
“Gen AI research points to a potential loss of human capital through undermining three important characteristics of a high-functioning workforce: critical thinking skills, subject knowledge and motivation,” says Dr KAREN MACPHERSON
Across the globe, organisations are embracing Generative AI with open arms.

Touted by its Big Tech company developers, we have been assured that Gen AI will increase efficiencies and cut costs, thereby raising productivity – the holy grail of any government or business enterprise.
What should we make of Gen AI? Undoubtedly, it is bringing significant benefits in medicine, science, industry, and in many other fields.
But we need to be clear-eyed about its potential downsides as well – and identify and manage these downsides proactively. We must not repeat the mistakes we made in recent decades with social media harms.
There are some significant emerging concerns with Gen AI that already warrant attention.
One of these concerns is that Google Gemini has hijacked our internet search vehicle and is now in the driver’s seat as we speed along the information superhighway.
There are two important implications: first, humans have been relegated to passenger status, so that our individual agency to locate accurate information from credible sources has been diminished. Second, our need to be able to recognise misinformation has increased dramatically.
Fortunately, we already have tools that can be used to mitigate these harms: we can ensure that every school across Australia teaches the package of skills known as “information literacy”.
Information literacy is all about how to navigate, locate, interpret and use information, including from that firehose of undifferentiated “information” that is the internet.
Critical thinking skills of evaluating the credibility, relevance and bias of information are a major subset of the information literacy suite. These skills can and should be taught explicitly (step-by-step) in every subject and reinforced at every age level. They are powerful tools for understanding and decision making.
Yet significantly, for the workforce of the future, there is a mounting body of evidence that suggests too much reliance on Gen AI could actually result in an erosion of these essential thinking skills; a decrease in subject knowledge, the narrowing of pathways to expertise and disengagement from work tasks.
In other words, a reduction in both human capital and in an individual’s motivation and sense of purpose. Research findings from big players such as Microsoft, Harvard University and the Australian Public Service through its implementation trial for Copilot, are united by a threat of such observations.
A key term in these discussions is “cognitive offloading”. We all offload various mental operations and/or memory tasks. Examples are writing a “to-do” list, or engaging cruise control when driving. In terms of mental effort, this is useful: it frees up thinking room for more complex tasks, such as monitoring road conditions for danger.
But the issue with Gen AI is that offloading too many complex thinking tasks on a regular basis could mean that essential thinking skills are lost. The saying “use it or lose it” applies even more to thinking skills than it does to manual skills such as driving a car.
Another concerning issue is the reduction of entry-level work opportunities through AI-related job cuts that are now occurring in many organisations. Quite apart from the impact of the job losses themselves, this trend is worrying from a human learning perspective.
Deep knowledge, or expertise, is developed over many years. It is acquired through learning, practice and wide experience. Think apprentice/master; intern/specialist; student teacher/principal; flight officer/captain. Humans become experts by identifying and understanding patterns in large datasets, starting with junior level tasks.
If juniors are not performing tasks such as routine reading of reports and work-related information, or writing summaries and reports themselves, important pathways to the development of human expertise are being narrowed.
Reading a Gen AI summary (even assuming it’s correct) doesn’t provide the immersion in information that enables basic patterns to be identified, and “mental models” of concepts to be formed.
Over time, expertise builds as patterns of information, and mental models of those patterns, become complex and unique.
Eventually, an expert is able to identify patterns in highly complex information that a novice does not even see. Coupled with the human brain’s ability to reason and to make rapid unconscious novel connections (“intuition”), this sophisticated pattern recognition underpins expert judgment. Who would you rather have diagnosing your complex medical condition: an intern or a specialist?
We already have significant shortages of expertise in many trades and professions: electricians, engineers, scientists, teachers and doctors. These shortages are causing serious systemic performance and service issues around Australia. We must be careful to balance productivity gains from Gen AI with the need to provide meaningful and engaging jobs, and long-term opportunities for our workforce to grow expertise and human capital.
A final thought. There is a colossal network of AI data centre superstructures spreading across the surface of our planet. It is a voracious energy consumer. Yet that vast global network cannot yet compete with the reasoning power of a single human mind. Each of us carries in our head the most creative, intuitive, sense-making instrument that has ever evolved. We are called homo sapiens for a reason. We really do need to value what we have.
Dr Karen Macpherson is an independent advocate for public education.
Leave a Reply