Is a college degree the best way to prepare for work in the digital economy? Photograph: Aaron Murphy

While universities have traditionally viewed their mission as expanding knowledge, developing powerful minds and transmitting culture, most students and parents would probably agree that their primary motivation for investing in higher education is to ensure access to a secure and financially rewarding career. Given that studies continue to find that college graduates, on average, have better financial outcomes than those without a degree, such an outlay seems eminently sensible. This is particularly so in developing economies where the critical shortage of qualified personnel ensures a life-changing impact for those who can access such an education.

But similar returns are diminishing in established economies where highly educated resources are in abundance. The desire to stand out in a crowded field has led to intense competition for spots at top universities with students being pushed to succeed on what has been called the “Race to Nowhere”.

An ironic consequence of this trend is that students are ever more pressured to perform, but not necessarily required to learn deeply or conceptually.

As Artificial Intelligence (AI), Machine Learning (ML) and robotics accelerate their disruption of the world of work and employment, much of the content students are currently absorbing in college is likely to be out-of-date by the time they graduate. At the same time, employers are complaining that graduates are not “work ready”, and that despite their extensive education, they lack the skills needed to apply and expand their learning in the field.

With college fees rising much faster than most other sectors of the economy and student debt ballooning, parents and students are increasingly likely to question the value of a university education, particularly as many students continue to struggle to find secure and rewarding jobs following graduation. Preparing masses of students for employment has not been a core element of higher education’s complex and evolving mission, but it is timely to review what role higher education can and should play in preparing students for the Fourth Industrial Revolution (4IR).

The cost of a US college education has doubled in the past 20 years.Source: AEI

This question is tackled head on in a new book, “Higher Education in the Era of the Fourth Industrial Revolution”, edited by Nancy Gleason, Senior Lecturer of Global Affairs and Director of the Centre for Teaching & Learning at Yale-NUS in Singapore.(The book is available to download for free here).

In determining the ongoing role of higher education, Gleason acknowledges that no-one can yet foresee many of the future jobs that this education will be required to support.AI is projected to create more jobs than it replaces, but these jobs will be very different from today’s and the transition will be challenging. Based on emerging talent shortages there will be clear job demand for creativity, data analytics and cybersecurity, but beyond that, even well-established high-skill white-collar jobs are likely to see decreasing demand for human labor.

There is more confidence when it comes to identifying the skills that will be required to thrive in an increasingly automated economy. As more and more tasks are digitalised, work will become increasingly fluid, requiring employees to be agile and able to transition between different types of tasks and contexts. As such, skills including complex problem solving, critical thinking, creativity, collaboration, emotional intelligence, judgement and decision making together with cognitive flexibility will be important.

The emphasis on skills does not replace the need for a deep understanding of content.They will in fact, provide continuity through the many cycles of content that will need to be absorbed to keep pace with a constantly transforming economy. More importantly, it will be these skills and capabilities that enable the effective translation of that content into real-world impacts and value.

In light of the above, Gleason concedes that traditional undergraduate education, developed for the industrial age, is no longer a viable form of education to ensure employment and a career, and will need to change both in terms of what students learn and how they learn it.

The “What” consists of content, the skills identified above plus optimised tools and strategies to support a lifetime of learning. As the transition to AI and automation heightens demand for those things which make us human, our emotional intelligence and creativity, together with adaptable and flexible minds, several of the book’s contributors point to Liberal Arts education as providing the necessary foundation. They suggest that its human centred, cross-disciplinary approach is ideal for developing the higher thinking skills, cognitive flexibility, and ethical tools for dealing with the challenges new technologies will bring.

At the same time, they acknowledge that an understanding of technology will also be essential, and this combination of STEM (Science, Technology, Engineering and Mathematics) with Liberal and Design Arts is driving the emergence of integrated STEAM educational approaches. Furthermore, in recognising that not everyone can attend a Liberal Arts college, they recommend that higher education institutions, in collaboration with governments and industry, support life-long learners by using Liberal Arts as a framework across all curricula and training to aid in the development the skills needed.

The “How” draws on research that shows students learn and recall more through project-based learning around open-ended challenges and are consequently more motivated and engaged with discipline content. This will require a move towards student-centred and personalised education, in place of more traditional information transfer approaches – from the “Sage on the Stage” to the “Guide on the Side”.

The collection then explores several emerging developments including:

  • Education systems, particularly in Asia, are starting to emphasise “general education” at the university level, promoting cross-disciplinary approaches, in many cases based on traditional Liberal Arts models,
  • Universities are establishing entrepreneurial incubation centres to not only equip students with 4IR skills and capabilities, but to launch them into engagement with industry and on the path to entrepreneurial careers. More than 3 million university students are currently engaged in entrepreneurship and innovation events in China alone.
  • In order to more effectively support life-long learning, the higher education sector is exploring concepts including micro-credentials, nano-degrees and micro-learning to support students and those already in the workforce with just-in-time learning opportunities.

These recommendations represent profound changes to the way universities operate and teach and raise the question of whether they are willing and able to make the transformation. It also remains to be seen how quickly students and employers will accept the value of these new approaches to content and learning, and indeed, whether they will view Universities as the best place to get them.

So, is a university education still a good investment? Graduating from an elite university does send a perceive positive signal that many employers are still responding to. However, as costs continue to rise and the breaking scandal over college admissions brings the whole sector under intense scrutiny, this seems likely to change. Parents and students continue to show willingness to invest heavily in education to ensure a flourishing future. The question is whether universities will be able to continue to justify that investment or will nimble new and cost-effective players disrupt the sector and dominate 4IR education and learning?