How might AI support how people learn outside the classroom?

Hanna Barakat + AIxDESIGN & Archival Images of AI / Better Images of AI / Data Mining 3 / CC-BY 4.0

Every day hundreds of posts are written on social media about AI and education. Every day yet more papers are published about AI and education. Webinars, seminars and conferences about AI and education. Yet nearly all of them are about formal education, education in the classroom. But as Stephen Downes says in a commentary on a blog by Alan Levine we need more on how people actually teach and actually learn. "We get a lot in the literature about how it happens in the classroom. But the classroom is a very specialized environment, designed to deal with the need to foster a common set of knowledge and values on a large population despite constraints in staff and resources. But if we go out into homes or workplaces, we see teaching and learning happening all the time..."

And of course people learn in different ways - through being showed how to do something, through watching a video, through working, playing and talking. Sadly in all these discussions about AI and education there is little about how people learn and even less on how AI might support (or hinder) informal learning.

These technologies are complex….

Nadia Piet and AIxDESIGN & Archival Images of AI / Better Images of AI / Limits of Classification / CC-BY 4.0

Theres some pretty fearsome discussions going on this week between so called sceptics of Gen AI and supporters (although much of the shouting is over the terms of the debate).

Bur it seems pretty incontestable that the big AI technology providers are trying to muscle in on education as a promising market.

In a series of posts on LinkedIn, Ben Williamson from Edinburgh University has looked at the different initiatives by the companies who not surprisingly are giving incentives to sign up with their AI variant. Google he said almost literally buying institutions, with prime ministerial endorsement, to advance its AI interests. Google last week announced the launch the AI Campus, with UK Prime Minister Sir Keir Starmer attending “to show his support for our groundbreaking initiative to improve digital skills in the UK in our London home and his constituency.” The pilot They said will offer students access to cutting-edge resources on AI and machine learning, as well as offering mentoring and industry expertise from Google, Google DeepMind, and others.:

Meanwhile not to be outdone Amazon's cloud computing subsidiary AWS - actually its biggest profit centre - announced a $100 million program to provide AI skills to underserved kids.

But as Williamson pointed out that $100m is pretty restricted coming in the form of cloud credits as part of the AWS Education Equity Initiative."

These cloud credits, they say, “essentially act like cash that organizations can use to offset the costs of using AWS's cloud services. Recipients can then take advantage of AWS's comprehensive portfolio of cloud technology and advanced AI services..."

And Microsoft who have already locked in many institutions to their Teams App with all kinds of AI add ons telling education institutions they must upgrade their cloud contracts for purposes of data governance when they use generative AI even more

AI is increasingly seen as a vehicle to expand the cloud business in education, says Williamson, locking education institutions in to hard-to-cancel cloud contracts under the guise of claims about AI efficiencies and improvements in outcomes (unproven as they are). He believes AI in education can't be separated from the cloud biz model.

In an article on his substack newsletter Edward Ongweso Jr points out: "These technologies are complex: their origins, their development, the motivations driving their financing, the political projects they’re connected to, the products they’re integrated."

This shows a need to go beyond present understandings of AI Literacy to understand the activities, intentions and impact of the big technology companies. And for education, it further suggest the need to develop our own applications, based on open source software and independent from a reliance on these companies. Open AI which started with a mission to develop AI to benefit society, now makes no pretense of its profit driven motivation and if that means privatizing education that is not a barrier.

About the image

This image shows a gradual transformation from fish to woman and vice versa - questioning the rigid boundaries of classification and emphasising the fluid, in-between states where entities cannot be neatly boxed into one category or the other. It is particularly reminiscent of early image-generation GAN models, where images could be generated at the midpoint between two concepts in latent space, outputting entertaining visuals while highlighting the complexity, ambiguity, and limits of data labelling.

AI – Productivity, Jobs and Skills

Hanna Barakat + AIxDESIGN & Archival Images of AI / Better Images of AI / Textiles and Tech 1 / CC-BY 4.0

Much of the big excitement about Generative AI was driven by the idea that it would boost productivity (and thus profit). Conversely one of the fears was that it would lead to job losses although there was little or no consensus about how severe such job losses might be and indeed some commentators speculated that new jobs created by AI would balance out the losses.

Early research and reports into the impact of AI were conflicted, with increasing levels of hype perhaps overwhelming more sober research findings. And even now there is only a limited consensus of the impact of Generative AI on employment. Lets look first at productivity. Early research has tended to emphasize that less experienced staff have gained most from using AI,with only limited gain from more senior employees, although of course there are big differences between sectors and occupations. A recent report – Reclaim your Day – the impact of AI PCs on Productivity - about a study by Intel, which tried to see if AI can save time and boost productivity, found that “current AI PC owners spend longer on tasks than their counterparts using traditional PCs.” According to the study, the users of these AIs spent a long time trying to identify “how best to communicate with AI tools to get the desired answers or response,” which is why they took longer. However, there is also a stark lack of data in the report on how much time was spent monitoring and correcting these AIs’ outputs. Despite this the study was optimistic, stating that people need to be better educated on using these AI tools.

Women in Technology has published a study by Sarah Writtenhouse entitled The Great Tech Job Migration is Upon Us - What you need to know about how jobs are adapting to the new tech climate (Paywalled), looking at how jobs in the software industry are changing. The software industry is interesting as thgi sis one of the sectors for which the big Gen AI companies have claimed big productivity savings. Software jobs were already in decline but Writtenhouse says that Software Development jobs postings on LinkedIn fell almost 25% in October this year, shrinking from 22,000 to just under 17,000. But not all is as it seems Writtenhouse says:

“These jobs are just evolving into the next generation of software development work by adding new skills to new job titles.

AI, ML, and Cloud Computing Engineers — Just new names for “Software Developer”

In terms of skills she says “Python, Java, and C++ are still core skills, but an added upskill to ML frameworks, cloud AI toolsets, and LLM models create new AI-centric development jobs… oops, I mean AI-centric engineering jobs.”

It seems AI Engineer postings rose sharply in October, increasing 55% from 10,000 to almost 16,000 from September with a doubling in openings for Cloud Computing Engineers and ML Engineers. Similarly there was an increase in demand for Data Analysts, Data Engineers, and Data Scientists.

I suspect that changes in skills demand and job titles may be more significant than overall employment in different sectors,. However this suggests that there is going to be higher levels of advanced skill training. It may well be that those working in the software industry are used to fast moving technology change but this may not be reflected in others sectors where professional training is needed to help employees keep up.

About the image

Textiles and Tech' intertwine the visual elements of circuits and textiles, merging the past and future, wires and strings. The collages draw inspiration from the history of 1960s Silicon Valley, where Navajo women were employed by Fairchild Semiconductor for their weaving expertise to assemble circuits that laid the groundwork for today’s microchips. By compiling archival images of hands, the series seeks to personify the anonymity of tech labor. The strings and wires running through the visuals encourage viewers to reflect: what is uncovered when we pull on these threads?

How to be a trusted voice online

UNESCO have launched an online course in response to a survey of digital content creators, 73 per cent of whom requested training. According to UNESCO the course aims to empower content creators to address disinformation and hate speech and provide them with a solid grounding in global human rights standards on both Freedom of Expression and Information. The content was produced by media and information literacy experts in close collaboration with leading influencers around the world to directly address the reality of situations experienced by digital content creators.

The course has just started and runs for 4 weeks; over 9 000 people from 160 countries enrolled and are currently taking it. They will learn how to:

  • source information using a diverse range of sources,
  • assess and verify the quality of information,
  • be transparent about the sources which inspire their content,
  • identify, debunk and report misinformation, disinformation and hate speech,
  • collaborate with journalists and traditional media to amplify fact-based information.

The UNESCO “Behind the screens” survey found that fact-checking is not the norm, and that content creators have difficulty with determining the best criteria for assessing the credibility of information they find online. 42% of respondents said they used “the number of ‘likes’ and ‘shares’ a post had received” on social media as the main indicator. 21% were happy to share content with their audiences if it had been shared with them “by friends they trusted”, and 19% said they relied “on the reputation” of the original author or publisher of content.

UNESCO says that although journalists could be a valuable aid for digital content creators to verify the reliability of their information, links and cooperation are still rare between these two communities. Mainstream news media is only the third most common source (36.9%) for content creators, after their own experience and their own research and interviews.

The survey also revealed that a majority of digital content creators (59%) were either unfamiliar with or had only heard of regulatory frameworks and international standards relating to digital communications. Only slightly more than half of the respondents (56.4%) are aware of training programmes addressed to them. And only 13.9% of those who are aware of these programmes participated in any of them.