For those of us who have been working on AI in Education it is a bit of a strange time. On the one hand it is not difficult any longer to interest policy makers, managers or teachers and trainers in AI. But on the other hand, at the moment AI seems to be conflated with the hype around Chat GPT. As one senior policy person said to me yesterday: "I hadn't even heard of Generative AI models until two weeks ago."
And of course there's a loge more things happening or about to happen on not just the AI side but in general developments and innovation with technology that is likely to impact on education. So much in fact that it is hard to keep up. But I think it is important to keep up and not just leave the developing technology to the tech researchers. And that is why I am ultra impressed with the new publication from the Netherlands SURF network - 'Tech Trends 2023'.
In the introduction they say
This trend report aims to help us understand the technological developments that are going on around us, to make sense of our observations, and to inspire. We have chosen the technology perspective to provide an overview of signals and trends, and to show some examples of how the technology is evolving.
Surf scanned multiple trend reports and market intelligence services to identify the big technology themes. They continue:
We identified some major themes: Extended Realities, Quantum, Artificial intelligence, Edge, Network, and advanced computing. We believe these themes cover the major technological developments that are relevant to research and education in the coming years.
But what I particularly like is for each trend the link to to public values and the readiness level as well. The values are taken from the diagram above. As SURF say "public values are key to efficient education and research."
n the last few weeks the discussions about technology for education and learning have been dominated by the impact of GPT3 on the future of education – discussion which as Alexandra Mihai characterises in a blog entitled Lets get off the fear carousel as “hysteria”.
The way I see it, she says, is “academia’s response to ChatGPT is more about academic culture than about the tool itself.” As she posts out AI tools are not new and are already in use in a wide range of applications commonly used in education. But probably the most concern or even panic being seen about ChatGPT is in relation to assessment.
Alexandra draws attention to 7 things that the current debate reveals about our academic culture. Although she is focused on Higher Education much the same applies to Vocational Education and Training although I think that many teachers and trainers in VET may be more open to AI, given how it already plays a considerable role in the jobs vocational students are being trained for.
Her 7 things are:
Lots of pressure/ high workloads: regardless of our positions, everyone seems to be under a great amount of pressure to perform
Non-transparent procedures: university administration is very often a black box with missing or inefficient communication channels
Lack of trust in students: this very harmful narrative is unfortunately a premise for many educators, not entirely (or not always) out of bad will but rather stemming from a teacher-centred paradigm which emphasises the idea of control.
Stale quality assurance (QA) policies: quality assurance in education is a complex mix of many factors (including faculty professional development, technology integration academic integrity policies, to name just the more relevant ones for the current debate)
Inertia: the biggest enemy, in her opinion. Responding to change in a timely and efficient manner is not one of the strong points of HE institutions.
Technological determinism ): the only thing that is, she feels, equally if not more dangerous that banning technology is thinking it can solve all problems.
Alexandra wants us to “take a moment to actually talk to and really listen to our students?” She says: :”All this will help us understand them better and design learning experiences that make sense to them. Not necessarily assignments where they cannot cheat, but activities and assignments they genuinely want to engage in because they see them as relevant for their present and their future.”
In an earlier blog she invites us to select on two questions.
Firstly, how do you balance three assessment purposes – students’ expertise development, Backward design and constructive alignment and Feasibility for students, teachers and organisation.
Secondly how do you take into account the three principles for optimally balancing different assessment purposes, in order to guide students towards professional independence?
There is no shortage of resources on ChatGTP in education: a list which is growing by the day. Here is 5 that Alexandra suggests:
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPT
Privacy & Cookies Policy
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.