Despite last week’s weather, VICTVS made it to the Federation of Awarding Bodies 2024 Conference, Exhibition, and Awards ceremony. Hosted in Leicester, the annual FAB conference is a must-attend event for many awarding organisations, ed-tech companies, assessment services, and anyone with an interest in the education industry.
As longstanding attendees of FAB, we returned this year to host a coffee bar, with our team ready to assist attendees with any queries about our services. While the team manned our stand, I had the chance to explore the conference as an attendee and join several engaging panel discussions and talks.
The conference opened with speeches and panel discussions featuring Jane Belfourd, Director for Technical Education at the Department for Education, alongside representatives from SIAS, JISC, Ofqual, and more. Two panels followed: one on the future of UK skills and further education, and the other on AI and assessment. This session highlighted both the opportunities AI presents and the challenges it poses in assessments and awarding.
With generative AI dominating public discourse, this was just one of many AI-related talks at FAB. The panel explored how AI can enhance a learner’s experience—for example, by helping those whose first language is not English improve their pronunciation and grammar. One speaker also suggested that schools should teach students how to create effective AI chatbot prompts, as this is quickly becoming a vital workplace skill.
As I have a background in digital rights and cybersecurity, I couldn’t help but approach the discussions from this perspective. Debra Grey, Principal and CEO of Hull College, raised an important point about schools using free AI tools to save money, warning: “Just because they’re free, doesn’t mean they’re safe.”
While the panel quickly moved on from this point, I was left wanting a deeper examination of this issue—particularly considering the amount of research that has gone into unsafe, free apps being used in schools.
In 2022, Human Rights Watch investigated over 150 educational apps, most of them free, and found that 89% monitored or had the ability to monitor their users. Additionally, more than 140 of the companies examined gave direct access to user data or sold it to 196 different advertising companies.
Just last year, it was revealed that 96% of free apps used or recommended for use by K-12 students in America shared students’ personal information with third parties.
So it’s clear that there’s already a problem with dubious apps being used in schools. With the rise of AI and the growing number of AI-powered apps, it seems likely that many schools may unknowingly recommend even more apps with questionable data-sharing practices.
Given the density of this topic, I was disappointed that it wasn’t further discussed, along with broader dialogue on student safety, data protection, accessibility, and generative AI’s inherent fallibility.
Later in the day, I attended two other talks: ‘The Future of Assessment – Towards a Vision for Tech-Enabled Assessment’ and ‘Navigating Assessment Integrity in the Age of Generative AI.’ Both sparked meaningful discussions, particularly around ideas for potential apps and technology that could be used in the classroom. The second talk, led by a Turnitin representative, highlighted the challenges of detecting AI in coursework and raised important questions about how assessment formats must evolve and adapt to students’ evolving use of technology.
This theme ran throughout much of FAB: How can assessments keep pace with rapidly changing technology? And how should students be encouraged to use these tools in formal assessments?
AI and technology dominated the days conversations, and I found the panels both insightful and thought-provoking. I particularly enjoyed the opening discussions and the chance to speak with other attendees, who were clearly passionate about the future of education and how best to support learners of all ages.
That said, I felt there was much missing from the conversation—particularly around cybersecurity, data privacy and the risks of integrating emerging technologies into underprepared schools. Moreover, while everyone acknowledged AI’s potential, I wish there had been more focus on its flaws, such as its bias and tendency for inaccuracies. This technology is far from perfect, and failing to acknowledge its limitations does nothing to help us. Rather than discouraging its use, students should be taught how to harness generative AI effectively while critically assessing its outputs.
In the future, it would be helpful to invite AI and technology experts from outside the education sector to speak on these issues. By doing so, we can gain a better understanding of emerging technologies and how to integrate them into assessments and the classroom in a genuinely helpful and safe manner.
I only attended the Monday of FAB, but I found the day both thought-provoking and inspiring—not just because of what was discussed, but because of what was left unsaid. It will be interesting to see how education evolves between now and FAB 2025. Hopefully, by then, we’ll have answers to many of the questions raised at this year’s event.