Shaping Future Success: AI Literacy Framework & the PISA 2029 MAIL Assessment

Two complementary frameworks are shaping how students learn to engage with AI responsibly, creatively, and critically in a fast-changing digital world.

by Katie Finnegan
December 12, 2025
3 min. read
Group Of Students In After School Computer Coding Class Learning To Program Robot Vehicle

Engage with Our Work

We invite you to explore the draft AI Literacy Framework and share your thoughts.

Download Draft Framework Download Draft Framework

In recent years, AI systems have begun surpassing human abilities in several areas, such as reading comprehension, language understanding, and image recognition. However, not all skills are equally easy for AI to master. In PISA assessments, AI has demonstrated particular strength in reading, while humans still hold the edge in mathematics—a fascinating contrast where humans still seem to outperform these systems. 

These differences between human abilities and AI capabilities are not lost on the next generation. Students are already questioning what skills will truly matter in a world where machines can outperform us in so many ways. Surveys reveal that a large majority believe AI will play a crucial role in their professional futures. Many are concerned that AI could widen academic gaps among peers. Some feel their schools are doing a good job preparing them for an age of AI, while others are less confident about how well their teachers are prepared to use AI in education. This makes the role of schools and teachers in fostering AI literacy more important than ever. 

Recognizing this need, the AI Literacy (AILit) Framework and the PISA 2029 Media & AI Literacy (MAIL) assessment are two initiatives that provide complementary approaches and that are designed to prepare students for the opportunities and challenges of a world increasingly influenced by AI.

How do the AILit and the PISA 2029 MAIL Frameworks relate?

Designed as a practical toolkit for primary and secondary teachers, the AILit Framework empowers students not just with technical knowledge, but with durable skills and future-ready mindsets. The framework encourages learners to engage with AI, create new ideas alongside it, manage its use wisely, and design it—all while considering the benefits, risks, and ethical dilemmas AI presents.

The current draft version of the AILit Framework highlights four dimensions of AI literacy. The first highlights engaging with AI for generating new content, recommendations, and information. The second invites students to collaborate with these tools in creative projects and problem-solving. Next comes intentionally deciding how AI can complement and improve our work and lives. Finally, the last dimension involves exploring the data behind AI models and making thoughtful decisions throughout the design process.

The MAIL assessment aims to evaluate the abilities we need to engage effectively, ethically, and responsibly with digital content, media platforms, and AI itself—the skills that matter most in the digital age. Much of what is measured in the MAIL assessment aligns with what the dimensions that the AILit Framework aims to develop. For instance, the MAIL assessment aims to critically assess both human- and AI-generated digital content, to assess creating with AI systems, and to evaluate how students use AI systems for collaboration and for improving human work. However, encouraging students to think more deeply about training data and design decisions represents the next step, which are processes that are harder to capture on a large-scale assessment—and so it goes beyond the scope of MAIL.

What will the MAIL assessment experience look like?

Each student will participate in an hour-long series of tasks that would include an AI literacy component, followed by a brief survey about their experiences and perceptions of AI. To give a concrete example, the AILit Framework expects learners to evaluate whether AI outputs should be accepted, revised, or rejected. Meanwhile, the MAIL assessment will ask students to critically assess the credibility, bias, and purpose of all kinds of digital content. More specifically, in a potential task like “flag the AI,” students may be challenged to spot instances where media has been created or manipulated by AI—and most importantly the reasons why they think so. This assessment will provide valuable insights informing policy decisions that promote the development of these foundational competences needed to succeed in this continuously evolving digital landscape.

Complementary Approaches

Technological advancements are progressing rapidly. However, these two initiatives provide complementary approaches for preparing students to engage responsibly with AI and hope to help prepare future generations to navigate evolving educational and professional landscapes. To learn more about the PISA 2029 MAIL Assessment Framework visit the following link: PISA 2029 Media and Artificial Intelligence Literacy | OECD.

Blog
Share this article

Engage with Our Work

We invite you to explore the draft AI Literacy Framework and share your thoughts.

Download Draft Framework Download Draft Framework