Scholarly background

The research behind the 5Es

Professor Natalia Kucirkova introduced the 5Es Framework in an article in 2023 and in two publications available from here and for i2Insights here. The 5Es Framework, as an education-specific evaluation of impact, shares its name with the well-known 5E Instructional Model —both emphasizing structured and student-centered approaches to learning.

Five research reports and academic articles document the research connected to the framework. EduEvidence’s Academic Advisory Board members continue to iterate on the 5Es parameters in response to feedback from other researchers, practitioners, and policymakers. Our goal is to lead the field with clear impact standards against which EdTech solutions can be evaluated and compared with a clear total impact score.


Efficacy and Effectiveness

The first report systematically reviewed literature and found 65 evaluation frameworks in Efficacy and Effectiveness of EdTech tools. The frameworks were consolidated based on weight of evidence. Efficacy assesses research foundation (logic model, theory of change, use of science in products), experimental evidence (including RCT). Effectiveness assesses Instructional fit, Usability evidence and Cost-effectiveness.

Kucirkova, N. I., Cermakova, A. L. & Vackova, P. (2024). Consolidated Benchmark for Efficacy and Effectiveness Frameworks in EdTech. University of Stavanger. https://doi.org/10.31265/usps.270

Check out the report →


Ethics

Experts in ethics reviewed key evaluation frameworks related to data, privacy and ethical handling of technologies with and without AI. Ethics include privacy compliance,data processing and integration, accountability and transparency.

Atabey A., Robinson C., Lindroos Cermakova, A., Siibak A. & Kucirkova, I. N.. (2024). Ethics in EdTech: Consolidating Standards For Responsible Data Handling And Usercentric Design. University of Stavanger. https://doi.org/10.31265/USPS.283

Check out the report →


Equity

The Equity report summarized literature relevant to inclusive design, bias mitigation, equitable implementation with contextual fit and respect for local communities and marginalized groups.

Lindroos Cermakova, A., Prado, Y& Kucirkova, I. N.. (2024). Equity in EdTech by Design. University of Stavanger. https://doi.org/10.31265/USPS.277

Check out the report →


Environment

This report summarises key environmental considerations for EdTech tools, with attention to the connection between green and digital education. 

Shengjergji, S., Luzai, A., Mills, S., Van Nostrand, P., Cermakova, A. L., & Kucirkova, N. (2024). Environmental impact of EdTech: The hidden costs of digital learning. University of Stavanger. https://doi.org/10.31265/USPS.285

Check out the report →


An international equivalence system across all major EdTech certification

Consolidated benchmarks based on weight of evidence can create an equivalence system across multiple evaluation methods and frameworks. The EduEvidence Academic Board and WiKIT Research Group created a research-based equivalence framework, which became the international EdTech impact standard.

Here is how it works in practice:

The ESSA standards are guidelines developed by the US government to assess the efficacy of EdTech tools. These standards are categorized into four tiers, each representing a different level of evidence: strong, moderate, promising and “demonstrates a rationale” (4th and lowest Tier).

The WiKIT research group established the equivalence between ESSA and the internationally consolidated benchmark. EduEvidence then mapped the consolidated benchmark to align with our international certification levels: Gold, Silver, and Bronze. This alignment helps in evaluating and comparing the evidence of EdTech tools on a global scale.