Environmental Responsibility

Built for purpose.

Not for scale.

Built for purpose. Not for scale.

Eduface was designed for academic assessment — not to be

the biggest model, but to be the most efficient one for the task.

Eduface was designed for academic assessment — not to be the biggest model, but to be the most efficient one for the task.

93%

Smaller than frontier models

120B vs ~1.8T parameters

6B

Active parameters per query

vs 37B+ for MoE systems

80–95%

Less energy per inference

vs general-purpose AI

5%

Parameters active at once

Sparse activation

The Problem

General-purpose AI has a

hidden cost

The largest AI models contain up to 1.8 trillion parameters. Routing every student submission through these systems carries a significant environmental footprint.

Research shows the most energy-intensive models consume
33+ Wh per long prompt, over 70× more than smaller, purpose-built alternatives.

GPT-4o estimated annual energy

390,000 – 460,000 MWh/year

Enough to power ~35,000 US homes

Energy per long prompt (frontier model)

33+

Wh

70× higher than purpose-built alternatives

Model Architecture

A model built for one thing: assessment

Eduface Model

120B

total parameters

5%

Active at inference

6B

active per query

~5%

activation ratio

Sparse MoE architecture routes each task to only the relevant parameters

no wasted compute.

Frontier Models (est.)

~1.8T

total parameters (GPT-4, reported)

37%

Active at inference

37B+

active per query

15–20×

more compute

Designed for breadth across all knowledge domains, most of which are irrelevant to academic assessment.

Mixtral 8×7B activates

13B / 47B

28% — 3–5× more than Eduface

DeepSeek activates

37B / 671B

5.5%, 6× more than Eduface

Eduface activates

6B / 120B

5% — domain-optimised minimum

How It Works

Sparse activation: only what is needed

Step 01

Submission arrives

A student's assessment is received by the Eduface platform.

Step 02

Router activates experts

Our MoE architecture routes the task to only the

6B relevant parameters.

Step 03

Minimal output

Feedback is generated using a fraction of the energy of a model.

For Universities

Sustainability is now a

procurement decision

Routing tens of thousands of student submissions through frontier AI

carries a cumulative energy cost that is not trivial. Choosing Eduface

means choosing infrastructure engineered to be sustainable from day one.

"Universities that invest in AI should invest in AI that was built for them. Domain-specific models are not only more accurate — they are significantly more sustainable."

E

Eduface

References & Sources

Fedus et al. (2022) Switch Transformers, JMLR. · Shazeer et al. (2017) Sparsely-Gated MoE, ICLR. · Jiang et al. (2024) Mixtral of Experts, arXiv:2401.04088. · DeepSeek AI (2024) arXiv:2412.19437.

· Caravaca et al. (2025) arXiv:2511.05597. · You et al. (2024) Scientific Reports 14, 26310. · Dettmers et al. (NeurIPS 2024) D2DMoE.

GPT-4 parameter count unconfirmed by OpenAI; figures reflect reported industry estimates. Eduface figures (120B total, 6B active) reflect our own architecture.