ADR 0006: Evidence as Evaluations (Not a Global Portfolio) in v1
Date: 2025-12-04
Status: Accepted
Context
As Evalium expands beyond pure online tests, we need to support evidence:
- Files, photos, videos, documents demonstrating that a learner can perform tasks.
- Artefacts that may be reviewed, approved, and used in compliance/audit contexts.
Typical patterns in the market:
- “File upload question” inside a test/observation.
- Separate “portfolio” / “evidence library” modules (often in apprenticeship or e-portfolio systems).
We needed to decide whether Evalium should introduce a standalone evidence system in v1 (global artefacts, reusable across multiple contexts) or keep evidence strictly within the assessment lifecycle.
Decision
In v1, Evalium will treat evidence entirely within the Evaluation model:
-
Inline Evidence
- A
file_uploaditem type is available in any Evaluation (Knowledge or Observation). - These uploads are part of the Submission and are scored/reviewed alongside other items.
- A
-
Evidence Evaluations
- Some Evaluations will be “evidence-focused” (kind/tag =
evidence). - Their payload is primarily
file_uploaditems + optional reflection/rubric fields. - They are assigned and submitted like any other Assessment, especially as Programme components.
- Some Evaluations will be “evidence-focused” (kind/tag =
We will not introduce a global “evidence portfolio” or top-level EvidenceArtifact in v1 that exists independently of Evaluations.
Programmes define how these Evidence Evaluations contribute to completion and competence.
Options Considered
1. Global Evidence Portfolio (rejected for v1)
Introduce a top-level EvidenceArtifact entity:
- Upload evidence once.
- Link it to multiple competencies, evaluations or programmes.
- Manage separate approval flows and lifecycles.
Pros:
- Highly flexible for e-portfolio and apprenticeship-like use cases.
- Artefacts reusable across many contexts.
Cons:
- Adds a parallel lifecycle to Evaluations/Submissions.
- Overlaps with document management/e-portfolio products.
- Higher complexity for RLS, audit trails and UX (where do users go to “do work”?).
- Risks scope creep away from AMS towards general file storage/portfolio.
2. File Upload Only as Question Type (rejected)
Allow evidence only via file_upload questions inside standard Evaluations.
Pros:
- Simple implementation.
- Familiar AMS pattern.
Cons:
- Cannot model “evidence as a separate task” in a Programme (e.g. submit three real-world artefacts over time).
- Hard to communicate required evidence as a standalone evaluation step.
- Less flexible for asynchronous evidence collection.
3. Evidence Inside Evaluations (chosen)
Use two mechanisms:
- Inline uploads inside Knowledge/Observation Evaluations.
- Dedicated Evidence Evaluations as first-class Programme components.
Pros:
- All evidence lives inside the familiar Evaluation → Assignment → Session → Submission lifecycle.
- Programmes can treat evidence steps the same way as tests and observations.
- Keeps domain and implementation simpler while still covering key evidence scenarios.
- Leaves the door open to introduce a true
EvidenceArtifactlater if needed, using submissions as the source of truth.
Cons:
- Evidence cannot (yet) be reused across multiple Programmes/competencies without duplication.
- Some advanced portfolio/e-portfolio use cases will not be fully supported in v1.
Consequences
Positive:
- Clear mental model for authors and learners: evidence is always tied to a specific Evaluation.
- No extra global “evidence area” or stray files with unclear context.
- Easier to implement review and approval workflows using Submission statuses.
- Supports both simple “inline upload” and “evidence as its own evaluation step” while staying AMS-native.
Negative:
- Customers with strong e-portfolio expectations may require integrations or later enhancements.
- Reuse of the same artefact across multiple outcomes or programmes will initially require re-upload or duplication via future tooling.
Notes
- Future versions may introduce a promoted
EvidenceArtifactconcept that indexes evidence derived from Submissions, if and when customer demand justifies the additional complexity. - Until then, all evidence-related features must be designed within Evaluations and Programmes.