Skip to main content

🧭 Programmes & Certifications Roadmap

Owner: Product Engineering
Status: Draft
Pillar: 5 — Programmes & Certifications
Related: FOUNDATION.md, architecture, assignments-roadmap.md, roles-and-access-control.md, CHANGELOG.md


0. Purpose & Scope

This roadmap defines the Programmes (Certifications / Learning Paths) layer that sits above Assignments.

It formalises how Evalium will:

  • Orchestrate multiple evaluations into a single credential.
  • Persist requirement-level progress for performance and auditability.
  • Handle versioning (pinned vs latest) for fairness and reproducibility.
  • Support expiry and recertification loops.
  • Maintain strict RLS and tenant isolation even when aggregating across assignments and submissions.

Programmes are long-lived credentials, not just “large assignments”.


1. Conceptual Model

1.1 Hierarchy

  1. Programme
    High-level credential definition, e.g.
    “Certified Safety Inspector 2025”.

  2. Programme Requirements
    Each requirement represents one component, usually mapped to a single evaluation (Knowledge, Observation, or Evidence evaluation), e.g.:

    • “Safety Theory” – minimum score 80%
    • “Hazard Perception Observation” – minimum score 70%
    • “Evidence: Upload Incident Investigation Report” – pass/fail
  3. Assignments & Submissions
    Delivery engine remains unchanged:

    • Programme enrolment drives creation of one or more assignments.
    • Submissions record attempts per evaluation/version.
  4. Programme Progress
    Requirement-level ledger that records whether each requirement is:

    • pending, met, or failed.
    • Which submission satisfied it.
    • When it was completed.
  5. Certificates
    Once requirements are satisfied:

    • A certificate record is issued.
    • It has an expires_at date.
    • It exposes a public verification token.

1.2 Key Principles

  • Programmes do not replace Assignments; they orchestrate them.
  • Requirement satisfaction is persisted, not recomputed on demand.
  • Enrolments are pinned to a snapshot of programme requirements.
  • Recertification is driven by a validity period on the programme.
  • All entities are tenant / org-scoped and enforced via RLS.

2. Design Decisions from Feedback

2.1 Requirement State vs Computed State

Decision: Introduce program_progress as the canonical ledger of requirement state.

  • Dashboards, admin views, and candidate progress bars query program_progress, not raw submissions.
  • Audit trails can answer:
    “Which submission satisfied which requirement, and when?”

2.2 Versioning – “Pinned vs Latest”

Decision: Programme enrolments behave like snapshots.

  • When a user enrols:
    • A programme version snapshot is stored on the enrolment.
    • Requirements for that enrolment are frozen (linked to specific evaluation IDs and any rules at that time).
  • Existing enrolments remain pinned to their snapshot, even if:
    • New evaluation versions are published.
    • The programme definition is updated.
  • New enrolments use the latest published programme definition and underlying evaluation versions.

2.3 Recertification & Expiry Loop

Decision: Recertification is modelled as a new enrolment triggered by expiry rules.

  • Programmes define a validity_period (e.g. 1 year).
  • Certificates store certified_at and expires_at.
  • A scheduled job:
    • Detects certificates approaching expiry.
    • Creates a renewal enrolment for the same programme.
    • Uses the current programme definition at the time of renewal.

2.4 RLS & Isolation

Decisions and invariants:

  • Programmes, requirements, enrolments, progress, and certificates are all tenant-scoped and typically org-unit-scoped.
  • A programme in Organisation A:
    • Can only reference evaluations within Organisation A.
    • Can only be assigned to users visible within that tenant/org context.
  • Cross-entity joins (Programmes → Requirements → Assignments → Submissions) must never bypass RLS; all access flows through TxManager under an authenticated user context.

3. Phased Roadmap

We treat Programmes as a distinct pillar above Assignments, implemented in four phases.


Phase 1 — Programme Definition (Schema & Invariants)

Goal: Define the canonical storage for programmes and their requirements, including validity rules.

3.1 Schema

Tables (core):

  • programs

    • id (UUID, PK)
    • tenant_id, org_unit_id
    • name
    • description
    • badge_image_url (for certificate visuals)
    • validity_period (INTERVAL, optional; e.g. 1 year)
    • is_active (BOOLEAN)
    • created_at, created_by
    • updated_at, updated_by
  • program_requirements

    • id (UUID, PK)
    • tenant_id, org_unit_id
    • program_id (FK → programs.id)
    • evaluation_id (FK → evaluation container, not specific version)
    • min_score (INT; nullable when pass/fail or non-scored)
    • is_mandatory (BOOLEAN; default true)
    • sort_order (INT)
    • metadata (JSONB; for future requirement types)

Note: Requirement “type” (Knowledge / Observation / Evidence) is currently implied by the linked evaluation’s type. In future we may support non-evaluation requirement types (e.g. “upload external certificate”), which will be modelled via metadata.

  • Optional / future-ready: program_versions
    • Provides a formal place to store the “programme version snapshot” if later needed beyond the enrolment itself.

3.2 Invariants

  • A programme:
    • Must belong to a single tenant_id.
    • May be scoped to an org_unit_id; RLS enforces visibility.
  • A requirement:
    • Must reference an evaluation that belongs to the same tenant (and compatible org scope).
  • A programme cannot be deleted once enrolments exist; it can only be archived/disabled.

3.3 Outputs

  • SQL migrations for programs and program_requirements.
  • RLS policies ensuring:
    • Row visibility is tenant/org-restricted.
    • Only permitted roles can create / update programme definitions.
  • Service layer definitions:
    • ProgrammeDefinitionService interface for create / update / list operations.

Phase 2 — Tracker: Enrolment & Progress (State Ledger)

Goal: Persist per-learner progress for each programme requirement, including which submission satisfied it.

4.1 Schema

  • program_enrolments

    • id (UUID, PK)
    • tenant_id, org_unit_id
    • user_id
    • program_id
    • status (in_progress, completed, certified, expired, cancelled)
    • program_snapshot (JSONB)
      • Frozen copy of requirements and key rules at enrolment time.
    • started_at
    • completed_at (when all requirements satisfied, before certificate issuance)
    • certified_at (when certificate issued)
    • expires_at (derived from validity_period, if present)
    • renewal_of_enrolment_id (nullable; FK to previous enrolment for recertification chains)
    • metadata (JSONB)
  • program_progress

    • id (UUID, PK)
    • tenant_id, org_unit_id
    • program_enrolment_id (FK → program_enrolments.id)
    • program_requirement_id (FK → program_requirements.id as defined in the snapshot)
    • status (pending, met, failed)
    • submission_id (FK → submissions.id; nullable until satisfied)
    • attempt_count (INT; optional, for analytics)
    • completed_at (TIMESTAMPTZ; when status first became met or failed)
    • last_updated_at (TIMESTAMPTZ)

4.2 Invariants

  • On creation of a program_enrolment:
    • The system initialises program_progress rows for each requirement in the snapshot, all with status pending.
  • A program_progress row:
    • Must never change program_enrolment_id or program_requirement_id after creation.
  • Status transitions:
    • pending → met (when a submission satisfies the requirement).
    • pending → failed (if business rules require explicit failure state).
    • met should generally be monotonic (do not revert to pending or failed unless explicitly designed).

4.3 Outputs

  • Migrations for program_enrolments and program_progress.
  • RLS policies with strict tenant/org enforcement.
  • Service methods:
    • Create enrolment (with snapshot initialisation).
    • Fetch enrolment and progress summaries.
  • Basic internal queries for:
    • “Programme completion percentage per enrolment.”
    • “List requirements and their states for a given enrolment.”

Phase 3 — Orchestrator: Assignment & Submission Logic

Goal: Connect Programmes to the existing delivery engine without polluting the Assignment model.

5.1 Auto-Assignment on Enrolment

When a program_enrolment is created:

  • The Programme Orchestrator:
    • Inspects the snapshot requirements.
    • For each requirement with status = pending:
      • Uses AssignmentService to create one assignment linked to:
        • user_id
        • The correct evaluation (and version, per snapshot rules).
        • The programme context (program_enrolment_id, program_requirement_id) in metadata.

Notes:

  • Constraints (time limits, attempt limits) can either:
    • Inherit from default evaluation/assignment templates, or
    • Be overridden by programme-level settings (future extension).

Idempotency invariant:
For a given (program_enrolment_id, program_requirement_id, user_id) tuple, the orchestrator must not create duplicate assignments. Re-running the enrolment initialisation logic should detect existing assignments and avoid creating additional ones for the same requirement.

5.2 Listener on Submission Events

When a submission is created:

  • Listener checks whether the submission:
    • Belongs to an assignment linked to a programme enrolment.
  • If yes:
    • Evaluates the result against the requirement rules (e.g. score ≥ min_score).
    • Updates the corresponding program_progress row:
      • Set status = met when satisfied.
      • Set or increment attempt_count.
      • Attach submission_id and completed_at when first satisfied.
  • After updating requirement state:
    • The orchestrator checks if all mandatory requirements have status = met.
    • If so, it marks the enrolment as completed and triggers certificate issuance.

Interaction with Result Remediation

Programme requirement evaluation must always use the latest score for a submission, including changes made by Correction Batches (remediation). The authoritative source for a submission’s score/outcome is submission_score_versions + submissions.latest_score_version.

Near-term contract:
Phase 3 will update program_progress based on the score at the time of submission creation. A later phase will extend this by reacting to remediation events (e.g. when a Correction Batch changes a score) and updating program_progress and enrolment status accordingly, without mutating snapshots or answers.

5.3 Handling Retakes

  • Assignments and submissions remain the system of record for attempts.
  • Programme logic determines:
    • Whether additional assignments are created (retake flows).
    • Whether multiple attempts can contribute to a single requirement.
  • Default rule (can be refined later):
    • Requirement is satisfied on first passing attempt; later attempts do not downgrade it.

5.4 Outputs

  • Service layer:
    • ProgrammeOrchestrator with methods:
      • OnEnrolmentCreated
      • OnSubmissionCreated
  • Wiring into existing event flows:
    • Hooked into submission creation path already used for scoring and telemetry.
  • Tests:
    • Enrolment creates assignments.
    • Passing submissions mark requirements as met.
    • All requirements met → enrolment moves to completed.

Phase 4 — Outcome: Certificates & Recertification

Goal: Represent certificates as first-class records and support public verification and renewal.

6.1 Certificates Schema

  • certificates
    • id (UUID, PK)
    • tenant_id, org_unit_id
    • program_enrolment_id (FK)
    • user_id
    • program_id
    • status (active, expired, revoked)
    • issued_at (TIMESTAMPTZ)
    • expires_at (TIMESTAMPTZ; derived from programme’s validity_period, if present)
    • public_token (TEXT; unique, random, used for verification URL)
  • metadata (JSONB; e.g. badge details, serial numbers)

Status: Implemented core issuance in code (certificates table, unique per enrolment, issuance on enrolment completion, public token, expiry based on programme validity period). Verification endpoint live at /api/v1/certs/verify/\{token\}.

6.2 Issuance Flow

  • When a program_enrolment transitions to completed:
    • Certificate service:
      • Creates a certificates row.
      • Sets issued_at and expires_at (if validity_period is present).
      • Marks enrolment as certified.

6.3 Verification Endpoint

  • Public endpoint (no auth required):

    • GET /certs/verify/{public_token}
  • Behaviour:

    • Looks up certificate by public_token under appropriate RLS restrictions.
    • Returns a minimal, non-sensitive view:
      • Programme name.
      • Candidate display name (subject to tenant privacy rules).
      • Issued / expiry dates.
      • Status (valid / expired / revoked).

6.4 Recertification Flow

  • A scheduled job:
    • Identifies certificates approaching expires_at (e.g. within 30 days).
    • Triggers creation of a new programme enrolment for the same user and programme:
      • Links renewal_of_enrolment_id to the previous enrolment.
      • Uses the current programme definition (and evaluation versions).
  • Certificates are never “edited” to extend validity; renewal is always:
    • A new enrolment,
    • A new set of assignments/submissions,
    • A new certificate.

6.5 Outputs

  • Certificate creation logic wired into Programme Orchestrator.
  • Public verification handler.
  • Background job spec for recertification initiation.

7. Non-Goals (for this Pillar)

  • Designing non-test requirement types (“watch video”, “upload ID”) in detail.
    • The schema (metadata on requirements, program_progress) is intended to be flexible enough to support these later.
  • Advanced analytics dashboards across programmes (e.g. cross-programme heatmaps).
    • Basic programme-level progress queries are in scope; full BI experiences are covered under reporting/analytics roadmaps.
  • Cross-tenant or cross-organisation sharing of programmes (marketplace).
    • Current scope assumes programmes are internal to a single tenant.

8. Open Questions to Capture Explicitly

These are design decisions to be recorded (e.g. in ADRs) as the implementation progresses:

  1. Retake semantics

    • Do requirements ever downgrade from met?
    • Do multiple passing attempts have any additional meaning?
  2. Programme versions vs enrolment snapshots

    • Is program_snapshot on enrolment sufficient as the long-term record of “which requirements applied”, or do we need a formal program_versions table to track changes to program_requirements over time (especially if requirements are edited while existing enrolments are in progress)?
  3. Assignment constraints in a programme context

    • Do programmes override attempt limits and time windows, or always defer to evaluation/assignment defaults?
  4. Candidate UX ordering

    • On the candidate dashboard, how are standalone assignments vs programme-linked assignments presented and grouped?

These questions do not block the schema or orchestrator, but they influence UX and some service-level policies and should be captured in future ADRs.