Assessment & Grading
Overview
Assessment & Grading covers how authors define learning assessments — from simple multiple-choice problems (CAPA) to open-response essays (ORA), drag-and-drop exercises, proctored exams, and staff-graded assignments. It also covers the grading policy framework that determines how component grades roll up to course grades.
This is one of the platform's most complex and historically deep feature areas, with the CAPA (Computer-Assisted Problem Answering) system dating to the original MIT platform (6.002x).
Current State (2026)
• CAPA: The primary problem type framework — supports multiple-choice, checkbox, text input, numerical, custom Python-graded, and many more; authored in XML
• ORA (Open Response Assessment): Multi-step assessment with peer review, staff grading, and rubrics; major standalone system
• Proctoring: Proctored exam configuration via `edx-proctoring`; supports multiple proctoring vendors via a plugin interface
• Grading policy: Course-level grading rules (assignment types, weights, cutoffs) defined in Studio settings; computed by LMS grading engine
• Staff grading: `staff-graded-xblock` for assignments where instructors manually enter grades
• Problem authoring: Problems authored in OLX XML within Studio; advanced editors exist for some types
Architecture
• CAPA: XBlock-based problem rendering and grading in `edx-platform`; Python-graded problems run in `codejail` sandbox; submissions stored in `StudentModule`
• ORA: `edx-ora2` is a Django app + XBlock; uses `edx-submissions` for submission storage; separate peer review workflow
• Grading engine: Subsection-level grade computation in `edx-platform`; grades stored per-learner, per-subsection; recomputed on policy changes
• Proctoring: `edx-proctoring` provides a provider abstraction; third-party vendors (Proctorio, Honorlock) integrate via the proctoring API
History
Origin
• Year introduced: 2012 (CAPA predates Open edX; originates from MIT 6.002x in 2012)
• Initial implementation: CAPA (Computer Aided Problem Answering) system from MIT; ported from the original MIT course infrastructure
• Context: CAPA was MIT's internal problem system; when edX was built, it incorporated CAPA as the foundation for graded problems
Key Milestones
CAPA system incorporated from MIT
ORA (Open Response Assessment) launched
Proctored exams introduced
ORA 2.0 with peer review improvements
ORA MFEs development
Open Questions
- ?What was the original CAPA system from MIT and how much was preserved vs. rewritten?
- ?When was ORA first introduced and what was the academic motivation for peer review?
- ?How does the Python-graded problem type work with codejail?
- ?What proctoring vendors have been integrated historically?
- ?What are the limitations of the CAPA XML authoring format that authors find most painful?
- ?How is the grading engine architected — what happens when a grading policy changes?