The responsibilities for assessment setting and marking can shape the experiences students have of assessment and may influence the grades they receive. This chapter discusses the three broad categories for the location of assessment setting and marking responsibilities: locally; externally; and at an intermediate level or with a mixed combination of local/external actors. The chapter discusses the implications for assessment validity and reliability of different arrangements for marking and setting responsibility. It provides examples of different responsibility arrangements across countries.
The Theory and Practice of Upper Secondary Certification
5. Responsibilities for assessment
Copy link to 5. Responsibilities for assessmentAbstract
Assessment responsibilities across upper secondary certificates
Copy link to Assessment responsibilities across upper secondary certificatesThere are two key questions that capture where responsibilities for assessments lie (discussed further in Chapter 2):
Who sets the assessment i.e. who writes the questions and tasks?
Who marks and judges the student’s assessment responses or evidence?
For upper secondary certificates, assessments are typically set and assessed either by the student’s own teacher/school or externally i.e. by a central agency – see Figure 5.1. Of the 71 certificates analysed by this report, most include at least one component that is set (53 out of 71) and/or assessed (56 out of 71) by the student’s own teacher/school. Most certificates also include at least one assessment component which is set (52 out of 71) and/or assessed (41 out of 71) externally. For each upper secondary certificate analysed, individual assessment components were identified collaboratively by the OECD Secretariat and country representatives. The term ‘assessment components’ refers to the different assessments or types of assessments students experience, with these components each being labelled as they would be familiar to students and teachers of that country. For example, one assessment component might be ‘final exams’ and another might be ‘teacher-based assessment’.
Figure 5.1. Assessment responsibilities across upper secondary certificates
Copy link to Figure 5.1. Assessment responsibilities across upper secondary certificatesNumber of certificates by who sets and marks assessment components
Note: Most certificates include multiple components. For example, a certificate may include some components that are set by the teacher/school and some components that are set externally.
Source: Transitions in Upper Secondary Education Certificate and Assessment mapping exercise, data collected across 2024 and 2025.
Teacher responsibilities for setting and marking assessments
Copy link to Teacher responsibilities for setting and marking assessmentsAssessments set by teachers
Across the certificates analysed by this report, more than half (53 out of 71 certificates) include tasks where the responsibility for assessment-setting is devolved to the student’s own teacher or school (Figure 5.1). Annex 5.A. sets out all the assessment components within upper secondary certificates and countries that are set by the school/teacher. In some systems, devolving upper secondary assessments reflects an underlying culture of high trust in teachers’ professionalism, with a high value placed on local and individual responsibility. This has historically been the case in Sweden where teachers determine overall grades for students for upper secondary certification or in Norway where teachers’ grades make up 80% of the overall student grade for upper secondary certification.
In other systems, responsibility for setting assessments may be devolved to the teacher or school because schools or teachers also have responsibility for the curriculum, so assessment must be set locally by those with knowledge of what is being taught. For example, in British Columbia (Canada), while there is a state level curriculum, Boards of Education can develop their own courses locally for students to complete the requirements of the upper secondary certificate – the Dogwood Diploma.
Validity and reliability
Teacher-set assessments are typically those where the tasks, timing and conditions are unspecified at a national level, giving responsibility to teachers to determine these factors when they set the assessment. Internal assessment has the potential to provide flexibility and adaptability. With this responsibility devolved to teachers, there are a wide range of assessment formats that teachers may potentially use. The flexibility of teacher-set assessments may also be used to enhance integration with teaching and learning providing formative value for students who can use their results and teacher feedback from assessments to deepen and strengthen their learning over the year (McDowell, Sambell and Davison, 2009[1]). Aspects of the assessment which may be variable or adaptable include:
Content: For example, schools may be able to choose which literature or history topics they teach and adapt the assessments to suit.
Context: Schools may be able to choose how students are assessed. For example, assessing students’ knife care and use skills within a cookery vocational programme could take place during a regularly scheduled lesson, during a capstone event put on for parents and guests or during a student’s work placement.
Conditions: Teachers may be able to determine if students have access to external resources like the internet or AI or if students experience more closed and controlled conditions.
To support reliability, teacher-set assessments typically aim to assess skills and knowledge at consistent levels of competence and performance across schools. National learning standards based on the curriculum are often used to support this.
Associated challenges
Even in systems where there is a long history of teacher-assessed grades and this is the dominant mode of assessment for upper secondary certification, there can be issues, or perceived issues, around the reliability and credibility of the assessment (as discussed in Chapter 2 in Sweden). In systems where upper secondary certificates are entirely based on teacher-set assessments, standardised tests for entry to tertiary education are typically used to address the absence of reliability and overall credibility. This is the case in the United States, where many states do not require students to take state-level assessments for high school graduation and where alternative, standardised tests such as the SATs are often used for higher education entrance. Other systems that are also entirely – or almost entirely – teacher-assessed and have separate examinations for higher education entrance are Greece, Finland, Japan, Korea, Poland, Spain and Türkiye – see Box 5.1. Even in systems with a strong culture of trust in teacher professionalism, such as Norway, teacher assessment is accompanied by an externally set, standardised examination for upper secondary certification.
Some of the reliability challenges of teacher-set assessment relate to the consistency and quality of the marking and judging but there are also tensions for teacher-set assessment to align with agreed national standards or curricular outcomes. It can require significant assessment literacy for teachers to develop valid assessments for the breadth of curricular outcomes. Even in cases when teachers achieve this, it is rarely manageable to have sufficient external control mechanisms that can provide transparency and monitoring to ensure teacher-set assessments are valid across all schools and classrooms, which contributes to concerns about its validity and alignment with agreed national standards.
Furthermore, even though teachers may potentially use a wide range of assessment methods and have flexibility to determine when and how assessment takes place, this potential is not always realised. In several systems where assessment-setting and marking is entirely devolved to schools, such as Japan, Korea and Türkiye, the actual curriculum in the classroom still routinely focuses on test-preparation (aligned to the higher education entrance exams) and students still experience a high proportion of tests (OECD, 2018[2]; Kwon, Lee and Shin, 2017[3]; Kitchen et al., 2019[4]).
Box 5.1. In Poland, in addition to the school-leaving certificate, students sit the Matura to access higher education
Copy link to Box 5.1. In Poland, in addition to the school-leaving certificate, students sit the Matura to access higher educationIn Poland, schools and teachers are responsible for both setting and marking the grades that upper secondary certificates are solely based on. In line with requirements specified by the school, students – for both general and vocational education – need, as a general rule, a passing grade in all core curriculum subjects at the basic level, a passing grade in at least one (for students in technikum) or two (for students in liceum) elective subjects at the extended level, as well as to have attended at least half of classes. The school-leaving certificate is entirely based on students’ annual grade, and almost all students (typically around 95% of the cohort) achieve the school leaving certificate at the end of school.
To access higher education, students need to complete the nationally-set Matura exam. In 2024, around 88% of liceum (general upper secondary) graduates and approximately 83% of technikum (vocational upper secondary) graduates took the Matura exam. The Matura is based entirely on externally developed and marked exams. Students need to pass exams in Polish language (written and oral), a modern foreign language (written and oral) and mathematics (written). At least one additional subject must be taken at an extended (i.e. advanced) level, with most students taking two.
This approach created challenges for students from vocational upper secondary to access higher education since they had to perform in the same subjects and at the same level as students who had completed general education. This is also the situation in several other countries, notably Lithuania, where students sit the same exams, despite students in vocational education and training (VET) having less learning time devoted to general content. In Lithuania, only around 2% of vocational upper secondary school graduates typically access higher education every year (OECD, 2023[5]). In Poland, a change was introduced in 2018 to facilitate vocational students’ access to higher education. Now, for the Matura, students in vocational education can replace the additional extended subject with a vocational qualification or diploma at the technician level. Higher education institutions providing “higher VET” courses at ISCED 5 have generally been positive about the allowing technikum graduates to substitute an extended general subject for a vocational subject, since these students bring strong occupational experience and skills.
Source: Ministerstwo Edukacji Narodowej [Ministry of National Education] (n.d.[6]), Egzamin maturalny [Matura exam], https://www.gov.pl/web/edukacja/egzamin-maturalny2; Centralna Komisja Egzaminacyjna [Central Exam Commission] (2024[7]), Preliminary information on the results of the matriculation exam conducted in the main term (in May).
Assessments marked by teachers
Across the certificates analysed by this report, marking/judging responsibilities for assessments often mirror arrangements for setting the assessment. Often, this is for practical reasons; when the teacher sets the assessment, it is seen as more practical for them to decide on the mark scheme and to apply it. Teacher marking of student assessment responses may be used when the assessment task is broad and open-ended, for example, a project to assess research, analysis and problem-solving skills. Such tasks provide scope for student agency and individualisation of the task when external marking would be unreliable (if external markers are not particularly familiar with the content of student-selected project topics, resulting on a focus on process over final output quality) or unmanageable, because each task needs to be matched with a suitably knowledgeable external marker. Assigning marking to the student’s teacher can help to address these challenges since they can observe the student’s processes, learn more if needed about the topic and ask the student questions to check their understanding or assure themselves that the work is the student’s own.
Validity and reliability
Research shows that, in grading students, teachers often make judgements that go beyond a measurement of any defined curriculum in their grading practices. Teachers report taking into account aspects such as attendance, behaviour, motivation and effort. Teachers take this approach to try to be fairer to students and to ensure that the assessment supports learning (McMillan and Nash, 2000[8]). While some may view this as reducing the reliability of the assessment and credibility of the upper secondary certificate (Guskey and Jung, 2012[9]), others argue that teacher assessment is contextualised and relationship-based, and its quality should be judged differently (Alonzo, 2019[10]).
Studies show that teacher-assigned grades have greater predictive validity for the student’s next steps in learning than some standardised tests (see Box 5.2) (Atkinson and Geiser, 2009[11]) (Cliffordson, 2008[12]). Teacher grading might have greater predictive validity than standardised tests because it provides a more holistic evaluation of student aptitudes and potential (Harlen, 2005[13]). Researchers suggests that the consistently greater predictive validity of teacher grades reflects the fact that teachers are measuring important life skills associated with success at all stages of life: self-control, self-efficacy and tenacity (Bowers, 2019[14]). This has important implications for systems aiming to assess social and emotional skills within upper secondary as social and emotional skills may already be implicitly captured by some forms of teacher-based assessment (see Chapter 1 for a discussion of the knowledge, skills and understanding that upper secondary certificates aim to assess). This also has implications for higher education admissions policies. In the United States, research into the predictive validity of teacher-given grades (Box 5.1) has become prominent in the context of changing university admission factors. The number of 4-year colleges and universities in the United States that are considered to be ‘test-optional’ (i.e. neither recommending or requiring admissions tests) has increased from about 12% prior to 2020 to over 85% for the 2024 academic year (Camara, 2024[15]).
Box 5.2. Predictive validity of teacher-assessed grades and standardised tests
Copy link to Box 5.2. Predictive validity of teacher-assessed grades and standardised testsSome systems – like Sweden and the United States – have both internally assessed upper secondary certificates and separate, externally set and marked entrance tests for higher education. Systems like these where the two approaches co-exist provide the possibility to analyse how predictive the two different sets of information are regarding student performance in higher education.
Numerous studies have found that teacher-assessed grades, compared with standardised entrance test results, are better at predicting student success in higher education (Westrick et al., 2015[16]) (Geiser and Santelices, 2007[17]). In the United States for example, a study of nearly 150,000 first-time entrants to higher education in 1999 across higher education institutions in four states found that teachers’ grades were stronger than grades from SATs (standardised tests for higher education admission) at predicting student completion of their programmes on time and their overall grades (Bowen, Chingos and McPherson, 2009[18]). In Sweden, research of 164,106 students between 1993 to 2001 found that the grades that teachers gave their students in upper secondary education were better predictors of their success in the first year of higher education (measured by grades) than their scores from Sweden’s standardised entrance test for higher education, the SweSat (Cliffordson, 2008[12]).
Most recently, as part of Sweden’s national inquiry into school grading, the government appointed committee conducting the investigation analysed the predictive power of teachers’ grades, nationally standardised tests and marks from the SweSat for success in higher education (measured as grades in students’ first year). The analysis found that teacher’s grades were most correlated with student success in higher education (0.28), while SweSat results were the least correlated (0.19) (Figure 5.2).
Explanations for the greater predictive validity of teacher-assessed grades focus on the fact that success in higher education requires not only cognitive ability but a wide range of other competencies related to motivation and self-regulation which teachers’ grades can capture (Galla et al., 2019[19]).
Figure 5.2. Predictive validity for success in higher education, 2018/19 in Sweden
Copy link to Figure 5.2. Predictive validity for success in higher education, 2018/19 in Sweden
Note: Predictive power (measured as correlation) of first year academic success for grades (merit value), national final examinations and university entrance examination results. The predicted value is measured as the correlation between academic success and each measure. Students who graduated from upper secondary school in 2018 and 2019 and who participated in all of the above-mentioned tests and also the university entrance examination. The students also have study results from higher education, n = 7 148.
Source: Cliffordson (2008[12]), “Differential Prediction of Study Success Across Academic Programs in the Swedish Context: The Validity of Grades and Tests as Selection Instruments for Higher Education”, in Educational Assessment, Vol. 13/1, https://doi.org/10.1080/10627190801968240; Galla et al. (2019[19]), “Why High School Grades Are Better Predictors of On-Time College Graduation Than Are Admissions Test Scores: The Roles of Self-Regulation and Cognitive Ability” in American Educational Research Journal, Vol. 56/6, https://doi.org/10.3102/0002831219843292; Bowen, Chingos and McPherson (2009[18]), Crossing the Finish Line: Completing College at America’s Public Universities. Princeton University Press, http://www.jstor.org/stable/j.ctt7rp39; Atkinson and Geiser (2009[11]), Reflections on a Century of College Admissions Tests in Educational Researcher, Vol. 38/9, https://doi.org/10.3102/0013189X09351981; Geiser and Santelices (2007[17]), Validity of high school grades in predicting student success beyond the freshman year: High-school record vs. standardized tests as indicators of four-year college outcomes, Centre of Studies in Higher Education, University of California, Berkeley, https://files.eric.ed.gov/fulltext/ED502858.pdf; Swedish Government (2025[20]), An Equivalent Grading System (Ett likvärdigt betygssystem), https://www.regeringen.se/contentassets/56408e57ea8e4954bf38491c8bc33d4c/ett-likvardigt-betygssystem-sou-202518-volym-1.pdf (accessed 4 June 2025); SOU 2025:18 (2025[21]), An equal grading system: Report of the Inquiry on Equivalent Grades and Merit Values, https://www.regeringen.se/rattsliga-dokument/statens-offentliga-utredningar/2025/02/sou-202518/.
Associated challenges
Teacher marking or judging of student assessment responses is the focus of much debate. There are often significant variations in the grades that teachers give to students across different schools, and even within classrooms, that is not consistent with differences in national testing (see for example recent reports from Sweden and France highlighting variations in teacher-assessed grades (Skolverket, 2020[22]) (SOU, 2025[21]) (Direction de l’évaluation, de la prospective et de la performance, 2025[23])). Variations in teacher-assessed grades might reflect that teachers are assessing students against broader measures of skills than those in the curriculum, but this raises many challenges. Teachers – just like anyone – are likely to have biases or preferences towards certain students, or types of students or behaviours. At a practical level, an individual teacher with 20-30 students is unlikely to be able to observe every relevant behaviour or provide equitable opportunities for all students to demonstrate their skills.
Intermediate responsibilities for setting and marking assessments
Copy link to Intermediate responsibilities for setting and marking assessmentsThe role of intermediate individuals or bodies
In some upper secondary certification systems, standardisation and oversight or stakeholder involvement in assessment setting lies with an intermediate individual or body like the school inspectorate, a local authority, employers or an assessment jury or council combining both internal and external actors. Table 5.1 lists certificates which include an assessment component set at the ‘intermediate’ level. If an assessment component is set at the intermediate level, it also tends to be marked at the intermediate level.
Promoting balance through shared responsibilities across national, regional and school levels
Using intermediate individuals or bodies to set and mark assessments might be used to ensure balance between teacher or school level flexibility with standardisation achieved through external oversight. For example, in Queensland (Australia), the state assessment organisation, the Queensland Curriculum and Assessment Authority (QCAA), shares responsibility for the quality of internal assessment by working with schools to develop teacher-set assessments. In Queensland, when schools develop summative internal assessments for general subjects, they follow parameters outlined in the QCAA defined syllabus, which specifies the type of assessment, the conditions under which it should be administered and a marking scheme. QCAA recruits and trains teacher assessors to then endorse schools’ internal assessment instruments before they are used in the classroom. After the assessments have been administered, QCAA requests a sample of student work for every subject in every school to confirm that the provisional marks awarded by teachers are accurate. Adjustments may be made to students’ results depending on the outcomes of this process.
Table 5.1. Countries and certificates including one or more assessment components set and/or assessed across multiple or intermediate level actors
Copy link to Table 5.1. Countries and certificates including one or more assessment components set and/or assessed across multiple or intermediate level actors|
Country |
Certificate |
Orientation |
Component |
Intermediate set |
Intermediate marked |
|---|---|---|---|---|---|
|
Australia (Queensland) |
Queensland Certificate of Education |
GEN |
Internal assessment |
|
|
|
Belgium (French Community) |
Certificat d’Enseignement Secondaire Supérieur (GEN) [Certificate of Upper Secondary Education] |
GEN |
External certificate assessment standards |
||
|
Belgium (French Community) |
Certificat d’Enseignement Secondaire Supérieur (VET) [Certificate of Upper Secondary Education] |
VET |
External certificate assessment standards |
|
|
|
Croatia |
Final exam |
VET |
School exams |
||
|
Czechia |
Upper secondary education |
VET |
VET final exam (practical exam); (theoretical exam) |
||
|
Czechia |
Upper secondary education with Maturita |
Both |
Maturita (profile/school part) |
||
|
Czechia |
Upper secondary education with VET certificate |
VET |
VET final exam – written exam; practical exam; oral exam |
||
|
Estonia |
Vocational upper secondary education |
VET |
Professional exam |
||
|
Finland |
Vocational qualifications |
VET |
Compulsory and optional study units |
||
|
France |
Baccalauréat |
GEN |
Contrôle terminal (Final assessment) |
||
|
France |
Professional Aptitude Certificate |
VET |
Written tests; Contrôle en cours de formation; Le chef-d’œuvre ; Professional training period |
||
|
France |
Baccalauréat professionnel |
VET |
Professional training period (PFMP); Le projet (project) |
||
|
Israel |
Matriculation Exam |
GEN |
Portfolio or project |
||
|
Italy |
Diploma d’istruzione secondaria di secondo grado |
Both |
State Matura (Esame di maturità) Written exam |
||
|
State Matura (Esame di maturità) Oral exam |
|||||
|
Luxembourg |
Diplôme de fin d’études secondaires [High School Diploma] |
GEN |
Exam |
||
|
Oral exam |
|||||
|
Norway |
Diploma |
GEN |
Exam grades |
||
|
Norway |
Vocational and apprentice and practical certificates |
VET |
Exam grades |
||
|
Poland |
School leaving certificate – for lyceum , technikum and branżowa szkoła I and II stopnia subjects |
Both |
Practical training component |
||
|
Poland |
Matura |
GEN |
Oral exams |
||
|
Portugal |
Secondary school completion certificate and diploma – Professional certificate |
VET |
Professional Aptitude Test (final project); On-the-job training |
||
|
Slovak Republic |
Maturitná skúška [Maturita] |
GEN |
Oral exam |
||
|
Slovak Republic |
Záverečná skúška [2-3 year VET], Absolvenská skúška [Art Schools] |
VET |
School-based grades and assessment |
|
|
|
Slovak Republic |
Maturitná skúška [Maturita] - 4-year VET |
VET |
Oral exam; Practical component |
||
|
Slovenia |
Vocational matura |
VET |
Examinations |
||
|
Seminar paper or product or service with defence |
|||||
|
Spain |
Ciclo Formativo de Grado Medio [Intermediate Vocational Training Cycle] |
VET |
Workplace assessment |
||
|
Türkiye |
Lise Diploması [High School Diploma] |
Both |
Teacher-based assessment |
|
|
|
United Kingdom - Wales |
Level 3 WJEC Vocational Qualifications |
VET |
Non-exam assessment |
Source: Transitions in Upper Secondary Education Certificate and Assessment mapping exercise, data collected across 2024 and 2025.
Juries, boards and assessment committees
Many of the examples in Table 5.1, particularly where students do oral or practical vocational exams, involve an assessment committee, board or jury. This group, typically made up of a range of internal and external actors, has responsibility for determining which questions or tasks are put to students to answer or complete. Figure 5.3 outlines the process, from task-setting to marking, when a jury or assessment board is involved, with specific examples from relevant countries. Marking or providing assessment judgements collectively can also be a way to mitigate the challenges inherent in teacher marking or judging of assessment evidence. Systems with no recent history or culture of external marking might use such approaches to introduce some oversight or externality into the marking without taking the task itself out of the hands of teachers. For example, in Norway, all locally developed exams are assessed by two examiners, including at least one external examiner (see Figure 5.3).
In vocational systems, employers may be involved in judging, or signing off on how student evidence should be assessed. In other systems, particularly where students do work-based learning, assessment decisions are made in collaboration between teachers and employers. While not strictly a jury or a formal examining board, in Portugal, for on-the-job training, teachers/trainers and employers make decisions together about whether students have satisfactorily completed this aspect of the programme (see Figure 5.3).
Figure 5.3. Assessing students using a jury or board
Copy link to Figure 5.3. Assessing students using a jury or boardExamples of how countries use juries or boards
Source: Transitions in Upper Secondary Education Certificate and Assessment mapping exercise, data collected across 2024 and 2025.
External responsibilities for setting and marking assessments
Copy link to External responsibilities for setting and marking assessmentsThe role of external agencies
Of the 71 certificates analysed, 52 include an assessment component which is set by an external agency, often a national or state examinations board. Figure 5.1 shows that 41 out of 71 certificates include an externally marked assessment component. The difference between these two figures is owing to the fact that, in several systems, including Austria, Belgium (French Community), Greece, Italy, Luxembourg, Slovenia and Sweden, there are centrally-set tests or exams that are marked by teachers/schools. For example, in Greece, for the Apolytirio, students sit final exams. Teachers are responsible for developing half of the exam items, and the remaining half are developed by experts and are randomly drawn from a national repository. After students complete the exam, all items are graded by the respective teachers.
Annex 5.B. set out the certificates which include an externally set and/or marked component. Assessment agencies or boards responsible for setting assessments are often part of the overall state education system. Such bodies are typically managed at arms-length from government to ensure a degree of separation so that assessments are independent of political influence (Baird et al., 2018[24]). Assessment agencies may be responsible for assessment of general qualifications, vocational qualifications, and sometimes national and international assessments.
Externally setting assessments helps to ensure their standardisation and credibility. Assessment tasks that are externally set and assessed tend to be exams (Annex 5.B.). However, some systems set and/or assess a wider range of formats externally. For example, in Ireland, New Zealand and the systems in the United Kingdom, students undertake projects or coursework that are marked or set externally. This approach can a) enable systems to assess the wider range of skills and knowledge that such assessment formats allow, b) provide scope for students to explore topics or themes of personal interest (discussed in Chapter 3) and c) ensuring a consistent marking approach is applied. Some systems use this approach for creative arts subjects and Box 5.3 discusses an example of externally assessed portfolios for Visual Arts in New Zealand.
Box 5.3. Use of portfolios in New Zealand
Copy link to Box 5.3. Use of portfolios in New ZealandIn New Zealand, for some subjects, including Visual Arts subjects such as photography, painting, printmaking and design, externally-assessed standards are assessed via portfolios. For National Certificate of Educational Achievement (NCEA) Level 3, students submit either a physical or a digital portfolio at the end of the year. This is sent to the New Zealand Qualifications Authority for external marking (see Annex 3.B for further information).
The portfolio assessment specifications require that students must integrate artistic conventions of the field to build and refine their ideas. Within this broad scope, students can select their own themes, ideas and styles, creating a context where they can explore and express their individual identities including ethnic and cultural heritage (Smith, Pohio and Hoeberigs, 2018[25]) (Smith, 2019[26]). Developing a year-long portfolio can also provide opportunities for the development of complex skills like time-management skills and independence (Doole, 2023[27]).
Source: Doole (2023[27]), Student Wellbeing and Digital Technology Use in Visual Art Education: a case study examining the perceptions and experiences of senior students from one secondary school in New Zealand, Thesis, https://hdl.handle.net/10092/105646; Smith, Pohio and Hoeberigs (2018[25]), “Cross-sector perspectives: how teachers are responding to the ethnic and cultural diversity of young people in New Zealand through visual arts” in Multicultural Education Review, Vol. 2/10, https://doi.org/10.1080/2005615X.2018.1460895; Smith (2019[26]), “Curriculum, Assessment and Pedagogy: How These Dimensions are Enriching Visual Arts Education for Ethnically Diverse Students in New Zealand Secondary Schools” in Journal of Education and Culture Studies, Vol. 3/3, https://doi.org/10.22158/jecs.v3n3p311.
Validity and reliability
The standardised nature of externally set assessments means they are felt to be fairer and more reliable – everyone is assessed in the same way at the same time, and they are less open to malpractice, cheating and fraud. External marking also supports the reliability of the assessment, in particular by ensuring that the marker does not know the student, and will, in theory, mark objectively and without bias.
Associated challenges
External assessment agencies might be seen as distant from the curriculum that students actually experience, potentially resulting in assessments that do not reflect agreed curriculum outcomes or learning intentions. If assessment items are seen to extend beyond the scope of the curriculum or are framed in way that students are not expected to be taught, this can be seen to be unfair (discussed in Chapter 1).
Some subjects and types of questions will always be more difficult to mark consistently than others, and this can lead to debates about marking consistency and the reliability of the assessment result. This can happen for several reasons, but debate often centres on the marking of subjects that may be viewed as more ‘subjective’, leading to differing interpretations of what a high-quality student response looks like. This is particularly the case for national language and humanities subjects. In the United Kingdom, for example, the marking of English exams frequently enters news media (Weale, 2024[28]). If the subject in question is also one that is taken by many students, is required for certification or is given significant weight for higher education entrance, then any perceived inconsistencies in marking will have important consequences for students.
Itemised marking (i.e. marking items individually, with different markers involved in the marking of a written exam) has been shown to eliminate the over or under marking that results from the so-called ‘halo or horns’ bias that stems from markers forming general impressions of the candidate rather than an objective judgement of each answer, resulting in over-lenient or over-harsh marking (Tisi J. et al., 2013[29]). Some online systems facilitate itemised marking so that no marker marks a whole student response, but instead one question or item across several students. More broadly, developments in technology have helped assessment bodies streamline processes, increasing both the efficiency and the effectiveness of checking, quality assurance and marking standardisation processes. Modern online marking systems also provide relatively streamlined ways for the assessment body to provide training and support for markers, and to check the quality and consistency, and therefore reliability, of their marking (Jadhav, 2018[30]).
Occasionally, assessments that are set externally are marked by the student’s own teacher or school, often following centrally-set guidelines or marking criteria. Adding this element of external input into the certification can promote credibility, particularly for upper secondary certificates that are otherwise entirely internally set and assessed. Having different actors responsible for different parts of upper secondary assessment can also be a way to promote manageability by spreading the assessment load. In Slovak Republic, for example, the national centre is only responsible for marking the written exams, with essays being marked by schools.
Linking assessment responsibilities to certification purposes
Copy link to Linking assessment responsibilities to certification purposesPolicy decisions about who is responsible for setting and marking assessments are directly related to the functions for certification – certifying student achievement of what they have learnt while at school and signalling their skills, knowledge and understanding to inform their next steps into higher education, training and work (discussed in Chapter 1).
External setting and marking is common when certificates and grades are central for selection for post-secondary destinations
The use of upper secondary certificates for selection purposes tends to create a need for standardisation and consistency to ensure fairness. In systems where there are more students than available places for high status or popular pathways, upper secondary certificates are often used by higher education institutions to enable meritocratic selection across students. In some cases, systems might use rank scores based on grades, with students being granted a place in a particular course or institution if they have a certain score. In some systems, cut-off scores are even calculated to the decimal place. Systems where certificates are the central tool in this type of competitive and selective access to higher education include Australia, Ireland, Portugal and systems across the United Kingdom. In these cases – where competition for places in post-secondary pathways is particularly pronounced, and the system prioritises the selection function of upper secondary certification – certification is likely be more externally-based.
Setting and marking at the school level is more common when certificates have little role in informing selection for higher education
In a few systems, upper secondary certification plays a limited role in selection to high demand pathways, notably higher education. This is the case when systems have separate selection tests for higher education, such as Greece, Korea, Poland, Spain and the United States. Similarly, some vocational certificates certify programmes that do not provide access to higher education such as Spain’s Ciclo Formativo de Grado Medio and Slovak Republic’s 2–3-year VET programme. In these cases, there is less need to directly compare students and therefore less emphasis on standardisation and consistency within the certificate. Assessments set by the student’s own teacher or school are more acceptable in such a context. In these systems, there is more flexibility for how students can be assessed, and teacher set and/or marked assessments are more common.
References
[11] Atkinson, R. and S. Geiser (2009), “Reflections on a century of college admissions tests. , 38(9),”, Educational Researcher, Vol. 38/9, pp. 665–676., https://doi.org/10.3102/0013189X09351981.
[24] Baird, J. et al. (eds.) (2018), Examination Standards: How measures and meanings differ around the world, UCL IOE Press, https://www.ucl-ioe-press.com/books/assessment-examination-standards/.
[18] Bowen, W., M. Chingos and M. McPherson (2009), Crossing the finish line: Completing college at America’s public universities, Princeton University Press, https://www.jstor.org/stable/j.ctt7rp39.
[10] Brookhart, S. (ed.) (2019), Defining Trustworthiness for Teachers’ Multiple Uses of Classroom Assessment Results.
[15] Camara, W. (2024), Admission Testing in Higher Education: Changing Landscape and Outcomes from Test-Optional Policies, John Wiley and Sons Inc, https://doi.org/10.1111/emip.12651.
[7] Centralna Komisja Egzaminacyjna [Central Exam Commission] (2024), Preliminary information on the results of the matriculation exam conducted in the main term (in May) 2024, http://cke.gov.pl/images/_EGZAMIN_MATURALNY_OD_2023/Informacje_o_wynikach/2024/20240709%20Wst%C4%99pne%20informacje%20EM24.pdf (accessed on 16 November 2025).
[12] Cliffordson, C. (2008), “Differential Prediction of Study Success Across Academic Programs in the Swedish Context: The Validity of Grades and Tests as Selection Instruments for Higher Education,”, Educational Assessment, Vol. 13/1, pp. 56-75, https://doi.org/10.1080/10627190801968240.
[23] Direction de l’évaluation, de la prospective et de la performance (2025), Moyennes de contrôle continu et résultats aux épreuves terminales du baccalauréat général, https://www.education.gouv.fr/moyennes-de-controle-continu-et-resultats-aux-epreuves-terminales-du-baccalaureat-general-450152 (accessed on 8 September 2025).
[27] Doole, S. (2023), Student Wellbeing and Digital Technology Use in Visual Art Education: a case study examining the perceptions and experiences of senior students from one secondary school in New Zealand., University of Canterbury, https://doi.org/10.26021/14740.
[19] Galla, B. et al. (2019), “Why High School Grades Are Better Predictors of On-Time College Graduation Than Are Admissions Test Scores: The Roles of Self-Regulation and Cognitive Ability”, American Educational Research Journal, Vol. 56/6, pp. 2077-2115., https://doi.org/10.3102/00028312198432.
[17] Geiser, S. and M. Santelices (2007), “Validity of high school grades in predicting student success beyond the freshman year: High-school record vs. standardized tests as indicators of four-year college outcomes”, Research & Occasional Paper Series, Vol. CSHE.6.07, https://files.eric.ed.gov/fulltext/ED502858.pdf (accessed on 4 June 2025).
[9] Guskey, T. and L. Jung (2012), “Four Steps in Grading Reform ”, Principal Leadership, Vol. 13/4, https://eric.ed.gov/?id=EJ1002406 (accessed on 27 February 2025).
[13] Harlen, W. (2005), “Teachers’ summative practices and assessment for learning – tensions and synergies”, The Curriculum Journal, Vol. 16/2, pp. 207-223, https://doi.org/10.1080/09585170500136093 (accessed on 27 February 2025).
[30] Jadhav, C. (2018), Exam marking: how technology is improving the quality of marking, https://ofqual.blog.gov.uk/2018/04/20/exam-marking-how-technology-is-improving-the-quality-of-marking/ (accessed on 28 February 2025).
[4] Kitchen, H. et al. (2019), OECD Reviews of Evaluation and Assessment in Education: Student Assessment in Turkey, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://doi.org/10.1787/5edc0abe-en.
[3] Kwon, S., M. Lee and D. Shin (2017), “Educational assessment in the Republic of Korea: lights and shadows of high-stake exam-based education system”, Assessment in Education: Principles, Policy and Practice, Vol. 24/1, https://doi.org/10.1080/0969594X.2015.1074540.
[14] M., S. and J. McMillan (eds.) (2019), Towards Measures of Different and Useful Aspects of Schooling, Why Schools need both Teacher-Assigned Grades and Standardized Assessments, Brookhart.
[8] McMillan, J. and S. Nash (2000), Teacher Classroom Assessment and Grading Practices Decision Making, Paper presented at the 2000 NCME Annual Meeting, April 27 2000, https://eric.ed.gov/?id=ED447195 (accessed on 27 February 2025).
[6] Ministerstwo Edukacji Narodowej [Ministry of National Education] (n.d.), Egzamin maturalny [Matura exam], https://www.gov.pl/web/edukacja/egzamin-maturalny2 (accessed on 23 April 2025).
[5] OECD (2023), Strengthening Upper Secondary Education in Lithuania, OECD Publishing, Paris,, OECD Publishing, https://doi.org/10.1787/a69409d7-en.
[2] OECD (2018), Education Policy in Japan: Building Bridges towards 2030, Reviews of National Policies for Education, OECD Publishing, Paris, https://doi.org/10.1787/9789264302402-en.
[1] Rust, C. (ed.) (2009), Assessment for learning : a brief history and review of terminology, Oxford Centre for Staff and Learning Development, https://researchportal.northumbria.ac.uk/en/publications/assessment-for-learning-a-brief-history-and-review-of-terminology.
[22] Skolverket (2020), Analyses of equivalence in secondary school grading: Comparisons between course grades and course tests, National Agency for Education Skolverket, https://www.skolverket.se/download/18.1a8151cc170ae4599bce10/1585902805741/pdf6564.pdf (accessed on 8 November 2024).
[26] Smith, J. (2019), “Curriculum, Assessment and Pedagogy: How These Dimensions are Enriching Visual Arts Education for Ethnically Diverse Students in New Zealand Secondary Schools”, Journal of Education and Culture Studies, Vol. 3/3, https://doi.org/10.22158/jecs.v3n3p311.
[25] Smith, J., L. Pohio and R. Hoeberigs (2018), “Cross-sector perspectives: how teachers are responding to the ethnic and cultural diversity of young people in New Zealand through visual arts”, Multicultural Education Review, Vol. 10/2, https://doi.org/10.1080/2005615X.2018.1460895.
[21] SOU (2025), An equal grading system: Report of the Inquiry on Equivalent Grades and Merit Values (SOU 2025:18), https://www.regeringen.se/rattsliga-dokument/statens-offentliga-utredningar/2025/02/sou-202518/ (accessed on 2 June 2025).
[20] Swedish Government (2025), An Equivalent Grading System (Ett likvärdigt betygssystem), https://www.regeringen.se/contentassets/56408e57ea8e4954bf38491c8bc33d4c/ett-likvardigt-betygssystem-sou-202518-volym-1.pdf (accessed on 4 June 2025).
[29] Tisi J., G. et al. (2013), A Review of Literature on Marking Reliability Research (Report for Ofqual )., NFER, https://assets.publishing.service.gov.uk/media/5a81cc0e40f0b62302699360/0613_JoTisi_et_al-nfer-a-review-of-literature-on-marking-reliability.pdf (accessed on 28 February 2025).
[28] Weale, S. (2024), “Experts raise concerns over ‘unreliable’ marking of GCSE English”, The Guardian, https://www.theguardian.com/education/2024/sep/13/unreliable-marking-gcse-english (accessed on 28 May 2025).
[16] Westrick, P. et al. (2015), “College performance and retention: A meta-analysis of the predictive validities of ACT scores, high school grades, and SES”, Educational Assessment, Vol. 20, pp. 23-45, https://doi.org/10.1080/10627197.2015.997614.
Annex 5.A. Teacher-based assessment
Copy link to Annex 5.A. Teacher-based assessmentAnnex Table 5.A.1. Certificates including a teacher/school set and/or assessed component
Copy link to Annex Table 5.A.1. Certificates including a teacher/school set and/or assessed component|
Country |
Certificate |
Orientation |
Component |
Assessment responsibilities |
||
|---|---|---|---|---|---|---|
|
Both teacher/school set and marked |
Teacher/ school set |
Teacher/ school marked |
||||
|
Australia (Australian Capital Territory - ACT) |
ACT Senior Secondary Certificate (ACT SSC) |
Both |
School-based internal assessment |
✓ |
||
|
Australia (Queensland) |
Queensland Certificate of Education |
GEN |
Internal assessment |
✓ |
||
|
Austria |
Standardised competency-oriented school leaving examination at Academic Secondary School (AHS) |
GEN |
Extended Essay; Written examinations (excluding mathematics and languages); Oral examinations; Practical examinations (for some versions of the programme only) |
✓ |
||
|
Written examinations (mathematics and languages subjects are externally set but locally marked) |
✓ |
|||||
|
Austria |
Standardised competency-oriented school leaving examination at College for Higher Vocational Education (BHS) |
VET |
Diploma Thesis; Written examinations (excluding mathematics and languages); Oral examinations; Practical examinations (for some versions of the programme only) |
✓ |
||
|
Written examinations (mathematics and languages subjects are externally set but locally marked) |
✓ |
|||||
|
Belgium (Flanders) |
Orientation towards higher education |
GEN |
School-related assessment |
✓ |
||
|
Belgium (Flanders) |
Orientation towards higher education and labour market |
GEN |
School-related assessment |
✓ |
||
|
Belgium (Flanders) |
Orientation towards labour market |
VET |
School-related assessment |
✓ |
||
|
Belgium (French Community) |
Certificat d’Enseignement Secondaire Supérieur (GEN) |
GEN |
Internal assessment standards |
✓ |
||
|
Belgium (French Community) |
Certificat d’Enseignement Secondaire Supérieur (VET) |
VET |
Internal assessment standards |
✓ |
||
|
Canada (British Columbia) |
British Columbia Certificate of Graduation (Dogwood Diploma) |
GEN |
Credits towards the Dogwood Diploma |
✓ |
||
|
Colombia |
Bachiller Académico [Academic Bachiller] |
GEN |
School/teacher-developed assessments |
✓ |
||
|
Colombia |
Bachiller Técnico [Technical Bachiller] |
VET |
School/teacher-developed assessments (Certificado de aptitud profesional, CAP) |
✓ |
||
|
Costa Rica |
Bachillerato |
Both |
Teacher grades |
✓ |
||
|
Croatia |
Final exam |
VET |
School exams |
✓ |
||
|
Czechia |
Upper secondary education |
VET |
VET final exam (practical exam); VET final exam (theoretical exam) |
✓ |
||
|
Denmark |
Higher Commercial Examination (hhx) |
GEN |
Subject assessment / grades for the year's work; Examinations; Major written assignment (3rd year) |
✓ |
||
|
Denmark |
Higher General Examination Programme (stx) |
GEN |
Subject assessment / grades for the year's work; Examinations; Major written assignment (3rd year) |
✓ |
||
|
Denmark |
Higher Preparatory Examination programme (hf) |
GEN |
Examinations; Major written assignment (fourth semester) |
✓ |
||
|
Denmark |
Higher Technical Examination Programme (htx) |
GEN |
Subject assessment / grades for the year's work; Examinations; Major written assignment (3rd year) |
✓ |
||
|
Estonia |
General upper secondary education |
GEN |
School exams; Practical work or subject research |
✓ |
||
|
Estonia |
Vocational Upper Secondary Education |
VET |
Completion of modules |
✓ |
||
|
Finland |
Certificate of General Upper Secondary Education |
GEN |
Units of scope for the syllabus (150 credits) |
✓ |
||
|
Finland |
Matriculation Examination Certificate |
GEN |
Matriculation Exam (preliminary marking only) |
✓ |
||
|
France |
Baccalauréat (lycée/general et technologique upper secondary education) [Baccalaureate] |
GEN |
Contrôle continu (Continuous assessment) |
✓ |
||
|
France |
Baccalauréat professionnel [Professional baccalaureate] |
VET |
Contrôle en cours de formation |
✓ |
||
|
Greece |
Apolytirio [Graduation certificate] |
GEN |
Internal Exams |
✓ |
||
|
Internal Coursework |
✓ |
|||||
|
Greece |
Ptychio [Qualification] |
VET |
EPAL exams, including practical exam; Assessment during the course of study (oral, practical and written); Practical training |
✓ |
||
|
Israel |
Matriculation exam |
GEN |
Annual grade |
✓ |
||
|
Portfolio or project |
✓ |
|||||
|
Italy |
Diploma d’istruzione secondaria di secondo grado |
GEN/VET |
Subject grades |
✓ |
||
|
Japan |
High school certificate |
GEN |
Credits |
✓ |
||
|
Korea |
High school certificate |
GEN |
Credits from subjects; Participation in Creative Experiential Activities |
✓ |
||
|
Luxembourg |
Diplôme de fin d’études secondaires |
GEN |
Regular classroom assessment |
✓ |
||
|
Mexico |
Bachillerato |
GEN |
Teacher approval of modules, sub-modules, or courses that integrate the curriculum and have curricular value |
✓ |
||
|
Mexico |
Vocational Training Certificates |
VET |
Teacher approval of modules, sub-modules, or courses that integrate the curriculum and have curricular value |
✓ |
||
|
Netherlands |
Senior general secondary education (HAVO) and pre university education (VWO) |
GEN |
School exam |
✓ |
||
|
Netherlands |
Secondary vocational education (MBO) |
VET |
Institutional exam |
✓ |
||
|
New Zealand |
National Certificates of Educational Achievement (NCEA) |
GEN |
Internal assessment standards |
✓ |
||
|
Norway |
Diploma |
GEN |
Standing grades |
✓ |
||
|
Norway |
Vocational and apprentice and practical certificates |
VET |
Standing grades |
✓ |
||
|
Poland |
School leaving certificate – for lyceum, technikum and branżowa szkoła I and II stopnia subjects |
Both |
Annual grade |
✓ |
||
|
Portugal |
Certificado de conclusão do ensino secundário e diploma - Nível 3 do Quadro Nacional de Qualificações [Secondary school completion certificate and diploma - Level 3 of the National Qualifications Framework] |
GEN |
Internal coursework |
✓ |
||
|
Portugal |
Secondary school completion certificate and diploma – Professional certificate |
VET |
Internal coursework |
✓ |
||
|
Slovak Republic |
Maturita |
GEN |
Essay |
✓ |
||
|
Slovak Republic |
2-3 year VET, Art Schools |
VET |
School-based grades and assessment |
✓ |
||
|
Slovak Republic |
Maturita - 4-year VET |
VET |
Essay |
✓ |
||
|
Slovenia |
Vocational matura |
VET |
Internal assessment |
✓ |
||
|
Slovenia |
General matura |
GEN |
Internal assessment (oral part, work performed, seminar paper or exam presentation) |
✓ |
||
|
South Africa |
Secondary National Certificate |
GEN |
School-based Assessment |
✓ |
||
|
Spain |
Bachillerato (General) and Vocational Upper Secondary |
GEN/VET |
Continuous and differentiated assessment |
✓ |
||
|
Spain |
Intermediate Vocational Training Cycle |
VET |
Continuous and differentiated assessment |
✓ |
||
|
Sweden |
General and Vocational Programmes |
GEN/VET |
Teacher-based assessment |
✓ |
||
|
Nationally standardised tests (all students for some subjects, randomly allocated for other subjects) |
✓ |
|||||
|
Türkiye |
Lise Diploması [High School Diploma] |
GEN/VET |
Teacher-based assessment |
✓ |
||
|
United Kingdom - England, Northern Ireland and Wales |
Business Technology and Education Council (BTEC) Nationals |
VET |
Internally assessed units |
✓ |
||
|
United Kingdom - England, Northern Ireland and Wales |
General Certificates of Secondary Education (GCSEs) |
GEN |
Non-exam assessment |
✓ |
||
|
United Kingdom - England, Northern Ireland and Wales |
Advanced Subsidiary and Advanced Levels |
GEN |
Non-exam assessment |
✓ |
||
|
United Kingdom - Northern Ireland |
Vocational Qualifications and Council for the Curriculum, Examinations & Assessment (CCEA) qualifications |
Both |
Assessment Units |
✓ |
||
|
United Kingdom - Scotland |
National 5s, Higher qualifications |
GEN |
Coursework |
✓ |
||
|
United Kingdom - Scotland |
Advanced Higher qualifications |
GEN |
Coursework |
✓ |
||
|
United Kingdom - Scotland |
Vocational Qualifications e.g. National Certificates, National Progress Awards, Higher National Certificates, Higher National Diplomas, Foundation Apprenticeships |
VET |
Assessment towards Outcomes of Units of Qualifications |
✓ |
||
|
United Kingdom - Wales |
Level 3 WJEC Vocational Qualifications |
VET |
Non-exam assessment |
✓ |
||
Notes: In Austria, written exams are teacher/school set and assessed, except for mathematics and languages subjects which are externally set but locally marked.
In Finland, teachers in upper secondary schools perform a preliminary assessment before the tests are sent to the Matriculation Examination Board. Technically, this means that the Matriculation Examination is teacher/school assessed. However, in practice, grades are based on external marking.
In Scotland, coursework is predominantly externally set and assessed. However, for some subjects, such as Practical Metalworking, Music, or Chemistry, coursework may be set by schools and/or coursework may be marked internally. Each course specification sets out precise assessment responsibilities.
In England, Northern Ireland and Wales, non-exam assessment for GCSEs and AS/A Levels is predominantly marked by teachers, but there are some instances when it is marked by exam boards e.g. for language qualifications that are regulated in England by Ofqual.
BTECs are a brand of qualifications and not a specific type of qualification. BTECs are offered by Pearson, an awarding organisation. The term BTEC is often used more broadly by students, their families and schools to refer to vocational qualifications that are taken alongside A levels. However, the mapping in this report is specific to Pearson’s ‘BTEC Nationals’.
In Greece, students in Vocational High Schools work towards both the Ptychio and the Apolytirio.
Source: Transitions in Upper Secondary Education Certificate and Assessment mapping exercise, data collected across 2024 and 2025.
Annex 5.B. External assessment
Copy link to Annex 5.B. External assessmentAnnex Table 5.B.1. Certificates which include an externally set and/or marked component
Copy link to Annex Table 5.B.1. Certificates which include an externally set and/or marked component|
Country |
Certificate |
Orientation |
Component |
Assessment responsibilities |
||
|---|---|---|---|---|---|---|
|
Both externally set and marked |
Just externally set |
Just externally marked |
||||
|
Australia (Queensland) |
Queensland Certificate of Education |
GEN |
External assessment |
✓ |
||
|
Austria |
Standardised competency-oriented school leaving examination at AHS |
GEN |
Written examinations (in specific subjects only e.g. languages, mathematics) |
✓ |
||
|
Austria |
Standardised competency-oriented school leaving examination at BHS |
VET |
Written examinations |
✓ |
||
|
Belgium (Flemish Community) |
Orientation towards higher education |
GEN |
Flemish tests (central assessment) |
✓ |
||
|
Belgium (Flemish Community) |
Orientation towards higher education and labour market |
GEN |
Flemish tests (central assessment) |
✓ |
||
|
Belgium (Flemish Community) |
Orientation towards labour market |
VET |
Flemish tests (central assessment) |
✓ |
||
|
Belgium (French Community) |
Certificat d’Enseignement Secondaire Supérieur (GEN) |
GEN |
External certificate assessment standards |
✓ |
||
|
Belgium (French Community) |
Certificat d’Enseignement Secondaire Supérieur (VET) |
VET |
External certificate assessment standards |
✓ |
||
|
Canada (British Columbia) |
B.C. Certificate of Graduation (Dogwood Diploma) |
GEN |
Grade 10 numeracy assessment, Grade 10 literacy assessment and Grade 12 literacy assessment |
✓ |
||
|
Colombia |
Bachiller Académico |
GEN |
Saber 11 |
✓ |
||
|
Colombia |
Bachiller Técnico |
VET |
Saber 11 |
✓ |
||
|
Costa Rica |
Bachillerato |
Both |
Prueba Nacional Estandarizada |
✓ |
||
|
Croatia |
State Matura |
GEN |
External exams |
✓ |
||
|
Czechia |
Upper secondary education with Maturita |
Both |
Maturita (state part) |
✓ |
||
|
Czechia |
Upper secondary education with VET certificate |
VET |
VET final exam - written exam |
✓ |
||
|
Denmark |
Higher Commercial Examination (hhx) |
GEN |
Examinations |
✓ |
||
|
Major written assignment (3rd year) |
✓ |
|||||
|
Denmark |
Higher General Examination Programme (stx) |
GEN |
Examinations |
✓ |
||
|
Major written assignment (3rd year) |
✓ |
|||||
|
Denmark |
Higher Preparatory Examination programme (hf) |
GEN |
Examinations |
✓ |
||
|
Major written assignment (fourth semester) |
✓ |
|||||
|
Denmark |
Higher Technical Examination Programme (htx) |
GEN |
Examinations |
✓ |
||
|
Major written assignment (3rd year) |
✓ |
|||||
|
Estonia |
General Upper Secondary Education |
GEN |
State exams |
✓ |
||
|
Finland |
Matriculation Examination Certificate |
GEN |
Matriculation Exam |
✓ |
||
|
France |
Baccalauréat |
GEN |
Contrôle terminal (Final assessment) |
✓ |
||
|
France |
Professional baccalauréat |
VET |
Epreuves (tests) |
✓ |
||
|
Greece |
Apolytirio |
GEN |
Internal Exams |
✓ |
||
|
Greece |
Pytchio |
GEN |
EPAL exams, including practical exam |
✓ |
||
|
Ireland |
Leaving Certificate (Established and Vocational Programme) |
GEN |
Final exams; Coursework |
✓ |
||
|
Israel |
Matriculation exam |
GEN |
Exam; Oral examinations |
✓ |
||
|
Italy |
Diploma d’istruzione secondaria di secondo grado |
Both |
State Matura (Esame di maturità) Written exam |
✓ |
||
|
Lithuania |
General upper secondary certificate |
GEN |
Matura Intermediate examinations; Matura Final examinations |
✓ |
||
|
Lithuania |
Vocational upper secondary certificate (4th-5th EQF) |
VET |
Matura Intermediate examinations; Matura Final examinations; Vocational examination |
✓ |
||
|
Luxembourg |
Diplôme de fin d’études secondaires |
GEN |
Exam |
✓ |
||
|
Netherlands |
Senior general secondary education (HAVO) and pre university education (VWO) |
GEN |
National examinations |
✓ |
||
|
Netherlands |
Secondary vocational education (MBO) |
VET |
National examinations |
✓ |
||
|
New Zealand |
National Certificates of Educational Achievement (NCEA) |
GEN |
External assessment standards; Literacy and numeracy co-requisite |
✓ |
||
|
Norway |
Diploma |
GEN |
Exam grades |
✓ |
||
|
Norway |
Vocational and apprentice and practical certificates |
VET |
Exam grades |
✓ |
||
|
Poland |
Egzamin zawodowy [Professional qualification] |
VET |
Practical vocational exam; Theoretical exam |
✓ |
||
|
Poland |
Matura |
GEN |
Oral exams |
✓ |
||
|
Written exams |
✓ |
|||||
|
Portugal |
Secondary school completion certificate and diploma - Level 3 of the National Qualifications Framework |
GEN |
Final exams |
✓ |
||
|
Slovak Republic |
Maturita |
GEN |
Written test / exam |
✓ |
||
|
Essay |
✓ |
|||||
|
Slovak Republic |
Maturita - 4-year VET |
VET |
Written test |
✓ |
||
|
Essay |
✓ |
|||||
|
Slovenia |
Vocational matura |
VET |
Examinations |
✓ |
||
|
Slovenia |
General matura GEN |
GEN |
Exams (written, oral, practical, exam presentation) |
✓ |
||
|
South Africa |
Secondary National Certificate |
GEN |
End of year examination; Practical component (practical subjects); Language oral assessment |
✓ |
||
|
Sweden |
General and Vocational Programmes |
Both |
Nationally standardised tests (all students for some subjects, randomly allocated for other subjects) |
✓ |
||
|
United Kingdom - England |
T Levels |
VET |
Core Component - Core exam; Core Component - Employer-set project; Occupational Specialism |
✓ |
||
|
United Kingdom - England, Northern Ireland and Wales |
AS and A Levels |
GEN |
Non-exam assessment; Examinations |
✓ |
||
|
United Kingdom - England, Northern Ireland and Wales |
GCSEs |
GEN |
Examinations; Non-exam assessment |
✓ |
||
|
United Kingdom - England, Northern Ireland and Wales |
BTEC Nationals |
VET |
Externally assessed units |
✓ |
||
|
United Kingdom - Scotland |
National 5s, Higher qualifications |
GEN |
Question papers; Coursework |
✓ |
||
|
United Kingdom - Scotland |
Advanced Higher qualifications |
GEN |
Question papers; Coursework |
✓ |
||
|
United Kingdom - Wales |
Level 3 WJEC Vocational Qualifications |
VET |
Exams; Non-exam assessment |
✓ |
||
Notes: In Austria, written exams are teacher/school set and assessed, except for mathematics and languages subjects which are externally set but locally marked.
In Belgium (Flemish Community), starting from the cohort of students who will take the last two years of upper secondary school (i.e. the third grade) in 2026-2027, pupils in the last year (i.e. the second year of the third grade) will sit Flemish tests.
In Finland, teachers in upper secondary schools perform a preliminary assessment before the tests are sent to the Matriculation Examination Board. Technically, this means that the Matriculation Examination is teacher/school assessed. However, in practice, grades are based on external marking.
In England, Northern Ireland and Wales, non-exam assessment for GCSEs and AS/A Levels is predominantly marked by teachers, but there are some instances when it is marked by exam boards e.g. for language qualifications that are regulated in England by Ofqual.
BTECs are a brand of qualifications and not a specific type of qualification. BTECs are offered by Pearson, an awarding organisation. The term BTEC is often used more broadly by students, their families and schools to refer to vocational qualifications that are taken alongside A levels. However, the mapping in this report is specific to Pearson’s ‘BTEC Nationals’.
In Scotland, coursework is predominantly externally set and assessed. However, for some subjects, such as Practical Metalworking, Music, or Chemistry, coursework may be set by schools and/or coursework may be marked internally. Each course specification sets out precise assessment responsibilities.
Source: Transitions in Upper Secondary Education Certificate and Assessment mapping exercise, data collected across 2024 and 2025.