This chapter examines the relationship between the two cycles of the Survey of Adult Skills (PIAAC) and other international skills surveys. It first focuses on previous international adult skills surveys, notably the International Adult Literacy Survey (IALS) and the Adult Literacy and Life Skills Survey (ALL). It then looks at its relationship to the OECD Programme for International Student Assessment (PISA), which targets 15-year-old students.
6. Relationship of the Survey of Adult Skills to other international adult skills surveys
Copy link to 6. Relationship of the Survey of Adult Skills to other international adult skills surveysAbstract
The Survey of Adult Skills (PIAAC) provides an unparalleled source of evidence for policy makers about the skills of adults. Its first cycle was conducted in three separate rounds of data collection between 2011 and 2018. In this cycle, 245 000 adults were interviewed in 39 countries and economies, representing 1.15 billion people. In the second cycle, the 2023 Survey of Adults Skills, 160 000 adults were interviewed in 31 countries and economies, representing 673 million people.
This chapter is organised in two parts. The first describes the relationship between the two cycles of the Survey of Adult Skills and previous international adult skills surveys. In particular, it compares the Survey of Adult Skills with the previous two international assessments of adult skills: the International Adult Literacy Survey (IALS) of 1994-98 and the Adult Literacy and Life Skills Survey (ALL) of 2003-07.1 The first cycle of the Survey of Adult Skills was also related to two surveys conducted by UNESCO (the Literacy Assessment and Monitoring Programme – LAMP) and the World Bank (the STEP Measurement Study). Information about the relationship between the first cycle of the Survey of Adult Skills and the LAMP and STEP surveys can be found in OECD (2019[1]) and in Keslair and Paccagnella (2020[2]).2
The second part examines the relationship between the Survey of Adult Skills and the OECD Programme for International Student Assessment (PISA), which focuses on 15-year-old students. Although there are similarities between these two surveys in terms of how skills are defined, there are significant differences between them, primarily in the target populations and the measures used to assess skills.
The relationship between the Survey of Adult Skills and other adult skills surveys
Copy link to The relationship between the Survey of Adult Skills and other adult skills surveysThis section compares the two cycles of the Survey of Adult Skills with previous adult skills surveys, IALS and ALL. The Survey of Adult Skills aimed to provide comparable measures of proficiency in the domains of literacy and numeracy with the earlier surveys, and there has been some overlap in participation. In total, 27 countries and economies participated in both cycles of the Survey of Adult Skills, while 17 countries and economies which participated in the 2023 Survey of Adult Skills also participated in IALS. Eight countries and economies participated in both IALS and ALL. IALS was undertaken in three separate waves, with data collection occurring in 1994, 1996 and 1998. ALL was undertaken in two waves, with data collection taking place in 2003 and 2006-08. Table 6.1 lists the countries and economies participating in IALS, ALL and the two cycles of the Survey of Adult Skills, together with the dates of data collection.
Table 6.1. Countries and economies participating in adult skills surveys
Copy link to Table 6.1. Countries and economies participating in adult skills surveys|
OECD countries and economies |
IALS (1994-98) |
ALL (2003-07) |
Survey of Adult Skills (first cycle, 2011-18) |
Survey of Adult Skills (second cycle, 2022-23) |
|---|---|---|---|---|
|
OECD countries |
||||
|
Australia |
1996 |
2006-07 |
2011-12 |
- |
|
Austria |
- |
- |
2011-12 |
2022-23 |
|
Canada |
1994 |
2003 |
2011-12 |
2022-23 |
|
Chile |
1998 |
- |
2014-15 |
2022-23 |
|
Czechia |
1998 |
- |
2011-12 |
2022-23 |
|
Denmark |
1998 |
- |
2011-12 |
2022-23 |
|
Estonia |
- |
- |
2011-12 |
2022-23 |
|
Finland |
1998 |
- |
2011-12 |
2022-23 |
|
France |
- |
- |
2011-12 |
2022-23 |
|
Germany |
1994 |
- |
2011-12 |
2022-23 |
|
Hungary |
1998 |
2007-08 |
2017-18 |
2022-23 |
|
Ireland |
1994 |
- |
2011-12 |
2022-23 |
|
Israel |
- |
- |
2014-15 |
2022-23 |
|
Italy |
1998 |
2003-04 |
2011-12 |
2022-23 |
|
Japan |
- |
- |
2011-12 |
2022-23 |
|
Korea |
- |
- |
2011-12 |
2022-23 |
|
Latvia |
- |
- |
- |
2022-23 |
|
Lithuania |
- |
- |
2014-15 |
2022-23 |
|
Netherlands |
1994 |
2007-08 |
2011-12 |
2022-23 |
|
New Zealand |
1996 |
2005 and 2007 |
2014-15 |
2022-23 |
|
Norway |
1998 |
2003 |
2011-12 |
2022-23 |
|
Poland |
1994 |
- |
2011-12 |
2022-23 |
|
Portugal |
- |
- |
- |
2022-23 |
|
Slovak Republic |
- |
- |
2011-12 |
2022-23 |
|
Slovenia |
1998 |
- |
2014-15 |
- |
|
Spain |
- |
- |
2011-12 |
2022-23 |
|
Sweden |
1994 |
- |
2011-12 |
2022-23 |
|
Switzerland |
1994 |
- |
- |
2022-23 |
|
United States1 |
1994 |
2003 |
2011-12, 2014-15 and 2017 |
2022-23 |
|
Subnational entities |
||||
|
Flemish Region (Belgium) |
- |
- |
2011-12 |
2022-23 |
|
England (UK) |
1996 |
- |
2011-12 |
2022-23 |
|
Northern Ireland (UK) |
1996 |
- |
2011-12 |
- |
|
Partner countries |
||||
|
Croatia |
- |
- |
- |
2022-23 |
|
Singapore |
- |
- |
2014-15 |
2022-23 |
1. The United States participated in all three rounds of the first cycle of the Survey of Adult Skills (PIAAC). It collected data as part of Round 1 in 2011-12. It then collected additional data for targeted population groups as part of a National PIAAC Supplement (Rampey et al., 2016[3]) in 2014 and participated in Round 3 in 2017. Data collected in the first and second round were combined and reweighted to totals related to the 2010 census (while the 2011/12 data were weighted to totals from the 2000 census). The 2012/14 data set has fully replaced the original 2011/12 dataset in all OECD reports, as it provides a more accurate representation of the proficiency of the working-age population at that point in time. The 2017 data collected as part of Round 3 can be used as an additional data point. Details of the PIAAC data collection in the United States can be found in the technical reports for the survey and the National PIAAC Supplement (Hogan et al., 2016[4]; OECD, 2019[5]).
Evolution of assessment frameworks and instruments
Over time, assessment frameworks in large-scale assessments (including adult assessments) face competing pressures. On the one hand, there is a desire to retain continuity in measures (to provide reliable measures of change over time). On the other hand, measures need to be relevant to contemporary realities and changing understanding of the phenomena measured. Three main factors drive these changes: 1) developments in the understanding of the skills measured; 2) technological and social developments that affect the nature and practice of these skills in everyday life, work and study; and 3) technological and methodological advances in the science and practice of measurement (OECD, 2021[6]).
Although the different international adult assessments have been designed to be linked psychometrically in the domains of literacy and numeracy, the constructs measured have undergone considerable revision and extension, even if a common core remains. The skills assessed in the 2023 Survey of Adult Skills and its predecessors are presented graphically in Table 6.2. The shading indicates links between surveys, with the same colour indicating that the domains are comparable in terms of the constructs measured and the content of the assessment instruments.
Table 6.2. Skills assessed in adult skills surveys
Copy link to Table 6.2. Skills assessed in adult skills surveys|
IALS (1994-98) |
ALL (2003-07) |
Survey of Adult Skills (first cycle, 2011-18) |
Survey of Adult Skills (second cycle, 2022-23) |
|---|---|---|---|
|
Prose literacy |
Prose literacy |
||
|
Document literacy |
Document literacy |
||
|
Literacy (rescaled to combine prose and document literacy) |
Literacy (rescaled to combine prose and document literacy) |
Literacy (encompasses the reading of prose and document texts as well as digital texts) Reading components (print vocabulary, sentence meaning and passage fluency). |
Literacy – includes the dimensions of organisation (density of content, representations and access devices) and source (single or multiple authors/publishers) to better represent the range of texts accessible in digital environments, including interactive texts Reading components (measuring sentence meaning and passage fluency and being integrated into the literacy scale) |
|
Numeracy |
Numeracy |
Numeracy (expanded to include representations of mathematical information in the form of “structured information” and also “dynamic applications”) Numeracy components (measuring quantity and relative magnitudes and being integrated into the numeracy scale) |
|
|
Quantitative literacy |
|||
|
Problem solving |
|||
|
Problem solving in technology-rich environments |
|||
|
Adaptive problem solving |
Note: See Box 6.1 for references for the assessment frameworks for each programme.
The descriptors used to describe the characteristics of the tasks at each proficiency level differ between surveys, because of the evolution of the assessment frameworks. Differences are even more marked when comparing the two cycles of the Survey of Adult Skills with IALS and ALL because of the introduction of the new domain of “literacy” (which replaced the previously separate domains of prose and document literacy) and because the way in which the “proficiency” of individuals and the “difficulty” of items are defined also changed. In particular, the Survey of Adult Skills locates items and individuals on the proficiency scales for literacy, numeracy and problem solving using a response probability (RP) value of 0.67. In other words, individuals are located on the scale at the point at which they have a 67% probability of successfully completing a random set of items representing the construct, and items are located on the scale at the point in which they have a 67% probability of being successfully completed by a random sample of the adult population (see Chapter 3). In IALS and ALL a response probability of 0.80 was used. The first cycle of the Survey of Adult Skills moved from a 0.80 RP to a 0.67 RP in order to align with the practice adopted in PISA (OECD, 2010, p. 48[7]).
The change in response probability has no consequences for either the estimation of the proficiency or the precision of the scales. The estimation of proficiency is independent of the selection of an RP value, as it is a function of the level of correct response to the test items. The precision of the scale is a function of the number of items in the scale, which is again independent of the choice of RP value. What the change in RP value does affect is the way proficiency is defined and described. In effect, “proficiency” is defined in terms of a different probability of successfully completing tasks. In the case of the shift from an RP value of 0.80 to one of 0.67, the result is that proficiency is described in terms of more difficult items that are completed with a lower probability of success (OECD, 2019[1]).
Literacy
The Survey of Adult Skills defines literacy more broadly than the definitions used in IALS and ALL. Literacy encompasses the domains of prose and document literacy,3 which were assessed separately in IALS and ALL. In addition, literacy includes a stronger emphasis on reading digital and mixed-format texts (i.e. texts containing both continuous and non-continuous elements; see Chapter 2). The second cycle further expanded the classification of texts to include the dimensions of organisation (density of content, representations and access devices) and source (single or multiple authors/publishers) to better represent the range of texts accessible in digital environments, including interactive texts. The conceptualisation of the cognitive processes used in gaining meaning from text, the definition of the contexts in which reading takes place, and the factors affecting the difficulty of test items are very similar across all adult skills surveys. Still, the 2023 Survey of Adult Skills emphasises evaluation in terms of the evaluation of the accuracy, soundness and task relevance of a text in relation to both its source and content. Additionally, cognitive processes are now considered independently of the factors affecting task difficulty. Task difficulty is conceived as being driven by the features of the stimulus text(s), the formulation of the question/task description and the interaction of the text and question/task description (see Table 6.3).
A set of common test items provided a psychometric link between the first cycle of the Survey of Adult Skills and IALS and ALL. A similar approach was used to link the first and second cycles of the Survey of Adult Skills: 28 of the 80 literacy items included in the second cycle assessment were linking items (i.e. items that were used in the first cycle of the survey).
The assessment of reading components was introduced in the first cycle of the Survey of Adult Skills as a new element. These were also included in the second cycle, with minor changes in the construct (print vocabulary was omitted). A more important change introduced in the second cycle was the inclusion of the performance on the components assessment into the main literacy proficiency scale.4 Table 6.3 shows the evolution of literacy constructs across the adult skills surveys.
Table 6.3. Evolution of literacy assessment frameworks across adult skills surveys
Copy link to Table 6.3. Evolution of literacy assessment frameworks across adult skills surveys|
IALS (1994-98) / ALL (2003-07) |
Survey of Adult Skills (first cycle, 2011-18) |
Survey of Adult Skills (second cycle, 2022-23) |
||
|---|---|---|---|---|
|
Construct |
Prose literacy |
Document literacy |
Literacy |
Literacy |
|
Definition |
Literacy is using printed and written information to function in society, to achieve one’s goals, and to develop one’s knowledge and potential. |
Literacy is the ability to understand, evaluate, use and engage with written texts to participate in society, to achieve one’s goals, and to develop one’s knowledge and potential. Literacy encompasses a range of skills, from the decoding of written words and sentences to the comprehension, interpretation, and evaluation of complex texts. |
Literacy is accessing, understanding, evaluating, and reflecting on written texts in order to achieve one’s goals, develop one’s knowledge and potential, and to participate in society. |
|
|
Prose literacy is the knowledge and skills needed to understand and use information from texts, including editorials, news stories, brochures and instruction manuals. |
Document literacy is the knowledge and skills required to locate and use information contained in various formats, including job applications, payroll forms, transportation schedules, maps, tables and charts. |
|||
|
Cognitive processes |
|
|
|
|
|
Content |
Continuous texts:
|
Non-continuous texts:
|
Texts characterised by their medium (print-based or digital) and by format:
|
Texts characterised by their:
|
|
Contexts |
|
|
|
|
|
Factors affecting task difficulty |
|
|
|
|
Source: Murray, Kirsch and Jenkins (1998[8]), OECD/Statistics Canada (2000[9]), Murray, Clermont and Binkley (2005[10]), OECD (2012[11]) and Rouet et al. (2021[12])
Numeracy
The conceptualisation of numeracy in the 2023 Survey of Adult Skills remains similar to the definitions used in its first cycle and ALL, with a focus on accessing, using and reasoning critically with mathematical content, information and ideas represented in multiple ways in order to engage in and manage the mathematical demands of a range of situations in adult life. As shown in Table 6.2, the domain of numeracy was introduced in ALL to replace quantitative literacy, which had been measured in IALS. Quantitative literacy covers the skills needed to undertake arithmetic operations such as addition, subtraction, multiplication and division, either singly or in combination, using numbers or quantities embedded in printed material. Numeracy covers a broader range of situations in which actors must deal with mathematical information of different types, not just situations involving numbers embedded in printed materials (Gal et al., 2005[13]).
The revisions to the numeracy framework for the 2023 Survey of Adult Skills were primarily focused on ensuring that the assessment reflects the importance of digital information, representations, devices and applications that adults have to manage in dealing with the numerical demands of everyday life. In particular, some of the key elements of this revision included 1) addressing 21st century skills, including critical thinking and reflection, reasoning and understanding of degree of accuracy; 2) considering advances in technology and information and communication technologies (ICT) while keeping a balance with more traditional modes and means of communication and undertaking numeracy tasks; 3) making better use of technology for assessment in relation to both authenticity and making items accessible; and 4) addressing a number of issues regarding adults’ numeracy performance and understanding, including a person’s disposition to use mathematics and to see mathematics in a numeracy situation (Tout et al., 2021[14]).
As in the case of the literacy assessment, the first cycle of the Survey of Adult Skills included common items with ALL, while several numeracy items are now common to both cycles of the Survey of Adult Skills. Out of the 80 numeracy items included in the second cycle numeracy assessment, 32 were linking items (i.e. items that were used in the first cycle of the survey).
The 2023 Survey of Adult Skills introduced an assessment of numeracy components that aimed to provide insights into the skills and knowledge of adults with low levels of numeracy (below Level 1). The content is limited to the fundamentals of number sense. More specifically, they cover understanding of quantity (16 items requiring respondents to identify how many objects are displayed) and relative magnitude (14 items asking respondents to identify the biggest number in a set). As with reading components, the numeracy component items are integrated into the numeracy proficiency scale.5 Table 6.4 shows the evolution of the numeracy constructs across the different adult skills surveys.
Table 6.4. Evolution of numeracy assessment frameworks across adult skills surveys
Copy link to Table 6.4. Evolution of numeracy assessment frameworks across adult skills surveys|
IALS (1994-98) |
ALL (2003-07) |
Survey of Adult Skills (first cycle, 2011-17) |
Survey of Adult Skills (second cycle, 2022-23) |
|
|---|---|---|---|---|
|
Construct |
Quantitative literacy |
Numeracy |
Numeracy |
Numeracy |
|
Definition |
Quantitative literacy is the knowledge and skills required to apply arithmetic operations, either alone or sequentially, to numbers embedded in printed materials, such as balancing a chequebook, figuring out a tip, completing an order form or determining the amount of interest in a loan from an advertisement. |
Numeracy is the knowledge and skills required to effectively manage and respond to the mathematical demands of diverse situations. Numerate behaviour is observed when people manage a situation or solve a problem in a real context; it involves responding to information about mathematical ideas that may be represented in a range of ways; it requires the activation of a range of enabling knowledge, factors and processes. |
Numeracy is the ability to access, use, interpret and communicate mathematical information and ideas in order to engage in and manage the mathematical demands of a range of situations in adult life. To this end, numeracy involves managing a situation or solving a problem in a real context by responding to mathematical content/information/ideas represented in multiple ways. |
Numeracy is accessing, using and reasoning critically with mathematical content, information and ideas represented in multiple ways in order to engage in and manage the mathematical demands of a range of situations in adult life. |
|
Cognitive processes |
|
|
|
|
|
Content |
Non-continuous texts:
|
Mathematical information:
Representations of mathematical information:
|
Mathematical content, information and ideas:
Representations of mathematical content:
|
Mathematical content, information and ideas:
Mathematical representations:
|
|
Contexts |
|
|
|
|
|
Factors affecting task difficulty |
|
|
|
|
Source: Murray, Kirsch and Jenkins (1998[8]), OECD/Statistics Canada (2000[9]), Murray, Clermont and Binkley (2005[10]), OECD (2012[11]) and Tout et al. (2021[14]).
Adaptive problem solving
The domain of adaptive problem solving (APS) was first introduced in the 2023 Survey of Adult Skills and is independent of previous measures of problem solving. Results from the APS assessment are therefore not comparable with results on problem solving in technology-rich environments (PSTRE) from the first cycle of the survey. PSTRE was an assessment of problem-solving skills as they apply to technology-rich environments. APS, on the other hand, did not systematically assess the proficiency of problem solvers at interacting with technology-rich environments.
APS represents the return to a concept of general problem solving that is relevant to a range of information environments and contexts and is not limited to digitally embedded problems, even though digital aspects as a mode of problem solving play an important role in APS. The concept of APS recognises that problems often dynamically change while they are being solved, which requires constant monitoring and, if necessary, adapting the original solution. These changes occur because of unexpected physical and/or social events in the environment and because of the unintended consequences of the problem solver’s actions. Adaptive problem solving is measured through 65 items. Table 6.5 shows the evolution of the problem-solving constructs across the adult skills surveys.
Table 6.5. Evolution of problem-solving assessment frameworks across adult skills surveys
Copy link to Table 6.5. Evolution of problem-solving assessment frameworks across adult skills surveys|
ALL (2003-07) |
Survey of Adult Skills (first cycle, 2011-18) |
Survey of Adult Skills (second cycle, 2022-23) |
|
|---|---|---|---|
|
Construct |
Analytical problem solving |
Problem solving in technology-rich environments |
Adaptive problem solving |
|
Definition |
Problem solving involves goal-directed thinking and action in situations for which no routine solution procedure is available. The problem solver has a more or less well defined goal, but does not immediately know how to reach it. The incongruence of goals and admissible operators constitutes a problem. The understanding of the problem situation and its step-by-step transformation based on planning and reasoning, constitute the process of problem solving. |
Problem solving in technology-rich environments involves the ability to use digital technology, communication tools and networks to acquire and evaluate information, communicate with others and perform practical tasks. The assessment focuses on the abilities to solve problems by setting up appropriate goals and plans, and accessing and making use of information through computers and computer networks. |
Adaptive problem solving involves the capacity to achieve one’s goals in a dynamic situation, in which a method for solution is not immediately available. It requires engaging in cognitive and metacognitive processes to define the problem, search for information, and apply a solution in a variety of information environments and contexts. |
|
Cognitive processes |
|
|
|
|
Content |
Problems |
Technology: • hardware devices • software applications • commands and functions • representations (e.g. text, graphics, video) Nature of problems:
|
Aspects of the environment in which adaptive problem solving tasks are embedded:
|
|
Contexts |
Not specified |
|
|
|
Factors affecting task difficulty |
Not specified |
|
|
Source: Murray, Clermont and Binkley (2005[10])OECD (2012[11]) and Greiff et al. (2021[15]).
Implications for the comparability of results over time
The evolution of the frameworks and the methodological changes introduced between the first and the second cycle of the Survey of Adult Skills (notably the inclusion of components in the main literacy and numeracy scales in the 2023 survey) have implications for the comparability of results between the two cycles of the survey.
When looking at changes in proficiency between the first and second cycle of the Survey of Adult Skills, additional uncertainty around scale values due to changes in the assessment frameworks and items must be taken into account (“is a score of 235 in the second cycle of the survey the same as 235 in the first cycle?”). The difference between a score in the second cycle scale and the corresponding score in the first cycle scale is modelled as a constant, equal for all countries and at all points of the proficiency scale. While the actual value of this constant remains unknown, its standard deviation can be estimated and is known as the linking error. This linking error should be added to the standard error of any trend statistics expressed as a proficiency score (e.g. difference in mean proficiency across cycles or the values of the percentiles of the proficiency distribution). More formally, the standard error of the change in proficiency for country (or subgroup) between the first and second cycle is: , where is the standard error of the proficiency of group in the first cycle, is the standard error of the proficiency of group in the second cycle, and is the linking error between the two cycles. The actual value of the linking error is 3.27 for literacy and 2.95 for numeracy.
It should be noted that the linking error does not apply to trends of any statistic which is analogous to a score point difference, such as gender or age gaps in proficiency scores. Given that the additional uncertainty for comparing results across cycles is modelled as a constant at all points of the scale, when taking the difference between two scores, the uncertainties associated with each score cancel each other out, and there is no need to add the linking error term to the standard error of the trend.
A more complex case is when the analysis looks at trends in the shares of the population scoring at a given proficiency level. In this case, the additional error term for the standard error of these trends depends on , but also on the density of the proficiency score distribution around these cut-offs. For instance, the resulting linking error for the trend in the proportion of the population score at Level 1 in group will be .
The inclusion of performance in the components assessment into the main literacy and numeracy scales improves the precision of the estimates of proficiency at the bottom end of the literacy and numeracy scales but may also affect the comparability of results over time, particularly for adults who only took the components assessment after they failed the locator test. The estimated proficiency of these adults is based on a much richer set of information in the second cycle of the survey than it was in the first cycle. Caution is therefore advised when analysing changes in proficiency over time for subgroups of adults in which low-skilled adults who failed the locator are over-represented. Adults who failed the locator and only took the components assessment constitute a small minority of the overall sample. For most analysis, the impact of this methodological change is therefore negligible. However, in some groups of the population, the share of such adults may be larger, especially in some countries. In OECD reports, as a general (albeit somewhat subjective) rule, changes in proficiency are not reported when the share of adults who only took the components assessment constitutes more than 20% of the group analysed. Table 6.6 presents, for each country and economy, the share of respondents that only took the components, both as a percentage of the overall population, and as a percentage of foreign-born adults.
Table 6.6. Share of respondents who failed the locator test and only took the reading and numeracy components assessments
Copy link to Table 6.6. Share of respondents who failed the locator test and only took the reading and numeracy components assessments|
OECD countries and economies |
% of the overall population (%) |
% of foreign-born adults |
|---|---|---|
|
OECD countries |
||
|
Austria |
1.6 |
3.7 |
|
Canada |
1.1 |
1.3 |
|
Chile |
7.1 |
8.4 |
|
Czechia |
0.5 |
0.4 |
|
Denmark |
0.5 |
1.0 |
|
Estonia |
1.0 |
1.3 |
|
Finland |
0.9 |
1.4 |
|
France |
2.2 |
7.2 |
|
Germany |
1.5 |
4.8 |
|
Hungary |
1.7 |
0.8 |
|
Ireland |
0.5 |
0.6 |
|
Israel |
3.4 |
2.7 |
|
Italy |
1.9 |
5.1 |
|
Japan |
0.9 |
1.6 |
|
Korea |
1.4 |
4.2 |
|
Latvia |
1.0 |
0.9 |
|
Lithuania |
1.3 |
1.2 |
|
Netherlands |
1.8 |
5.3 |
|
New Zealand |
4.1 |
5.0 |
|
Norway |
1.0 |
3.5 |
|
Poland |
7.1 |
18.9 |
|
Portugal |
3.3 |
3.5 |
|
Slovak Republic |
1.3 |
0.3 |
|
Spain |
1.1 |
2.3 |
|
Sweden |
0.8 |
2.5 |
|
Switzerland |
2.1 |
5.1 |
|
United States |
4.8 |
15.1 |
|
Subnational entities |
||
|
England (UK) |
1.4 |
2.9 |
|
Flemish Region (Belgium) |
1.0 |
3.0 |
|
Partner countries |
||
|
Croatia |
2.0 |
4.1 |
|
Singapore |
2.7 |
3.9 |
Mode of delivery
A major difference between the two cycles of the Survey of Adult Skills and IALS and ALL is the way in which the assessment was delivered. Both cycles of the Survey of Adult Skills were designed to be delivered on digital devices. The first cycle of the survey relied on laptops, although there was a pencil-and-paper option for respondents who did not have sufficient computer skills to take a digital assessment. The 2023 Survey of Adult Skills relied solely on tablets. A digital stylus was also available, to replicate to the extent possible the experience respondents would have with paper-based instruments. In contrast, both IALS and ALL were exclusively based on paper-and-pencil instruments: respondents received printed booklets and responded to questions in writing. During the field trial of the first cycle of the Survey of Adult Skills, a study was conducted into the comparability of results across the two delivery modes (paper and computer; see OECD (2013[16])). In the second cycle, parameters of linking items that were also administered in the first were rather stable, despite being administered on a tablet rather than on a laptop, providing evidence that the change from laptop to tablet had no impact on the comparability of results across the two cycles.
Box 6.1. Assessment framework references for the adult skills surveys
Copy link to Box 6.1. Assessment framework references for the adult skills surveys2023 Survey of Adult Skills
OECD (2021[6]), The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Skills Studies, OECD Publishing, Paris, https://doi.org/10.1787/4bc2342d-en.
First cycle of the Survey of Adult Skills
OECD (2012[11]), Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264128859-en.
ALL
Murray, S., Y. Clermont and M. Binkley (eds) (2005[10]), Measuring Adult Literacy and Life Skills: New Frameworks for Assessment, Statistics Canada, Ottawa, Catalogue No. 89-552-MIE, No. 13.
IALS
Murray, S., I. Kirsch and L. Jenkins (eds) (1998[8]), Adult Literacy in OECD Countries: Technical Report on the First International Adult Literacy Survey, National Center for Education Statistics, Office of Educational Research and Improvement, Washington, DC.
OECD/Statistics Canada (2000[9]), Literacy in the Information Age: Final Report of the International Adult Literacy Survey, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264181762-en.
Comparability of background questionnaires
The extent to which comparisons can be made between the two cycles of the Survey of Adult Skills and their predecessors does not just depend on the psychometric links between them. If the results for subgroups of the population are to be reliably compared between surveys, the definitions of the relevant subgroups must also be similar over time.
The background questionnaire from the 2023 Survey of Adult Skills took great care in ensuring the comparability of questions with those administered in previous surveys. Moreover, it enriched the range of information collected by including: 1) a new section on social and emotional skills; 2) new items to capture changes in working environments and societies (e.g. use of digital skills); 3) improved measurement of education and training; 4) an improved process for implementing national adaptations and adding national extensions; and 5) new items to capture more information about being out of work and respondents’ social background. More information on the content of the background questionnaire in the 2023 Survey of Adult Skills can be found in Chapter 4 and in (OECD, forthcoming[17]).
In areas such as the personal characteristics of respondents, language background, immigration status, educational attainment and participation, and labour-force status, there is a high degree of similarity between the questions and response categories used in the two cycles of the Survey of Adult Skills and those used in IALS and ALL. Some caution must be exercised when comparing levels of educational attainment between the first and the second cycle of the survey, as some countries have changed the classification of some of their qualifications within the International Standard Classification of Education (ISCED) framework (OECD, forthcoming[18]).
Care must also be taken when comparing levels of education with previous surveys. For four countries participating in IALS (Czechia, Germany, Poland and the United Kingdom), the proportion of the adult population classified as having educational attainment below upper secondary level (ISCED 0-2) is considerably lower and the proportion with upper secondary and post-secondary non-tertiary attainment (ISCED 3-4) is considerably higher than is found in other statistics on educational attainment for the years when the IALS data were collected (1994 or 1996 depending on the country) such as those published by the OECD in Education at a Glance (Gesthuizen, Solga and Künster, 2010[19]). Analysts should bear this in mind when comparing results between IALS and ALL and the Survey of Adult Skills for these countries. Gesthuizen, Solga and Künster (2010[19]) propose a method to correct the attribution of respondents to levels of educational attainment in the IALS data set that provides distributions in line with other attainment statistics.
Concerning the items which were designed to capture the frequency of use of certain skills, the information collected regarding reading at home and work, influence, and task discretion is comparable across the two cycles of the Survey of Adult Skills. In the case of ICT use at work and home, learning at work, numeracy use at work and home, and writing at work and at home, however, the innovations that were introduced to better measure some concepts have compromised the comparability over time. The codebook for the Survey of Adult Skills database clearly identifies whether variables are comparable through the variable names; more detailed information is contained in the Survey of Adult Skills Data Analysis Manual (OECD, forthcoming[17]).
Survey methods and operational standards and procedures
Other things being equal, differences in design, methodology and operational procedures may have a potentially significant effect on the comparability of different assessments. This section presents a comparison of the extent of comparability between IALS, ALL and the two cycles of Survey of Adult Skills in terms of:
the target population
sample design and procedures
survey operations
survey response rates.
The target population
The target population defined for both IALS and ALL is identical to that of the two cycles of the Survey of Adult Skills, i.e. civilian, non-institutionalised persons aged 16-656. In each of the four surveys, participating countries/economies were required to use sampling frames that covered the target population. Exclusions of up to a maximum of 5% of the target population were permitted, and all countries met the requirement of including 95% or more of the target population in their sampling frames.7
Sample design
In all four surveys, participating countries and economies were required to use a probability sample representative of the target population. There were no deviations from this requirement in either cycle of the Survey of Adult Skills or ALL. In IALS, there was one deviation: Germany employed a non-probability selection method at the second stage of its three-stage sample design (Murray, Kirsch and Jenkins, 1998[8]). However, the extent of deviation from strict probability sampling was assessed to be “relatively minor” and was not believed to have “introduced significant bias into the survey estimates”.
In the second cycle of the Survey of Adult Skills there was evidence that in Lithuania and in the Slovak Republic not all eligible persons in a household were given a chance to be selected to participate in the survey, which could lead to undercoverage bias. Measures were taken to reduce undercoverage bias (weight calibration). While some additional caution should be taken when analysing data from these countries, the results of additional analysis, including non-response bias analysis (NRBA), suggest that the effect of this deviation from the sampling standards is rather small (OECD, forthcoming[18]).
Survey operations and the introduction of the doorstep interview
Both the degree of standardisation of survey procedures and the effort put into monitoring compliance with these standards have been greater in the two cycles of the Survey of Adult Skills than was the case in either IALS or ALL. An external review of the implementation of the first round of IALS8 conducted in the second half of 1995 concluded that while there were no concerns regarding the development of instrumentation: “The variation in survey execution across countries is so large that we recommend that all comparative analyses across countries should be interpreted with due caution” (Kalton, Lyberg and Rempp, 1998, p. 4[20]). In particular, while guidance on survey procedures was provided to the participating countries and economies, the reviewers found that little was done to “enforce adherence to specific procedures” (Kalton, Lyberg and Rempp, 1998, p. 4[20]). Quality assurance procedures were subsequently improved for the second and third rounds of IALS (OECD/Statistics Canada, 2000[9]) and in ALL.9
Maximising standardisation in processes and procedures, and therefore minimising any differentials in error resulting from variation in implementation, was a central objective of the Survey of Adult Skills. The quality assurance and quality control procedures put in place are among the most comprehensive and stringent ever implemented for an international household-based survey. The standards that participating countries and economies are required to meet in implementing the two cycles of the Survey of Adult Skills were set out in two comprehensive sets of Technical Standards and Guidelines. These were accompanied by a quality assurance and quality control process at key stages of implementation (e.g. sampling designs) and data collection throughout the project. The results of the quality control activity fed into an assessment of the overall quality of the data from each participating country (see also Chapter 5).
An important innovation in design and operations introduced in the second cycle of the Survey of Adult Skills is the administration of a doorstep interview to adults who were not able to participate in the survey because of language barriers (literacy-related non-respondents). In previous adult skills surveys, no information was collected on such adults, effectively leading to a small undercoverage of the target population and an upward bias in the estimation of the average proficiency of the population10.
In the 2023 Survey of Adult Skills, such adults were administered a very short questionnaire available in many languages (the doorstep interview). Respondents completed this on a tablet by themselves. The limited information collected through this doorstep interview (age, gender, level of education, employment status and migration history) was used to generate plausible values for these respondents, thus allowing them to contribute to the estimation of the average proficiency of the population. A separate population model was used for estimating the proficiency of such respondents, which was constrained not to exceed the proficiency of respondents who failed the locator assessment (OECD, forthcoming[18]).
The introduction of the doorstep interview creates a small misalignment between the populations for which proficiency estimates are available across different surveys. For this reason, it is recommended to systematically exclude doorstep interview cases when comparing results from the 2023 Survey of Adult Skills with those of previous surveys.
Survey response rates
Non-response is a potentially significant source of error in any survey. In comparing results across the adult skills surveys, it is important to be aware of their different response rates. Table 6.7 presents the response rates of the four surveys for those countries/economies for which repeated observations are available. As is evident in the table, response rates have declined over time in most countries. This is a general trend common to all surveys, and has possibly been accelerated in the aftermath of the COVID-19 pandemic when data for the 2023 Survey of Adult Skills were collected. Low response rates increase the possibility that results of the survey are affected by non-response bias. For this reason, NRBA has been conducted in the first and second cycle of the survey for countries with response rates below 70%. More details on this analysis and its results are presented in Chapter 5, as well as in the Technical Report of the 2023 Survey of Adults Skills (OECD, forthcoming[18]). For the first cycle of the survey, the results of the NRBA can be found in OECD (2019[5]; 2019[1]).
Table 6.7. Response rates across adult skills surveys (%)
Copy link to Table 6.7. Response rates across adult skills surveys (%)|
OECD countries and economies |
IALS (1994-98) |
ALL (2003-07) |
Survey of Adult Skills (first cycle, 2011-18) |
Survey of Adult Skills (second cycle, 2022-23) |
|---|---|---|---|---|
|
OECD countries |
||||
|
Australia |
96 |
79 |
71 |
- |
|
Austria |
- |
- |
53 |
39 |
|
Canada |
69 |
66 |
59 |
28 |
|
Chile |
74 |
- |
66 |
56 |
|
Czechia |
61 |
- |
66 |
40 |
|
Denmark |
66 |
- |
50 |
27 |
|
Estonia |
- |
- |
63 |
50 |
|
Finland |
69 |
- |
66 |
34 |
|
France |
- |
- |
67 |
55 |
|
Germany |
69 |
- |
55 |
45 |
|
Hungary |
- |
63 |
57 |
59 |
|
Ireland |
60 |
- |
72 |
47 |
|
Israel |
- |
- |
61 |
61 |
|
Italy |
35 |
44 |
56 |
29 |
|
Japan |
- |
- |
50 |
41 |
|
Korea |
- |
- |
75 |
73 |
|
Latvia |
- |
- |
- |
28 |
|
Lithuania |
- |
- |
54 |
44 |
|
Netherlands |
45 |
47 |
51 |
40 |
|
New Zealand |
74 |
64 |
63 |
48 |
|
Norway |
61 |
56 |
62 |
41 |
|
Poland |
75 |
- |
56 |
57 |
|
Portugal |
- |
- |
- |
39 |
|
Slovak Republic |
- |
- |
66 |
70 |
|
Slovenia |
70 |
- |
62 |
- |
|
Spain |
- |
- |
48 |
61 |
|
Sweden |
60 |
- |
62 |
31 |
|
Switzerland |
55 |
- |
- |
30 |
|
United States |
60 |
66 |
68 (2012/14) 56 (2017) |
28 |
|
Subnational entities |
||||
|
England (UK) |
63 |
- |
59 |
38 |
|
Flemish Region. (Belgium) |
36 |
- |
62 |
35 |
|
Partner countries |
||||
|
Croatia |
- |
- |
- |
36 |
|
Singapore |
- |
- |
63 |
62 |
The relationship between the Survey of Adult Skills and PISA
Copy link to The relationship between the Survey of Adult Skills and PISAAll the countries and economies participating in the Survey of Adult Skills have also participated in at least some rounds of the OECD Programme for International Student Assessment. As a result, some of the adults sampled for the Survey of Adult Skills will have been eligible to participate in PISA at some point in time.
PISA and the Survey of Adult Skills assess ostensibly similar skills. In particular, literacy and numeracy as assessed in the Survey of Adult Skills have clear similarities with reading and mathematics assessed in PISA. Given the overlap in terms of the cohorts assessed and the content of the assessments, this section illustrates the similarities and differences between the two studies and the extent to which the results of the two studies can be compared.
The conceptualisation of literacy and numeracy skills in the Survey of Adult Skills has much in common with the skills of reading literacy and mathematical literacy in PISA. However, the Survey of Adult Skills was not designed to be linked psychometrically to PISA. Even in those areas in which conceptual links are strongest (in the domains of literacy/reading literacy and numeracy/mathematical literacy), the measurement scales are distinct.
PISA cohorts in the target population of the Survey of Adult Skills
The target population for the two cycles of the Survey of Adult Skills includes cohorts who were eligible to participate in PISA 2000, 2003, 2006, 2009, 2012, 2015, 2018 and 2022. Table 6.8 shows how old the cohorts assessed in the eight rounds of PISA between 2000 and 2022 would have been at the time when the data for the two cycles of the Survey of Adult Skills were being collected.
Table 6.8. Age of PISA cohorts in 2022-23
Copy link to Table 6.8. Age of PISA cohorts in 2022-23|
Survey of Adult Skills (first cycle, 2011-18) |
Survey of Adult Skills (second cycle, 2022-23) |
|||
|---|---|---|---|---|
|
Age in 2011-12 |
Age in 2014-15 |
Age in 2017-18 |
Age in 2022-23 |
|
|
PISA 2000 |
26-27 |
29-30 |
32-33 |
37-38 |
|
PISA 2003 |
23-24 |
26-27 |
29-30 |
34-35 |
|
PISA 2006 |
20-21 |
23-24 |
26-27 |
31-32 |
|
PISA 2009 |
17-18 |
20-21 |
23-24 |
28-29 |
|
PISA 2012 |
- |
17-18 |
20-22 |
25-26 |
|
PISA 2015 |
- |
- |
17-18 |
22-23 |
|
PISA 2018 |
- |
- |
- |
19-20 |
|
PISA 2022 |
- |
- |
- |
16-17 |
Differences in the target population
As noted above, several “PISA cohorts” are included in the population assessed in the two cycles of the Survey of Adult Skills. There are differences in coverage of these cohorts, which need to be considered when comparing the results from these surveys. In particular, the target population of the Survey of Adult Skills is broader than that of PISA; as a result, not all adults in these “PISA cohorts” were in fact part of the PISA target population.
The target population of PISA is young people aged from 15 years and 3 months to 16 years and 2 months at the beginning of the assessment period who were enrolled in an educational institution in Grade 7 or above. Fifteen-year-olds who are not enrolled at an educational institution are not tested as part of PISA, and in all countries participating in the eight rounds of PISA between 2000 and 2022, a proportion of 15-year-olds were out of school or in grades lower than Grade 7, and therefore excluded from the PISA target population. In 2018, for example, the PISA sample represented around 90% of the 15-year-old population in most countries that participated in the 2023 Survey of Adult Skills. The coverage was lowest in Israel (81%) and highest in Germany (99%) (OECD, 2019[22]). In contrast, the target population for the Survey of Adult Skills is the entire resident population. Therefore, the “PISA cohorts” surveyed in the Survey of Adult Skills include, in addition to persons who were at school at age 15 (and, therefore, part of the PISA target population), those who were out of school at the age of 15 (and, therefore, outside the PISA target population). Irrespective of any other considerations, the different rates of coverage are relevant to comparisons of the results of the two surveys for these cohorts. In particular, it seems likely that, in most countries, the mean proficiency scores for the full 15-year-old cohort would have been lower than those observed for 15-year-olds who were in school,11 as the available evidence suggests that early school-leavers are less proficient than students who continue in schooling (Spaull and Taylor, 2015[23]; Taylor and Spaull, 2015[24]; OECD, 2019[22]).
Skills assessed
Table 6.9 compares the skill domains assessed in the Survey of Adult Skills and those assessed across the PISA rounds that have been administered since 2000. As can be seen, both studies assess skills in the domains of literacy/reading, numeracy/mathematics. They also assess problem solving, but these are considered innovative domains in PISA, and the domains have changed across the two cycles of the Survey of Adult Skills. As a result, the comparability of the different problem-solving assessments across the two studies is not discussed here. The one area in which there is certainly no overlap is that of science, which the Survey of Adult Skills does not cover.
Table 6.9. Comparison of the skill domains assessed by the Survey of Adult Skills and PISA
Copy link to Table 6.9. Comparison of the skill domains assessed by the Survey of Adult Skills and PISA|
Survey of Adult Skills (second cycle, 2022-23) |
Survey of Adult Skills (first cycle, 2011-17) |
PISA |
|---|---|---|
|
Literacy |
Literacy |
Reading (2000, 2003, 2006, 2009, 2012, 2015, 2018, 2022) Electronic reading (2009) |
|
Numeracy |
Numeracy |
Mathematics (2000, 2003, 2006, 2009, 2012, 2015, 2018, 2022) |
|
Science |
||
|
Adaptive problem solving |
Problem solving in technology-rich environments |
Problem solving (2003, 2012), Collaborative problem solving (2015) |
Psychometric links
The two cycles of the Survey of Adult Skills were not designed to allow direct comparisons of their results with those of PISA. Despite similarities in the broad approach to defining the skills assessed, the Survey of Adult Skills and PISA share no common items, and their results cannot be treated as being on the same scale in any of the domains that they ostensibly have in common.
An objective of the first round of PISA was to establish a psychometric link between PISA and the International Adult Literacy Survey (IALS) in the domain of literacy (OECD, 1999[25]). Fifteen prose items from IALS were embedded in the PISA 2000 test booklets for the main study. Items from IALS were not included in the assessments of reading literacy conducted in subsequent rounds of PISA, however.
The outcomes of an analysis investigating whether students taking the PISA 2000 assessment could be placed on the IALS prose literacy scale are reported in Yamamoto (2002[26]) and OECD (2002[27]). Yamamoto concluded that PISA students could be placed on the IALS prose literacy scale.12 OECD (2002[27]) presents the distribution of students in participating countries across the five IALS proficiency levels.
More recently, concordance between the PISA and PIAAC scales was established through a statistical link that exploited a pseudo-equivalent group design (Borgonovi et al., 2017[28]; Pokropek and Borgonovi, 2019[29]). In 2012, when both PISA and the Survey of Adult Skills were administered in Poland, some students who participated in PISA were selected on the basis of attending Grade 10. They were therefore older than 15, and thus also eligible to participate in the Survey of Adult Skills. Scale concordance scores for reading/literacy and for mathematics/numeracy were used to map PISA and PIAAC scales to one another on the basis of this partial sample overlap in Poland and the existence of comparable background information in the Polish questionnaires for the two surveys.
The relationship between constructs in the domains of literacy and numeracy
While there has been no attempt to link the cycles of the Survey of Adult Skills to the cycles of PISA in any assessment domain, the two studies share a similar approach in terms of the definition of the domains assessed.
Both the Survey of Adult Skills and PISA hold an action-oriented or functional conception of skills. The object of interest is the application and use of knowledge and know how in common life situations as opposed to the mastery of a body of knowledge or a repertoire of techniques. In defining assessment domains, the emphasis is placed on the purposive and reflective use and processing of information to achieve a variety of goals. To this end, in both surveys, the skills assessed are defined in terms of a set of behaviours through which the skill is manifested and a set of goals that the behaviours in question are intended to achieve.
The Survey of Adult Skills and PISA also share a common approach to the specification of the constructs measured.13 The frameworks defining the constructs specify their features in terms of three dimensions: content, cognitive processes and context. The dimension of content (“knowledge domain” in PISA) relates to the artefacts, tools, knowledge, representations, cognitive challenges, etc., that constitute the corpus to which an individual (an adult, in the case of the Survey of Adult Skills; a 15-year-old student in the case of PISA) must respond or that they must use. Cognitive processes (“competencies” in PISA) cover the mental processes that individuals bring into play to respond to or appropriately use given content. Context (“context and situation” in PISA) refers to the different situations in which individuals read, display numerate behaviour, solve problems or use scientific knowledge.
The similarities and differences between the conceptualisation of the domains of literacy, numeracy and problem solving in the Survey of Adult Skills and those of reading, mathematics and problem solving in PISA are compared below through definitions taken from their respective assessment frameworks. It focuses on the latest PISA assessment frameworks – 2018 for reading and 2022 for mathematics.
Literacy and reading
Table 6.10 summarises the definition, content, processes and context dimensions of the literacy framework of the two cycles of the Survey of Adult Skills with the latest reading literacy frameworks for PISA 2018.
Table 6.10. Comparison of the Survey of Adult Skills and PISA: Literacy and reading
Copy link to Table 6.10. Comparison of the Survey of Adult Skills and PISA: Literacy and reading|
Survey of Adult Skills (second cycle, 2022-23) |
Survey of Adult Skills (first cycle, 2011-18) |
PISA 2018 |
|
|---|---|---|---|
|
Definition |
Literacy is accessing, understanding, evaluating, and reflecting on written texts in order to achieve one’s goals, develop one’s knowledge and potential, and participate in society. |
The ability to understand, evaluate, use and engage with written texts to participate in society, to achieve one’s goals, and to develop one’s knowledge and potential. |
Reading literacy is understanding, using, evaluating, reflecting on and engaging with texts in order to achieve one’s goals, develop one’s knowledge and potential and participate in society. |
|
Cognitive processes |
|
|
|
|
Content |
Texts characterised by their:
|
Texts characterised by their medium (print-based or digital) and by format:
|
Text format:
Text type:
|
|
Context |
|
|
|
|
Factors affecting task difficulty |
text-by-task factors (type of match, presence of distracting or irrelevant information) |
|
|
|
Assessment mode |
Computer-based (tablet device). One-to-one administration with the presence of an interviewer. |
Computer-based (laptop device) and paper-based option. One-to-one administration with the presence of an interviewer. |
Computer-based, with paper-based option for countries that were unable to implement a digital survey. Exam-style administration in a school context. |
The two cycles of the Survey of Adult Skills and PISA 2018 share similar conceptualisations of literacy and reading literacy. This is evident in the similarity of cognitive processes that are identified as parts of the assessment domains, the content types and the range of contexts for reading.
Numeracy and mathematics
Table 6.11 summarises the definition, content, processes and context dimensions of the numeracy framework of the two cycles of the Survey of Adult Skills with the latest mathematical literacy frameworks for PISA 2022.
Table 6.11. Comparison of the Survey of Adult Skills and PISA: Numeracy and mathematics
Copy link to Table 6.11. Comparison of the Survey of Adult Skills and PISA: Numeracy and mathematics|
Survey of Adult Skills (second cycle, 2022-23) |
Survey of Adult Skills (first cycle, 2011-17) |
PISA 2022 |
|
|---|---|---|---|
|
Definition |
Numeracy is accessing, using and reasoning critically with mathematical content, information and ideas represented in multiple ways in order to engage in and manage the mathematical demands of a range of situations in adult life. |
The ability to access, use, interpret and communicate mathematical information and ideas in order to engage in and manage the mathematical demands of a range of situations in adult life |
Mathematical literacy is an individual’s capacity to reason mathematically and to formulate, employ and interpret mathematics to solve problems in a variety of real-world contexts. It includes concepts, procedures, facts and tools to describe, explain and predict phenomena. It assists individuals to know the role that mathematics plays in the world and to make the well-founded judgements and decisions needed by constructive, engaged and reflective 21st century citizens. |
|
Cognitive processes |
|
|
|
|
Content |
Mathematical content information and ideas:
Mathematical representations:
|
Mathematical content, information and ideas:
Representations of mathematical content:
|
|
|
Context |
|
|
|
|
Assessment mode |
Computer-based (tablet device). One-to-one administration with the presence of an interviewer. |
Computer-based (laptop device) and paper-based option. One-to-one administration with the presence of an interviewer. |
Computer-based, with paper-based option for countries that were unable to implement a digital survey. Exam-style administration in a school context. |
In sum, the two cycles of the Survey of Adult Skills and PISA 2022 have overlapping conceptualisations of numeracy and mathematical literacy. This overlap is evident in the similarity of cognitive processes that are identified as part of the assessment domains. However, second cycle of the Survey of Adult Skills includes an additional content type, mathematical representations, that is not explicitly included in the assessment frameworks of the first cycle or PISA, although these frameworks are likely to implicitly include it. The range of contexts also differs. PISA includes a scientific context, covering mathematical problems in the context of mathematics as a science or field of human endeavour, which includes mathematics as it is typically studied at school. The first cycle of the Survey of Adult Skills incorporated this as an aspect of the education and training context. In contrast, the second cycle has subsumed this into the social/community context, and it is no longer separately specified.
References
[28] Borgonovi, F. et al. (2017), “Youth in Transition: How Do Some of The Cohorts Participating in PISA Fare in PIAAC?”, OECD Education Working Papers, No. 155, OECD Publishing, Paris, https://doi.org/10.1787/51479ec2-en.
[31] Gaëlle, P. et al. (2014), “STEP skills measurement surveys: Innovative tools for assessing skills”, Social Protection and Labor Discussion Paper, No. 1421, World Bank, Washington, DC, http://documents.worldbank.org/curated/en/516741468178736065/STEP-skills-measurement-surveys-innovative-tools-for-assessing-skills.
[13] Gal, I. et al. (2005), “Adult numeracy and its assessment in the ALL survey: A conceptual framework and pilot results”, in International Adult Literacy Survey: Measuring Adult Literacy and Life Skills: New Frameworks for Assessment, Statistics Canada.
[19] Gesthuizen, M., H. Solga and R. Künster (2010), “Context matters: Economic marginalization of low-educated workers in cross-national perspective”, European Sociological Review, Vol. 27/2, pp. 264-280, https://doi.org/10.1093/esr/jcq006.
[15] Greiff, S. et al. (2021), “PIAAC Cycle 2 assessment framework: Adaptive problem solving”, in The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Publishing, Paris, https://doi.org/10.1787/3a14db8b-en.
[4] Hogan, J. et al. (2016), U.S. Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014: Main Study and National Supplement Technical Report, National Center for Education Statistics, Institute of Education Sciences,U.S. Department of Education, https://nces.ed.gov/pubs2016/2016036_rev.pdf (accessed on 2 October 2024).
[20] Kalton, G., L. Lyberg and J. Rempp (1998), “Appendix A: Review of methodology”, in Adult Literacy in OECD Countries: Technical Report on the First International Adult Literacy Survey, National Center for Education Statistics, Office of Educational Research and Improvement, Washington, DC.
[2] Keslair, F. and M. Paccagnella (2020), “Assessing adults’ skills on a global scale: A joint analysis of results from PIAAC and STEP”, OECD Education Working Papers, No. 230, OECD Publishing, Paris, https://doi.org/10.1787/ae2f95d5-en.
[10] Murray, T., Y. Clermont and M. Binkley (2005), Measuring Adult Literacy and Life Skills: New Frameworks for Assessment, Statistics Canada, Ottawa.
[8] Murray, T., I. Kirsch and L. Jenkins (eds.) (1998), Adult Literacy in OECD Countries: Technical Report on the First International Adult Literacy Survey, National Center for Education Statistics, Office of Educational Research and Improvement.
[6] OECD (2021), The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Skills Studies, OECD Publishing, Paris, https://doi.org/10.1787/4bc2342d-en.
[22] OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, https://doi.org/10.1787/5f07c754-en.
[32] OECD (2019), Skills Matter: Additional Results from the Survey of Adult Skills, OECD Skills Studies, OECD Publishing, Paris, https://doi.org/10.1787/1f029d8f-en.
[5] OECD (2019), Technical Report of the Survey of Adult Skills (PIAAC), Third Edition, OECD, Paris, https://www.oecd.org/content/dam/oecd/en/about/programmes/edu/piaac/technical-reports/cycle-1/PIAAC_Technical_Report_2019.pdf (accessed on 2 October 2024).
[1] OECD (2019), The Survey of Adult Skills : Reader’s Companion, Third Edition, OECD Skills Studies, OECD Publishing, Paris, https://doi.org/10.1787/f70238c7-en.
[16] OECD (2013), The Survey of Adult Skills: Reader’s Companion, OECD Publishing, Paris, https://doi.org/10.1787/9789264204027-en.
[11] OECD (2012), Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills, OECD Publishing, Paris, https://doi.org/10.1787/9789264128859-en.
[7] OECD (2010), PISA 2009 Results: What Students Know and Can Do: Student Performance in Reading, Mathematics and Science (Volume I), PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264091450-en.
[27] OECD (2002), Reading for Change: Performance and Engagement across Countries: Results from PISA 2000, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264099289-en.
[25] OECD (1999), Measuring Student Knowledge and Skills: A New Framework for Assessment, OECD Publishing, Paris, https://doi.org/10.1787/9789264173125-en.
[17] OECD (forthcoming), Survey of Adult Skills 2023 Data Analysis Manual, OECD Publishing, Paris.
[18] OECD (forthcoming), Survey of Adult Skills 2023 Technical Report, OECD Publishing, Paris.
[21] OECD/Statistics Canada (2011), Literacy for Life: Further Results from the Adult Literacy and Life Skills Survey, OECD Publishing, Paris, https://doi.org/10.1787/9789264091269-en.
[30] OECD/Statistics Canada (2005), Learning a Living: First Results of the Adult Literacy and Life Skills Survey, OECD Publishing, Paris, https://doi.org/10.1787/9789264010390-en.
[9] OECD/Statistics Canada (2000), Literacy in the Information Age: Final Report of the International Adult Literacy Survey, OECD Publishing, Paris, https://doi.org/10.1787/9789264181762-en.
[29] Pokropek, A. and F. Borgonovi (2019), “Linking via Pseudo‐Equivalent Group Design: Methodological Considerations and an Application to the PISA and PIAAC Assessments”, Journal of Educational Measurement, Vol. 57/4, pp. 527-546, https://doi.org/10.1111/jedm.12261.
[3] Rampey, B. et al. (2016), Skills of U.S. Unemployed, Young, and Older Adults in Sharper Focus: Results from the Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014: First Look, National Center for Education Statistics, Washington, DC, https://nces.ed.gov/pubs2016/2016039.pdf.
[12] Rouet, J. et al. (2021), “PIAAC Cycle 2 assessment framework: Literacy”, in The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Publishing, Paris, https://doi.org/10.1787/7b3bf33b-en.
[23] Spaull, N. and S. Taylor (2015), “Access to what? Creating a composite measure of educational quantity and educational quality for 11 African countries”, Comparative Education Review, Vol. 59/1, pp. 133-165, https://doi.org/10.1086/679295.
[24] Taylor, S. and N. Spaull (2015), “Measuring access to learning over a period of increased access to schooling: The case of Southern and Eastern Africa since 2000”, International Journal of Educational Development, Vol. 41, pp. 47-59, https://doi.org/10.1016/j.ijedudev.2014.12.001.
[14] Tout, D. et al. (2021), “PIAAC Cycle 2 assessment framework: Numeracy”, in The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Publishing, Paris, https://doi.org/10.1787/c4221062-en.
[26] Yamamoto, K. (2002), Estimating PISA students on the IALS Prose Literacy Scale, https://typeset.io/pdf/estimating-pisa-students-on-the-ials-prose-literacy-scale-2os020grtp.pdf.
Notes
Copy link to Notes← 1. See OECD/Statistics Canada (2000[9]; 2005[30]; 2011[21]), for information on the methods and results of IALS and ALL, and OECD (2019[32]) for information on the methods and results of the first cycle of the Survey of Adult Skills.
← 2. Information about LAMP can be found at https://unesdoc.unesco.org/ark:/48223/pf0000217138 and information regarding STEP in Gaëlle et al. (2014[31]).
← 3. In IALS and ALL, prose literacy was defined as the knowledge and skills needed to understand and use continuous texts – information organised in sentence and paragraph formats. Document literacy represented the knowledge and skills needed to process documents (or non-continuous texts) in which information is organised in matrix structures (i.e. in rows and columns). The type of documents covered by this domain included tables, signs, indexes, lists, coupons, schedules, charts, graphs, maps and forms.
← 4. Reading components items were scaled in three steps. First, only the (other) literacy items were scaled. Second, these literacy items were finalised and item fits were evaluated in a way that was not affected by the reading component items. Third, reading component items were added to the scaling procedure and item fits were evaluated.
← 5. Numeracy components items were scaled in three steps. First, only the (other) numeracy items were scaled. Second, these numeracy items were finalised and item fits were evaluated in a way that was not affected by the numeracy component items. Third, numeracy component items were added to the scaling procedure and item fits were evaluated.
← 6. See chapter 5 for more information on which adults were excluded from the target population.
← 7. Exclusions were permitted for “practical operational” reasons in ALL (OECD/Statistics Canada, 2005[30]). Murray, Kirsch and Jenkins (1998, p. 27[8]) provide a list of exclusions in participating countries for the first wave of IALS.
← 8. The first round involved nine countries: Canada, France, Germany, Ireland, the Netherlands, Poland, Sweden, Switzerland and the United States. France withdrew from the study in 1995 citing concerns regarding data quality.
← 9. A technical report covering the first wave of IALS was published in 1998 (Murray, Kirsch and Jenkins, 1998[8]). Some information on the implementation of the 2nd and 3rd rounds of IALS and the implementation of ALL is available in the methodological appendices of OECD/Statistics Canada (2000[9]) OECD/Statistics Canada (2005[30]) and OECD/Statistics Canada (2011[21]). However, no technical reports covering the second and third rounds of IALS and the two rounds of ALL have been released.
← 10. To minimise literacy-related non-responses, it was possible to rely on translators or interpreters (either family members or staff of the survey organisation) to help respondents answering the background questionnaire. Sweden is the only country that, in both cycles of the survey, was able to provide a sufficient number of interpreters so that all respondents, even those with severe language barriers, were able to complete the full background questionnaire. For this reason, no respondent in Sweden was classified as literacy-related non-respondent in the first cycle, and no respondent took the doorstep interview in the second cycle. This improved the precision of the estimates of proficiency for such respondents in Sweden because richer information is available for them. However, it also introduced a small difference in survey operations between Sweden and other countries, as adults who were not able to participate in the survey because of language barriers were treated slightly differently in Sweden than in other countries. As such adults constitute a very small percentage of the population, threat to comparability remains low. However, some caution is warranted when analysing results of subgroups of the population (for example, recent immigrants) where adults with very low language proficiency can constitute a larger share.
← 11. Fifteen-year-olds in home schooling may be an exception.
← 12. Some block-order effects (responses were affected by where the items were placed in the assessment) were found in respect of the IALS items in PISA that were not present in IALS.
← 13. This reflects the influence of the IALS frameworks on the development of both the PISA literacy framework (OECD, 1999[25]) and the literacy framework of the Survey of Adult Skills.