This chapter describes the approach taken by the 2023 Survey of Adult Skills to measure proficiency in literacy, numeracy and adaptive problem solving. It discusses the content, cognitive processes and contexts applicable to the assessment and provides some examples of assessment items.
Survey of Adult Skills – Reader's Companion

2. Measuring cognitive skills in the 2023 Survey of Adult Skills
Copy link to 2. Measuring cognitive skills in the 2023 Survey of Adult SkillsAbstract
A unique feature of the Survey of Adult Skills is its inclusion of a direct assessment of the information-processing skills of participating adults. The information gathered through this assessment allows us to estimate the distribution of skills among the adult population in participating countries and economies.
The 2023 Survey of Adult Skills assessed adults in three domains: literacy, numeracy and adaptive problem solving. These skills are deemed essential for full participation in the economic and social life of modern societies. Results from past adult skills surveys – such as the International Adult Literacy Survey (IALS, conducted in the mid-1990s), the Adult Literacy and Life Skills Survey (ALL, conducted in the mid-2000s) and the first cycle of the Survey of Adult Skills (conducted between 2012 and 2018) – have repeatedly demonstrated the importance of literacy and numeracy skills for economic and non-economic outcomes.1 Adaptive problem solving is a new domain assessed for the first time in the 2023 Survey of Adult Skills.
This chapter provides an overview of the approach to assessing adult skills. It then describes in greater detail how the skills to be measured were conceptualised and how assessment items were developed. The chapter also provides examples of some assessment items.
Some key features of the assessment
Copy link to Some key features of the assessmentA focus on key information-processing skills
The assessment tasks in the Survey of Adult Skills focus on respondents’ ability to draw on information-processing strategies to perform tasks in real-world contexts. Whereas large-scale assessments of school-age populations may focus on the sets of skills that students are expected to have mastered at key stages of their education (without being linked specifically to any particular curriculum), the assessment tasks in the Survey of Adult Skills are designed to measure a broad set of foundational skills required to successfully interact with the range of real-life tasks and materials that adults encounter in everyday life.
Successful completion of these tasks does not require specialised knowledge or more specific skills: in this sense, the skills assessed in the Survey of Adult Skills can be considered “foundational” or, more appropriately, the general skills required for a very broad range of situations and domains. In no way should they be seen as basic skills that are less complex than other higher-order or specialised’ skills. The Survey of Adult Skills does not take a prescriptive approach in defining a minimum level of skills that adults are supposed to achieve to “fully function” in modern societies.
Reflecting the changing nature of information
Data-intensive, complex digital environments are more and more pervasive in both the workplace and everyday life, and it has become increasingly important for adults to be able to navigate, critically analyse and solve problems in these new environments. The conceptual frameworks underlying the three assessment domains all emphasise this changing nature of information as a critical feature that had to be reflected in the assessment tasks if the survey results were to be truly informative about the skills adults need in today’s societies.
To meet this goal, the literacy and numeracy frameworks used in the first cycle of the Survey of Adult Skills have been updated, and a number of innovative aspects introduced to reflect the types of tasks found in digital environments. For example, some literacy tasks require participants to consult multiple sources of information, including both static and dynamic texts, in order to respond. Similarly, some numeracy tasks include dynamic applications that require interactive, digitally based tools. The new domain of adaptive problem solving was also developed to account for the digital environments adults now routinely navigate.
Accounting for very high and very low levels of proficiency
As well as the inclusion of tasks that focus on the more sophisticated strategies required in digital environments, it was equally important to be able to assess the skills of those with more limited proficiency. In all domains, care has been taken to design items of varying levels of difficulty to provide as much coverage as possible across the entire ability distribution in all participating countries.
For example, the adaptive problem solving domain includes a set of “static” tasks with no dynamic features that require the application of adaptive strategies in order to obtain some measure of basic problem solving ability among those with more limited skills. In the case of literacy and numeracy, the assessment includes two types of tasks specifically designed to measure skills at the lower end of the proficiency distribution. These include the locator tasks and the component measures. The locator tasks are among the easier tasks in the assessment and consist of eight numeracy and eight literacy items. All respondents take these 16 items.2 The component tasks provide information about the basic reading and numeracy skills that support proficient performance in each domain. Respondents who struggle with the locator tasks are asked to complete the component tasks in order to collect some information about their foundational reading and numeracy skills.
An overview of literacy, numeracy and adaptive problem solving
Copy link to An overview of literacy, numeracy and adaptive problem solvingPanels of subject-matter experts developed the conceptual frameworks for each domain, guided the development and selection of items, and informed the interpretation of results3. The complete domain frameworks can be found in OECD (2021[1]). The frameworks define and describe the underlying latent skills the assessment aims to measure. To inform item development, they identify the key task dimensions that should be used to build the assessment and report results. Across the domains, the dimensions focus on:
Content: the various representations of information, or types of materials and tools, that adults use to complete tasks.
Cognitive processes: the information-processing strategies required to use specific materials to meet task demands successfully.
Contexts: the social and situational contexts in which tasks are embedded.
Table 2.1 provides an overview of the definition, content, cognitive processes and contexts for each of the three domains. The remainder of this chapter describes these dimensions in greater detail for each domain.
Table 2.1. Summary of assessment domains in the 2023 Survey of Adult Skills
Copy link to Table 2.1. Summary of assessment domains in the 2023 Survey of Adult Skills
|
Literacy |
Numeracy |
Adaptive problem solving (APS) |
---|---|---|---|
Definition |
Literacy is accessing, understanding, evaluating and reflecting on written texts in order to achieve one’s goals, to develop one’s knowledge and potential, and to participate in society. |
Numeracy is accessing, using and reasoning critically with mathematical content, information and ideas represented in multiple ways in order to engage in and manage the mathematical demands of a range of situations in adult life. |
Adaptive problem solving involves the capacity to achieve one’s goals in a dynamic situation, in which a method for reaching a solution is not immediately available. It requires engaging in cognitive and metacognitive processes to define the problem, search for information, and apply a solution in a variety of information environments and contexts. |
Content |
Content within the literacy domain includes both static and interactive texts. These texts are characterised by their source (single or multiple) and by their format:
In addition, texts reflect a range of genres (e.g. narrative, descriptive and argumentative). They can be organised using a variety of layout features, content representations and digital tools such as scrolling and hyperlinks. |
Mathematical content associated with numeracy tasks includes a variety of representations of information:
Additionally, numeracy content reflects four key areas of mathematical content, comprising quantity and number, space and shape, change and relationships, and data and chance. |
Aspects of the environment in which adaptive problem solving tasks are embedded include:
There are three types of information sources for APS tasks:
|
Cognitive processes |
|
accessing and assessing situations mathematically acting on and using mathematics evaluating, critically reflecting and making judgements. |
APS involves both cognitive and metacognitive processes. Both processes may be required in each of the three states of problem solving:
|
Contexts |
|
work personal social/community. |
|
Literacy
The conceptual framework for the literacy domain is largely based on the one used in the first cycle of the Survey (OECD, 2012[2]). For this cycle, the Literacy Expert Group suggested updates to reflect the growing importance of reading in digital environments, which pose different cognitive demands and challenges to the reader. In particular, the new framework emphasises that readers increasingly need to effectively interact with the multiple texts that they often encounter on line.
Definition
The framework used in the 2023 Survey of Adult Skills defines literacy as “accessing, understanding, evaluating and reflecting on written texts in order to achieve one’s goals, to develop one’s knowledge and potential, and to participate in society” (Rouet et al., 2021[3]).
The framework further elaborates on the key aspects of this definition:
The term literacy is used in its broadest, but also most literal, sense to describe the ability to read written language presented in the form of texts and documents.
Readers engage in accessing texts when they search for texts, or passages within texts, that are relevant to their purpose.
Any literate activity requires some level of understanding. This can range from the most basic skills, such as the literal comprehension of words and sentences, to more complex inferential skills, such as understanding the dispute between two authors making conflicting claims about an argument.
Evaluating involves making judgements about a text, which may include deciding whether it is appropriate for the task at hand or whether it presents accurate and reliable information.
Written texts are defined as including both static and interactive materials. The latter, which are reflective of digital environments, may include features such as hyperlinks.
In line with this definition, the framework defined three core dimensions of literacy: content, cognitive processes and social contexts.
Content
In everyday life, readers engage with a variety of content and read for a range of purposes. As noted in the definition, the literacy assessment focuses on the comprehension of written texts. Texts are further classified by:
source
format
type (or genre)
organisation.
Source: texts can be single or multiple. Single texts originate from a single source – e.g. a single author, publication medium and date of publication. Multiple texts have different authors or are published by different authors or at different times.
Format: texts can be continuous, non-continuous or mixed. Continuous texts have sentences organised into paragraphs. Examples include newspaper articles, essays, brochures and announcements. Non-continuous texts include tables, graphs, forms and diagrams, where information is often displayed in lists. Examples include restaurant menus, tables showing interest rates or sports rankings, and lists of navigation links shown on a web page. Mixed texts include both continuous and non-continuous elements. A web page or article with paragraphs of information supported by a table or graph is an example of a mixed text.
Type: texts can be divided into six types – description, narration, exposition, argumentation, instruction and transaction – which cover most texts adults will encounter in everyday life.
Organisation: the devices used to present content and facilitate access to information often affects how texts are organised. These include layout features and content representation, such as titles, headings and, in the case of longer texts, chapters and indices. Digital texts may include tools such as windows, scroll bars, tabs and hyperlinks.
Cognitive processes
The framework identifies three cognitive processes that support the range of adult reading activities that are the focus of the literacy assessment:
accessing
understanding
evaluating.
Accessing texts involves identifying one or more texts that are relevant to a presented task and locating information within them. Readers must navigate across texts or passages, or within texts, as a function or task demands. In an assessment context, the complexity of an access task is driven by the interaction between the question posed to the test taker and the features of the presented texts.
Understanding consists of constructing meaning and representations. This aspect includes both literal and inferential comprehension of material with a single text, as well as across multiple texts.
Evaluating involves assessing the accuracy and credibility of information in a text; evaluating the soundness of a text (i.e. the completeness and consistency of the information); evaluating the relevance of one or more texts for a given task; and reflecting on the author’s intent, purpose and effectiveness.
Contexts
Adult reading typically occurs within a social setting. The context may influence both the motivation to read and the interpretation of content. As a result, the texts in the literacy assessment derive from four contexts that will be familiar to a broad range of participants. These are:
work and occupation
personal
community and citizenship
education and training.
Texts related to work contexts include general workplace materials associated with finding employment, finance and being on the job. Examples include job listings, workplace policies and employment practices. The framework notes that specialised job-specific texts are not appropriate for inclusion in the assessment due to the background knowledge required.
Materials in the personal context include texts associated with interpersonal relationships, personal health and safety, home and family, consumer economics, and leisure and recreation. Examples include articles on disease prevention, safety and accident prevention, housing, and personal finance.
Texts in the community context are associated with community resources, public services and staying informed. They include official documents, community announcements, blog posts, bulletin boards and news.
Finally, materials related to education and training focus on opportunities for further learning and personal or professional goals.
Distribution of test items by task characteristics
A total of 80 literacy items were included in the final item pool. Items were selected from this pool to construct the assessments administered to adults for the 2023 Survey of Adult Skills. Selected items should:
provide accurate and reliable measurement of the construct across a range of difficulties
meet the target distribution of the key dimensions of literacy as defined in the framework
include enough items used in previous surveys to ensure comparability of results
fulfil the requirements of the adaptive main study assessment design.
Table 2.2 describes the item pool according to the characteristics presented above.
Table 2.2. Distribution of literacy items across the framework dimensions
Copy link to Table 2.2. Distribution of literacy items across the framework dimensions
Number (80 items) |
Percent |
|
---|---|---|
Cognitive process |
||
Accessing text |
30 |
38% |
Understanding |
35 |
44% |
Evaluating |
15 |
19% |
Text source |
||
Single |
51 |
64% |
Multiple |
29 |
36% |
Text format |
||
Continuous |
40 |
50% |
Non-continuous |
25 |
31% |
Mixed |
15 |
19% |
Context |
||
Work and occupation |
9 |
11% |
Personal |
33 |
41% |
Community and citizenship |
28 |
35% |
Education and training |
10 |
13% |
Sample literacy items
This section presents three example literacy items. The items are shown using screenshots of the displays that appear on the tablet used to deliver the assessment. To view and interact with the full set of sample items, see https://www.oecd.org/en/about/programmes/piaac/piaac-released-items.html.
Sample item 1: Bread, question 1
This first example, the first of three items in this unit, represents an easy item and focuses on the following aspects of the literacy construct:
process: accessing text
source: single
text format: continuous
text display: static
context: personal.
Participants must locate and tap on the sentence that states the moisture level at which crackers become soft. Each sentence in the passage can be selected, or deselected, by tapping on it. This item is relatively easy because crackers are only addressed in the last paragraph of this short passage and only one sentence mentions “soft” crackers.
Figure 2.1. Sample literacy item 1: Bread
Copy link to Figure 2.1. Sample literacy item 1: Bread
Sample item 2: Bread, question 2
This second item is somewhat more difficult. Readers must make inferences based on the information presented in the text in order to determine if a set of statements is true for bread, crackers or both. Respondents are asked to tap on a response for each of the presented statements. Only one response can be selected for each row.
The item focuses on the following aspects of the literacy construct:
process: understanding
source: single
text format: continuous
text display: static
context: personal.
Figure 2.2. Sample literacy item 2: Bread
Copy link to Figure 2.2. Sample literacy item 2: Bread
Prior to the next item in this unit, a second text is introduced for respondents to read. Respondents see a transition screen that says, “You look on the web and find a short article with more information about retrogradation. Tap on the NEXT arrow to read the article.”
This staged presentation of stimuli is used throughout the literacy assessment in cases where multiple texts are included in a unit.
Sample item 3: Bread, question 3
As shown in the image below, the second text displays on its own tab on the right side of the screen. While respondents can tap on the tabs at the top of the screen to toggle back and forth between the available texts, only the information presented in the second text is required to answer this third question. Note that if the question required respondents to use information in both the first and second texts, the source would be classified as multiple.
This item is of medium difficulty. As in Sample item 2, readers must make inferences based on the information presented in the text in order to put three storage methods in order. Respondents must drag and drop each method into one location in order to answer.
The item focuses on the following aspects of the literacy construct:
process: understanding
source: single
text format: continuous
text display: static
context: personal.
Figure 2.3. Sample literacy item 3: Bread
Copy link to Figure 2.3. Sample literacy item 3: Bread
Reading components
Reading components represent the basic set of decoding skills that are essential for extracting meaning from written texts. As in the first cycle of the Survey of Adult Skills, the assessment of reading components has been included to provide more information about the skills of adults at the lower end of the literacy proficiency scale.
Two types of reading component tasks are included in the 2023 Survey of Adult Skills: sentence comprehension and passage comprehension. Sentence comprehension tasks ask respondents to identify if a sentence does or does not make sense. Passage comprehension tasks ask respondents to read a short passage as it displays on screen sentence by sentence. For each sentence with a pair of underlined words, respondents are asked to identify the word that gives meaning to the sentence.
For both types of tasks, timing data are collected as well as the answers respondents give to the items. Timing data are useful as they provide a measure of fluency, but they do not contribute to the estimation of literacy proficiency.
Sample reading component items
Sentence comprehension
A single sentence is displayed on the screen, and the respondent is asked to indicate whether the sentence makes sense. As soon as the respondent taps on “YES” or “NO”, the screen displays the next sentence.
Figure 2.4. Sample reading component item: Sentence comprehension
Copy link to Figure 2.4. Sample reading component item: Sentence comprehension
Additional sample sentence comprehension items include:
Two boys threw the wall.
The lightest balloon floated in the bright sky.
A comfortable pillow is soft and rocky.
Passage comprehension
Respondents are asked to read a short article which builds sentence by sentence on the screen. Most sentences include two underlined words, and respondents are asked to select the one that best completes the sentence. The example below shows a portion of one article where three selections have already been made and a fourth sentence has just been displayed on the screen.
Figure 2.5. Sample reading component item: Passage comprehension
Copy link to Figure 2.5. Sample reading component item: Passage comprehension
Numeracy
The development of the numeracy framework for the 2023 Survey of Adult Skills was inspired by a review conducted by a group of experts charged with identifying changes in the field since the framework for the first cycle of the survey was conceived and suggesting appropriate revisions and updates (Tout et al., 2017[4]).
That review urged that the framework for the 2023 Survey of Adult Skills should:
reflect the importance of digital information, representations, devices and applications as realities that adults have to manage in dealing with the numerical demands of everyday life
incorporate a wide range of different mathematical and quantitative skills and knowledge and avoid a narrow view that sees numeracy as only dealing with numbers and arithmetical operations
better emphasise the importance of critical thinking, reflection and reasoning in the context of numeracy
describe the full range of numeracy skills in the adult population.
Definition
The framework for the 2023 Survey of Adult Skills defines numeracy as “accessing, using and reasoning critically with mathematical content, information and ideas represented in multiple ways in order to engage in and manage the mathematical demands of a range of situations in adult life” (Tout et al., 2021[5]).
Key changes from the definition used in the first cycle of the survey include:
The elimination of the word “ability”, as it was thought to imply an innate ability that some people may not possess. This implication was not aligned with the Numeracy Expert Group’s belief that all adults have the capacity to learn mathematics and apply it successfully in their lives.
The phrase “interpret and communicate” has been replaced with “reason critically” to align with the expert group’s view that one core cognitive process for numeracy, particularly in the context of technology environments, is the ability to evaluate, critically reflect and make judgements.
The phrase “represented in multiple ways” was included to reflect the importance of digital information, representations, devices and applications in meeting the numeracy demands of everyday life.
Three core dimensions of numeracy were defined: content (including both areas of mathematical content and different representations of information), cognitive processes, representations and contexts.
Content
As in the first cycle of the survey, the assessment covers four key areas of mathematical content, information and ideas:
quantity and number
space and shape
change and relationships
data and chance.
Quantity and number involve understanding ordering, counts, place value, magnitudes, indicators, relative size and numerical trends. Space and shape cover understanding and using measurement systems and formulas, dimensions and units, location and direction, geometric shapes and patterns, angle properties, symmetry, transformations, and two- and three-dimensional representations and perspectives. Change and relationships cover understanding ways to describe, model and interpret mathematical relationships, quantitative patterns and change. This involves understanding, using and applying proportional reasoning and rates of change, including the use and application of ratios, and recognising, describing and/or using relationships between different variables. Data and chance include topics such as data collection, data displays, charts and graphs, measures of central tendency and variance, and understanding and knowing about chance and probability.
In addition, the framework identifies four types of representations that are found in real-world numeracy tasks:
text or symbols
images of objects
structured information
dynamic applications.
Texts included in the assessment can include symbols and numerical information. Note that in order to limit the impact of reading skills on the numeracy assessment, any text-based stimuli were short, simple and direct. Images of objects include photos or images of physical objects. Structured information consists of data or information represented in tables, graphs, charts and maps and may include calendars, schedules, timetables and infographics. Dynamic applications include interactive applications, animations, and applications supporting calculations such as loan calculators, currency or measurement convertors, spreadsheets and drawing programs.
Cognitive processes
The framework identifies three cognitive processes associated with numeracy skills:
accessing and assessing situations mathematically
acting on and using mathematics
evaluating, critically reflecting and making judgements.
In order to access and assess situations mathematically, adults must be able to examine a contextual problem and determine if and where they can extract the essential mathematics to analyse, set up and solve the problem.
Acting on and using mathematics includes the processes of ordering, counting, estimating, computing, measuring, graphing and drawing. Adults must use their knowledge of mathematical processes, facts and procedures to solve real-world problems. Where relevant, they must also select and use appropriate tools, including those present in digital environments.
Evaluating, critically reflecting and making judgements is the process of evaluating whether a solution to a real-world problem is reasonable and relevant to the original problem situation and context. Based on these judgements, a decision can be made about whether to accept the solution or revise and adjust it.
Contexts
Tasks in the numeracy assessment reflect three real-world context areas that are important for adults:
work
personal
social and community.
Mathematical situations encountered at work are typically more specialised than those in everyday personal life. Examples include completing purchase orders, maintaining inventories, managing schedules, interpreting workplace diagrams, and making and recording measurements.
Tasks in the personal context focus on numeracy-related activities for individuals and their immediate families. These include those associated with handling money and personal or family finances, health and well-being, shopping, personal time management, and travel and holiday planning.
Adults must be able to use quantitative data and statistics to interpret information presented by a range of community or governmental authorities, as well as perform tasks associated with community activities and social events. Sample tasks in this category include understanding graphs and numerical information presenting local or national crime or health data.
Distribution of test items by task characteristics
A total of 80 numeracy items were included in the final item pool, Items were selected from this pool to construct the assessment testlets administered in the 2023 Survey of Adult Skills. Table 2.3 shows the distribution of the pool items.
Table 2.3. Distribution of numeracy items across the framework dimensions
Copy link to Table 2.3. Distribution of numeracy items across the framework dimensions
|
Number (80 items) |
Percent |
---|---|---|
Cognitive process |
||
Access and assess situations mathematically |
23 |
29% |
Act on and use mathematics |
38 |
48% |
Evaluate, critically reflect, make judgements |
19 |
24% |
Representation |
||
Text or symbols |
15 |
19% |
Images of objects |
11 |
14% |
Structured information |
39 |
49% |
Dynamic applications |
15 |
19% |
Mathematical content area |
||
Quantity and number |
19 |
24% |
Space and shape |
16 |
20% |
Change and relationships |
17 |
21% |
Data and chance |
28 |
35% |
Context |
||
Work |
25 |
31% |
Personal |
26 |
33% |
Social/community |
29 |
36% |
Sample numeracy items
This section presents three example numeracy items. To view and interact with the full set of sample items, see https://www.oecd.org/en/about/programmes/piaac/piaac-released-items.html.
Sample item 1: Tolerances
This first item is a multi-part multiple-choice item. For this item, respondents are presented with a scenario about a coolroom – a room that keeps foods frozen at a food processing company – which must maintain a temperature within the range of -20°C to -15°C. For the actual question, respondents are given a table of different temperatures and asked to identify whether or not each temperature is within the acceptable range.
This item focuses on the following aspects of the numeracy construct:
process: access and assess situations mathematically
content: space and shape
representation: images of physical objects
context: work.
Figure 2.6. Sample numeracy item 1: Tolerances
Copy link to Figure 2.6. Sample numeracy item 1: Tolerances
Sample item 2: Render Mix
In this second sample item respondents must calculate the size of the wall to be covered by the render mix and then use the information on the packaging to determine how many kilograms of mix are needed.
The item focuses on the following aspects of the numeracy construct:
process: act on and use mathematics
content: space and shape
representation: images of physical objects
context: work.
Figure 2.7. Sample numeracy item 2: Render Mix
Copy link to Figure 2.7. Sample numeracy item 2: Render Mix
Sample item 3: Wallpaper
The third sample item is a numeric entry item, requiring respondents to fill in a number to answer. It uses a novel interactive tool, called “wallpaper calculator”. For this item, the wallpaper calculator has already been used to determine the number of rolls needed. However, an error was made with one or more values that were entered into the tool. The task is to identify the error(s) and enter the correct value(s).
The item focuses on the following aspects of the numeracy construct:
process: evaluate, critically reflect, make judgements
content: space and shape
representation: dynamic application
context: personal.
Figure 2.8. Sample numeracy item 3: Wallpaper
Copy link to Figure 2.8. Sample numeracy item 3: Wallpaper
Numeracy components
The numeracy component assessment is a new element of the 2023 Survey of Adult Skills, and the first assessment of this kind in the context of large-scale international adult surveys. As for reading components, numeracy components should be thought of as basic numeracy skills that form a prerequisite to developing the more advanced numeracy skills measured in the numeracy assessment. The inclusion of numeracy components allows numeracy skills at the lowest end of the proficiency distribution to be measured more accurately.
Based on their review of the literature and consideration of both conceptual issues and delivery constraints, the Numeracy Expert Group recommended that the numeracy components assessment focus on number sense. Number sense relates to the understanding of quantities and how numbers represent quantities. The numeracy components items ask participants to estimate quantities from real-life pictures and to estimate the relative size of several numerical representations of quantities.
Numeracy components include two types of fluency-based measures, each focusing on different aspects of number sense: how many and which is biggest. How many tasks ask respondents to look at an image and identify how many items are shown. Which is biggest tasks ask respondents to identify the largest of four numbers.
Sample numeracy component items
How many?
Respondents are shown a screen with an image of a set of objects and are asked to tap on a number to indicate how many items are shown. As soon as a number is selected, the next screen displays. The items vary in terms of the number of objects shown and the format in which they are displayed (e.g. presented in an organised array, grouped or in a random visual display).
Figure 2.9. Sample numeracy component item: How many?
Copy link to Figure 2.9. Sample numeracy component item: How many?
Which is biggest?
Respondents are shown a group of four numbers and asked to tap on the number that is biggest. As with the “how many?” items, once a selection is made, the next screen displays.
Figure 2.10. Sample numeracy component item: Which is biggest?
Copy link to Figure 2.10. Sample numeracy component item: Which is biggest?
Additional “Which is biggest?” items display sets of numbers such as:
58 35 16 81
336 313 352 381
67.91 4.7 83 0.96
78.1 81.7 8.71 7.91
Adaptive problem solving
Adaptive problem solving (APS) is a new domain whose conceptual framework was specifically developed for the 2023 Survey of Adult Skills. APS was introduced to replace the assessment of problem solving in technology-rich environments (PSTRE) that was administered in the first cycle of the survey. PSTRE was exclusively focused on problems in digital environments and, as a consequence, it conflated problem solving and information and communication technologies (ICT) skills, as only test-takers with some (basic) ICT skills could participate and display proficiency in this domain. There was a sizeable non-response due to lack of familiarity with ICT devices or poor computer skills (between 8% and 57%, depending on the country). APS was therefore conceptualised as “general” problem solving, relevant to a range of information environments and contexts and not limited to digitally embedded problems.
The underlying conceptual model for APS was developed by Greiff et al. (2017[6]) who defined the three stages of problem solving (definition, search and application), the cognitive and metacognitive processes associated with each stage, and the information environments in which problem-solving tasks are situated. This model highlighted important aspects of adaptive problem solving that should be a focus of the assessment. Additional specifications to define the scale and inform item development were undertaken by the APS Expert Group and articulated in the final framework for this domain.
Definition
The conceptual framework defines adaptive problem solving as “(…) the capacity to achieve one’s goals in a dynamic situation, in which a method for solution is not immediately available. It requires engaging in cognitive and metacognitive processes to define the problem, search for information, and apply a solution in a variety of information environments and contexts” (Greiff et al., 2021[7]).
Within this definition are key features specific to adaptive problem solving compared to problem solving more generally. One critical feature of adaptive problem solving tasks is that these problems involve dynamic situations, where resources and information needed to solve a problem are not readily available, or some aspect of the problem changes while the solution is being developed. Additionally, it emphasises the importance of metacognition. Metacognitive skills are called upon in order to monitor the problem-solving process and adapt solution strategies as a problem changes.
Content
The APS framework identifies three features of tasks that require adaptive problem solving skills:
problem configuration
dynamics of the situation
features of the environment
Items were not classified based on these features but instead the test developers manipulated aspects of these features to develop tasks with a range of difficulty.
The problem configuration refers to the initial problem setup and goal state(s). This includes the elements presented in the problem, the relationships among those elements (e.g. whether they interact with each other or are independent) and the resources or operators made available to the problem solver.
The dynamics of the situation refers to the change (or absence of change) in the problem situation, the problem constraints across time and how these affect the problem configuration. It is the dynamism of the problem that requires respondents to demonstrate adaptive problem solving skills.4 The number of features that change, along with the frequency and salience of those changes, drives the difficulty of an APS task.
Features of the environment refers to various features that are characteristic of the environment and the information and resources that are available. The adaptive problem solving process is affected by the amount and sources of available information and how relevant it is to solve the problem.
The APS framework also identifies the types of information sources that are available to solve a problem:
physical resources
social resources
digital resources.
Physical resources are hands-on and can be manipulated. These might include resources available for driving a car or operating machinery by pressing buttons and pulling levers. Social resources require the problem solver to engage in interpersonal and social interactions. These include planning an activity with friends or leading a group discussion. Digital resources require the problem solver to use of digital knowledge and skills to interact with digital features or devices. Examples include using digital tools to sort a table, send an email or format text.
Cognitive and metacognitive processes
As stated in the definition, adaptive problem solving involves both cognitive and metacognitive processes. Metacognitive processes become more important as problems become more complex and have aspects that change during the course of solving them. Different cognitive and metacognitive processes may be required within each of the three stages of problem solving: defining the problem, searching for information relevant to the solution of the problem and applying a solution.
Defining the problem
The three cognitive processes associated with this stage of problem solving are: selecting, organising and integrating problem information into a mental model; retrieving relevant background information; and externalising an internal problem representation by creating a table, making a drawing, etc. The two metacognitive processes are goal setting and monitoring problem comprehension.
Searching for information relevant for a solution
The framework identifies two cognitive processes associated with searching for a solution: searching for operators, within the mental model and in the environment, and evaluating how well these operators satisfy the problem constraints. The metacognitive processes associated with searching for a solution involve evaluating resources with respect to whether they can be executed. In the artificial problem-solving context of an assessment, this evaluation process is difficult to distinguish from the cognitive processes described above. Therefore, the expert group specified that items that tapped into this process should be coded for analysis as requiring both cognitive and metacognitive processes.
Applying a solution
The primary cognitive process in the third stage is to implement the selected operator(s) to solve the problem. As part of the metacognitive processes associated with applying a solution, problem solvers must evaluate whether they are progressing towards the goal and take actions if they are not. This involves monitoring progress, regulating the application of the operators and reflection.
Contexts
The situational contexts in which a problem might be embedded are:
work
personal
social and community.
Problems in a work context might include situations where one is working under supervision or with co-workers. Tasks in a personal context include problems related to one’s home, family, education, hobbies and finances. Social and community tasks may include interactions with others in leisure activities or use of community resources.
Distribution of test items by task characteristics
A total of 65 adaptive problem solving items were included in the final item pool. Items were selected from this pool to construct the assessments administered in the 2023 Survey of Adult Skills. Table 2.4 shows the distribution of the pool items.
Table 2.4. Distribution of adaptive problem solving items across the framework dimensions
Copy link to Table 2.4. Distribution of adaptive problem solving items across the framework dimensions
|
Number (65 items) |
Percent |
---|---|---|
Information environment |
||
Digital |
26 |
40% |
Physical |
24 |
37% |
Social |
15 |
23% |
Cognitive processes |
||
Definition |
19 |
29% |
Searching |
33 |
51% |
Application |
13 |
20% |
Metacognitive processes |
||
Definition |
23 |
35% |
Searching |
22 |
34% |
Application |
12 |
19% |
Not applicable (static items) |
8 |
12% |
Context |
||
Work |
26 |
40% |
Personal |
27 |
42% |
Social/community |
12 |
18% |
Note: Static items with no dynamic features do not require the application of metacognitive strategies.
Sample adaptive problem solving items
All the sample items for adaptive problem solving are taken from a single unit, in which the problem solver is asked to use an interactive map to accomplish pre-defined goals. Initially, the situation is static; it then becomes dynamic as obstacles change the presented problem and the available solutions. To view an interactive version of these sample items, see https://www.oecd.org/en/about/programmes/piaac/piaac-released-items.html.
Sample item 1: Best Route, question 1
In the first item, the problem solver needs to use an interactive map to find the fastest route to accomplish three goals, keeping a set of time constraints in mind.
The problem solver needs to: take a child to school by a designated time, purchase groceries and return home by a designated time. The total driving time (shown at the bottom right of the screen) updates as the route is selected by the respondent. This could be considered a standard problem-solving task, in which a solution needs to be found given some constraints that need to be satisfied.
Figure 2.11. Sample adaptive problem solving item 1: Best Route
Copy link to Figure 2.11. Sample adaptive problem solving item 1: Best Route
Sample item 2: Best Route, question 2
In the second item, the situation becomes dynamic as the problem solver has to deal with new circumstances that interfere with the initial problem solution. Impasses must be overcome and additional constraints need to be taken into consideration when adapting the initial problem solution.
Figure 2.12. Sample adaptive problem solving item 2: Best Route
Copy link to Figure 2.12. Sample adaptive problem solving item 2: Best Route
References
[7] Greiff, S. et al. (2021), “PIAAC Cycle 2 assessment framework: Adaptive problem solving”, in The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Publishing, Paris, https://doi.org/10.1787/3a14db8b-en.
[6] Greiff, S. et al. (2017), “Adaptive problem solving: Moving towards a new assessment domain in the second cycle of PIAAC”, OECD Education Working Papers, No. 156, OECD Publishing, Paris, https://doi.org/10.1787/90fde2f4-en.
[1] OECD (2021), The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Skills Studies, OECD Publishing, Paris, https://doi.org/10.1787/4bc2342d-en.
[2] OECD (2012), Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills, OECD Publishing, Paris, https://doi.org/10.1787/9789264128859-en.
[3] Rouet, J. et al. (2021), “PIAAC Cycle 2 assessment framework: Literacy”, in The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Publishing, Paris, https://doi.org/10.1787/7b3bf33b-en.
[4] Tout, D. et al. (2017), Review of the PIAAC Numeracy Assessment Framework: Final Report, Australian Council for Education Research (ACER), Camberwell, Australia.
[5] Tout, D. et al. (2021), “PIAAC Cycle 2 assessment framework: Numeracy”, in The Assessment Frameworks for Cycle 2 of the Programme for the International Assessment of Adult Competencies, OECD Publishing, Paris, https://doi.org/10.1787/c4221062-en.
Notes
Copy link to Notes← 1. See Chapter 6 for a more in-depth discussion of the relationship between the 2023 Survey of Adult Skills and previous international surveys of adult skills.
← 2. In the first cycle of the Survey of Adult Skills, the locator test (which was referred to as the “core”) included only eight items. The larger number of items included in the 2023 Survey of Adult Skills locator, as well as the inclusion of a few tasks of medium difficulty, allows the skills of low-performing adults to be more accurately measured without requiring them to take the full assessment. The assessment design is described in more detail in Chapter 5, and the differences between the first and the second cycles of the survey are discussed in Chapter 6.
← 3. The composition of the different expert groups is reported in Annex B.
← 4. It should be noted that the expert group specified that a small number of APS units should not include dynamic elements – i.e. that they should be static. These were included as baseline problem-solving tasks and were intended to be the easiest of the APS tasks.