As interest in non-formal adult education and training (AET) builds, the collection of data enabling the tracking of educational and labour-market outcomes becomes increasingly important. Unlike formal education, data collection systems in this area tend to have grown piecemeal, if data are gathered at all. Drawing on state-level experiences with community college data in the United States, as well as national examples from Austria, Denmark, Germany and the United Kingdom, this chapter analyses best practices and the conditions needed to enable consistent data collection. These systems can then be integrated with other data frameworks, facilitating national and international comparisons and tracking broader outcomes. This chapter presents a data taxonomy that can be customised to different contexts and concludes with recommendations based on the analysis of these best practices and examples to enable policy makers to track outcomes more effectively and in turn strengthen their AET systems.
Quality Matters

4. Tracking outcomes in adult education
Copy link to 4. Tracking outcomes in adult educationAbstract
Introduction: Trends in non-formal adult education and training
Copy link to Introduction: Trends in non-formal adult education and trainingAs interest in non-formal adult education and training (AET) accelerates globally, questions of access and quality are emerging as central issues (European Commission, 2020[1]; Van Noy and Michael, 2022[2]), driving both questions about practice but only actionable with high-quality data. Data are particularly necessary to track the educational and labour-market outcomes of short-term programmes leading to non-degree credentials (NDCs) for adults, as learners move in, through and out of the workforce. Tracking these outcomes would also provide information on how to improve non-formal and adult education programmes, increase or streamline the number and type of providers, and help older students make more informed decisions about pathways into the workforce.
In the first year of the COVID-19 global pandemic, policy makers and practitioners alike refocused workforce development efforts onto shorter-term credentials that could help individuals rapidly transition into new careers. The “great resignation”, when individuals left jobs and professions, has become a global phenomenon, with many of those leaving considering additional education and alternative employment (Tharoor, 2021[3]). Educational pursuits have not all been toward additional formal degrees, with more than 70% of those in the United States who pursued training enrolling in programmes not lasting longer than six months (Cengage Group, 2022[4]). In the United States, 50% of adults considering further education in 2019 preferred non-degree options. By 2020, the share had risen to 68% (Strada, 2020[5]).
AET credentials have also been an area of focus for policy making and grant making in the European Union. In 2022, a team of scholars funded by the European Commission published the MICROBOL framework for the integration of short-term AET credentials into established higher education systems to encourage the development of such credentials (Microbol, 2023[6]). Efforts are underway to define these offerings to advance AET and consider their impacts (Caballero et al., 2022[7]; Zanville, 2021[8]). International survey data have found that these alternative credentials are particularly relevant for adult learners who have already participated in higher education (Kato, Galán-Muros and Weko, 2020[9]).
AET credentials can take many forms, including certificates issued by training providers, employers and others; certifications based on industry standards; and licensure, typically associated with government approval to participate in a profession (Workcred, 2021[10]) Certificates issued by providers are typically based on participation in a structured educational experience, while certifications and licences based on industry standards and/or governmental approval may or may not involve a structured educational experience but typically involve a standard examination.
Ultimately, non-formal AET can be offered by a wide array of providers, including online training outlets, private firms, unions or organised labour, industry associations, governments and, perhaps most notably, higher education institutions (Van Noy and Michael, 2022[2]). Globally, the number and diversity of short-term AET programmes are increasing, providing potential alternative pathways to employment. These include emerging online avenues, bootcamps and industry-based programme offerings (Gaston and Van Noy, 2022[11]). Meanwhile, more long-standing providers at education institutions have been increasingly shifting their focus to occupational offerings in their non-formal programmes (Van Noy and Hughes, 2022[12]).
In the United States, these providers are often public (government supported) community and technical colleges offering non-formal adult education and training (AET), usually referred to in this context as noncredit community college education (see Box 1.1 in Chapter 1 for a definition of non-formal AET). As established institutions with significant resources to support record-keeping, community colleges are more likely to have data or emergent data on these activities. In contrast, newer providers typically do not systematically collect data and usually have few incentives to do so. With that in mind, this chapter concentrates on the efforts to build a data infrastructure for these alternative learning pathways among community colleges, as the institutions where these data primarily exist. These efforts offer lessons for data collection with other providers.
This chapter explores the importance of developing a robust data infrastructure that will create a better understanding of educational offerings, enrolment, and outcomes of AET, which are an increasingly important part of the discussion on workforce development and adult education. The sections that follow draw upon research into the community college data infrastructure across multiple US states. The chapter considers the policy context and drivers for AET, and the accompanying data and the conditions needed for consistent data collection. It presents a data taxonomy with the data categories and elements that can be captured uniformly. It also looks at the importance of data partnerships to measure credential-based and labour-market outcomes, and the nature of functional data platforms as the foundation of such efforts. Although the taxonomy has been based on state-level data from US community college systems, the effort and the resulting categorisations offer insights into approaches to creating more systematic methods to track outcomes for AET across providers. To help apply the work to a global setting, the chapter includes case studies from selected OECD countries to provide comparative context.
The need for better data on adult education and training
Copy link to The need for better data on adult education and trainingAdult education leads to many different types of credentials issued by many different types of training providers. A vast array of courses and credentials are offered by traditional educational institutions, including colleges and universities, private training providers, and employer-sponsored training. Adult education spans a wide range of topics, from pre-college and language training to courses designed to facilitate career transitions and avocational programmes (i.e. those allowing individuals to pursue personal interests not related to their work). Funding models also vary significantly from programme to programme and from learner to learner; in some cases, a student may pay the entire cost through tuition fees, whereas in other cases a programme may be funded solely by government or philanthropic sources.
In recent years, this landscape of opportunities and providers has expanded and innovated with the development of new forms of technology and platforms for AET. Online learning opportunities are transforming learning, as are new forms of credentials, such as badges and micro-credentials that incorporate a digital component to the documentation of learning. The interest in and need for more AET is being further fuelled by changes in the labour market, leading to the widespread recognition of the need for ongoing training to keep up with workplace changes and retraining to adapt to changing labour-market demands. Yet, even as a major restructuring of higher learning, including AET, is underway, and as policy makers and other leaders seek to make sense of these trends and understand their outcomes, the systems, and initiatives to track and measure these disparate activities are scattered at best.
In contrast to initial education, which focuses on students progressing through the “traditional” educational pathways of primary and secondary education, immediately followed by postsecondary education and/or training for career entry, education targeted at adults is generally spread across disparate and disconnected systems. By and large, data are not consistently tracked within each of these systems, and even less frequently do systems share data to enable learners to be tracked as they move across different educational and career pathways. To understand the data available on AET, it is first important to map the general landscape of these educational opportunities and the extent to which data are available for each element. Even within countries, the educational systems and resources from governments to support AET are quite diverse, and data are limited. Emerging AET providers often exist outside the traditional governance structures that would require data reporting and the longitudinal tracking of learners. Employers who provide AET for their workers may track outcomes internally, but would not routinely share that information with other providers or with governments. Any examination of data sources for AET must recognise this complexity of providers and patchwork data availability.
In the United States, as in many other OECD countries, government-supported institutions like community colleges are some of the main providers of education leading to NDCs. Non-formal AET is typically comprised of short-term training that does not lead to a credit-based certificate or degree and fits into one of the following four categories: i) occupational education in which individuals enrol; ii) sponsored occupational education, often through employer contracts; iii) pre-college courses that include various forms of secondary-equivalent adult education; and iv) avocational courses that meet the personal interests and needs of members of the community (D’Amico et al., 2014[13]). Historically in the United States, data on learners in these programmes have not been systematically collected in the same way as data on degree-focused students, since there is no national/federal data collection (Erwin, 2019[14]), and data are collected differently across states due to inconsistent state funding (D’Amico et al., 2017[15]; Van Noy et al., 2008[16]). As these credentials become more of a priority, it is critically important to gain a better understanding of non-formal AET and provide recommendations on standardising data collection.
Approaches to tracking outcomes of non-formal adult education and training
Copy link to Approaches to tracking outcomes of non-formal adult education and trainingIn many OECD countries, there is a great deal of variation across government entities and educational sectors in terms of who collects AET data, along with how and why they collect them. Ultimately the governance of education systems goes hand in hand with shaping their data governance. Understanding the interplay between governance and data collection is an important step towards identifying how to promote the development of the data. Among US states, the two main policy drivers behind the development of data collection on non-formal education are governance and funding. Examples of efforts in other OECD countries, examined in the four case studies included in this chapter, illustrate how many of these same issues apply. These cases serve to illustrate the broader foundational issues to be tackled in building a data infrastructure.
The governance of adult education data is as varied as policy development itself. For instance, data collection may depend on whether government systems require data reporting for funding, or whether individual institutions are more self-governing. Centralisation is clearly an important factor facilitating the compilation of adult education data. For example, the state department of education in Iowa, and the state systems for community colleges in Virginia and Louisiana facilitate the collection of data on community college students attending non-formal programmes. There are similar advantages to centralisation in other OECD countries such as Estonia, where a largely unified policy framework for higher education facilitates the collection of comprehensive national data. Austria (Box 4.1) offers another example of a highly centralised data system.
Box 4.1. Austria’s approach: A highly centralised data system
Copy link to Box 4.1. Austria’s approach: A highly centralised data systemThe majority of Austria’s federal data are developed by one federal agency, Statistics Austria, which by federal law serves as the main public information centre for a broad array of data on topics including demographics, social issues, economics and labour markets, and vocational education and training (VET) and continuing education. This highly centralised system incorporates common methods for the collection, production and dissemination of the large majority of federal data, which is further reflected by there being no subordination of regional agencies to Statistics Austria. For example, most data are collected through centralised surveys. However, in order to develop the alignment of systems between Statistics Austria and the nine provinces, provincial representatives serve as members of technical subcommittees and the statistics council, which meet regularly with federal staff. This central-regional co-ordination is based on language in the Federal Constitution to address technical, financial, and organisational matters (Statistics Austria, 2022[17]).
The Initiative for Adult Education (IEB) was launched in 2012 in response to the results of the Programme for the International Assessment of Adult Competencies (PIAAC) Survey of Adult Skills, which showed that 17% of adults in Austria had poor literacy skills. It was set up by the Austrian Federal Ministry of Education in co-operation with the nine Austrian provinces. Funded nationally and through the European Social Fund, the IEB is intended to facilitate free basic skills and lower secondary education to adults in all provinces who did not have formal education, irrespective of language, birthplace, or social status (UNESCO, 2016[18]).
Statistics Austria implements the Adult Education Survey (AES) in co-ordination with the statistical office of the European Union, Eurostat (Statistics Austria, 2023[19]). However, Austria also has its own national focus on non-formal education. For example, federal legislation has paid increased attention to improving data collection on VET programmes, particularly around salaries and qualifications of apprenticeships (Hoeckel, 2010[20]).
IEB-funded programmes are also subject to regular data monitoring and oversight to ensure sustainability (UNESCO, 2016[18]). For example, providers must supply information on courses and participants in IEB-funded educational programmes to an IEB database in order to collect data on demographics, educational progress, qualifications, and dropouts (IEB, 2022[21]). Programmes that receive this funding are also anonymously reviewed using these data in reports by the IEB central office twice a year and at their conclusion (IEB, 2022[21]). Although the data and reports lack specific information about employment outcomes, they are nonetheless able to indicate the quality of the inputs of IEB-funded programmes, as well as the educational progress of the adults taking part.
Through the centralised approach of Statistics Austria and the IEB, Austria seems to have been improving its federal data infrastructure for more than a decade, particularly in the area of adult and non-formal education. Although there have been significant data gaps involving the long-term tracking of those completing VET programmes, and linking them to employment and other labour-market outcomes (Hoeckel, 2010[20]), the strong centralisation of its data infrastructure and methodology, along with the co-ordination among provinces, may place Austria in an advantageous position in developing its non-formal and adult education data related to outcomes more broadly.
The collection of non-formal AET data is often motivated by funding and associated accountability efforts. Earlier literature found such data collection was often tied to state-level funding initiatives. This same trend is illustrated by current efforts around AET data, where funding clearly drove data collection. Where funding existed, data were much more likely to exist, and tended to be of higher quality. Even within the same state, data on the non-formal offerings which did not have financial support were typically unavailable. This variation within states illustrates the importance of funding as a driver for data collection. In Iowa, where the vast majority (99%) of occupational non-formal AET was funded, as was much of the pre-college offerings (87%), this drove much of the data collection. Virginia’s FastForward programme provided significant funding through the Workforce Credential Grant for occupational AET that leads to industry-recognised credentials in a high-demand field. This funding is designed to be a pay-for-performance system. Students and institutions receive funding based on students’ completion of programmes and attainment of credentials. Data collection is a requirement of funding and therefore occurs at scale with a degree of completeness.
Creating the conditions for consistency
There are three fundamental principles that underpin consistent data collection on adult education and training: i) the alignment of policy, non-formal AET functions and data priorities; ii) a clear conceptual structure for the data landscape; and iii) the development of consistent operational definitions. Each of these principles is applicable to any setting that values short-term training leading to the earning of non-degree credentials.
However, any education or training that falls outside the traditional credit and degree structures may lack uniform programme standards and reporting mechanisms. Globally this is the case for much of the emerging area of alterative credentials. Even in the most long-standing providers of these credentials, traditional educational institutions, the data collected tend to be lacking in structure and consistency.
As described in the previous section, non-formal AET data in the United States exist in an inconsistent environment due to the lack of any national infrastructure (Erwin, 2019[14]) and inconsistencies in data collection across states (D’Amico et al., 2017[15]; Van Noy et al., 2008[16]). In the absence of co-ordination, states collect the information that meets their own individual needs, often from colleges which will in turn be collecting data based on reporting mandates, the minimum data required to run courses and programmes, and using a variety of collection mechanisms. Such conditions often lead to inconsistent definitions, missing data and data that do not translate between states or even between institutions. Therefore, the effort to develop a data taxonomy aims to provide a framework for consistency in thinking about the data infrastructure at the state level that can i) help enable cross-state comparisons; ii) help states collect more uniform data; and iii) eventually guide efforts toward national data collection—all three of which are benchmarks for measuring educational activity in workforce-relevant AET. These themes are illustrative of dynamics across OECD countries as global efforts seek to make sense of this complex and emerging ecosystem of providers and alternative credentials. Based on experience in the context of US community colleges, the key considerations that follow can be adapted for use in wider contexts.
Aligning data priorities, educational purposes and funding policy
Central to measuring the outcomes of non-formal AET is ensuring that the data captured are aligned with the educational purposes of the programme, as well as with the policy drivers, which are most often associated with funding. However, OECD countries vary in the purpose, policy development and funding mechanisms used for non-formal education. In the US context, data priorities will primarily be driven by the community colleges’ missions. With a long-established comprehensive direction to offer workforce preparation (O’Banion, 2018[22]), community colleges in the United States address the employment training needs of individuals and the industries and communities they serve. They fulfil this part of their mission through close collaboration with industries to meet specific skill-based needs, but also need to have a keen awareness of the needs of individuals, usually in a clearly defined local service area. They provide training that leads to credit-based degrees and certificates, as well as short-term training that leads to non-formal credentials. These institutions and the wider community college systems use data to gain a greater understanding of their missions through more clearly articulated data, reflecting the four different categories of non-formal education they offer. Data collected must align with the outcomes relevant to each training function and purpose.
An additional contextual factor is the policy environment. Most often, the driving policy context is related to available funding. Where funding is available for non-formal AET, each function may have a different funding mechanism. Whether governments are collecting data to provide accountability associated with funding, or because they trying to make the case that they are preparing the workforce, robust data collection is critical (D’Amico et al., 2017[15]; Van Noy and Jacobs, 2009[23]; Voorhees and Milam, 2005[24]) and will be needed to measure progress towards educational attainment goals that include AET (Lumina Foundation, 2023[25]).
Although the United States has community college data from both survey and administrative sources, its data infrastructure is less comprehensive and centralised than that of other OECD countries. For example, while Austria (Box 4.1), Denmark (Box 4.4) Germany (Box 4.3) and the United Kingdom (Box 4.2) may have mixed approaches to collecting non-formal AET data, they all have central or federal agencies that collect survey and/or administrative educational and vocational training data, and may be better able to expand the ongoing development of data in this area. In the United States, community colleges as well as the states they reside in, may use different methods and sources to collect data, which may not be uniform from one college, other educational provider or state to another.
Creating a clear conceptual structure for data collection
Once priorities have been established for AET, then an appropriate conceptual structure for data collection must be assembled to ensure that the data needed to measure outcomes will be available. Enrolment data may often be available, but historically it has been more challenging to obtain the data needed to measure labour-market outcomes. During recovery from the Great Recession, the US government funded the Trade Adjustment Assistance Community College and Career Training grant programme through the Department of Labor. These grants enabled many states to build the capacity to merge their education and labour-market datasets, allowing better measurement of outcomes (Mikelson et al., 2017[26]). Additional priority areas for enabling the success of data collection efforts include enhancing the descriptions of the non-formal offerings (programmes and courses), credentials earned, student identifiers and associated financial data.
Although other countries in the OECD have similar data gaps and priorities, they are already collecting or planning to collect these kinds of data under a systemic structure. For example, the United Kingdom, there is already has an administrative dataset known as the Outcomes-Based Success Measures (OBSM) (Box 4.2). This matches education and employment records, and captures information on the technical and abstract knowledge covered by those credentials that are part of the National Qualifications Framework (NQF). These administrative data are also supplemented by employer surveys developed by the UK Department of Education to collect information on employer experiences with training providers and their skills development.
Box 4.2. The UK approach: Using existing administrative data as the basis for broad data collection
Copy link to Box 4.2. The UK approach: Using existing administrative data as the basis for broad data collectionIn the United Kingdom, the term “further education” is used to refer to a wide range of non-formal courses and credentials, which themselves are often described as “learning aims” in official documents. Further education also covers apprenticeships and occupational licensing. To learn more about which of these programmes were associated with the greatest labour-market returns, the United Kingdom has made use of existing administrative data to develop a dataset known as Outcomes-Based Success Measures (OBSM).
Consistent with the broad use of the term “further education” in the United Kingdom, the OBSM covers all credentials that fall within the UK National Qualifications Framework (NQF). The NQF divides credentials into different levels depending on the level of technical and abstract knowledge and skill required and the OBSM tracks the individual learning aims represented by credentials pursued at different levels of the NQF, with a focus on levels 1-4 (out of 8). The OBSM is also notable for the quality of the administrative data pulled into the system, using income data directly from national tax records and including self-employment income as well as wages. The OBSM data are very complete, with a match rate between education and employment records of between 98% and 100% depending on the credential.
At first glance, the OBSM has many similarities to efforts in the United States to use administrative records to track longitudinal outcomes, such as Post-Secondary Education Outcomes (PSEO) and state longitudinal data systems. However, the UK government has not prioritised efforts to disseminate the data to the public in the same way as the United States has done through resources such as TrainingProviderResults.gov, or data dashboards maintained by individual states. The UK Department of Education’s Unit for Future Skills has developed an interactive dashboard using data from the OBSM dataset. However, the data dashboard only reports on aggregated categories of credentials, not individual credentials, nor is there evidence of a substantial effort to promote the dashboard for use in career advice or decision making. There are additional plans to create a dashboard that could facilitate public comparisons of individual providers, with color-coded ratings (red-amber-green) being applied to training providers based on their standing on key performance indicators. These indicators include:
A newly created “skills measure” which adjusts data on the percentage of learners moving into sustained employment or higher-level learning to allow for contextual factors related to the occupation/industry associated with the credential, local labour-market conditions and the population served by the training provider.
Learner employability.
Employer experience – a measure to be derived from employer surveys to identify the extent to which a training provider engages with local employers, and employers’ perceptions of the training provider’s responsiveness to their skill needs.
In addition to the Department of Education’s efforts to supplement data on training providers with surveys of employers, some regional bodies in the United Kingdom conduct their own surveys. The Greater London Authority, in particular, has launched a recurring London Learner Survey (https://londonlearnersurvey.co.uk/learner/) that collects data on self-reported well-being and civic engagement outcomes in addition to earnings, with the goal of measuring the full range of social outcomes associated with further education.
The key strengths of the United Kingdom’s approach to non-formal AET data collection and analysis are the quality of the administrative records and the direct applicability of the data collected to policy making. However, the United Kingdom has not yet incorporated its data into user-friendly products or services that directly help learners to choose between training providers or inform the work of professional career advisors or counsellors.
Developing consistent operational definitions
In order for impacts and outcomes to be measured, priorities must be set, and data elements must be identified in a clear conceptual data structure. However, the establishment of consistent operational definitions is perhaps the most critical condition for ensuring that the data collected by institutions are accurately reported over time. In the absence of national data collection or definitions that cross state borders, the identification of key data elements with consistent definitions is fundamental to accurately measuring outcomes (D’Amico and Van Noy, 2022[27]).
Outside the United States, other OECD countries also use different definitions for similar concepts surrounding non-formal education. For example, in the United Kingdom non-formal education falls under the term “further education” (Box 4.2) while in Denmark it is incorporated into Arbejdsmarkedsuddannelser (AMU; literally translated as labour-market training) (Box 4.4) and in Germany it is considered part of vocational education and training (VET) (Box 4.3). The European Commission has made significant progress towards establishing common definitions within the EU through its Skills Agenda and Council Resolution on a New European Agenda for Adult Learning, but there is little evidence that European data definitions are being adopted outside Europe. In order to develop a better understanding of the AET data infrastructure across OECD countries, one solution could be to be intentional in sharing data, information (including definitions and conceptualisations) and resources that may already exist within and across countries in order to identify and fill any gaps.
A taxonomy for data collection
Once the conditions for thinking about consistency have been established, more specifics are needed in terms of data elements. While these will inevitably vary by context and goal, the non-formal AET data taxonomy (D’Amico et al., 2023[28]) that follows is designed to embrace the educational functions of AET as well as the related policy priorities; offer a conceptual structure; and provide an inventory of relevant data elements that can be applied in international, national or regional settings. The specific operational definitions outlined here may vary across providers and countries, but the intention is for the taxonomy to serve as a launching point for educational systems and governing structures to identify their own data elements within each of the categories. The taxonomy is divided into four main sections: i) purpose and design; ii) outcomes; iii) demographics and enrolment; and iv) finance (Figure 4.1). The subsections below discuss the opportunities and challenges arising in each category.
Figure 4.1. Non-formal education data taxonomy
Copy link to Figure 4.1. Non-formal education data taxonomy
Source: Adapted from D’Amico et al. (2023[28]), Collecting and Understanding Noncredit Community College Data: A Taxonomy and How-to Guide for States, https://sites.rutgers.edu/state-noncredit-data/wp-content/uploads/sites/794/2023/11/State-Noncredit-Taxonomy_EERC_11.17.23.pdf.
Purpose and design
The purpose and design category (Figure 4.2) covers the intent of specific programme offerings and the ways in which they are delivered to students. When collecting data on non-formal courses and programmes (i.e. offerings), it is critical to capture the relevant topic area (field of study) and the type of offering. Data collection systems can make use of established categorisations for consistency. In the United States, the most common mapping for field of study is the US government-established Classification of Instructional Programs Codes (NCES, 2020[29]). Offerings can be categorised into the four non-formal types of offerings already described, as shown in Figure 4.2. However, this typology does not address specific career fields, which is why fields of study and programme type are complementary data elements.
Purpose and design should also record the form in which programmes are delivered. This includes the programme characteristics, with data elements such as the number of contact hours for each course; delivery, including whether face-to-face, online or blended, and any work-based learning requirements; and the credentials associated with each offering.
These different categories within purpose and design are, in many cases, relevant to funding in those states where resources are available. The US federal government is also considering whether to include short-term training of at least 150 hours into the national student need-based grant programme, Pell Grants (Dembicki, 2023[30]). At present, federal student aid policy is much more focused on programmes leading to traditional credit-based higher education. As other types of credentials become increasingly important, a framework developed by Workcred (2021[10]) to differentiate between college-issued certificates, industry-based certifications and occupational licensure is particularly relevant. Recent policy attention focused on micro-credentials (Educause, 2023[31]), as well as government-approved national standards for apprenticeship programmes (US Department of Labor, n.d.[32]) further underline the need to delineate the different types of AET, as the desired credentials are often associated with particular types of programme (e.g. occupational training) or specific in-demand career fields such as science, technology, engineering and mathematics (STEM). One persistent challenge is the lack of incentives for tracking outcome data in environments that do not support non-formal education or outcomes. Researchers also struggle to match data from government and non-government sources.
Internationally, the purpose and design data elements described here most closely resemble the data elements in the United Kingdom’s OBSM, described in Box 4.2. Unsurprisingly, the US approach places more emphasis than many other countries on metadata to describe the characteristics of programmes, due to the lack of agreed definitions and standards to define quality within credential categories. Certificates are particularly problematic in the United States as rigour and quality can vary greatly across different institutions in the absence of quality standards imposed by government (Gaston and Van Noy, 2022[11]). In other OECD countries, this issue may be less significant, due to the greater centralisation in the design and purpose in data collection on training and education. For example, Austria’s federal constitution includes references to the co-ordination of educational data collection between federal and provincial technical subcommittees, as well as an explicit goal to improve basic skills like literacy and lower secondary education to adults. Similarly, Germany has multiple federal bodies dedicated to improving the coherence of non-formal and adult education data collection and reporting, including its many subareas dealing with inputs, processes, and outcomes such as certification. Nonetheless, there are still areas for improvement in terms of non-formal and adult education, particularly around connecting inputs to outcomes.
Figure 4.2. Purpose and design categories and sample data elements
Copy link to Figure 4.2. Purpose and design categories and sample data elements
Source: Adapted from D’Amico et al. (2023[28]), Collecting and Understanding Noncredit Community College Data: A Taxonomy and How-to Guide for States, https://sites.rutgers.edu/state-noncredit-data/wp-content/uploads/sites/794/2023/11/State-Noncredit-Taxonomy_EERC_11.17.23.pdf.
Outcomes
The outcomes category addresses outcomes for learners such as credentials earned, labour-market benefits or transition to higher education. While the measurement of outcomes might appear to be standardised, the varied nature of non-formal AET across programmes, countries and sub-national governments, results in a wide range of relevant outcomes. In a credit-based higher education environment a typical outcome would be the earning of a degree or other post-secondary qualification – but for adult education it might involve the attainment of industry-based credentials, licensure, or institution-based certificates. An additional challenge is that those collecting the information do not all have the same access to data that would help them measure outcomes; they need to forge a web of agreements with other government and industry groups, such as state licensure agencies for government-issued credentials and industry organisations issuing credentials, to access the relevant information.
Outcomes for non-formal adult education and training are therefore multifaceted. They can be divided into three main categories: academic, labour-market and non-degree credential outcomes (Figure 4.3). Academic outcomes include the completion of courses and programmes, and also whether students transition to credit-based programmes relevant to the earning of higher education degrees. Such progressions of educational opportunities are often referred to as stackable credentials, and the transition from a non-formal to a credit-based programme is not the norm (Bahr et al., 2022[33]; Buckwalter and Maag, 2019[34]). High-quality data that are integrated across different missions of the college (degree and non-degree seeking) are critical to measuring academic outcomes. States like Iowa and Louisiana have worked to integrate all student records into a single state data system so that academic outcomes can be measured accurately.
The second category of data, labour-market outcomes, are primarily measured by employment and earnings before and after the pursuit and completion of AET. These data can be challenging to obtain, since employment and earnings data are maintained in government agencies outside of education; however, some efforts are underway to improve abilities to make cross-matches, and states like Virginia and Iowa have formed data partnerships that allow them to measure labour-market impacts. As the case study in Box 4.2 shows, the United Kingdom has the capacity to track longitudinal outcomes with robust connections between education and employment records. Denmark (Box 4.4) is also able to track the earnings and employment of individuals. Although Denmark’s system shows great capacity, its data are collected on an ad-hoc basis rather than as a result of a commitment to public transparency about these outcomes. Germany (Box 4.3) has also been actively developing plans to collect more comprehensive and uniform data about informal, formal, and non-formal education outcomes related to educational returns and placements.
Finally, the third category, non-degree credential outcomes, includes qualifications such as industry-based certifications, occupational licences, college-issued certificates, micro-credentials, and certificates of apprenticeships. Although these are important outcomes for AET, collecting the data presents similar challenges to data on labour-market outcomes. Licences to work in fields such as health care are often maintained by agencies outside of education and require data partnerships to obtain the data. Information on industry-based credentials can be even more difficult to maintain since the issuing organisations are typically private and may not be compelled to share data. The section below on data partnerships provides more information on data sharing agreements.
Figure 4.3. Outcome categories and sample data elements
Copy link to Figure 4.3. Outcome categories and sample data elements
Source: Adapted from D’Amico et al (2023[28]), Collecting and Understanding Noncredit Community College Data: A Taxonomy and How-to Guide for States, https://sites.rutgers.edu/state-noncredit-data/wp-content/uploads/sites/794/2023/11/State-Noncredit-Taxonomy_EERC_11.17.23.pdf.
Demographics and enrolment
The demographics and enrolment category is intended to capture the numbers and characteristics of learners. Analyses of equitable access to and outcomes of non-formal offerings require the ability to differentiate under-represented or marginalised groups, and demographic data are critical to such work. Unique person-level identifiers, such as social security numbers in the United States, are also a key part of this category as they are central to any outcomes analysis that requires educational data to be matched with employment and earnings records. Figure 4.4 provides a list of sample data elements to be included in robust AET datasets.
In formal programmes that lead to degrees, capturing enrolment figures, student identifiers and demographic data is the norm. In the United States, this is partly driven by national data collection through the federal Integrated Postsecondary Education Data System data collection effort, requirements for federal financial aid, and the verified enrolments required for state-level funding. In the realm of AET, with greater variation in state funding streams and no federal reporting, demographic and enrolment data are collected by most states, but not all (D’Amico et al., 2017[15]; Van Noy et al., 2008[16]).
Demographic data on race/ethnicity (which can include immigration status) and sex/gender are often missing in state-level datasets for a number of reasons. The first is that these data can be requested, but students are not required to supply them. The second is that any student-level details may be difficult to obtain because of the way some non-formal programmes are provided. For example, when a community college offers training for a specific employer, the number of students enrolled may be captured, but the employer might not provide the names, identifiers and demographic details of the employees completing the training. For example, in a recent study of three states, race/ethnicity data were unknown or missing for more than half of the participants enrolled in occupational non-formal education. In this environment, colleges may only collect what they must and report what is required.
Although there are challenges to obtaining enrolment and demographic data in a policy environment where incomplete data are the norm, it is very possible from a technical standpoint to enhance existing datasets. For example, combining data on enrolments and contact hours will be critical to any national-level proposals to provide student need-based funding. In Germany there is a strong focus on improving data collection around demographic, economic and social indicators for students, including immigrants (Box 4.3) For example, a recent federal report has argued for expanded efforts to develop a system of integrated further education reporting (iWBBe in German), with a goal to create a rigorous data infrastructure framework to improve data collection on continuing education across a broader range of indicator groups and subareas, including non-formal and adult education (Münchhausen et al., 2023[35]).
Figure 4.4. Demographics and enrolment categories and sample data elements
Copy link to Figure 4.4. Demographics and enrolment categories and sample data elements
Source: Adapted from D’Amico et al. (2023[28]), Collecting and Understanding Noncredit Community College Data: A Taxonomy and How-to Guide for States, https://sites.rutgers.edu/state-noncredit-data/wp-content/uploads/sites/794/2023/11/State-Noncredit-Taxonomy_EERC_11.17.23.pdf.
Finance
The finance data elements address relevant policy by identifying tuition costs and subsidised funding arrangements. In general, the policy environment around non-formal AET is quite varied, and for noncredit community college education the data collected can depend on what types of offerings are funded, how they are funded and how these data fit within the purpose of the relevant training. The current situation supports the findings of previous studies regarding the relationship between funding and the availability of high-quality data (D’Amico et al., 2017[15]; Van Noy and Jacobs, 2009[23]; Voorhees and Milam, 2005[24]). Iowa is one example of a state in which data collection is supported because nearly all occupational training receives support from the state. Similarly, Virginia’s FastForward programme features an outcome-based student tuition reimbursement model leading the state to prioritise data collection (D’Amico et al., 2023[36]). The more comprehensive data infrastructure and central financing in these US states is closer to the situation in other OECD countries that have sources of central funding. For example, in Austria there is central funding and federal legislation to not only facilitate, but monitor and oversee the Initiative for Adult Education (IEB), which includes data reporting requirements for providers, anonymous reviews of providers and their adult programmes, and a central IEB database that issues regular reports on programmes (Box 4.1).
The data elements in the finance category might include tuition fees, state funding, whether courses grant students’ eligibility to use federal training resources, whether employers fund training or students electively enrol, and whether states provide economic development subsidies (Figure 4.5). These finance data do not operate in a vacuum, however. Combining them with data elements from the other three categories enables meaningful analyses, showing the critical and supplementary role of each category and data element in the taxonomy.
Figure 4.5. Finance category and sample data elements
Copy link to Figure 4.5. Finance category and sample data elements
Source: Adapted from D’Amico et al. (2023[28]), Collecting and Understanding Noncredit Community College Data: A Taxonomy and How-to Guide for States, https://sites.rutgers.edu/state-noncredit-data/wp-content/uploads/sites/794/2023/11/State-Noncredit-Taxonomy_EERC_11.17.23.pdf.
Partnerships for data sharing
Given the wide range of AET providers, as well as the many entities involved in tracking outcomes for any given provider type, partnerships are essential. Compiling data on AET outcomes requires education providers, industry groups and government agencies to collaborate. For example, individual institutions will need to be co-ordinated, often by government agencies, so they use standardised data elements to define non-formal offerings, outcomes, student identifiers and funding. Decisions about which data elements to capture and how the operational definitions are to be used to guide data collection will usually be made at the state level. In the states where non-formal AET data are mandatory or highly encouraged, these are typically not collected at all or are just partially collected.
External data partnerships are also key to compiling a complete set of the data elements needed to measure outcomes. Partnerships with workforce or commerce departments may be needed to match enrolment or completion data with labour-market outcomes. For example, the Virginia Employment Commission shared unemployment insurance records to provide labour-market (wage/employment) outcomes. Iowa also maintained a partnership with other state agencies that allowed them to obtain data on labour-market outcomes. AET data may also be linked to information on occupational licensure. Virginia’s Department of Health and its Department of Labor and Industry worked together to connect FastForward completions with licensures. Likewise, Iowa’s Department of Education shared information to identify state licensures for all healthcare fields and commercial truck driving. The development of these linkages took a number of years due to the need to negotiate access and integrate disparate systems.
One of the primary findings from work with US states is that each is on what is referred to as a “noncredit data journey” (D’Amico et al., 2023, p. 23[36]). As this journey progresses, the expectation is that data elements will continue to be added to meet new and developing priorities or as additional data sources are added. This contrasts with the experiences of other countries, such as Estonia, where data systems were rapidly scaled by government mandates accompanied by technical support. In Germany, the iWBBe report in 2023 outlined efforts to create more comprehensive and systematic reporting on equivalent education (Box 4.3). At present, many of the data elements discussed above are not being collected, but the report proposes a new system incorporating three main components: inputs, processes, and outputs, which mirror some of the overall categories in the data taxonomy. Although much remains to be done, in many respects US state-level data systems are further along than other national systems. For example, although Germany has invested in the creation of national data systems, individual US states tend to capture a greater depth of data than the national German system.
Box 4.3. Germany’s approach: Actively developing more coherent reporting
Copy link to Box 4.3. Germany’s approach: Actively developing more coherent reportingIn Germany, policy, research and practice on vocational education and training (VET), which include continuing education programmes that do not lead to credits (i.e. non-formal AET), are co-ordinated by the Federal Institute for Vocational Education and Training (BIBB). Although BIBB does not directly administer VET or non-formal education, it is the main government body that provides research-based regulatory and developmental support to training providers (e.g. employers and schools) and advises the German Federal Parliament and regional parliaments on VET administration and policy. Moreover, it is the central institution that develops, collects, and analyses data on VET and employment outcomes. Another important government agency in the area AET is the German Institute for Adult Education (DIE), which focuses on improving the quality of adult and continuing education through research and policy advice.
Both BIBB and DIE have recognised the importance of both formal and non-formal types of continuing VET. Funded by the Federal Ministry of Education and Research (BMBF), the two agencies released a report in 2023 on new and expansive efforts to develop integrated further education reporting (iWBBe) (Münchhausen et al., 2023[35]). The report argues that Germany currently lacks comprehensive data collection on this type of training and education, including data on important indicators (e.g. demographic, economic and social) and outcomes (e.g. certifications, competencies and quality, advising and personnel, and educational returns). The broad policy recommendation of the iWBBe report was to improve the largely fragmented data and reporting infrastructure currently in use by developing a more coherent, uniform, and rigorous data infrastructure framework that covers all relevant topics of continuing vocational education. The proposed framework incorporates inputs, processes, and outputs across multiple indicator groups, broken down into different subareas of continuing vocational education, including non-formal AET, in order to develop a more complete understanding of the German education and training landscape. Another primary recommendation from the iWBBe report is the creation of a digital interactive education dashboard to improve the availability, integration, and coherence of information across all facets of VET (Münchhausen et al., 2023[35]).
Although the iWBBe report and recommendations are important for the future of continuing education as well as for broader VET in Germany, there are existing strategies in place to collect data on continuing and non-formal education. VET analyses and reports use related international data from the European Union and the OECD (Münchhausen et al., 2023[35]), but German-specific data are primarily collected through surveys conducted by BIBB and other governmental partners. These include:
The Employment Survey of the Working Population on Qualification and Working Conditions in Germany (ETB), which is a telephone survey that collects robust and representative data on the German working population (i.e., those aged over 15 and working at least 10 hours a week). Starting in 1979 and last administered 2018, the ETB survey is conducted every four years in co-ordination between BIBB and the Federal Institute for Occupational Safety and Health (Rohrbach-Schmidt and Hall, 2020[37]).
“wbmonitor”, administered in co-ordination between BIBB and DIE, is an annual online survey of continuing education and training providers, and facilitates information exchange on the varying landscape of continuing education and training (BIBB, 2022[38]).
The continuing education data from these surveys have largely been disseminated, albeit not as the main focus, through three national reports designed for different audiences:
The Report on Vocational Education and Training 2021 (BMBF, 2021[39]), a politically framed report which, since 1977, has been legally mandated to be issued every year.
The VET Data Report Germany 2019 (BIBB, 2022[40]) which is a non-political accompaniment to the Report on Vocational Education and Training, and provides the underlying data and statistics on VET and continuing education.
The National Education Report (KMK and BMBF, 2016[41]) is a report on the entire German education system, intended for a broad audience. It is commissioned by BMBF and the Standing Conference of the Ministers of Education and Cultural Affairs and has been released every two years since 2006.
Though it is limited in its current form and largely survey based, Germany’s existing continuing education data and reporting infrastructure is actively maintained and continually being developed. Key institutions, such as BIBB, are aware of what structures and processes would be needed to build a more comprehensive data system for a broader range of VET and its subareas. Germany should therefore be relatively well equipped to further expand its data collection and analysis as required to tackle the ongoing and future challenges of VET among a wider population of students.
Data system integration
Integrating data about AET programme characteristics and outcomes remains a significant issue across the OECD. For example, in the United States, non-formal community college education is often referred to as the “hidden college” (Voorhees and Milam, 2005[24]). One of the key reasons for its lack of visibility in US higher education is that data on these programmes are often not held in the same or similar datasets as data on credit-based programmes. With limited federal aid, varied state funding, and non-formal education not often leading to credit-based programmes, the fact that many consider non-formal instruction, and the data about it, to be “hidden” should come as no surprise.
However, increasing non-degree credential attainment and quality has become an important policy goal for community colleges and other non-formal providers (Van Noy, 2023[42]). Tracking non-formal enrolments to accurately account for the non-formal function and student funding also needs to be considered (Romano and D’Amico, 2021[43]). Meanwhile, efforts continue to capture students transitions between non-formal and credit-based programmes (Xu and Ran, 2020[44]). As outlined in previous sections, not only is it important to improve data quality, but data systems must be structured in a way that enables them to address current movements.
An analysis conducted for the US Department of Education showed that non-formal AET data are not always included in statewide longitudinal data systems and are highly inconsistent even within state higher education systems. In addition, states do not always house non-formal enrolment data in the same systems as credit enrolments (Office of Career, Technical, and Adult Education, 2014[45]). Some of this reflects practices within colleges; some have chosen to integrate the workforce education functions that span credit and noncredit education, while others have chosen to keep separate structures (Van Noy et al., 2008[16]).
Among the US states with robust non-formal community college data, different approaches are emerging. For example, Iowa combines all noncredit and credit data into a single state-level management information system (D’Amico et al., 2023[28]). Virginia’s system allows state leaders to view individual college data in real time. Louisiana, which has recently included “credentials of value” in its overall higher education attainment goals, has implemented the same student information system (Banner) in all 12 of its education system colleges to be used for all credit and noncredit data (D’Amico et al., 2023[36]). Ultimately, researchers find that the data elements captured are only as good as the data platform they populate, and thus data management systems are essential to the successful measurement of AET outcomes.
The United States’ emphasis on synchronisation and standardisation is shared by other OECD countries, some of which have administrative data systems that offer even deeper linkages than the most comprehensive American statewide longitudinal data systems. Two countries that are particularly noteworthy are Denmark and Estonia. Denmark’s Integrated Database for Labor Market Analysis (IDA) allows for linkages to rich longitudinal labour-market data, even if coverage of adult education credentials obtained outside the higher education system is not complete (Box 4.4). The Estonian Education Information System (EHIS) combines rich data on all forms of educational attainment and service providers' data. For a more comprehensive view that includes labour-market data, the Haridusportaal (Education Portal) integrates various information sources, such as educational and labour-market data, presented through rich personalised dashboards to guide career decision making. In contrast, in the United States, states do not typically use non-formal administrative data to offer individuals personalised recommendations for further education and career decision making (see Chapter 3 for a discussion about how such outcomes data could support more adults to make informed choices about AET).
Box 4.4. Denmark’s approach: Rich integrated datasets used primarily for targeted policy research
Copy link to Box 4.4. Denmark’s approach: Rich integrated datasets used primarily for targeted policy researchEach year, about 200 000 Danes participate in courses that might be considered roughly equivalent to the concept of non-formal or workforce education in other parts of the world. Courses that do not lead to higher-level credentials and have an explicit workforce focus are known as Arbejdsmarkedsuddannelser (AMU; literally translated as labour-market training). AMU courses are offered primarily by vocational colleges and standalone AMU centres, although trade unions are also important sources of AMU training (Kirkgaard Nielsen, 2022[46]).
Danish researchers (and foreign researchers using Danish data) have long enjoyed some of the world’s richest administrative data, due to the unusual degree to which systems are linked together across public sector sources. The Integrated Database for Labor Market Analysis (IDA) contains rich data on individuals’ earnings and the characteristics of the firms employing them (Timmermans, 2010[47]). Denmark could be described as almost an ideal case for the matching of data on non-formal training and credentials onto longitudinal labour-market data, potentially offering great insights into the long-term labour-market rewards associated with such credentials. However, despite the richness of data available, Denmark has not invested in the creation of data products that would provide public transparency on the outcomes of individual credentials and training providers.
Rather than collect data on a continuous basis on enrolment in non-formal and non-degree credentials (NDCs), Denmark does so on an ad-hoc basis. The analysis of administrative data is also frequently supplemented with learner surveys. For example, the Danish Ministry of Education uses a standard questionnaire for selected AMU courses referred to as vis kvalitet (show quality). The questionnaire asks working learners questions that, unlike traditional university course evaluation questionnaires, directly gauge the relationship between course materials and on-the-job learning. Sample questions include:
Is this course useful to you in your current job?
Has this course meant that you are better able to take on new tasks at your workplace if there is a need for it?
Was the teacher good at showing the connection to what you do at work?
Although the Danish government has not created a specific dataset designed to track the longitudinal labour-market outcomes associated with completing an AMU course, think tanks and research organisations do occasionally conduct research to identify the current state of the field. A recent study published by the National Research Centre for Social Sciences found that completing an AMU is associated with higher salaries and a greater probability of employment post-completion, but the study did not produce evidence of long-term labour-market impacts (VIVE, 2022[48]). Aggregate research of this kind has not resulted in any interactive dashboards or other data tools that could be used to allow learners to directly compare between AMU programmes, or inform programme development or funding.
The key strength of the Danish approach to non-formal AET data collection is its integration with national administrative datasets. This unlocks opportunities for research into the effectiveness of non-formal instruction on a wide range of social and demographic variables and minimises research costs. However, there has been little sustained interest in tracking the outcomes of individual programmes in a systematic manner and no efforts towards launching data dashboards or dissemination tools that could directly aid individuals making career decisions. While interest in data collection and research may grow as non-formal micro-credentials proliferate (potentially with the support of European Union funding), no significant initiatives are expected for the foreseeable future.
The United States is arguably more advanced than many other countries with respect to the standardisation of adult education data. In contrast, while Australia and Mexico do not have comprehensive national or significant state-level efforts specifically aimed at co‑ordinating data collection on adult education and training (AET), their data systems still capture important aspects of vocational education and training (VET). For example, Australia’s National Centre for Vocational Education Research (NCVER) collects extensive data on nationally recognised (formal) VET, including the Survey of Employers’ Use and Views of VET (SEUV), which gathers information on employers’ experiences with non-formal training. Similarly, individual universities in both Mexico and Australia (e.g. Tecnológico de Monterrey and the University of Melbourne) are also undertaking their own efforts to track labour-market outcomes related to AET. However, these institutional efforts often rely on surveys of individuals who have completed AET and may be limited by restricted access to administrative data systems that could better track workforce outcomes. National-level surveys are important components of efforts to learn more about the value of adult education in several other countries, particularly in the European Union where the Eurostat Adult Education Survey (referenced in Box 4.1) collects data on adult education completion and associated labour-market outcomes across each member state.
Conclusions and policy recommendations
Copy link to Conclusions and policy recommendationsMain findings
This chapter has explored the importance of building robust data collection systems for tracking information about non-formal AET and the different approaches taken in selected OECD countries. It has considered the conditions needed to enable consistent and comparable data and proposes an outline taxonomy, based on experience in the context of US community colleges. It also identified some of the partnerships and integration needed to provide comprehensive and relevant information about outcomes, such as labour-market impact. In conclusion, much progress is being made across OECD countries towards the creation of data systems, but gaps remain.
1. More comprehensive and consistent data collection is needed to enable tracking of the outcomes of non-formal AET and its associated credentials.
Such data collection can help to address two goals. The first, as highlighted throughout this chapter, is to better understand adult education and training outcomes among and across providers within national and sub-national contexts. The second is the ability to systematically compare attainment rates or labour-market outcomes from different national (or sub-national) non-formal AET data systems. This could enable policy makers to better understand how non-formal credentials contribute to national competitiveness and how effectively different components of their skills development policies may be performing.
2. Establishing commonalities in definitions would allow countries and sub-national entities with robust data systems to contribute data to international benchmarking projects.
Current differences between countries in the terminology used, data collection methodologies and availability of resources for research limits the ability to generate such cross-national statistics. Fragmented governance of key data systems (especially in the United States, where sub-national governments are leading the creation of data systems in a piecemeal fashion) limit countries’ ability to create national datasets that can be effectively compared to each other, notwithstanding existing ad-hoc data sharing agreements such as the Coleridge Initiative in the United States.
As the variation even among the countries profiled in this chapter indicates, there is tremendous global variation in the availability and quality of non-formal AET data. Diligent efforts to fill in gaps in data collection and technical assistance to increase analysis capacity could help researchers to start to build a global picture of non-formal adult attainment. Higher education institutions which work in multiple countries, as well as technology platform providers such as Coursera that partner with higher education institutions in many countries, could also help to develop global estimates of non-formal attainment. Evidence about the labour-market impact of AET in developing countries could be particularly important for decision making about the allocation of funding towards such programmes, which might provide more cost-effective pathways out of poverty than traditional credit-bearing credentials.
Policy recommendations
The following recommendations are intended to provide a starting point, as national and subnational governments consider stronger and more consistent outcomes tracking for adult education and training:
Recommendation 1: Take an inventory of the policy conditions supporting consistent data collection, and the infrastructure available to achieve it.
Effectively incorporating data into evidence-based policy making requires both the research capacity to analyse the data, and interest on the part of policy makers. Policy choices related to the use of non-formal AET data – such as the use of non-formal programme outcomes as a criterion in public funding decisions – and cultural factors both seem to influence the likelihood of a data system being used to build data dashboards and similar mechanisms that learners can use to guide their own decision making. It is up to policy makers to decide whether the returns, in terms of better student advice, better ways of distributing funding and better policy making, justify investment in the collection of non-formal AET data, but there is growing momentum behind the creation of more comprehensive and useful data systems in many countries. This momentum follows the growing trend for establishing viable alternative pathways to career advancement, promoting prosperity for individuals, communities, and nations alike.
It is also important to keep in mind that data can come from survey responses as well as from administrative records. While survey response rates have fallen in recent years throughout much of the world, national and international survey research projects (such as PIAAC and, to a more limited extent, the Gallup World Poll) could be modified to ensure that adult education credentials are accurately identified and recorded in survey results. Although surveys will generally not produce enough data points to allow individual training programmes to be compared with each other, there is still much that can be learned from them about the labour-market rewards of adult education programmes at different levels and in different fields of study. Adding additional questions related to adult education attainment to PIAAC, and ensuring consistency in the coding of results would be one option enabling the OECD to increase the availability of data on the credentials held by adult learners.
Recommendation 2: Employ a consistent data taxonomy and operational definitions used by providers and governmental systems.
Just as US states are beginning to discuss the use of common definitions, national actors, potentially in co-ordination with international organisations such as the OECD, could work to align their definitions and data collection practices. The proposed data taxonomy discussed above is a step towards achieving consistency, by providing a guide to developing approaches that could be adapted to individual countries’ particular needs and context.
An important starting point is a universal definition of NDCs and non-formal learning that translates across national borders. It may be necessary for an international working group to categorise credentials which are of a similar nature to each other, even if they do not share all the same characteristics. For instance, while it is clear that a non-credit certificate issued by a US community college has more in common with a Danish AMU course than a Danish university degree, flexibility may be needed in how non-formal programmes are defined in terms of the way courses can be “stacked” to earn degrees, and in fields of study.
Recommendation 3: Identify and establish partnerships for data sharing.
As noted in the examples discussed above, robust non-formal data might begin with providers, but partnerships are also needed if workforce-relevant outcomes are to be included. These partnerships might begin with the other government agencies gathering data on employment, wages, and licensing data, and may expand to include industry groups capturing information on the receipt of credentials.
Recommendation 4: Implement a multi-phased approach to building the data infrastructure needed to measure outcomes.
The data systems discussed in this chapter have undergone a multi-year “non-credit data journey”, adding data elements over time based on availability and priorities. National and sub-national governments would be well served to take steps toward more robust data collection, knowing that investments in data and systems occur over time and are often driven by similar motivations and needs such as ensuring that funding is allocated to the programmes that will have the most impact. Each country considering improving its data infrastructure to capture the outcomes of AET will start by considering the baseline of already available data. The first three recommendations listed here – inventorying existing data gathering infrastructure, identifying key data elements, and building partnerships – will be critical to obtaining the data needed to capture outcomes accurately and meaningfully. These are the beginning steps towards mapping out the data needed to cover this evolving landscape of credentials, providers, and outcomes.
References
[33] Bahr, P. et al. (2022), Investigating the Hidden College: A Study of Community College Noncredit Education in Five States, University of Michigan and Opportunity America, https://bit.ly/CCNoncredit_FiveStates.
[40] BIBB (2022), Facts and Analyses to Accompany the Federal Government Report on Vocational Education and Training – Selected Findings, Federal Institute for Vocational Education and Training (BIBB), Bonn.
[38] BIBB (2022), wbmonitor—The Continuing Education and Training Landscape from the Provider Perspective, Federal Institute for Vocational Education and Training (BIBB), Bonn, https://www.bibb.de/en/2160.php.
[39] BMBF (2021), Report on Vocational Education and Training 2021, Federal Ministry of Education and Research (BMBF), https://www.bmbf.de/SharedDocs/Publikationen/de/bmbf/FS/31702_Berufsbildungsbericht_2021_en.pdf.
[34] Buckwalter, V. and T. Maag (2019), Closing the Credit-Noncredit Divide: Bridging the Gap in Postsecondary Education to Expand Opportunity for Low-Wage Working Adults, Jobs for the Future, https://www.luminafoundation.org/wp-content/uploads/2019/11/closing-the-credit-noncredit-divide.pdf.
[7] Caballero, A. et al. (2022), “Microcredentials: A new category of education is rising”, University World News, https://www.universityworldnews.com/post.php?story=20220705223949571.
[4] Cengage Group (2022), From the Great Resignation to the Great Reskilling: Insight on What’s Next for the “Great Resigners”, Cengage Group, https://cengage.widen.net/s/78hrkqgfj7/cg-great-resigners-research-report-final.
[15] D’Amico, M. et al. (2017), “A national analysis of noncredit community college education: Enrollment, funding, accountability, and contextual issues”, Community College Journal of Research and Practice, Vol. 41/4-5, pp. 288-302, https://doi.org/10.1080/10668926.2016.1251349.
[13] D’Amico, M. et al. (2014), “An exploration of noncredit community college enrollment”, Journal of Continuing Higher Education, Vol. 62/3, pp. 152-162, https://doi.org/10.1080/07377363.2014.953438.
[27] D’Amico, M. and M. Van Noy (2022), “State-level noncredit data definitions”, Issue Brief, Rutgers Education and Employment Research Center, https://smlr.rutgers.edu/sites/default/files/Documents/Centers/EERC/Data%20Definitions%20Report_Final%208.16.22tc.pdf.
[28] D’Amico, M. et al. (2023), Collecting and Understanding Noncredit Community College Data: A Taxonomy and How-to Guide for States, Rutgers Education and Employment Research Center, https://sites.rutgers.edu/state-noncredit-data/wp-content/uploads/sites/794/2023/11/State-Noncredit-Taxonomy_EERC_11.17.23.pdf.
[36] D’Amico, M. et al. (2023), The State Community College Noncredit Data Infrastructure: Lessons from Iowa, Louisiana, and Virginia, Rutgers Education and Employment Research Center, https://sites.rutgers.edu/state-noncredit-data/wp-content/uploads/sites/794/2023/08/The-State-Community-College-EERC-8.2023.pdf.
[30] Dembicki, M. (2023), “Momentum behind short-term Pell”, Community College Daily, https://www.ccdaily.com/2023/03/momentum-behind-short-term-pell/.
[31] Educause (2023), EDUCAUSE and WCET QuickPoll Results: Current Trends in Microcredential Design and Delivery, Educause, https://er.educause.edu/articles/2023/5/educause-and-wcet-quickpoll-results-current-trends-in-microcredential-design-and-delivery.
[14] Erwin, M. (2019), Noncredit Enrollment and Related Activities, National Postsecondary Education Cooperative, https://nces.ed.gov/ipeds/pdf/NPEC/data/NPEC_Paper_Noncredit_Enrollment_and_Related_Activities.pdf.
[1] European Commission (2020), Achieving a European education area by 2025 and resetting education and training for the digital age [Press Release], https://ec.europa.eu/commission/presscorner/detail/en/ip_20_1743.
[11] Gaston, P. and M. Van Noy (2022), Credentials: Understand the Problems. Identify the Opportunities. Create the Solutions, Routledge.
[20] Hoeckel, K. (2010), OECD Reviews of Vocational Education and Training: A Learning for Jobs Review of Austria 2010, OECD Reviews of Vocational Education and Training, OECD Publishing, Paris, https://doi.org/10.1787/9789264113695-en.
[21] IEB (2022), Initiative for Adult Education (IEB) and Curriculum Basic Education, IEB, https://conference.basicskills.eu/wp-content/uploads/2022/06/D1_P_2_Kemper_Weber_IEB-presentation-esbn.pdf.
[9] Kato, S., V. Galán-Muros and T. Weko (2020), “The emergence of alternative credentials”, OECD Education Working Papers, No. 216, OECD Publishing, Paris, https://doi.org/10.1787/b741f39e-en.
[46] Kirkgaard Nielsen, D. (2022), The AMU System Lets the Danish Workforce continuously Upgrade Its Skills, Electronic Platform for Adult Learning in Europe, European Commission, Brussels, https://epale.ec.europa.eu/en/blog/amu-system-lets-danish-workforce-continuously-improve-its-skills.
[41] KMK and BMBF (2016), Education in Germany 2016: An Indicator-Based Report Including an Analysis of Education and Migration, Kultuskmister Konferenz and the Federal Ministry of Education and Research (BMBF).
[25] Lumina Foundation (2023), We’re tracking America’s progress toward the 60% attainment goal, Lumina Foundation website., https://www.luminafoundation.org/stronger-nation/report/#/progress.
[6] Microbol (2023), Micro-credentials Linked to the Bologna Key Commitments: A Common Framework for Micro-credentials in the EHEA, Knowledge Innovation Center, Malta, https://microbol.knowledgeinnovation.eu/wp-content/uploads/sites/20/2022/03/Micro-credentials_Framework_final-1.pdf.
[26] Mikelson, K. et al. (2017), TAACCCT Goals, Design, and Evaluation: The Trade Adjustment Assistance Community College and Career Training Grant Program Brief 1, Urban Institute, https://www.urban.org/sites/default/files/publication/89321/2017.02.08_taaccct_brief_1_final_v2_1.pdf.
[35] Münchhausen, G. et al. (2023), Integrated Continuing Education Reporting - Establishment of a Systematic Reporting on Vocational Training (iWBBe) (Translated from Gerrman), BIBB and DIE, Bonn.
[29] NCES (2020), CIP: The Classification of Instructional Programs, National Center for Education Statistics, US Department of Education, https://nces.ed.gov/ipeds/cipcode/default.aspx?y=56.
[22] O’Banion, T. (2018), “A brief history of workforce education in community colleges”, Community College Journal of Research and Practice, Vol. 43/3, pp. 216-223, https://doi.org/10.1080/10668926.2018.1547668.
[45] Office of Career, Technical, and Adult Education (2014), Availability of Data on Noncredit Education and Postsecondary Certifications: An Analysis of Selected State-Level Data Systems, US Department of Education, https://files.eric.ed.gov/fulltext/ED555237.pdf.
[37] Rohrbach-Schmidt, D. and A. Hall (2020), BIBB/BAuA Employment Survey 2018: BIBB-FDZ Data and Methodological Report, Federal Institute for Vocational Education and Training (BIBB), Bonn, https://www.bibb.de/dienst/publikationen/de/16563.
[43] Romano, R. and M. D’Amico (2021), “How federal data shortchange the community college”, Change: The Magazine of Higher Learning, Vol. 53/4, pp. 22-28, https://doi.org/10.1080/00091383.2021.1930978.
[19] Statistics Austria (2023), AES Adult Education Survey, Statistics Austria website, https://www.statistik.at/en/about-us/surveys/individual-and-household-surveys/aes-adult-education-survey.
[17] Statistics Austria (2022), The National Statistical System, Statistics Austria website, https://www.statistik.at/en/about-us/organisation/the-national-statistical-system.
[5] Strada (2020), “COVID-19 work and education survey”, Public Viewpoint, Strada, https://cci.stradaeducation.org/wp-content/uploads/sites/2/2020/11/Report-September-16-2020.pdf.
[3] Tharoor, I. (2021), “The ’great resignation’ goes global”, Washington Post, https://www.washingtonpost.com/world/2021/10/18/labor-great-resignation-global/.
[47] Timmermans, B. (2010), “The Danish integrated database for labor market research: Towards demystification for the English speaking audience”, DRUID Working Papers, No. 10-16, Aalborg University, Aalborg, Denmark, http://www3.druid.dk/wp/20100016.pdf.
[18] UNESCO (2016), Initiative for Adult Education, Austria, UNESCO, https://www.uil.unesco.org/en/litbase/initiative-adult-education-austria.
[32] US Department of Labor (n.d.), A quick-start toolkit: Building registered apprenticeship programs, US Department of Labor, https://www.doleta.gov/oa/employers/apprenticeship_toolkit.pdf.
[42] Van Noy, M. (2023), “Making sense of quality in the non-degree credential (NDC) marketplace: Implications for policymakers and practitioners”, Issue Brief, Rutgers Education and Employment Research Center, https://smlr.rutgers.edu/sites/default/files/Documents/Centers/EERC/Making%20Sense%20of%20Quality%20in%20the%20NDC%20Marketplace%20-%20EERC%20-%208.2023.pdf.
[12] Van Noy, M. and K. Hughes (2022), “A snapshot of the shifting landscape of noncredit community college workforce education”, Issue Brief, Rutgers School of Management and Labor Relation, https://smlr.rutgers.edu/sites/default/files/Documents/Centers/EERC/Snapshot%20of%20Shifting%20Landscape%20Issue%20Brief.FINAL_0.pdf.
[23] Van Noy, M. and J. Jacobs (2009), “The outlook for noncredit workforce education”, New Directions for Community Colleges, Vol. 146, pp. 87-94, https://doi.org/10.1002/cc.369.
[16] Van Noy, M. et al. (2008), Noncredit Enrollment in Workforce Education: State Policies and Community College Practices, American Association of Community Colleges, Washington, DC, http://www.aacc.nche.edu.
[2] Van Noy, M. and S. Michael (2022), Promoting Quality, Creating Value: Organizational Influences in the Non-degree Credential Marketplace, Rutgers Education and Employment Research Center, https://smlr.rutgers.edu/sites/default/files/Documents/Centers/EERC/Market%20Processes%20Paper-V6.pdf.
[48] VIVE (2022), Effekter af Efteruddannelse: Analyser for Kommissionen for Andengenerationsreformer, National Research Centre for Social Sciences (VIVE), Copenhagen, https://www.vive.dk/da/udgivelser/effekter-af-efteruddannelse-rv7m4ovn/.
[24] Voorhees, R. and J. Milam (2005), The Hidden College: Noncredit Education in the United States, Curry School of Education, University of Virginia, https://www.issuelab.org/resources/9664/9664.pdf.
[10] Workcred (2021), How Do Credentials Differ?, Workcred, https://workcred.org/Documents/How-Do-Credentials-Differ.pdf.
[44] Xu, D. and F. Ran (2020), “Noncredit education in community college: Students, course enrollments, and academic outcomes”, Community College Review, Vol. 48/1, pp. 77-101, https://journals.sagepub.com/doi/pdf/10.1177/0091552119876039.
[8] Zanville, H. (2021), Nondegree Credentialing: A Global Issue, Nondegree Credential Research Network, https://cpb-us-e1.wpmucdn.com/blogs.gwu.edu/dist/8/3867/files/2021/05/NCRN-Report_OECD-Webinar_May-12-2021_final_PDF.pdf.