This chapter examines how selected ALMPs in Spain integrated monitoring and evaluation (M&E) frameworks. It begins by outlining the rationale for assessing M&E systems and presents the benchmark used for the qualitative assessment. The chapter analyses the extent to which programmes established mechanisms to monitor participation and outcomes, used feedback to improve delivery, evaluated their impact on participants, and promoted transparency and accountability. It highlights good practices identified during the assessment and concludes with recommendations to strengthen M&E frameworks in future ALMPs.
Improving Active Labour Market Policies in Spain
9. Monitoring and evaluation
Copy link to 9. Monitoring and evaluationAbstract
9.1. Understanding the criterion and its benchmark for qualitative assessment
Copy link to 9.1. Understanding the criterion and its benchmark for qualitative assessmentPublic policies, including ALMPs, must be introduced in a setting where transparency and accountability are promoted, and effectiveness and efficiency are analysed. This is where monitoring and evaluation play a crucial role. Monitoring and evaluation not only provide the objective evidence necessary for policymakers to adapt or terminate inefficient policies and promote policies that meet the changing labour market needs effectively and efficiently, but also ensures that public money is spent on policies that produce the desired outcomes. The results generated from monitoring and evaluation enable to establish a process of continuous feedback for ongoing improvement of ALMPs based on the evidence generated (OECD, 2022[31]; OECD, 2022[30]; OECD, 2023[32]).1
Monitoring and evaluation, though distinct in function, are complementary in their purpose to generate the evidence necessary to assess the success of a programme. Monitoring is an ongoing process of gathering and analysing information about a programme to ensure that it is progressing as planned, that activities have been implemented and deadlines are being respected, that participants and staff are satisfied and that objectives are being met. Such monitoring makes it easier to identify different kind of issues with the design, implementation, and results of ALMPs and enables quick reactions and the provision of appropriate solutions.
In tandem with monitoring, evaluation activities focus on determining how effectively the programme has been implemented, whether it meets its objectives, whether there are causal links between the programme and its results, and if its benefits outweigh its costs (OECD, 2020[33]; OECD, 2020[34]). They serve as a systematic review of an ongoing or completed programme, encompassing its design, execution, and outcomes (OECD, 2022[35]). Evaluation demands a higher investment of time and resources compared to conventional monitoring indicators but is nonetheless critical to ensure programme’s effectiveness.
Although numerous evaluation types exist and they can be classified in multiple ways, one proposed categorisation is as follows (OECD, 2023[17]; OECD, 2020[33]):
Formative evaluations: Ex-ante assessment whether a programme or intervention is feasible, appropriate, and acceptable before it is fully implemented. Mostly appropriate to assess the evaluation criteria “Relevance”.
Process evaluation: Determines whether programme activities have been implemented as intended. Conducted to assess the “Coherence” criteria. As part of this evaluation, user experiences could be incorporated to better understand the effectiveness of programme implementation from the perspective of those directly involved.
(Intermediate) outcome evaluation: Measures intermediate programme effects in the target population by assessing the progress in the outcomes or outcome objectives that the programme is to achieve.
Impact evaluation: Assesses programme effectiveness in achieving its ultimate goals.
Cost-effectiveness and cost-benefit evaluation: Examines the programme’s outcomes (cost-effectiveness) or impacts (cost-benefit) in relation to the costs of implementing the programme and, if possible, the opportunity costs for beneficiaries (e.g. foregone earnings) as well as indirect costs on non-beneficiaries (e.g. negative externalities).
These diverse types of evaluations are integral to a comprehensive monitoring and evaluation framework for ALMPS. Crucially, the insights derived from these monitoring and evaluation processes directly inform and refine programme design and implementation (see Chapter 2).
Box 9.1. Benchmark for excellence: Monitoring and evaluation of ALMPs
Copy link to Box 9.1. Benchmark for excellence: Monitoring and evaluation of ALMPsThe benchmark ALMP is distinguished by its comprehensive monitoring and evaluation (M&E) framework, designed to ensure continuous improvement and accountability. This framework includes:
Systematic monitoring: The programme incorporates a systematic process for ongoing data collection and analysis. Monitoring includes tracking participant progress, employment rates, skills acquisition, and other relevant indicators. The aim is to establish a feedback loop, facilitating real-time adjustments to the programme to maintain alignment with strategic goals and evolving labour market demands.
Evaluation: Evaluation in the benchmark ALMP goes beyond traditional monitoring measures. This involves assessing (internally or in collaboration with independent experts) the effectiveness of the programme in achieving its objectives and its broader impact, using methodologies such as counterfactual impact evaluations. The focus is on measuring the (causal) impact of the programme on outcomes such as employment, job quality, its cost-effectiveness, as well as understanding the broader economic and social benefits of the programme.
Stakeholder feedback integration: The programme actively seeks and incorporates feedback from key stakeholders, including participants and employers. Methods like surveys, interviews, and focus groups are used to gather this feedback, ensuring that the programme remains responsive, and participant focussed.
Transparency and responsive adjustments: The programme emphasises transparency in its reporting processes and implements accountability measures. This includes publicly sharing findings, undergoing independent reviews, and making adjustments to the programme based on monitoring and evaluation results.
9.2. Summary of the qualitative assessment results
Copy link to 9.2. Summary of the qualitative assessment results9.2.1. Systematic monitoring
All assessed programmes were subject to the monitoring framework established under the RRP, which mandates the collection of common indicators to track participant progress. Implementing authorities submit bimonthly reports detailing programme implementation, participant characteristics, and service provision. The data collected typically included detailed tracking of all services and actions undertaken by participants (e.g. guidance sessions, assistance from job search teams, trainings, and other supportive actions). Each service or action was documented with its start and end dates, duration, the service provider in charge, and the location where it was delivered. This structured approach allows for comprehensive tracking of programme activities and ensures that service provision aligns with programme objectives.
However, for some programmes such as those targeting vulnerable groups, the RRP framework had a limitation: it did not require the systematic monitoring of labour market integration outcomes. This drawback has led some programmes to implement independent employment tracking mechanisms to address this gap. For most other programmes, such as those targeting women victims of gender-based violence and those supporting women in rural and urban areas, employment outcomes were tracked due to the direct link between service provider payments and successful job placements. In these cases, employment contracts were tracked, including the start and anticipated end date, the type of contract, the occupation, and the location where it was performed, along with the employer’s identification, such as its VAT number and corporate name. For self-employment, the programmes also tracked registration in the self-employment social security scheme.
In some programmes, data collection went beyond the RRP framework to capture more nuanced indicators of programme success. For instance, in some cases, employability assessments were conducted at both the start and the conclusion of itineraries to measure progress in terms of employability. Additionally, skill acquisition was monitored in some programmes using competency-based evaluation questionnaires, theoretical exams and the obtention of professional certificates.
In addition to quantitative monitoring, some programmes, particularly those targeting women victims of gender-based violence, collected more detailed qualitative data to gain a deeper understanding of participant progress. For example, reports were generated that included qualitative information on each participant’s self-esteem, autonomy, and other personal developments that were not immediately captured through employment metrics.
The technical infrastructure supporting monitoring within the RRP framework requires service providers to pre‑fill bimonthly monitoring files, which are then submitted to the regional PES. The regional PES conduct quality checks on the collected data before forwarding it to SEPE, ensuring that the data meets the necessary standards. Simultaneously, SEPE, together with the regional PES, maintains a comprehensive national database known as the Information System of Public Employment Services (SISPE). This system aggregates data on the implementation of ALMPs across the country, capturing details about jobseekers, the various services and programmes they engage with, as well as data related to the management of ALMPs and unemployment benefits. Data flow into SISPE are facilitated through interfaces with operational databases of regional PES, enabling a compatible and efficient access to programme data.
Some regions have automated systems that allow service providers to input data directly into regional or national databases, streamlining the monitoring process and ensuring real-time data availability. However, for many programmes assessed, regions still relied on manual data entry, which increased the administrative burden and the risk of information gaps. Moreover, for some programmes, regions had not established any form of automated data exchange, nor did they manually input data into SISPE, further exacerbating the gaps in PES records concerning jobseekers participating in the programmes.
Moreover, the need to manage two parallel data collection systems generated significant inefficiencies, placing a substantial workload on both service providers and PES staff. Service providers had to duplicate efforts by entering similar information into both systems, while regional PES staff were tasked with performing quality checks and reconciling data from both sources to ensure consistency. This dual system created unnecessary complexity and delays, hindering the efficiency of the monitoring process.
9.2.2. Evaluation practices
While monitoring processes were well-established, ensuring that activities were implemented as planned, identifying bottlenecks or inefficiencies in service delivery, and tracking participant progress and outcomes, evaluation frameworks remained largely underdeveloped. Most programmes lacked formal impact evaluation mechanisms, which made it difficult to determine whether observed employment outcomes were directly attributable to programme participation. In rare cases, experimental or quasi‑experimental evaluations are being piloted to measure causal programme impact on employment and well-being. One such example, still in its early stages, aims to assess the effects of participation by comparing the employability outcomes of participants with those of a control group, which serves as a counterfactual to isolate the programme’s causal impact. The evaluation also seeks to capture broader qualitative effects, such as improved self-confidence and autonomy among participating women, providing a more comprehensive understanding of the programme’s impact.
9.2.3. Stakeholder feedback integration
Some programmes actively collected feedback from stakeholders through surveys and interviews, providing valuable insights into programme quality and effectiveness. For example, participant feedback was sometimes gathered at multiple stages throughout the programme, allowing for real-time adjustments based on jobseekers’ experiences. When employment was an integral part of the programmes, employer feedback helped assess job readiness, the relevance of training, and identify sector-specific skills gaps, ensuring that training and employment support align with market needs. However, the integration of both participant and employer feedback was not yet systematic across all programmes, and there was a lack of consistent mechanisms to ensure this feedback is used to inform ongoing programme improvements.
9.2.4. Transparency and responsive adjustments
While monitoring results were systematically reported to managing authorities, their use for public accountability and programme adaptation remained inconsistent across programmes. In most cases, final reports submitted by service providers remained internal to the PES, limiting opportunities for broader learning and programme improvement. Additionally, findings from M&E activities were not consistently integrated into decision making processes to inform significant programme redesigns, often leading to only minor procedural adjustments rather than more comprehensive refinements based on evidence.
However, some programmes took important steps towards greater transparency and responsiveness. For instance, publishing programme results and statistical breakdowns online has increased public accountability and facilitated cross-programme learning. Furthermore, some programmes demonstrated a commitment to data-driven decision making by using M&E results to introduce mid-course adjustments, responding to emerging challenges and fostering continuous improvement throughout implementation.
9.3. Good practices identified
Copy link to 9.3. Good practices identifiedThe following two examples, presented in Box on good practices 16 and Box on good practices 17, illustrate particularly noteworthy approaches to monitoring and evaluation identified during the assessment exercise. The first, from the programme targeting vulnerable groups in Baleares, shows how regional authorities can develop an integrated and transparent M&E system that goes beyond mandatory RRP requirements. The second, managed by SEPE through its Provincial Directorate in Ceuta, Melilla, and five other regions (Andalucía, Aragón, Canarias, Cataluña, Madrid), combines systematic monitoring with additional tools to capture broader dimensions of reintegration and participant feedback.
Box on good practices 16. Monitoring and evaluation (M&E) for vulnerable groups programme in Baleares
Copy link to Box on good practices 16. Monitoring and evaluation (M&E) for vulnerable groups programme in BalearesBaleares exemplifies a good practice in monitoring and evaluation (M&E) within the framework of ALMPs targeting vulnerable groups. The comprehensive M&E framework implemented in Baleares aligns with the benchmark for excellence by ensuring continuous improvement and maintaining high transparency standards
Detailed and systematic monitoring
In Baleares, service providers are given access to specialised applications, ACCFOR and ESOIB, to meticulously document all aspects of participant engagement. These platforms allow for the direct upload of comprehensive data on every service and action provided to participants – from initial enrolment to programme completion, including any instances of dropout. This data is seamlessly integrated into both regional and national databases managed by the PES, contributing to the Information System of Public Employment Services (SISPE). This automated data exchange enhances the effectiveness of the monitoring process by ensuring accurate and timely information flow.
Proactive labour market integration monitoring
Distinctively, despite the RRM monitoring framework not mandating such measures, Baleares proactively monitors labour market integration 6 and 12 months after programme completion. The regional PES (SOIB) accomplishes this by tracking participants within the social security database to verify their engagement in employment contracts. This proactive approach allows for a more accurate assessment of the programme’s impact on employability.
Integrating participant feedback
Baleares places a strong emphasis on participant feedback to refine and enhance service delivery. Throughout the programme, participants are invited to evaluate each training session and the overall guidance and counselling they receive through structured surveys. This direct feedback is essential for ongoing programme adjustments and is fully integrated into the overall monitoring process.
Commitment to transparency and dissemination
SOIB in Baleares sets a high standard for transparency by publishing analytical reports on its official website (https://soib.es/avaluacio-de‑programes-i-serveis/). These reports provide an analysis of monitoring actions and survey results, offering key statistics and insights on all aspects of the programme. Information provided includes demographic breakdowns of participants by gender, level of education, and age. Additionally, the reports include breakdowns by service provider, and analyse outcomes such as dropout rates, successful completions, labour market integration, and participant satisfaction.
Source: Authors from information collected through questionnaires and consultations.
Box on good practices 17. Comprehensive and nuanced approach to monitoring and evaluation (M&E) in the programme targeting women victims of gender violence and human trafficking by SEPE in Ceuta, Melilla, and in the provinces of Zaragoza (Aragón), Barcelona (Cataluña), Madrid (Comunidad de Madrid), Las Palmas (Canarias), Málaga, and Sevilla (Andalucía)
Copy link to Box on good practices 17. Comprehensive and nuanced approach to monitoring and evaluation (M&E) in the programme targeting women victims of gender violence and human trafficking by SEPE in Ceuta, Melilla, and in the provinces of Zaragoza (Aragón), Barcelona (Cataluña), Madrid (Comunidad de Madrid), Las Palmas (Canarias), Málaga, and Sevilla (Andalucía)Systematic monitoring for continuous improvement
A core strength of the programme is its adherence to the structured monitoring processes mandated by the RRM monitoring and evaluation (M&E) framework. This includes the collection of anonymised participant data, ensuring that privacy concerns are respected while still providing comprehensive tracking of participants’ progress. Through bimonthly reports submitted to SEPE, the programme tracks critical indicators such as the number of women targeted, employment status, training details, and legal instruments used. The programme introduces a range of documents that facilitate real-time monitoring. Examples include reports generated for services like “Servicio Puedo” and “Servicio Proyecto”, which track participants’ progress in personal, social, and employment skills development. Monthly reports document attendance, active job searches, and employer interactions, while final reports detail employment outcomes, such as contract types, work placements, and self-employment transitions.
Additionally, the programme uses a project management system based on the Logical Framework Approach (LFA), a structured methodology for planning, managing, and evaluating projects. The LFA’s focus on setting measurable indicators from the outset supports the systematic collection and analysis of data, ensuring that the programme remains aligned with its objectives and can adapt based on participant progress.
Assessment of Survivor Outcomes (ASO)
A unique feature of the SEPE/Red Cross programme is the use of the Assessment of Survivor Outcomes (ASO) tool, developed by the International Justice Mission. The ASO tool goes beyond traditional monitoring by assessing the holistic recovery and reintegration of women survivors of violence and exploitation. This tool monitors participants’ progress across six critical domains that are essential for reducing vulnerability to revictimisation and fostering long-term reintegration:
Safety: Assesses that participants are living in safe environments.
Legal protection: Tracks participants’ access to legal support and rights.
Mental well-being: Monitors psychological recovery and emotional health.
Economic empowerment and education: Evaluates participants’ readiness and ability to re‑enter the workforce or education and training systems.
Social support: Measures the strength of social networks and the support available to participants.
Physical well-being: Assesses participants’ overall health and physical recovery.
This tool provides a comprehensive view of a survivor’s progress, going beyond economic outcomes to capture personal empowerment and societal reintegration.
Integration of participant feedback
A vital component of the programme’s M&E is its proactive use of participant feedback. Regular satisfaction surveys gauge participants’ experiences, focussing on the relevance and usefulness of activities, the quality of information and support provided, and the alignment of the programme with their expectations and needs. Additionally, the programme offers open channels for suggestions and complaints through the Red Cross’s Spain official website, allowing for continuous feedback and adaptation.
Source: Authors from information collected through questionnaires and consultations.
9.4. Policy directions: Avenues for strengthening monitoring and evaluation in future ALMPs
Copy link to 9.4. Policy directions: Avenues for strengthening monitoring and evaluation in future ALMPsSpain has established an integrated model for monitoring and evaluating ALMPs, structured around four key processes: (1) monitoring and evaluation of the current Spanish Strategy for Active Support to Employment and its annual Employment Policy Plans;2 (2) monitoring of services provided to job seekers and employers through the Common Portfolio of Services; (3) performance assessments of PES, including through benchlearning exercises; and (4) external and independent evaluations of ALMPs and their instruments. Drawing on evidence from the assessment conducted, the following recommendations seek to reinforce and enhance this framework.
9.4.1. Enhance systematic monitoring
While systematic monitoring was in place for most programmes, improvements are necessary to ensure consistent data collection and streamline the monitoring process across regions. Future ALMP should:
Systematise the tracking of labour market integration outcomes as a core part of the monitoring process. This involves the mandatory tracking of outcomes such as job placements, the type of contract, job duration, and other key labour market integration indicators. Ideally, monitoring should extend beyond the end of the programme, ensuring long-term follow-up on employment sustainability and career progression.
Encourage the collection of outcome data on broader dimensions such as social integration, well-being and skills acquisition, to provide a more holistic understanding of participant success and programme effectiveness. Initiatives such as ES_DataLab, which facilitates secure access to linked administrative microdata from SEPE and multiple other public institutions, can support the analysis of these broader outcomes.
Integrate parallel data monitoring systems -such as the RRP monitoring system and PES databases- to reduce inefficiencies, duplication of efforts and administrative burdens on service providers and PES staff, ensuring more consistent and accurate data flows.
Prioritise the development of automated systems for data exchange in regions that have not yet implemented them. This would enable real-time data tracking and secure data exchange, reduce the risk of information gaps, and improve the overall responsiveness and efficiency of the monitoring system.
9.4.2. Implement robust evaluations
While monitoring processes were established, most programmes lack robust evaluation mechanisms to assess the impact on employment, job quality and other outcomes. Such evaluations could provide the evidence needed to refine programme design, ensure that public resources are used effectively, and assess the economic efficiency of programmes. Future ALMP should:
Incorporate formal impact evaluations, using experimental or quasi‑experimental methodologies to measure causal impacts on employment, social integration, and well-being. This could include, for example, conducting randomised controlled trials (RCTs) at the national or regional level, or using staggered implementation across geographical areas and over time to enable the use of comparison groups. This would provide a more comprehensive understanding of the programme’s effectiveness.
Include cost-benefit analysis to assess the economic efficiency of programmes, comparing the costs of implementation with the benefits derived from improved employment outcomes, social integration, and other impacts on participants.
9.4.3. Strengthen stakeholder feedback integration
While some programmes collected feedback from participants and employers, this process was not systematic across all programmes. Future ALMP should:
Systematically gather participant feedback at multiple points throughout the programme, ensuring that their experiences and suggestions can inform real-time adjustments.
Integrate employer feedback in a more structured way to assess the relevance of itineraries and training, job readiness, and sector-specific skills gaps. This would help tailor training programmes to actual labour market needs and improve employment outcomes.
Use stakeholder feedback to make responsive adjustments throughout the programme cycle, ensuring that the programme remains dynamic and adaptive.
9.4.4. Improve the transparency and use of M&E findings
Transparency in reporting M&E results is vital for public accountability and continuous programme improvement. In line with the provisions of Title VI of the Employment Law 3/2023, which establishes the obligation to publish evaluation results and integrate them into decision making processes, future ALMP should:
Publish M&E findings, enabling external scrutiny, promoting accountability, and facilitating cross-programme learning.
Ensure that findings from M&E activities are systematically used to inform not only minor adjustments but also more comprehensive programme redesigns based on evidence.
9.4.5. Invest in capacity building for M&E systems
Effective M&E activities require adequate resources and capacity at all levels of programme implementation. Building on existing initiatives, such as the training guarantee established under Royal Decree 438/2024 and the permanent training plan approved by the Sectoral Conference, future ALMPs should invest in training and capacity-building for staff responsible for data collection, analysis, and M&E in both service providers and PES. This could help ensure high-quality data and effective use of M&E systems across regions. Additionally, regions, particularly those with limited capacity and experience, should receive technical and financial support to develop and implement automated data systems.
Notes
Copy link to Notes← 1. See OECD-EC project on policy impact evaluation through the use of linked administrative and survey data.
← 2. Currently called annual Plans for the Promotion of Decent Employment (PAFED).