Centre for Effective Learning Environments (CELE)

Evaluating Quality in Educational Facilities: PEB Working Group

 

To help countries improve the quality of their school buildings, grounds and equipment, PEB established an ad hoc Working Group on Evaluating Quality in Educational Facilities. At its first meeting, held in Paris on 18-19 September 2006, the working group participants discussed PEB’s existing framework on quality evaluation and ways of developing quality indicators, common user-centred tools and international quality benchmarks. Nearly 30 participants from 22 countries attended. The meeting advanced work initiated at two PEB experts’ group meetings on evaluating quality in educational facilities, in 2005.

PEB Organising Framework

In this session, participants reviewed the PEB Organising Framework on Evaluating Quality in Educational Facilities, one of the outcomes of the 2005 experts’ group meetings. The objective of the framework is to demonstrate the inter-relationships over a facility’s life cycle between the broad policy matters that both shape and respond to quality issues in educational facilities; current conceptions of what defines “quality” in educational facilities; the demands of the users and other stakeholders and the benefits of the facility; and methods to evaluate different aspects of quality. The framework comprises a definition, principles and criteria of quality; examples of successful methods for evaluating quality in educational facilities; and parameters within which quality in educational facilities must be defined and evaluated. Specifically, the working group considered whether the existing organising framework is an acceptable conceptual model and reflected on the proposed educational policy issues that the framework should address.

The framework was substantially revised in light of the group’s discussions to highlight the important role of quality facilities in increasing access and equity for all in education, improving educational effectiveness and promoting acquisition of key competencies, and optimising building performance, operation and cost-effectiveness. Five principles of quality were adopted:

  • The facility is fit for purpose in relation to the users’ needs.
  • The facility is fit for purpose in relation to operational layout.
  • The facility is visually pleasing and educational, and the design offers symbolic meaning.
  • The facility provides a healthy and safe environment.
  • The facility is environmentally sustainable.

The working group agreed that results from any evaluation process must feed back into the building cycle and seek to raise awareness among those who can influence funding and improve design, namely officials and decision-makers. They also stressed that the framework should not serve as a checklist; rather it is a multi-dimensional, policy-oriented tool to help determine the most appropriate means by which to evaluate aspects of quality in educational facilities in different countries at local, regional and/or national levels. It should also be used by individual countries to assess “quality” in terms of their own goals and priorities.

Developing common user-centred tools

The working group participants considered a proposal for developing common user-centred tools by way of an international study based on a post-occupancy evaluation-type methodology. Following this session, a number of participants expressed interest in collaborating on a proposal for a pilot study. The study would use a variety of qualitative research tools to assess the educational effectiveness of a facility over its life cycle, from the perspective of the building’s users and other key stakeholders in the procurement process. More information on launching the Facility Performance Evaluation Pilot Study is available in the “News” section of this edition of PEB Exchange.

Developing indicators

This session explored two tools for developing quality indicators: PISA questionnaires, and statistics and indicators using data from administrative sources.

The OECD Programme for International Student Assessment (PISA) offers many possibilities for exploring the relationship between school building quality and student performance through its questionnaires (see PEB Exchange, No. 58). However a review of data and items related to the quality of physical infrastructure in 2003 PISA School (Principal’s) Questionnaire – in which school principals were asked about the extent to which they perceived that the school’s capacity to provide instruction were hindered by a shortage or inadequacy of school buildings and grounds; heating/cooling and lighting systems; and instructional space (e.g. classrooms) – revealed little about the nature of this relationship. The working group suggested ways to improve existing items in the School Questionnaire and provided comments on a draft questionnaire on the Physical Learning Environment, to be completed by 15-year-old students.

In early 2007, the PISA Governing Board will consider the working group’s recommendations for the PISA 2009 cycle.

The few existing facilities-related data from the OECD Education Database were presented, in addition to the results of a recent PEB Survey on Availability of Data on Educational Facilities. Although this survey, which was completed by 15 countries, indicates that much national-level data exists, there is a dearth of international data on educational facilities. The working group reflected upon the potential internationally comparable indicators that could be generated from existing sources, and the policy issues that such indicators could address.

They agreed that data collection must reflect common interests and improve understanding of policy developments and issues on educational facilities in different countries. However, existing definitions and data relating to educational facilities – such as cost of maintenance, school and classroom capacity, and size of the school and educational spaces – are difficult to compare within and between countries. For these reasons, any future data collection must be simple and not aim to create comparable definitions and datasets, at least initially. The group also agreed that there is definite value in collecting data from countries on existing definition and standards – for example on overcrowding; age of buildings; funding models; availability of teaching spaces and access for students with special needs.

Participants suggested that indicators could address common areas of policy interest such as investment in educational facilities annually and over time, and as a percentage of total expenditure on education, total GDP and GDP per capita, and educational expenditure per student; the role of the private sector in financing educational facilities; the decentralisation of funding and ofresponsibility in financing and managing educational facilities; and the capacity of educational facilities – in terms of the number, type and location of facilities – to meet the demands of changing school-age populations, and general trends in OECD countries associated with the rising age of compulsory education, increased investment in early childhood education and care, etc.

A small meeting involving the 15 existing PEB country contacts for this activity and other interested experts will take place in 2007 to identify areas of common interest to countries.

Developing international benchmarks

In this session, the working group considered the constraints involved in developing international benchmarks, such as school construction and maintenance standards, student capacity regulations for classrooms and schools, and space standard. The also considered the type of international benchmarks (if any) that could be developed, and the real and potential policy impact of establishing standards, guidelines or benchmarks of quality in educational facilities at an international level.

The group agreed that it is neither feasible nor desirable to establish international performance standards for the purpose of measuring or ranking the quality of educational facilities, since standards vary greatly within and between countries due to differences in education systems, in procurement processes and in decentralisation issues. However, there is value in developing international performance standards from which countries can develop their own national/local measurable standards of quality in educational facilities. The Secretariat will develop such standards as part of the Facility Performance Evaluation pilot study.

For further information on these activities, please contact Hannah.vonAhlefeld@oecd.org or consult www.oecd.org/edu/facilities/evaluatingquality.

 

 

 

Countries list

  • Afghanistan
  • Albania
  • Algeria
  • Andorra
  • Angola
  • Anguilla
  • Antigua and Barbuda
  • Argentina
  • Armenia
  • Aruba
  • Australia
  • Austria
  • Azerbaijan
  • Bahamas
  • Bahrain
  • Bangladesh
  • Barbados
  • Belarus
  • Belgium
  • Belize
  • Benin
  • Bermuda
  • Bhutan
  • Bolivia
  • Bosnia and Herzegovina
  • Botswana
  • Brazil
  • Brunei Darussalam
  • Bulgaria
  • Burkina Faso
  • Burundi
  • Cambodia
  • Cameroon
  • Canada
  • Cape Verde
  • Cayman Islands
  • Central African Republic
  • Chad
  • Chile
  • China (People’s Republic of)
  • Chinese Taipei
  • Colombia
  • Comoros
  • Congo
  • Cook Islands
  • Costa Rica
  • Croatia
  • Cuba
  • Cyprus
  • Czech Republic
  • Côte d'Ivoire
  • Democratic People's Republic of Korea
  • Democratic Republic of the Congo
  • Denmark
  • Djibouti
  • Dominica
  • Dominican Republic
  • Ecuador
  • Egypt
  • El Salvador
  • Equatorial Guinea
  • Eritrea
  • Estonia
  • Ethiopia
  • European Union
  • Faeroe Islands
  • Fiji
  • Finland
  • Former Yugoslav Republic of Macedonia (FYROM)
  • France
  • French Guiana
  • Gabon
  • Gambia
  • Georgia
  • Germany
  • Ghana
  • Gibraltar
  • Greece
  • Greenland
  • Grenada
  • Guatemala
  • Guernsey
  • Guinea
  • Guinea-Bissau
  • Guyana
  • Haiti
  • Honduras
  • Hong Kong, China
  • Hungary
  • Iceland
  • India
  • Indonesia
  • Iraq
  • Ireland
  • Islamic Republic of Iran
  • Isle of Man
  • Israel
  • Italy
  • Jamaica
  • Japan
  • Jersey
  • Jordan
  • Kazakhstan
  • Kenya
  • Kiribati
  • Korea
  • Kuwait
  • Kyrgyzstan
  • Lao People's Democratic Republic
  • Latvia
  • Lebanon
  • Lesotho
  • Liberia
  • Libya
  • Liechtenstein
  • Lithuania
  • Luxembourg
  • Macao (China)
  • Madagascar
  • Malawi
  • Malaysia
  • Maldives
  • Mali
  • Malta
  • Marshall Islands
  • Mauritania
  • Mauritius
  • Mayotte
  • Mexico
  • Micronesia (Federated States of)
  • Moldova
  • Monaco
  • Mongolia
  • Montenegro
  • Montserrat
  • Morocco
  • Mozambique
  • Myanmar
  • Namibia
  • Nauru
  • Nepal
  • Netherlands
  • Netherlands Antilles
  • New Zealand
  • Nicaragua
  • Niger
  • Nigeria
  • Niue
  • Norway
  • Oman
  • Pakistan
  • Palau
  • Palestinian Administered Areas
  • Panama
  • Papua New Guinea
  • Paraguay
  • Peru
  • Philippines
  • Poland
  • Portugal
  • Puerto Rico
  • Qatar
  • Romania
  • Russian Federation
  • Rwanda
  • Saint Helena
  • Saint Kitts and Nevis
  • Saint Lucia
  • Saint Vincent and the Grenadines
  • Samoa
  • San Marino
  • Sao Tome and Principe
  • Saudi Arabia
  • Senegal
  • Serbia
  • Serbia and Montenegro (pre-June 2006)
  • Seychelles
  • Sierra Leone
  • Singapore
  • Slovak Republic
  • Slovenia
  • Solomon Islands
  • Somalia
  • South Africa
  • South Sudan
  • Spain
  • Sri Lanka
  • Sudan
  • Suriname
  • Swaziland
  • Sweden
  • Switzerland
  • Syrian Arab Republic
  • Tajikistan
  • Tanzania
  • Thailand
  • Timor-Leste
  • Togo
  • Tokelau
  • Tonga
  • Trinidad and Tobago
  • Tunisia
  • Turkey
  • Turkmenistan
  • Turks and Caicos Islands
  • Tuvalu
  • Uganda
  • Ukraine
  • United Arab Emirates
  • United Kingdom
  • United States
  • United States Virgin Islands
  • Uruguay
  • Uzbekistan
  • Vanuatu
  • Venezuela
  • Vietnam
  • Virgin Islands (UK)
  • Wallis and Futuna Islands
  • Western Sahara
  • Yemen
  • Zambia
  • Zimbabwe
  • Topics list