API rate limiting
The OECD Data Explorer API is subject to rate limiting to protect the network, manage traffic efficiently, and ensure a responsive experience for all users.
API access is currently restricted to a maximum of 60 data downloads per hour. Any requests exceeding this limit will be temporarily blocked. This restriction also applies to CSV file downloads from the data-explorer.oecd.org interface. Additionally, traffic originating from VPNs or anonymized sources is not allowed.
Please also note that API requests using certain parameters are restricted, as they can impact overall system performance. The list of restricted parameters is available here
API best practices
The following suggestions are provided to support more efficient use of the OECD Data Explorer API, particularly for users who regularly retrieve the same datasets.
With limited exceptions, primarily for high-frequency economic indicators, most OECD datasets are updated infrequently (with revisions occurring primarily once or twice a year). As a result, API responses tend to remain relatively stable over time. Implementing efficient querying strategies can help minimise unnecessary repeated requests and reduce the need for large-scale downloads.
1. Use the contentconstraint query
- This query provides a ValidFrom timestamp, indicating the last update time, and an Annotation with the total observation count.
- It also ensures you receive the latest version of the dataset, allowing you to verify if the dataset version has changed.
Example: https://sdmx.oecd.org/public/rest/contentconstraint/OECD.ELS.SPD/CR_A_DSD_SOCX_AGG@DF_SOCX_AGG/
https://sdmx.oecd.org/public/rest/ + contentconstraint/ + AGENCY_ID + CR_A_ + DATASET_ID
2. Cache your results locally
- If you or multiple users within the same organisation frequently fetch the same data, store results locally in a database or file.
- This avoids redundant requests, reduces server load, and prevents you from hitting API usage limits.
3. Optimize query sizes
- Run larger, consolidated queries whenever possible, rather than multiple smaller ones.
- For very large datasets (e.g., those with over 10 million records), consider breaking the queries into smaller slices for better manageability.
Thank you for following these best practices to ensure efficient and reliable use of the API.