API rate limiting
To protect the OECD Data Explorer API network, manage traffic effectively, and maintain a responsive experience for all users, rate limiting was implemented in November 2024.
API access is currently limited to a maximum of 20 data downloads per hour. Requests exceeding these thresholds will be temporarily blocked. In addition, traffic originating from VPNs or anonymised sources is not permitted.
OECD is actively working to ease these restrictions in the future. This will be made possible through planned infrastructure upgrades aimed at supporting higher loads and improved concurrency. However, this transition is expected to take several more months.
API best practices
The following suggestions are provided to support more efficient use of the OECD Data Explorer API, particularly for users who regularly retrieve the same datasets.
With limited exceptions, primarily for high-frequency economic indicators, most OECD datasets are updated infrequently (with revisions occurring primarily once or twice a year). As a result, API responses tend to remain relatively stable over time. Implementing efficient querying strategies can help minimise unnecessary repeated requests and reduce the need for large-scale downloads.
1. Use the contentconstraint query
- This query provides a ValidFrom timestamp, indicating the last update time, and an Annotation with the total observation count.
- It also ensures you receive the latest version of the dataset, allowing you to verify if the dataset version has changed.
Example: https://sdmx.oecd.org/public/rest/contentconstraint/OECD.ELS.SPD/CR_A_DSD_SOCX_AGG@DF_SOCX_AGG/
https://sdmx.oecd.org/public/rest/ + contentconstraint/ + AGENCY_ID + CR_A_ + DATASET_ID
2. Cache your results locally
- If you or multiple users within the same organisation frequently fetch the same data, store results locally in a database or file.
- This avoids redundant requests, reduces server load, and prevents you from hitting API usage limits.
3. Optimize query sizes
- Run larger, consolidated queries whenever possible, rather than multiple smaller ones.
- For very large datasets (e.g., those with over 10 million records), consider breaking the queries into smaller slices for better manageability.
Thank you for following these best practices to ensure efficient and reliable use of the API.