Limits of the Wikipedia API

They state some API:Etiquette and API:FAQ.

There is no hard and fast limit on read requests, but we ask that you be considerate and try not to take a site down. Most sysadmins reserve the right to unceremoniously block you if you do endanger the stability of their site.

If you make your requests in series rather than in parallel (i.e. wait for the one request to finish before sending a new request, such that you're never making more than one request at the same time), then you should definitely be fine. Also try to combine things into one request where you can (e.g. use multiple titles in a titles parameter instead of making a new request for each title

API FAQ states you can retrieve 50 pages per API request.

You can use Data Dumps as well if you need content offline (likely a little outdated).

For a graceful termination of your script in case you hit any of the limits, you can handle errors & warnings in API calls using these status messages.


If there is no need of a "live sample", it would be better to use a data-dump.


See the wikimedia REST API "Terms and Conditions" for the latest rate limits (200 requests per second in 2022). What do you plan to do with the Wikipedia API?