Rate limit clarifications




Currently, we are thinking about creating a new tool for fetching some information about certain entities from Wikidata. The way we would like to do this is with the entity’s URL adding the .json suffix for getting the parsable data. (i.e.: for http://www.wikidata.org/entity/Q23240 we will call http://www.wikidata.org/entity/Q23240.json)

Probably, for the initial release we would also like to fetch the json data for a large number of entities.
Considering this, we want to know what are the rate limits for reading.

I’m asking here because I couldn’t find a clear answer. Right below you can see what I have found about this subject:

As you can see, the statements are contradictory, so I would really appreciate a clear info about this.

Thanks for your help :bowing_man:


The first two links are about the action API, the last about RESTBase. You are asking about the Linked Data interface, which is neither.


Ok, and for this interface, is there a rate limit?
I have found this page but I can’t find nothing related to rate limiting.


No clue; maybe @addshore knows.


Somebody? Nobody? At least a suggestion where should I take a look?


You could ask on wikidata-tech or the tech advice meeting (which is mostly manned by Wikidata engineers).


What sort of rate were you thinking?

The guidelines for request limits can be found @ https://www.mediawiki.org/wiki/API:Etiquette#Request_limit

There is no hard and fast limit on read requests, but we ask that you be considerate and try not to take a site down. Most system administrators reserve the right to unceremoniously block you if you do endanger the stability of their site.

If you make your requests in series rather than in parallel (i.e. wait for the one request to finish before sending a new request, such that you’re never making more than one request at the same time), then you should definitely be fine. Also try to combine things into one request. For example: specify multiple ‘|’-separated titles in a titles parameter instead of making a new request for each title; use a “generator” instead of making a request for each result from another request.

This relates specifically to the Wikimedia REST APIs @ https://en.wikipedia.org/api/rest_v1/ and https://wikimedia.org/api/rest_v1/ for example.

It should also be noted that you can get a dump of all entity JSON at https://www.wikidata.org/wiki/Wikidata:Database_download#JSON_dumps_(recommended)