Currently, we are thinking about creating a new tool for fetching some information about certain entities from Wikidata. The way we would like to do this is with the entity’s URL adding the
.json suffix for getting the parsable data. (i.e.: for http://www.wikidata.org/entity/Q23240 we will call http://www.wikidata.org/entity/Q23240.json)
Probably, for the initial release we would also like to fetch the json data for a large number of entities.
Considering this, we want to know what are the rate limits for reading.
I’m asking here because I couldn’t find a clear answer. Right below you can see what I have found about this subject:
- http://wikimedia.7.x6.nabble.com/best-practice-for-rate-limits-for-accessing-the-English-Wikipedia-with-the-API-td1145619.html : “Rate limits are for editing and logging in only”
- https://www.mediawiki.org/wiki/API:Etiquette : “There is no hard and fast limit on read requests, but we ask that you be considerate and try not to take a site down”
- https://www.mediawiki.org/wiki/REST_API : “200 requests/s to this API overall”
As you can see, the statements are contradictory, so I would really appreciate a clear info about this.
Thanks for your help