Wikimedia Developer Support

Developing an RDF recommender system to help look for biases/content-gaps or simply recommend a wikipedia page to edit


#1

Hi guys, this will be my first post on here,

I am a Norwegian computer science masters degree student currently doing an exchange year in Chile. Here I am in a post grad course on the semantic web and for my final project I want to try to develop an RDF recommender system that will be fed wikipedia revisions/edits in real time and use it together with Linked Open Data like wikidata or dbpedia that will try to recommend other pages that should be edited.

I wanted to ask if anybody would be so kind as to point me in the right direction on how to create such a stream of edits/revisions, ideally I would use apache Kafka and have a producer writing these streams to a topic. Maybe something like this already exists, or maybe there is some API I could use.

Any pointers in the right direction would be massively appreciated.

Thanks!


#2

There is EventStreams, which uses Kafka internally but is exposed as an SSE-based web service.


#3

Thanks @Tgr, I found the codepen example very helpful. Looking at the stream of edits, is there any internal classification of the motivation behind the edit (i.e. fixing typos, obtaining a more neutral point of view, etc), or is this something I would have to try to do manually?


#4

There’s this, in a very experimental state I think. (See ORES for wider context.) T130251 is the tracking task, you can probably ask for more info there.