[go: up one dir, main page]

Jump to content

Grants:IEG/Wiki needs pictures/Timeline

From Meta, a Wikimedia project coordination wiki

Timeline for Wiki needs pictures

[edit]
Timeline Date
First planning of the work, together in Alexmar983 house 4-6 December 2015
Test of the beta version of the tool during wikimeting in Florence for 15° Wikipedia birthday 15 January 2016
Studied how to interact with Wikidata and prepared SPARQL queries. 30 May 2016
Github repository 100% complete, updated online and copied over to server. 15 August 2016
Bug-fix and tested compatibility with all platforms 30 August 2016
Final Report submitted 14 September 2016


Monthly updates

[edit]

Please prepare a brief project update each month, in a format of your choice, to share progress and learnings with the community along the way. Submit the link below as you complete each update.

December 2015

[edit]

During this month we had a preliminar brainstorming about how to start and which tasks could be problematic, stressing the demo.

Coordinates

[edit]

We understood that the main point of the tool is providing alerts that an image is missing with a strong geolocation component. So we analyzed the state of the art of coordinates among Wikidata and Wikipedia. Looking at the data, both with SPARQL queries at he Wikidata endpoint, both with API call to local Wikipedias, we noticed that there are many uncertainties concerning geodata:

  1. sometimes Wikidata entities have more than one lat/long value;
  2. sometimes the value on Wikidata is quite different from the value on Wikipedia and, moreover, it could be different again according to a certain language version;
  3. sometimes coords are simply wrong (the tool is very useful to locate immediately wrong coordinates in the area known by the user);
  4. sometimes coords are not enough to describe the position of very large entities.

Seeing these problems we thought about:

  1. a way to use those data with our tool idea (e.g. doing avarage or display all these dot with a connection);
  2. a way to improve the quality of geodata, drafting another possible tool.

Data

[edit]

Thinking about which kind of Wikidata elements we consider without image (as you know, at the moment Wikidata not fully represent the amount of images that Wikipedia and Commons have). So we established that we can start with items without image property, without commons interlink and gallery and with coordinates.

User interface

[edit]
  1. We thought about a way to make a simple configuration file to produce the highest possible number for translations
  2. The demo version uses the type provided by Extension:GeoData to generate the filter menu. We thought that we could ask an improvement of those categories or we could simply use the "instance of" property of Wikidata.

API

[edit]

Thinking about which API the tool may expose we focused on two main aspect:

  1. Data API: this would returns at least the Wikidata item, its coords, its type (assembling all our data, not just the one currently on WD)
  2. Statistics API: this would returns the numbers of new and processed items and some metrics about where this happens

January 2016

[edit]
  • during the 15th birthday celebration in Florence, the beta of WNP was shown to a group of it-N users active on itWikipedia, itWikiversity and Commons. Reactions were very interesting.
  1. the problem of obsolescence and limited coverage of local templates could be explained in few minutes, many users were completely unaware of this issue but soon realized its existence with a simple look at the distribution of the flags, the lack of superposition and the prcentage of false negatives;
  2. the flexibility of the tool was also easily understood. It has been specifically requested, for example, to overlap the "unused" commons images on the map. I think that in the future commons files with geolocation but not used on any project could be display as well on the map;
  3. the use of the "house" icon for the places (towns, villages, neighbourhood) was confusing, it should be replaced with something else.

February 2016

[edit]
  • I have started to contact "local" users, i.e. focused on regional and subregional topics on itWiki, on wikidata. I showed to them the wikishootme software and tried to understand if they were willing to play with it. This was a test to analyze their interest on the topic and their tendency, to cooperate from a local to a meta-wiki perspective. I have selected about ten users from at least 6 different Italian regions (Piedmont, Ligury, Tuscany, Umbria, Abruzzo, Sicily), approximately one half showed interest. They all agree that wikishootme was full of flase positives and false negatives but they were willing too cooperate to fix the situation. This means that even users with a specifically local focus are interested and a more elaborated tool specifically aimed to constantly improve the census of missing images will be well received.--Alexmar983 (talk) 22:12, 7 February 2016 (UTC)
This type of analysis is part of the budget item "Analyze how each community face the problem and survey of all other existing image tools to better interface with them". I have already sampled the different cat system on local Wikipedias and I am expanding on the "sociological" aspects of the tool--Alexmar983 (talk) 02:36, 8 February 2016 (UTC)

March 2016

[edit]
  • I have started with Incola the first wikimetrics analysis on commons users. Please take look here and here . By the time the tool will be ready a specific targeted message presenting all tools for images can be delivered to interested users.--Alexmar983 (talk) 12:25, 13 March 2016 (UTC)

May 2016

[edit]
  • Found contacts to translate the interface in Modern Greek.
  • Studied how to get data from Wikidata via SPARQL queries. We find out that is not possible to grab all the data, quickly (via query, not dump) and in a granular way. So, as we imagine from our previous attempts, we have choosen a compromise: not all places (timeout errors) but only buildings with a low granular level to reduce the number of query and their complexity.

July 2016

[edit]
  • Studied and drafted code to load a big quantity of pins in the map. Lot of problems here. The solution found is to use a library written for Leaflet (the engine we choose to display the map) that make clusters. The UX is nice when all is loaded. Beta testers didn't feel confuse and everyone understand immediately how to use the interface. However we suffer of a very slow start and high usage of CPU while loading. This leads to some seconds to wait in a desktop experience (not so bad, and we added a loading bar to feel the app is moving) but to some alert message (about freezing) while using mobile. This problem is linear correlated with the dimensions of the dataset.

August 2016

[edit]
  • Deploy on the server of the code prepared, enabling all the services needed (web server and cron jobs).
  • I have interacted with others on github that open issues about the tool.
  • Performance test & fixes.

September 2016

[edit]
  • Final spreading of the tool.


Is your final report due but you need more time?



Extension request

[edit]

New end date

[edit]

30 August 2016

Rationale

[edit]

Our testing period takes longer than we expected.

This extension request is approved, with a new project end date of 30 August 2016. --Marti (WMF) (talk) 17:32, 5 August 2016 (UTC)