![]() | This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | → | Archive 10 |
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
Facto Post – Issue 24 – 17 May 2019![]() ![]() The Editor is
Charles Matthews, for
ContentMine. Please leave feedback for him, on his User talk page.
To subscribe to Facto Post go to
Wikipedia:Facto Post mailing list. For the ways to unsubscribe, see the footer.
Two dozen issues, and this may be the last, a valediction at least for a while. It's time for a two-year summation of ContentMine projects involving TDM ( text and data mining). Wikidata and now Structured Data on Commons represent the overlap of Wikimedia with the Semantic Web. This common ground is helping to convert an engineering concept into a movement. TDM generally has little enough connection with the Semantic Web, being instead in the orbit of machine learning which is no respecter of the semantic. Don't break a taboo by asking bots "and what do you mean by that?" The ScienceSource project innovates in TDM, by storing its text mining results in a Wikibase site. It strives for compliance of its fact mining, on drug treatments of diseases, with an automated form of the relevant Wikipedia referencing guideline MEDRS. Where WikiFactMine set up an API for reuse of its results, ScienceSource has a SPARQL query service, with look-and-feel exactly that of Wikidata's at query.wikidata.org. It also now has a custom front end, and its content can be federated, in other words used in data mashups: it is one of over 50 sites that can federate with Wikidata. The human factor comes to bear through the front end, which combines a link to the HTML version of a paper, text mining results organised in drug and disease columns, and a SPARQL display of nearby drug and disease terms. Much software to develop and explain, so little time! Rather than telling the tale, Facto Post brings you ScienceSource links, starting from the how-to video, lower right.
The review tool requires a log in on sciencesource.wmflabs.org, and an OAuth permission (bottom of a review page) to operate. It can be used in simple and more advanced workflows. Examples of queries for the latter are at d:Wikidata_talk:ScienceSource project/Queries#SS_disease_list and d:Wikidata_talk:ScienceSource_project/Queries#NDF-RT issue. Please be aware that this is a research project in development, and may have outages for planned maintenance. That will apply for the next few days, at least. The ScienceSource wiki main page carries information on practical matters. Email is not enabled on the wiki: use site mail here to Charles Matthews in case of difficulty, or if you need support. Further explanatory videos will be put into commons:Category:ContentMine videos. If you wish to receive no further issues of Facto Post, please remove your name from
our mailing list. Alternatively, to opt out of all
massmessage mailings, you may add
Category:Wikipedians who opt out of message delivery to your user talk page.
Newsletter delivered by MediaWiki message delivery |
MediaWiki message delivery ( talk) 18:52, 17 May 2019 (UTC)
lastrevid
of entities to JSON dumps – thanks
Pintoch!You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
News and updates associated with user scripts from the past month (May 2019).
Hello everyone and welcome to the 6th issue of the Wikipedia Scripts++ Newsletter:
|
Enjoy your summer, -- DannyS712 ( talk) 23:44, 31 May 2019 (UTC)
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
![]() | This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | → | Archive 10 |
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
Facto Post – Issue 24 – 17 May 2019![]() ![]() The Editor is
Charles Matthews, for
ContentMine. Please leave feedback for him, on his User talk page.
To subscribe to Facto Post go to
Wikipedia:Facto Post mailing list. For the ways to unsubscribe, see the footer.
Two dozen issues, and this may be the last, a valediction at least for a while. It's time for a two-year summation of ContentMine projects involving TDM ( text and data mining). Wikidata and now Structured Data on Commons represent the overlap of Wikimedia with the Semantic Web. This common ground is helping to convert an engineering concept into a movement. TDM generally has little enough connection with the Semantic Web, being instead in the orbit of machine learning which is no respecter of the semantic. Don't break a taboo by asking bots "and what do you mean by that?" The ScienceSource project innovates in TDM, by storing its text mining results in a Wikibase site. It strives for compliance of its fact mining, on drug treatments of diseases, with an automated form of the relevant Wikipedia referencing guideline MEDRS. Where WikiFactMine set up an API for reuse of its results, ScienceSource has a SPARQL query service, with look-and-feel exactly that of Wikidata's at query.wikidata.org. It also now has a custom front end, and its content can be federated, in other words used in data mashups: it is one of over 50 sites that can federate with Wikidata. The human factor comes to bear through the front end, which combines a link to the HTML version of a paper, text mining results organised in drug and disease columns, and a SPARQL display of nearby drug and disease terms. Much software to develop and explain, so little time! Rather than telling the tale, Facto Post brings you ScienceSource links, starting from the how-to video, lower right.
The review tool requires a log in on sciencesource.wmflabs.org, and an OAuth permission (bottom of a review page) to operate. It can be used in simple and more advanced workflows. Examples of queries for the latter are at d:Wikidata_talk:ScienceSource project/Queries#SS_disease_list and d:Wikidata_talk:ScienceSource_project/Queries#NDF-RT issue. Please be aware that this is a research project in development, and may have outages for planned maintenance. That will apply for the next few days, at least. The ScienceSource wiki main page carries information on practical matters. Email is not enabled on the wiki: use site mail here to Charles Matthews in case of difficulty, or if you need support. Further explanatory videos will be put into commons:Category:ContentMine videos. If you wish to receive no further issues of Facto Post, please remove your name from
our mailing list. Alternatively, to opt out of all
massmessage mailings, you may add
Category:Wikipedians who opt out of message delivery to your user talk page.
Newsletter delivered by MediaWiki message delivery |
MediaWiki message delivery ( talk) 18:52, 17 May 2019 (UTC)
lastrevid
of entities to JSON dumps – thanks
Pintoch!You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
News and updates associated with user scripts from the past month (May 2019).
Hello everyone and welcome to the 6th issue of the Wikipedia Scripts++ Newsletter:
|
Enjoy your summer, -- DannyS712 ( talk) 23:44, 31 May 2019 (UTC)
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.
You can see all open tickets related to Wikidata here. If you want to help, you can also have a look at the tasks needing a volunteer.