Attention visitors
This page discusses the Query Interface at http://en.wikipedia.org/w/query.php
The new API is being developed at mw:API, using the knowledge gained from this project and your feedback. The API will allow querying as well as posting back to wiki.
I will be spending most of my available time on the new API, so for now the work on new query features is postponed. Any feedback is welcome, especially on the way the new API will incorporate the existing capabilities. -- Yurik 06:41, 17 September 2006 (UTC)
See Completed Requests Archive for older requests.
Please monitor this section as it may impact your code, and make any suggestions below.
There are several kinds of page entries the API currently returns:
Proposals
Outstanding issues:
This would enable things like Interiot's javascript edit counter to use the query interface instead of screen-scraping contribs pages. I'm not sure why the rvoffset parameter is disappearing (has disappeared?) but it'd be good to have a way to query entire article histories if that's really what's needed. Lupin| talk| popups 13:14, 12 June 2006 (UTC)
-- Yurik 17:57, 12 June 2006 (UTC)
I would like to have a bot interface for Special:Log - i.e. a way to query the logging table. It should be possible to filter the result by type (upload, delete, new account, etc), page, page-namespace, user and timestamp. Ideally, it should be possible to give more than one value for the type and namespace options, maybe also for user and page.
This feature could be used by CommonsTicker as a fallback in case the replication lag on the toolserver gets too large (as it has in the löast few days).
Thanks for your great work! -- G. Gearloose (?!) 22:53, 19 June 2006 (UTC)
Is it possible to list the contents of a category (possibly filtered by namespace) using query.php? I couldn't figure it out... It would be nice to have, especially as I find people spidering CategoryTree to retrieve the category structure. -- de:Benutzer:Duesentrieb 10:00, 3 July 2006 (UTC)
Conversly, is it posibble to return the categories to which on a page belongs (or constrain the links to ns14 or something)? Or am I stupidly overlooking that possibility? maarten 00:16, 8 July 2006 (UTC)
Hi, great tool!, would it be possible to always give the namespace attribute, even when it is 0, for example when the page is mainspace the ns attribute is missed out. Not critical, but would make coding a bit easier. thanks Martin 10:34, 12 July 2006 (UTC)
What about having a counting feature for example of how many pages there are in a combination of categories. -- 193.175.201.126 12:19, 26 July 2006 (UTC)
Would it be possible to query special pages? e.g. Special:Deadendpages but obviously not others like Special:Userlogin, if so it would be great if the results for each different page used exactly the same format, so then the only variable would be the url. Martin 10:40, 27 July 2006 (UTC)
Render wiki markup as HTML and return that, without the skin customizations. Not sure if its possible to have a generic html version of the page that so that CSS changes everything... -- Yurik 15:51, 27 July 2006 (UTC)
These urls [1] [2] gives you revisions adjacent to revision 61168673 of the Main Page. Could query.php support such relative revid specifications? I'd like to use this to download two revisions in a single request in order to perform a diff on them, for example to generate a preview for this link. Currently I can perform two separate requests with action=raw (which does support the direction parameter), but it should be possible to make just one request. Lupin| talk| popups 02:51, 28 July 2006 (UTC)
It just occurred to me that it may be nice to support YAML output. Not that I need it or anything :P Just a thought. -- G. Gearloose (?!) 10:38, 4 August 2006 (UTC)
/sign this request. http://spyc.sourceforge.net/, too. (I was feeling lucky with Google.) AKX 11:57, 13 August 2006 (UTC)
This query's XML output doesn't parse in MSIE, Firefox, or the Perl parser I'm using. Firefox displays the error "XML Parsing Error: xml declaration not at start of external entity". It looks like there's an extra line at the beginning that's throwing the parsers off. (a workaround of removing all whitespace from the beginning worked for me) -- Interiot 05:48, 6 August 2006 (UTC)
ERRORMESSAGE: after uploading query.php into mysite/wiki/extensions/botquery/query.php I can run it, but it stops immediately saying:
Parse error: parse error, unexpected '{' in /.../wiki/extensions/botquery/query.php on line 580
* MediaWiki: 1.6.10 * PHP: 4.3.10-19 (apache2handler) * MySQL: 4.1.11-Debian_4sarge7-log
Do you have an idea about the problem???
Written by Fortyfoxes 00:30, 7 August 2006 (UTC) With installed latest version of query.php installed and the following server set up:
* MediaWiki: 1.6.7 * PHP: 4.4.2 (cgi) * MySQL: 5.0.18-standard-log
I get the following error:
Parse error: syntax error, unexpected '&', expecting T_VARIABLE or '$' in pathToMyDomain/w/extensions/query.php on line 557... and also 721, 722, 740....
Also, now that I have php5 and a later vesion of mediaWiki installed:
* MediaWiki: 1.6.7 * PHP: 4.4.2 (cgi) * MySQL: 5.0.18-standard-log
I get the following:
Warning: require_once(/home/.goethe/fortyfoxes/architex.tv/w/extensions/../../includes/Defines.php) [function.require-once]: failed to open stream: No such file or directory in /home/.goethe/fortyfoxes/architex.tv/w/extensions/query.php on line 56
Fatal error: require_once() [function.require]: Failed opening required '/home/.goethe/fortyfoxes/architex.tv/w/extensions/../../includes/Defines.php' (include_path='/home/.goethe/fortyfoxes/architex.tv/w:/home/.goethe/fortyfoxes/architex.tv/w/includes:/home/.goethe/fortyfoxes/architex.tv/w/languages:.:/usr/local/php5/lib/php') in /home/.goethe/fortyfoxes/architex.tv/w/extensions/query.php on line 56</nowiki>
I have a .htaccess rewrite for short-url's which may be causing the problem?
RewriteEngine on
# uncomment this rule if you want Apache to redirect from www.mysite.com/ to www.mysite.com/wiki/Main_Page # RewriteRule ^$ /wiki/Main_Page [R] # do the rewrite RewriteRule ^wiki/?(.*)$ /?title=$1 [L,QSA]</nowiki>
Either way need a few more instructions to be able to set up on a virtual host (in my case DreamHost)?
It would be great to have output for the currently logged in users watchlist. Presently I can parse the watchlist page directly, but it isn't as nice as using this API. It's nothing to worry about, especially if it is particularly difficult for any reason, just would be useful to have eventually. Martin 19:23, 6 August 2006 (UTC)
Would it be possible to set up an edit interface so bots could edit pages without downloading tens of kilobytes of unneeded HTML? -- Carnildo 20:04, 6 August 2006 (UTC)
Since this request has not been (formally) rejected (so far :-(), I guess I may post another suggestion here. An easy way to make editing possible could be to add two "global" fields when querying "what=revisions":
wfTimestampNow()
before anything else;$wguser->editToken()
(or maybe htmlspecialchars($wguser->editToken())
?).Of the other two hidden fields I mentioned above, wpAutoSummary seems not necessary (so far). As for wpEdittime: if the page already exists, wpEdittime is the timestamp of the last revision of the article, so this is already returned when querying "what=revisions"; otherwise, it is equal to wpStarttime. ( Liberatore, 2006). 16:01, 28 August 2006 (UTC)
Hiya, I think it may be useful to make this work as a SOAP. If there is a WSDL file then third-party application developers will have a much easier time developing tools to use Query. We made something similar for LyricWiki's SOAP API (check out the External links on that page for help implementing a SOAP in PHP using nuSOAP).
Is it possible to somehow expose the ISBN-numbers on a given page, used for those Special:Booksources-links (would be great for combining data)? Also, I fail to understand how to use a (imageinfo?) query to return the (upload.wikimedia.*) url's of the images, displayed on a given page. maarten 12:28, 15 August 2006 (UTC)
I note that "what=links" returns the links in an alphabetically sorted order. Would it be possible to add an optional flag such as "sort=none" or something like that so that one could retrieve them in the same order in which they appear within the text? This could also be used with "what=categories", "what=langlinks" and "what=templates". -- Russ Blau (talk) 14:08, 16 August 2006 (UTC)
I suppose this is a feature request. But being able to query a users status, or even fetch the entire list of admins or whatever would be tremendously helpful to me. joshbuddy, talk 18:26, 23 August 2006 (UTC)
What is exactly the meaning of the touched attribute? I wrongly supposed it was the last modified date, but I have a bunch of articles which don't match... So I suppose it's the last generated date (templates & co).
Is there a way to get the last modified date?
Thanks
Gonioul 20:37, 26 August 2006 (UTC)
Is there any way to filter the output of what=backlinks
to determine which of the linking pages are actually redirects? At first I thought blfilter=redirects
would do this, but that is not what it does -- it filters the list of titles, not the output. (Presumably there should be a way to implement this, since the output of [[Special:Whatlinkshere/Page]] does show which links are redirects, and even includes the backlinks to those pages.) --
Russ Blau
(talk)
20:04, 30 August 2006 (UTC)
Hi, query is great :) but I think the "recentchanges" property is missing a "ns" field. I would be interested in filtering the RC and retrieving those related to talk pages, not articles. Thanks, keep the good work. Dake 21:39, 6 September 2006 (UTC)
Some of my JavaScript extensions look up interface strings which are stored in the MediaWiki namespace and also in the "message cache". It turns out this is not always very easy or possible with action=raw and even when it is can require up to 4 xmlhttprequests per message. It would be nice to have a Query API which takes the arguments for these functions, calls them, and returns the result in some very very simple format - preferably as simple as action=raw — Hippietrail 12:36, 8 September 2006 (UTC)
I've written two autocomplete tools that use allpages, one for the search inputbox, and one for the article edit box (see links at top of my user page). There are two things I need to make them better:
Otherwise, query.php rocks :) Zocky | picture popups 17:30, 12 September 2006 (UTC)
May I ask the people watching this page to test the following links:
I get the incoming links in the first three cases, in a second. However, I get a timeout on the the fourth query; this is just the combination of both pages plus the restriction on the namespace. ( Liberatore, 2006). 14:19, 22 September 2006 (UTC)
Just FYI, I've discovered a little snag using the api. If a query URL is too long, you get a HTTP 400 error that contains "The following error was encountered: Invalid URL". 17:56, 25 September 2006 (UTC)
Hi, how can I make query.php return a list of images used in an article? "what=links" doesn't seem to include images. -- Magnus Manske 12:11, 26 September 2006 (UTC)
It was recommended here that I bring my request to this page. On the old Kate's tool, it was possible to display deleted edits as well as "actual" edits; could this feature be in the new query.php? Thanks Batmanand | Talk 19:21, 27 September 2006 (UTC)
Running into an issue where the main query.php page is functioning and some of the queries are working, but when I try to perform something like this:
query.php?what=content|templates&titles=Junk
I'm getting:
Fatal error: Call to undefined method LinkBatch::isEmpty() in /var/www/mediawiki-1.6.7/w/BotQuery/query.php on line 2040
I did notice this commit, but I don't think that is in 1.6.7. Is that version just not supported? Note: putting in the isEmpty() and getSize() functions fixes the problem, but I'm not sure what else could be missing (worried about production level readiness on 1.6.7). Thanks! -- Malcom 04:09, 9 October 2006 (UTC)
1599 $linkBatch = new LinkBatch; ... ... 1620 if( $linkBatch->isEmpty() ) {
Hi, Yurik. It is possible for your query.php to provide info about interwiki/language links? I only need language links (links that will disappear and put into the left bar), so I don't have to parse Special:SiteMatrix and put in my bot special knowledge about commons and other exceptions; but complete information about interwiki links and their URL templates wouldn't be bad either. Anyway, yes or no, I want to thank you very much for this useful tool. Greetings. -- es:Usuario:Angus 19:12, 18 November 2006 (UTC)
Hi Yurik. Congratulations for the work on Query API/api.php, at last someone is working on it. I'm trying to rewrite our Java library for interfacing (when possible) with Query API/api.php, but all I'm getting are limited results - more precisely, the error «Error: User requested 99999 pages, which is over 500 pages allowed (cplimit)». It doesn't seem to be possible to query for something like a "give me everything", like querying for "ALL articles in a category", am I correct? Could you please explain (or point me to a relevant discussion on the subject) why - if I am correct (hope not) - or how do I achieve such results? Best regards, Nuno Tavares 03:01, 2 January 2007 (UTC)
Hello, i would like to know if some special page can be use to extend the query API. For exemple i need Special:Prefixindex. Thanks a lot. fr:user:bayo 193.248.56.95 01:54, 26 January 2007 (UTC)
Hi, I expected to get a list of 10 articles starting at "h" with this:
http://memory-alpha.org/en/query.php?what=allpages&aplimit=10&apnamespace=0&apfrom=h
But as you can see that is not what is returned. Is this the expected bahaviour, amd am I missing something? Also, is there a way to make the results case-insensitive? I'm using it for this. Thanks. -- Bp0 01:11, 2 February 2007 (UTC)
http://memory-alpha.org/en/query.php?what=allpages&aplimit=10&apnamespace=0&apfrom=H
userinfo only provides information on the current user, but it would be very useful to be able to query some information on other users; for example, ProxyDB needs to check if a proxy is blocked. This could be done by only using $wgUser if no username is provided, possibly providing less information (no preferences, for example) on other users.
It would also be very useful to be able to query multiple users, returning the results in an array:
http://en.wikipedia.org/w/query.php?what=userinfo&uiisblocked&names=foo|bar <yurik> <meta> <user> <name>foo</name> <isblocked>0</isblocked> </user> <user> <name>bar</name> <isblocked>1</isblocked> </user> </meta> </yurik>
—{ admin} Pathoschild 09:02:43, 08 February 2007 (UTC)
Hi. I have a small suggestion is to add a query for the list of unused images. The list would be similar to Special:UnusedImages. -- Jutiphan | Talk - 05:28, 12 February 2007 (UTC)
This url gives an error, which seems to be the result of a recent change. Lupin| talk| popups 22:43, 27 February 2007 (UTC)
Is there any way to make the API return the length of a revision when retrieving the RecentChanges list (e.g. via rc_old_len and rc_new_len) - those values don't seem to be covered by the API yet!? I'm trying to use it with this query... -- Ace NoOne 14:33, 14 March 2007 (UTC)
Does this contain deleted edits or not? There's about a one-thousand edit difference between query.php and Wannabe Kate with my edit count. Will ( We're flying the flag all over the world) 01:59, 15 May 2007 (UTC)
http://en.wikipedia.org/w/query.php?titles=Image:Hope_Diamond.jpg&what=imageinfo&iishared
This is coming up blank for me right now... if you remove iishared then it behaves correctly. Any idea what's wrong? Lupin| talk| popups 16:38, 3 June 2007 (UTC)
*------ Error: This site does not have shared image repository (ii_noshared) ------*
[the following was on WP:VPT; I have copied it here. Tizio 12:12, 5 June 2007 (UTC)
Does anyone know why the first of these two queries fails but the second one works? success error. — Carl ( CBM · talk) 16:32, 4 June 2007 (UTC)
I emptied the category Category:Stub-Class mathematics articles any I still get pi_badpageids when I try to query its contents, instead of the usual "emptyresult". — Carl ( CBM · talk) 13:09, 9 June 2007 (UTC)
Ok, I know that this is about to be replaced by API, but I am still using old scripts, so I noticed this bug: when two edits have the very same timestamp, they may be listed in the wrong order. For example:
http://en.wikipedia.org/w/query.php?format=xml&what=revisions&titles=Homicidal&rvlimit=3
lists 169395135 before 169395138, while the web interface got them in the right order:
http://en.wikipedia.org/?title=Homicidal&action=history
I don't know if this is really worth fixing. Tizio 18:51, 6 December 2007 (UTC)
Attention visitors
This page discusses the Query Interface at http://en.wikipedia.org/w/query.php
The new API is being developed at mw:API, using the knowledge gained from this project and your feedback. The API will allow querying as well as posting back to wiki.
I will be spending most of my available time on the new API, so for now the work on new query features is postponed. Any feedback is welcome, especially on the way the new API will incorporate the existing capabilities. -- Yurik 06:41, 17 September 2006 (UTC)
See Completed Requests Archive for older requests.
Please monitor this section as it may impact your code, and make any suggestions below.
There are several kinds of page entries the API currently returns:
Proposals
Outstanding issues:
This would enable things like Interiot's javascript edit counter to use the query interface instead of screen-scraping contribs pages. I'm not sure why the rvoffset parameter is disappearing (has disappeared?) but it'd be good to have a way to query entire article histories if that's really what's needed. Lupin| talk| popups 13:14, 12 June 2006 (UTC)
-- Yurik 17:57, 12 June 2006 (UTC)
I would like to have a bot interface for Special:Log - i.e. a way to query the logging table. It should be possible to filter the result by type (upload, delete, new account, etc), page, page-namespace, user and timestamp. Ideally, it should be possible to give more than one value for the type and namespace options, maybe also for user and page.
This feature could be used by CommonsTicker as a fallback in case the replication lag on the toolserver gets too large (as it has in the löast few days).
Thanks for your great work! -- G. Gearloose (?!) 22:53, 19 June 2006 (UTC)
Is it possible to list the contents of a category (possibly filtered by namespace) using query.php? I couldn't figure it out... It would be nice to have, especially as I find people spidering CategoryTree to retrieve the category structure. -- de:Benutzer:Duesentrieb 10:00, 3 July 2006 (UTC)
Conversly, is it posibble to return the categories to which on a page belongs (or constrain the links to ns14 or something)? Or am I stupidly overlooking that possibility? maarten 00:16, 8 July 2006 (UTC)
Hi, great tool!, would it be possible to always give the namespace attribute, even when it is 0, for example when the page is mainspace the ns attribute is missed out. Not critical, but would make coding a bit easier. thanks Martin 10:34, 12 July 2006 (UTC)
What about having a counting feature for example of how many pages there are in a combination of categories. -- 193.175.201.126 12:19, 26 July 2006 (UTC)
Would it be possible to query special pages? e.g. Special:Deadendpages but obviously not others like Special:Userlogin, if so it would be great if the results for each different page used exactly the same format, so then the only variable would be the url. Martin 10:40, 27 July 2006 (UTC)
Render wiki markup as HTML and return that, without the skin customizations. Not sure if its possible to have a generic html version of the page that so that CSS changes everything... -- Yurik 15:51, 27 July 2006 (UTC)
These urls [1] [2] gives you revisions adjacent to revision 61168673 of the Main Page. Could query.php support such relative revid specifications? I'd like to use this to download two revisions in a single request in order to perform a diff on them, for example to generate a preview for this link. Currently I can perform two separate requests with action=raw (which does support the direction parameter), but it should be possible to make just one request. Lupin| talk| popups 02:51, 28 July 2006 (UTC)
It just occurred to me that it may be nice to support YAML output. Not that I need it or anything :P Just a thought. -- G. Gearloose (?!) 10:38, 4 August 2006 (UTC)
/sign this request. http://spyc.sourceforge.net/, too. (I was feeling lucky with Google.) AKX 11:57, 13 August 2006 (UTC)
This query's XML output doesn't parse in MSIE, Firefox, or the Perl parser I'm using. Firefox displays the error "XML Parsing Error: xml declaration not at start of external entity". It looks like there's an extra line at the beginning that's throwing the parsers off. (a workaround of removing all whitespace from the beginning worked for me) -- Interiot 05:48, 6 August 2006 (UTC)
ERRORMESSAGE: after uploading query.php into mysite/wiki/extensions/botquery/query.php I can run it, but it stops immediately saying:
Parse error: parse error, unexpected '{' in /.../wiki/extensions/botquery/query.php on line 580
* MediaWiki: 1.6.10 * PHP: 4.3.10-19 (apache2handler) * MySQL: 4.1.11-Debian_4sarge7-log
Do you have an idea about the problem???
Written by Fortyfoxes 00:30, 7 August 2006 (UTC) With installed latest version of query.php installed and the following server set up:
* MediaWiki: 1.6.7 * PHP: 4.4.2 (cgi) * MySQL: 5.0.18-standard-log
I get the following error:
Parse error: syntax error, unexpected '&', expecting T_VARIABLE or '$' in pathToMyDomain/w/extensions/query.php on line 557... and also 721, 722, 740....
Also, now that I have php5 and a later vesion of mediaWiki installed:
* MediaWiki: 1.6.7 * PHP: 4.4.2 (cgi) * MySQL: 5.0.18-standard-log
I get the following:
Warning: require_once(/home/.goethe/fortyfoxes/architex.tv/w/extensions/../../includes/Defines.php) [function.require-once]: failed to open stream: No such file or directory in /home/.goethe/fortyfoxes/architex.tv/w/extensions/query.php on line 56
Fatal error: require_once() [function.require]: Failed opening required '/home/.goethe/fortyfoxes/architex.tv/w/extensions/../../includes/Defines.php' (include_path='/home/.goethe/fortyfoxes/architex.tv/w:/home/.goethe/fortyfoxes/architex.tv/w/includes:/home/.goethe/fortyfoxes/architex.tv/w/languages:.:/usr/local/php5/lib/php') in /home/.goethe/fortyfoxes/architex.tv/w/extensions/query.php on line 56</nowiki>
I have a .htaccess rewrite for short-url's which may be causing the problem?
RewriteEngine on
# uncomment this rule if you want Apache to redirect from www.mysite.com/ to www.mysite.com/wiki/Main_Page # RewriteRule ^$ /wiki/Main_Page [R] # do the rewrite RewriteRule ^wiki/?(.*)$ /?title=$1 [L,QSA]</nowiki>
Either way need a few more instructions to be able to set up on a virtual host (in my case DreamHost)?
It would be great to have output for the currently logged in users watchlist. Presently I can parse the watchlist page directly, but it isn't as nice as using this API. It's nothing to worry about, especially if it is particularly difficult for any reason, just would be useful to have eventually. Martin 19:23, 6 August 2006 (UTC)
Would it be possible to set up an edit interface so bots could edit pages without downloading tens of kilobytes of unneeded HTML? -- Carnildo 20:04, 6 August 2006 (UTC)
Since this request has not been (formally) rejected (so far :-(), I guess I may post another suggestion here. An easy way to make editing possible could be to add two "global" fields when querying "what=revisions":
wfTimestampNow()
before anything else;$wguser->editToken()
(or maybe htmlspecialchars($wguser->editToken())
?).Of the other two hidden fields I mentioned above, wpAutoSummary seems not necessary (so far). As for wpEdittime: if the page already exists, wpEdittime is the timestamp of the last revision of the article, so this is already returned when querying "what=revisions"; otherwise, it is equal to wpStarttime. ( Liberatore, 2006). 16:01, 28 August 2006 (UTC)
Hiya, I think it may be useful to make this work as a SOAP. If there is a WSDL file then third-party application developers will have a much easier time developing tools to use Query. We made something similar for LyricWiki's SOAP API (check out the External links on that page for help implementing a SOAP in PHP using nuSOAP).
Is it possible to somehow expose the ISBN-numbers on a given page, used for those Special:Booksources-links (would be great for combining data)? Also, I fail to understand how to use a (imageinfo?) query to return the (upload.wikimedia.*) url's of the images, displayed on a given page. maarten 12:28, 15 August 2006 (UTC)
I note that "what=links" returns the links in an alphabetically sorted order. Would it be possible to add an optional flag such as "sort=none" or something like that so that one could retrieve them in the same order in which they appear within the text? This could also be used with "what=categories", "what=langlinks" and "what=templates". -- Russ Blau (talk) 14:08, 16 August 2006 (UTC)
I suppose this is a feature request. But being able to query a users status, or even fetch the entire list of admins or whatever would be tremendously helpful to me. joshbuddy, talk 18:26, 23 August 2006 (UTC)
What is exactly the meaning of the touched attribute? I wrongly supposed it was the last modified date, but I have a bunch of articles which don't match... So I suppose it's the last generated date (templates & co).
Is there a way to get the last modified date?
Thanks
Gonioul 20:37, 26 August 2006 (UTC)
Is there any way to filter the output of what=backlinks
to determine which of the linking pages are actually redirects? At first I thought blfilter=redirects
would do this, but that is not what it does -- it filters the list of titles, not the output. (Presumably there should be a way to implement this, since the output of [[Special:Whatlinkshere/Page]] does show which links are redirects, and even includes the backlinks to those pages.) --
Russ Blau
(talk)
20:04, 30 August 2006 (UTC)
Hi, query is great :) but I think the "recentchanges" property is missing a "ns" field. I would be interested in filtering the RC and retrieving those related to talk pages, not articles. Thanks, keep the good work. Dake 21:39, 6 September 2006 (UTC)
Some of my JavaScript extensions look up interface strings which are stored in the MediaWiki namespace and also in the "message cache". It turns out this is not always very easy or possible with action=raw and even when it is can require up to 4 xmlhttprequests per message. It would be nice to have a Query API which takes the arguments for these functions, calls them, and returns the result in some very very simple format - preferably as simple as action=raw — Hippietrail 12:36, 8 September 2006 (UTC)
I've written two autocomplete tools that use allpages, one for the search inputbox, and one for the article edit box (see links at top of my user page). There are two things I need to make them better:
Otherwise, query.php rocks :) Zocky | picture popups 17:30, 12 September 2006 (UTC)
May I ask the people watching this page to test the following links:
I get the incoming links in the first three cases, in a second. However, I get a timeout on the the fourth query; this is just the combination of both pages plus the restriction on the namespace. ( Liberatore, 2006). 14:19, 22 September 2006 (UTC)
Just FYI, I've discovered a little snag using the api. If a query URL is too long, you get a HTTP 400 error that contains "The following error was encountered: Invalid URL". 17:56, 25 September 2006 (UTC)
Hi, how can I make query.php return a list of images used in an article? "what=links" doesn't seem to include images. -- Magnus Manske 12:11, 26 September 2006 (UTC)
It was recommended here that I bring my request to this page. On the old Kate's tool, it was possible to display deleted edits as well as "actual" edits; could this feature be in the new query.php? Thanks Batmanand | Talk 19:21, 27 September 2006 (UTC)
Running into an issue where the main query.php page is functioning and some of the queries are working, but when I try to perform something like this:
query.php?what=content|templates&titles=Junk
I'm getting:
Fatal error: Call to undefined method LinkBatch::isEmpty() in /var/www/mediawiki-1.6.7/w/BotQuery/query.php on line 2040
I did notice this commit, but I don't think that is in 1.6.7. Is that version just not supported? Note: putting in the isEmpty() and getSize() functions fixes the problem, but I'm not sure what else could be missing (worried about production level readiness on 1.6.7). Thanks! -- Malcom 04:09, 9 October 2006 (UTC)
1599 $linkBatch = new LinkBatch; ... ... 1620 if( $linkBatch->isEmpty() ) {
Hi, Yurik. It is possible for your query.php to provide info about interwiki/language links? I only need language links (links that will disappear and put into the left bar), so I don't have to parse Special:SiteMatrix and put in my bot special knowledge about commons and other exceptions; but complete information about interwiki links and their URL templates wouldn't be bad either. Anyway, yes or no, I want to thank you very much for this useful tool. Greetings. -- es:Usuario:Angus 19:12, 18 November 2006 (UTC)
Hi Yurik. Congratulations for the work on Query API/api.php, at last someone is working on it. I'm trying to rewrite our Java library for interfacing (when possible) with Query API/api.php, but all I'm getting are limited results - more precisely, the error «Error: User requested 99999 pages, which is over 500 pages allowed (cplimit)». It doesn't seem to be possible to query for something like a "give me everything", like querying for "ALL articles in a category", am I correct? Could you please explain (or point me to a relevant discussion on the subject) why - if I am correct (hope not) - or how do I achieve such results? Best regards, Nuno Tavares 03:01, 2 January 2007 (UTC)
Hello, i would like to know if some special page can be use to extend the query API. For exemple i need Special:Prefixindex. Thanks a lot. fr:user:bayo 193.248.56.95 01:54, 26 January 2007 (UTC)
Hi, I expected to get a list of 10 articles starting at "h" with this:
http://memory-alpha.org/en/query.php?what=allpages&aplimit=10&apnamespace=0&apfrom=h
But as you can see that is not what is returned. Is this the expected bahaviour, amd am I missing something? Also, is there a way to make the results case-insensitive? I'm using it for this. Thanks. -- Bp0 01:11, 2 February 2007 (UTC)
http://memory-alpha.org/en/query.php?what=allpages&aplimit=10&apnamespace=0&apfrom=H
userinfo only provides information on the current user, but it would be very useful to be able to query some information on other users; for example, ProxyDB needs to check if a proxy is blocked. This could be done by only using $wgUser if no username is provided, possibly providing less information (no preferences, for example) on other users.
It would also be very useful to be able to query multiple users, returning the results in an array:
http://en.wikipedia.org/w/query.php?what=userinfo&uiisblocked&names=foo|bar <yurik> <meta> <user> <name>foo</name> <isblocked>0</isblocked> </user> <user> <name>bar</name> <isblocked>1</isblocked> </user> </meta> </yurik>
—{ admin} Pathoschild 09:02:43, 08 February 2007 (UTC)
Hi. I have a small suggestion is to add a query for the list of unused images. The list would be similar to Special:UnusedImages. -- Jutiphan | Talk - 05:28, 12 February 2007 (UTC)
This url gives an error, which seems to be the result of a recent change. Lupin| talk| popups 22:43, 27 February 2007 (UTC)
Is there any way to make the API return the length of a revision when retrieving the RecentChanges list (e.g. via rc_old_len and rc_new_len) - those values don't seem to be covered by the API yet!? I'm trying to use it with this query... -- Ace NoOne 14:33, 14 March 2007 (UTC)
Does this contain deleted edits or not? There's about a one-thousand edit difference between query.php and Wannabe Kate with my edit count. Will ( We're flying the flag all over the world) 01:59, 15 May 2007 (UTC)
http://en.wikipedia.org/w/query.php?titles=Image:Hope_Diamond.jpg&what=imageinfo&iishared
This is coming up blank for me right now... if you remove iishared then it behaves correctly. Any idea what's wrong? Lupin| talk| popups 16:38, 3 June 2007 (UTC)
*------ Error: This site does not have shared image repository (ii_noshared) ------*
[the following was on WP:VPT; I have copied it here. Tizio 12:12, 5 June 2007 (UTC)
Does anyone know why the first of these two queries fails but the second one works? success error. — Carl ( CBM · talk) 16:32, 4 June 2007 (UTC)
I emptied the category Category:Stub-Class mathematics articles any I still get pi_badpageids when I try to query its contents, instead of the usual "emptyresult". — Carl ( CBM · talk) 13:09, 9 June 2007 (UTC)
Ok, I know that this is about to be replaced by API, but I am still using old scripts, so I noticed this bug: when two edits have the very same timestamp, they may be listed in the wrong order. For example:
http://en.wikipedia.org/w/query.php?format=xml&what=revisions&titles=Homicidal&rvlimit=3
lists 169395135 before 169395138, while the web interface got them in the right order:
http://en.wikipedia.org/?title=Homicidal&action=history
I don't know if this is really worth fixing. Tizio 18:51, 6 December 2007 (UTC)