From Wikipedia, the free encyclopedia

Operator: Xaxafrad

Automatic or Manually Assisted: Manually Assisted.

Programming Language(s): Python

Function Summary: Reading all the articles linked to from Centuries, Decades, List of years

Edit period(s) (e.g. Continuous, daily, one time run): Intermittent

Edit rate requested: Nil.

Already has a bot flag (Y/N): N

Function Details: Will get all articles linked to from Centuries, Decades, List of years

Discussion

So this will just generate a few large list of pages right? How does it go about making them? Voice-of-All 04:16, 31 December 2006 (UTC) reply

The large output won't be "duplicated" in the Database, if it's an issue; local output (on my computer alone, and maybe emailed to project collaborators) will work equally well. It going to juggle words through ifs and fors, searching for "Events", "Important *", and other sections, then look for something like a list (a rudimentary heuristic thingy, I guess), and reformat the text into a list suitable for processing by another script (likely Bash), which will do a immense amount of interpreting before producing a grid-style arraying for plotting on an image for a series of frames for a pretty little animated gif. I'm probably overreaching, but I want to catch as much detail as I can, a zooming feature is on my todo list. For Wikipedia, I'm want to upload some of the eye-catchier animations. Xaxafrad 20:33, 31 December 2006 (UTC) reply
So the goal is to make some sort of slideshow? Will it be making any edits on wiki? It seems to be using a serious of GET request rather than heavy editing (POST requests). If you are going to do a lot of GET request that don't need to be up to the minute, then you can perhaps download a database dump and work from that, though if this is a one time (or occasional) project then it may not be worth the extra effort of doing that. Voice-of-All 22:54, 31 December 2006 (UTC) reply
OK, after reading User:XaxaBot (which has more info than here), this seems fine. Try to avoid exceeding 20 requests per minute. It seems like there will be minimal or no on wiki editing. Voice-of-All 22:59, 31 December 2006 (UTC) reply
Thanks, 20 reqs should be no problem. I ran two tests, one getting about 20 articles and the other got some 40-odd articles, without any additional processing, the average time seemed to bear out 20 GETs per minute. I don't know whether this is due to a built in throttle or just natural internet lag combined with single-threaded processing, but it sounds acceptable. Xaxafrad 08:41, 1 January 2007 (UTC) reply
Approved; the bot shall run without a flag. Voice-of-All 22:59, 31 December 2006 (UTC) reply
From Wikipedia, the free encyclopedia

Operator: Xaxafrad

Automatic or Manually Assisted: Manually Assisted.

Programming Language(s): Python

Function Summary: Reading all the articles linked to from Centuries, Decades, List of years

Edit period(s) (e.g. Continuous, daily, one time run): Intermittent

Edit rate requested: Nil.

Already has a bot flag (Y/N): N

Function Details: Will get all articles linked to from Centuries, Decades, List of years

Discussion

So this will just generate a few large list of pages right? How does it go about making them? Voice-of-All 04:16, 31 December 2006 (UTC) reply

The large output won't be "duplicated" in the Database, if it's an issue; local output (on my computer alone, and maybe emailed to project collaborators) will work equally well. It going to juggle words through ifs and fors, searching for "Events", "Important *", and other sections, then look for something like a list (a rudimentary heuristic thingy, I guess), and reformat the text into a list suitable for processing by another script (likely Bash), which will do a immense amount of interpreting before producing a grid-style arraying for plotting on an image for a series of frames for a pretty little animated gif. I'm probably overreaching, but I want to catch as much detail as I can, a zooming feature is on my todo list. For Wikipedia, I'm want to upload some of the eye-catchier animations. Xaxafrad 20:33, 31 December 2006 (UTC) reply
So the goal is to make some sort of slideshow? Will it be making any edits on wiki? It seems to be using a serious of GET request rather than heavy editing (POST requests). If you are going to do a lot of GET request that don't need to be up to the minute, then you can perhaps download a database dump and work from that, though if this is a one time (or occasional) project then it may not be worth the extra effort of doing that. Voice-of-All 22:54, 31 December 2006 (UTC) reply
OK, after reading User:XaxaBot (which has more info than here), this seems fine. Try to avoid exceeding 20 requests per minute. It seems like there will be minimal or no on wiki editing. Voice-of-All 22:59, 31 December 2006 (UTC) reply
Thanks, 20 reqs should be no problem. I ran two tests, one getting about 20 articles and the other got some 40-odd articles, without any additional processing, the average time seemed to bear out 20 GETs per minute. I don't know whether this is due to a built in throttle or just natural internet lag combined with single-threaded processing, but it sounds acceptable. Xaxafrad 08:41, 1 January 2007 (UTC) reply
Approved; the bot shall run without a flag. Voice-of-All 22:59, 31 December 2006 (UTC) reply

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook