[go: up one dir, main page]

Jump to content

Wikipedia:Bot requests/Archive 1

From Wikipedia, the free encyclopedia
Archive 1Archive 2Archive 3Archive 5

HTML cleanup bot

(Moved from Wikipedia talk:Bots) Kevin Rector 14:42, Aug 19, 2004 (UTC)

I'm thinking that it would be a good idea to run a bot that attempts to clean up HTML and replace it with valid wikisyntax, especially with regards to tables. What does everyone else think? -- Grunt (talk) 01:00, 2004 Aug 17 (UTC)

How would you determine which pages to modify? Kevin Rector 01:42, Aug 17, 2004 (UTC)
Looks like Guanabot is doing this. ··gracefool | 01:51, 30 Aug 2004 (UTC)

Bot needed to fix AMOS

If you go to the AMOS article, and then ask for What Links Here, you'll see a whole bunch of List of asteroids pages. All of the AMOS links from these pages should NOT point to AMOS (Advanced Mortar System) but rather to Air Force Maui Optical and Supercomputing observatory (AMOS). Could a bot fix this?

Urhixidur 03:10, 2004 Sep 20 (UTC)

This problem is now moot.
Urhixidur 19:30, 2004 Nov 23 (UTC)

Bot to fix "the the"

Nearly 1500 articles include an instance of « the the » (sometimes « the "The »). With the exception of the rock group The The, these need to be fixed.

Where is the list of such articles? RedWolf 05:29, Dec 10, 2004 (UTC)
I believe the the the list in question was at User:Topbanana/Reports/This_article_contains_a_repeated_word. However, it and several variations have since been processed manually. I suspect that a fully robotic solution to this sort of task is inappropriate, but I'd be glad to be proven wrong. - TB 16:22, 2004 Dec 10 (UTC)
No, there is at least one article which has a legitimate use of the coupling "The The" unconnected with the rock group ... but I forget what it was. Whatever, you cannot blythely make the assertion that you can fix errors of this sort by machine; you'll get 99% right and completely knacker 1% ... better to do it by hand, sadly. --Tagishsimon (talk)
No bot should be run without supervision, so the 1% should't be an issue. The operator will have to pay attention, as is true of all edits. --—Ben Brockert (42) UE News 00:22, Jan 1, 2005 (UTC)
See User:R3m0t/Reports2/the for a list of false positives. This is all fixed. r3m0t 15:14, Feb 25, 2005 (UTC)

There are at least 500 (and who knows how many more) wikilinks to Christian, all or almost all of which are referring to Christianity Christians. "Christian" is a disambig page with a link to Christianity as well as a few other uses of the word, "Christian." It would be nice to have a bot to change all these links from "Christian" to "Christianity|Christian", while generating a log so that the changes can be reviewed for any that actually point to something other than Christianity Christians (I expect there will be almost none). --Gary D 00:15, Dec 2, 2004 (UTC)

Thanks to User:Netoholic for this bot! --Gary D 00:25, Jan 1, 2005 (UTC)
You're welcome. There are a few hundred links, and no way to fully automate this, since it requires interpretation (does the link refer to the religion or the people?). It took me a few sessions to get through them all, but I think it's a lot better than before. If you want to see a log, view the contribs of NetBot. -- Netoholic @ 18:34, 2005 Jan 2 (UTC)


Italians in the wrong country

I've just discovered that the page Province of Brescia lists about 200 communes in this Italian province. Each of those communes has its own page, and on each of them is a handy-dandy table (not a template, sadly), listing region, province, location, etc etc. In each case, "province" links to the page [[Provinces of Italy], but region links to Regions of Germany rather than Regions of Italy. This sounds like the job for a bot... unless someone wants to change 200 tables by hand? Any help would be greatly appreciated. Grutness|hello? 08:23, 17 Jan 2005 (UTC)

This problem has already been taken care of as according to User:Grutness. -- AllyUnion (talk) 19:10, 25 Feb 2005 (UTC)

BBC: On this day

(moved from Wikipedia talk:Bots) Can someone make a bot to go through the 366 days of the year and put links into the BBC's on this day page? They come in the format http://news.bbc.co.uk/onthisday/hi/dates/stories/month/date/ replacing month with the month, and date with a numerical date, not including a leading zero, 9 not 09. Dunc| 22:39, 19 Feb 2005 (UTC)

I don't think you need a bot for that... really... See my comment on your talk page. -- AllyUnion (talk) 11:10, 20 Feb 2005 (UTC)

Sorry if I didn't make myself clear. The link ought to go into an external links section in the pages in the style of "February 20" rather than the little ones that appear on the front page. It should be possible to go to March 18 (say) on any date of the year and the link should link to http://news.bbc.co.uk/onthisday/hi/dates/stories/march/18. If it is coded into the article text with {{}} then that will always take the user to today's article, rather than the date in the article title. Dunc| 12:45, 20 Feb 2005 (UTC)

This has or is being handled under User:Kakashi Bot maintained and run by me. Since this is a one time 'run', I don't think a bot flag is necessary. -- AllyUnion (talk) 19:09, 25 Feb 2005 (UTC)
Bot completed its task -- AllyUnion (talk) 11:03, 27 Feb 2005 (UTC)


Simple re-save to update templates

Template:China-geo-stub has just had its category destination changed from Category:China geography stubs to Category:Mainland China geography stubs. So... all 190 or so articles in Category:China geography stubs have to be opened, edited and resaved so that the new template destination can come into effect. I could do it by hand, but it would be a hell of a lot simpler with a bot. Please? Grutness|hello? 08:09, 11 Mar 2005 (UTC)

This will be the next thing that User:KevinBot does. Easy as pie. Kevin Rector 23:52, Mar 11, 2005 (UTC)
Done. Kevin Rector 06:12, Mar 12, 2005 (UTC)


Bot to add categories

Every page linked from List of villages in Wisconsin, except those in the see also section should have the category Category:Villages in Wisconsin some of them do, some of them don't. I guess that someone already has a bot which does this?? If so, could you please point your bot that way. I started doing it manually, a & b were easy.. but when I got to e I reallised that most of the pages don't have the category. Your help appreciated. Azikala 13:52, 6 Nov 2004 (UTC)

Pearle might be about to do this, see Wikipedia:Auto-categorization. -- User:Docu
rambot is currently doing just this. Its not a problem if someone else does it, but since I am fixing and adding other things, adding categories are just one of the things to do and I have already done it to every remaining County article and 98% of cities from states A - K. The rest will follow. 35,000 articles takes a bit of time though, so give it a few days. -- RM

This appears to have been complete by RamBot, so I moved it to the archive. Kevin Rector 06:22, Mar 12, 2005 (UTC)


Read-only random-page getter for statistical stuff

Hello. I don't know if this is the right place but here we are. I'm thinking about running a read-only robot to get random pages for the purpose of doing statistical analysis on them. Probably the edit history will be just as interesting as the page text. I don't think I could ever get more than 1 page per second, which iirc is below the limit placed on spiders (can't find the rules governing spiders at the moment). Does that seem OK in general? If anyone has any comments I'd be interested to hear them. Regards & happy editing, Wile E. Heresiarch 06:25, 4 Jan 2005 (UTC)

You should probably copy this discussion to Wikipedia talk:Bots. As I understand it, this page is for requesting help from existing bots, but that one is for proposing new ones. -- Netoholic @ 05:51, 2005 Jan 8 (UTC)
Thanks for the heads up. I've copied the comment to Wikipedia talk:Bots. Regards, Wile E. Heresiarch 22:19, 8 Jan 2005 (UTC)

Transwiki Bot

Currently, the transwiki to wiktionary (Category:Copy to Wiktionary) category has a huge backlog (of about 1000 entries). Would a bot be a workable mechanism to alleviate this problem? Radiant! 12:29, Mar 10, 2005 (UTC)

  • Yes, User:KevinBot could do this, but I'd have to get permission on Wiktionary to run my bot there (I don't know the procedures there or if there are any - I don't really know much about the Wiktionary). I'll look into it. Kevin Rector 22:40, Mar 11, 2005 (UTC)
Here's the discussion on the wiktionary about this issue.
    • I extremely discourage the mass-moving of stuff to Wiktionary. Many entries are more than definitions and some of them are poorly written. I've spent some time at Wiktionary, and unlike Wikipedia, they don't seem to accept just any kind of crap. This kind of action would create a lot of work for them. Personally, I would not create an entry (in Wiktionary) for Balloon modelling because it's just modelling with balloons. There's no need to define 44 gallon drum as a drum that can hold 44 gallons, it's obviously implied in the name. Besides, that entry should stay on Wikipedia, because it's more than a definition. Sadly, Category:Copy to Wiktionary is not being used correctly, I think, because it's a lot easier to take care of inappropriate stuff by tagging it as a move, removing it from a category and forgetting about it than having to deal with VfDs, merging it, or expanding it. But that's Wikipedia's problem and moving it all elsewhere is not a solution. --jag123 18:11, 12 Mar 2005 (UTC)
  • If you go to the beer parlour over at wiktionary it appears that they do want the articles transwikied to a special namespace where they can go through them at their leisure. The question we need to deal with is not should we send them over, but what to do with them in our system once they've been sent over. I would recommend creating a new category (and template) named something like Category: Transwikied to Wiktionary. Then administrators can go through those articles and either delete them if they are only dicdefs or remove them from the category if they have evolved to actual encyclopedia articles. I'm copying this message to Category talk:Move to Wiktionary. Kevin Rector 22:55, Mar 14, 2005 (UTC)
    • Sounds like a good idea. Radiant! 08:33, Mar 15, 2005 (UTC)

I'm now moving this to the archive and marking it done. The backlog is below 200 and I'll keep working on it. Kevin Rector 04:58, Apr 2, 2005 (UTC)

Wiktionary moving

I wonder whether some automation wouldn't be helpful for Category:Move_to_Wiktionary, assuming a human checks to make sure that the articles listed there are appropriate for moving. It seems fine if the articles get cleaned up into Wiktionary style proper after they are moved. -- Beland 02:22, 2 Apr 2005 (UTC)

This has already been done see the archive 1. Kevin Rector 17:37, Apr 6, 2005 (UTC)


Grammarbot request

(Moved from Wikipedia talk:Bots) Kevin Rector 14:42, Aug 19, 2004 (UTC)

Is it possible that a bot could be created that would put punctuation inside the quotation marks and replace short dashes (-) with long dashs (—)? Neutrality 20:12, 2 Jul 2004 (UTC)

Moving punctuation inside quotes is a bad idea. Punctuation inside quotes is appropriate for some types of quotation, such as spoken dialogue. ("I say!" he said.) But it's often wrong in technical articles. (To get a directory listing, you type "ls", not "ls,".)
For dashes you need to be careful. Em-dashes are inappropriate for ranges of numbers or dates or for minus signs. And hyphens appear in so many contexts (ISO 8601 dates, hyphenated words and names, prefixes and suffixes, chemical formulae) that a naive bot could wreak havoc... Gdr 20:30, 2004 Jul 2 (UTC)
There is also I believe a British/American English style difference in the locating of punctuation around quotes. Rmhermen 21:55, Jul 2, 2004 (UTC)
You believe correctly. See American and British English differences#Punctuation. He said, "British English is this way". She replied, "Americans do it thus." This is also true with similar punctuation marks (such as brackets). That's how it's done in Britain (though in America they're more commonly known as parentheses, and punctuated thus.) Note that Wikipedia:Manual_of_Style#Quotation_marks recommends British usage in this matter for Wikipedia. Marnanel 22:18, 2 Jul 2004 (UTC)
That's not right. In quoted dialogue in British English, punctuation goes inside the quotes, the same as in American English. "British English is this way too," he said. Also, parentheses are parenthetic in Britain too. Gdr 11:05, 2004 Jul 3 (UTC)
See MOS talk/Dashes for the ongoing problems with dashes. Rmhermen 23:09, Jul 2, 2004 (UTC)

I strongly oppose any bot whose purpose is to fix punctuation, grammar, or spelling in an automated way. There are just too many instances where automated changes are wrong due to context. (There are some very funny jokes out there where someone allowed a spell checker to respell things wrong.) Not only that, I strongly oppose any even semi-automated changes to standardize spelling or punctuation based on a single country's standard. --ssd 04:58, 16 Jul 2004 (UTC)

Seconded and seconded. Also, these issues of punctuation style are far from agreed upon. VV 21:53, 18 Jul 2004 (UTC)
I disagree. I think an interactive bot, using an international English dictionary including jargon, understanding Wikipedia markup, looking for correctly-spelled words that are improbable in their context, and giving the user context and consent over every request like the disambiguation bot does, would be a highly effective and safe means of spell-checking articles. Admittedly, no one has written such a thing, but someone very well could. Derrick Coetzee 16:48, 29 Jul 2004 (UTC)
An interactive bot would not be quite as bad, but I still doubt there is any chance it could do punctuation correctly, and you better have a VERY long list of correct alternate spellings that are NOT to be touched. --ssd 05:04, 6 Aug 2004 (UTC)

This is now accomplished at Wikipedia:WikiProject Punctuation. -- Beland 23:13, 30 July 2005 (UTC)

Article name checking

It may be useful for merging purposes to run an automated search on 'articles with the same name except one is singular and the other is plural' (apple vs. apples) and/or 'articles with the same name but different capitalization' (apple tree vs Apple Tree). Except of course if either of them is already a redirect. Then compile a list so people can do cleanup and merging/redirecting from it. Radiant! 12:01, Mar 4, 2005 (UTC)

This is now being done on Wikipedia:WikiProject Red Link Recovery. -- Beland 23:26, 30 July 2005 (UTC)

Bot to purge watchlist of redirects

Simple redirects must account for somewhere between 10% and 15% of my watchlist total. Could a bot be written to scan through a user's watchlist and deselect (stop watching) those articles that are only redirects? Or am I barking up the wrong tree and that's not what a bot can/should do? Hajor 04:56, 7 Mar 2005 (UTC)

a bot could be made to use your account to edit your watchlist and since no changes made by this bot would be visiable to others i don't see why permission would be needed to run this. However i would ask you to reconsider. We NEED people watching redirects as they are a possible target for vandals. Plugwash 17:25, 8 Mar 2005 (UTC)

Need ppl watching redirs. And no one other than the creator is likely to be watching. Good point. Ok. Hajor 16:46, 5 Apr 2005 (UTC)

I for one do not want my watchlists purged of redirects. As Plugwash notes, we need people watching them. Also, I don't see why there's a problem with watching a redirect - it's unlikely to change much so won't appear in the recent changes to your watchlist page, jguk 17:50, 5 Apr 2005 (UTC)


Unnecessary spaces and line breaks

I see a lot of unnecessary spaces and line breaks in the html for articles. I know that it does not increase memory or increase download time by much, but it must have some effect. Sometimes if I am editing an article, I will 'search and replace' double spaces. I sometimes also remove unnecessary line breaks. Perhaps a bot could do one or both of these tasks. Bobblewik  (talk) 21:01, 30 Jan 2005 (UTC)

Those are probably added for source readability. Thus, I usually leave them alone (not that I like them, though :-) ). I'm not suggesting that you should do that, too. — Pladask 15:01, 31 Jan 2005 (UTC)
There's at least one article I've been involved with where someone took out "unnecessary line breaks", and I put them back, simply because they were there deliberately to shape the article around images and tables. What may appear to be an unnecessary white space to one person often does have a use. Grutness|hello? 23:09, 31 Jan 2005 (UTC)
I don't care if they are necessary or not; it's annoying to go check on some change to my watchlist, only to find out someone has been needlessly doing nothing other than messing around with spaces. Do it in conjunction with other edits if you want to, but not just for its own sake. I do also think that there are times when spaces which may not be necessary are useful in making editing easier, as in a list that will be all run together in what we see, but is easier to edit with each entry on one line. Gene Nygaard 01:45, 1 Feb 2005 (UTC)
I definitely wouldn't recommend doing this except in conjunction with other edits. Pearle is currently undergoing an overhaul to fix whitespace in the category and interwiki link sections. Doing a scan of all articles with extraneous whitespace would probably create more server load and take up more storage space than it reduces on either front. Also, it's important to maintain human readability. The time that humans devote to editing is currently our most-constrained resource. Perhaps the best thing to do would be to write a cleanup routine, get it approved, and lobby to have it included in other bots. -- Beland 04:20, 27 Feb 2005 (UTC)
do remeber that linebreaks are significant in wiki markup unlike in html where they only count as whitespace. Plugwash 01:01, 9 May 2005 (UTC)


I don't know if this can be done by a bot, but I've written some instructions at User:Pladask/Compositions by composer category cleanup. See also Category talk:Compositions by composerPladask 20:13, Mar 31, 2005 (UTC)

Done. Gdr 23:02, 2005 May 17 (UTC)

Latin name redirects

It is incredible the number of genus and latin names that do have english articles, but not associated redirects. I am sure it wouldn't be hard to build a bot that would check for redirects for the "taxon" variable in {{taxobox genus entry}} and {{Taxobox species entry}} and create them to the english articles whenever they are missing. Circeus 16:26, May 18, 2005 (UTC)

Done. See User:Gdr/Nomialbot Gdr 19:05, 2005 May 18 (UTC)

TV

There are many many TV redirects to Television, whether more than 500 or not I don't know, but it is a very common mistake that is highly tedious human work to repair to avoid the redirect. Could a bot do this? SqueakBox 22:01, Jun 11, 2005 (UTC)

I'm not sure that this actually needs to be done. As far as I can tell, the redirect doesn't hurt anything. (Actually it saves a bit on typing piped links.) -Aranel ("Sarah") 22:22, 17 July 2005 (UTC)


Duplicate content in articles

Per Wikipedia:Village_pump_(technical)#duplicate_content_in_articles, there seem to be not altogether infrequent occurrences of wholesale duplication of article text (not currently known whether this is due to a sofware bug or user error). Automatically identifying articles that have duplicated content would help in both fixing the affected articles and (perhaps) figuring out how this is actually occurring. One approach might be to scan articles for multiple, identical, headers. -- Rick Block 15:00, 1 Apr 2005 (UTC)

This should be done via a db dump and not done online. Kevin Rector 17:37, Apr 1, 2005 (UTC)
Agreed for a process like this which will use the bulk of wikipedia text you should download the cur dump and process it locally. Though it may be an idea to use a bot to post to the talk pages of articles where there is a likely problem. Plugwash 03:02, 2 Apr 2005 (UTC)
Is there some separate page for offline analysis requests I should have posted this to instead of here? I sort of assumed at least some bots start with an offline copy of the DB (like, say, Beland's orphaned categories scan). In any event, is anyone interested in doing this? -- Rick Block 03:49, 7 Apr 2005 (UTC)
I wasn't saying that that this was the wrong place to discuss this, I was just noting that it should be done offline, that's all. No harm, no foul. Kevin Rector 16:39, Apr 7, 2005 (UTC)
It looks like user:Dragons flight has identified a coding error in the source that leads to this issue (see Wikipedia_talk:Templates_for_deletion#Page_duplication), which means it might actually get fixed (yeah!). After it's fixed, it would be swell if someone who gets an offline copy of the database would write a script to look for articles with duplicate content. With over 600,000 articles I'd be willing to bet there are at least a few with as yet undetected content duplication. -- Rick Block (talk) July 6, 2005 01:24 (UTC)

So Rick says Bugzilla:275 was apparently causing this, and it's fixed. Bugzilla is down right now, so I have no idea about that. In any case, he asked if I could run an analysis on an offline copy of the database to look for these duplicated sections. I thought that was an interesting problem and that a solution should be easy enough to whip up. (Plus I'm really procrastinating on my job hunt.) Well, for the benefit of posterity, let me say that just looking for duplicate headers (which was my first thought, as well), does not work well. It produces a huge number of false positives. The next most obvious thing to do is to do a brute-force check for duplicated text. As a cheap approximation to doing this, I tried chopping page text into three-word chunks, and seeing how many of those chunks occur more than once. I spent several minutes watching it analyze a single article. I phoomped my code, and I got it running fast enough to cover the whole database in maybe 5 hours or so. But this was also producing a very long list. So, I decided to chain these filters, and only look for duplicated triplets in pages that have duplicated headers. This completed in about 20 minutes and seems to have produced a relatively useful listing. I have posted it at Wikipedia:Duplicated sections. Enjoy. -- Beland 06:18, 1 August 2005 (UTC)

Category Cleanup Bot

(Moved from Wikipedia:Help wanted) Kevin Rector 14:48, Aug 19, 2004 (UTC)

Category:Lists of people by occupation is a horrible mess. Similar problems to lists of writers. Secretlondon 02:36, 21 Jun 2004 (UTC)

all lower case entries capitalised - but some "occupations" are a little dubious. Secretlondon 02:43, 21 Jun 2004 (UTC)
Everything under L for List sorted. All lists of actors dumped into their own sub-category. Maybe someone more sympathetic to what I see as utter trivia should deal with the rest... Secretlondon 03:38, 21 Jun 2004 (UTC)
For trivia, you may want to duplicate this to List of trivia lists. You could have save some work if you alphabetized the lists as per "list of ..", anyways, I changed the ones on Category:Lists of actors to match the other sorting as well. -- User:Docu

Tagged for category cleanup due to need to alphabetize properly. Otherwise, this looks fixed, and not in need of bot attention. -- Beland 04:08, 7 September 2005 (UTC)

Asteroid de-stubbing bot

Asteroids are rather dull things. There are also rather a lot of them. Wikipedia has a lot of articles like 315 Constantia which have quite extensive numerical data on orbits and the like but are also stubs.

If the idea is that stubs are an invite to expand an article with relevent information then the stubs here are no longer valid as all the relavent information is in the table (theres not a lot you can say about your average asteroid).

There are many asteroid stubs like this to manually remove the stub labels so I believe it prudent that a bot should be created to remove the stub label from all articles in Category:Asteroids and subcategories which contain the "minor planet" table.LukeSurl 18:35, 9 Mar 2005 (UTC)

Substub sifting

I've been busily trying to sift through Category:Substubs, to see what can be salvaged and what can't. The items that can b salvvaged, I've enlarged a little and moved to one of the subcategories of Category:Stub. It looks, though, like someone else has already had a go at this task, but has left Template:substub on the ones he or she has sifted. This means that quite a number of articles have both Template:substub and one of the Stub category templates (an example is Clean Air Act 1956). Is it possible to have a bot go through all 3000 substubs and remove Template:Substub from any that have other stub templates? Grutness|hello? 01:22, 20 Mar 2005 (UTC)

This cat has about 430 people and should be depopulated since it is a parent category for Category:Puerto Rican people by occupation. I don't relish the thought of doing this manually. Thanks. Gamaliel 18:24, 29 Mar 2005 (UTC)

If you nominate this to WP:CFD, Pearle will do it. -- Beland 02:20, 2 Apr 2005 (UTC)
Can I do that if I don't actually want the category to be deleted? Gamaliel 20:51, 13 Apr 2005 (UTC)
Yes. Just specify that in your nomination. -- Beland 23:26, 30 July 2005 (UTC)

Adapt excel list to wiki

A friend send me xls with ~25,000 entries (index) from Polski Słownik Biograficzny - i.e. a list of 25,000 people who deserve an encyclopedic article. Is there a bot or any other tool that could convert it into a wikified list (or several...25,000 ENTRIES!)? I hope this is the right place to ask for help. If you know anybody (or anything) I can contact and ask for help, let me know at my talkpage as well. I created a subpage at my userpage for this project explaining what Polski Słownik Biograficzny is at User:Piotrus/List of Poles. --Piotr Konieczny aka Prokonsul Piotrus Talk 13:35, 9 May 2005 (UTC)

Converting to a list shouldn't be a problem, and won't require a bot. Export the .xls file to a .csv or text format (if possible, only export the column with the name) then run a text editor that will do macros on it to format the data. I can help if you provide a link to the .xls file or the exported data. --ChrisRuvolo (t) 16:48, 9 May 2005 (UTC)

Paralympics English → German Bot

(Moved here from Village pump) --Wclark 15:08, 2004 Jul 15 (UTC)

(Moved again from Wikipedia talk:Bots) Kevin Rector 14:29, Aug 19, 2004 (UTC)

How do you make a Bot Request? I want to link all the Paralympics games in English to the Paralympics games in German by using the [[en:]] and [[de:]] tags, but it's too many to do manually. Salasks 14:59, 15 Jul 2004 (UTC)

This seems to have been taken care of manually. (It's not actually that many.) -- Beland 01:24, 15 September 2005 (UTC)


Taxobot

I believe the time has come to make a taxobot (taxonomy infobox bot). Here are the various purposes needed:

  1. Convert plain HTML and plain wikitable taxonomy boxes into template taxoboxes.
  2. Fix buggy taxoboxes caused by the conversion of the taxobox templates from HTML to wiki markup ({{Taxobox_end}} should be on its own line, and no blank line should follow it.)
  3. The authority templates should be updated to take a single parameter.

While I'm a software engineer in real life, I have no idea how to get this started. Anyone for some bot fun? - UtherSRG 04:41, Jan 8, 2005 (UTC)

Have you ever thought about contacting André Engels at the nl.wikipedia ? If you want a bot, he's the man to see. JoJan 22:40, 8 Jan 2005 (UTC)

Well, I've tried, but with no luck, to get Andre's attention. Anyone wanna help me build a bot? - UtherSRG 02:10, Feb 1, 2005 (UTC)

Assisting UtherSRG on this matter. -- AllyUnion (talk) 19:33, 25 Feb 2005 (UTC)
the use of templates for taxoboxes is a pita for those of us who follow interwiki links to pages they can't read in order to add images (you don't need to be able to read the text of a species article to add an image to its taxobox). Plugwash 17:18, 8 Mar 2005 (UTC)


Bot for updating and tabulating CIA Factbook Data

I have made Template:Military and implemented it in all countries with a military beginning letters A,F,Z (i.e. Military of Armenia). Currently, CIA data is presented in an unappealling way in semi-hairy html (see Military of Cameroon for example). In addition, the CIA updates its data every year so most articles generated back in 2001 are outdated. If anyone can update and reformat the CIA data with a bot, then it will save us a lot of time (now and in the future because a new factbook is issued each year). This can also be expanded beyong articles on militaries- to other daughter articles prescribed by WikiProject Countries.--Jiang 11:06, 10 Jan 2005 (UTC)

I am assisting Jiang with this matter. -- AllyUnion (talk) 08:18, 27 Feb 2005 (UTC)


Bot to undo damage by bot putting in U.S. census places

In all the demographics for U.S. census places, the information is cluttered up by useless false precision of the numbers. All of the marked zeros in this example below are useless, misleading information.

There are 1,974,181 households out of which 30.90% have children under the age of 18 living with them, 44.00% are married couples living together, 15.60% have a female householder with no husband present, and 35.70% are non-families. 29.40% of all households are made up of individuals and 9.30% have someone living alone who is 65 years of age or older. The average household size is 2.68 and the average family size is 3.38.
In the county the population is spread out with 26.00% under the age of 18, 9.90% from 18 to 24, 31.70% from 25 to 44, 20.70% from 45 to 64, and 11.70% who are 65 years of age or older. The median age is 34 years. For every 100 females there are 93.90 males. For every 100 females age 18 and over, there are 90.50 males.
The median income for a household in the county is $45,922, and the median income for a family is $53,784. Males have a median income of $40,690 versus $31,298 for females. The per capita income for the county is $23,227. 13.50% of the population and 10.60% of families are below the poverty line. Out of the total population, 18.90% of those under the age of 18 and 10.30% of those 65 and older are living below the poverty line.

Gene Nygaard 07:37, 27 Jan 2005 (UTC)

This user is currently talking with the census bot owner -- AllyUnion (talk) 19:16, 25 Feb 2005 (UTC)
Looks like this is resolved. -- Beland 01:24, 15 September 2005 (UTC)