This is Haiyizhu's talk page, where you can send her messages and comments. |
|
Welcome :)-- Haiyizhu ( talk) 04:24, 24 January 2011 (UTC)
Here are my answers to your questions. Thanks for your questions and let me know if you need more information. mheart ( talk) 20:40, 24 January 2011 (UTC)
1. How many times (approximately) have you participated in the alternative music collaboration of the week?
2. Why did you participate in the alternative music collaboration of the week?
3. Do you feel you become a better editor through participating in collaboration? If so, could you provide some examples?
I don't know if this counts as being specifically a part of the alternative music collaboration of the week, but the Soundgarden was once chosen for the collaboration. At another time, I assisted another user article in editing the Soundgarden article to obtain good article status on Wikipedia. Cannibaloki and I pushed each other to get the article up to good standards and our respective know-how about editing and the band ultimately sent the Soundgarden article into good article territory.
4. How did alternative music collaboration of the week change your behaviors? What causes the changes?
5. Are you still participating alternative music collaboration of the week recently? Why?
Hi, I replied to your questions over on my talk page. Hope they are useful in some way. Chevy monte carlo 19:00, 25 January 2011 (UTC)
Hi Haiyizhu,
Thanks for your interest in Wikipedia. Here are my answers to your five questions.
1. I've taken part in Wikiproject Oregon's Collaboration of the Week between 10 and 20 times since 2007.
2. I usually work more-or-less alone on topics that interest me. An Oregon collaboration from time-to-time adds variety.
3. Collaborations help me see how others view things and how they go about their work. They know things that I don't know, and I learn from them by observation or discussion. For example, when I took part in a collaboration to improve articles about Oregon hospitals, I imitated what others were doing and asked their advice.
4. The collaborations did not change my basic behaviors.
5. I haven't participated as much lately because I've spent increasing amounts of time reviewing, especially at WP:PR. I have a wide variety of Wikipedia interests, many of which are collaborative.
Hope this helps. Finetooth ( talk) 02:40, 30 January 2011 (UTC)
Q1. How many times (approximately) have you participated in Alternative Music Collaboration of the Week?
Q2. How much do you learn from participating in Alternative Music Collaboration of the Week? A. A lot; B. A little bit; C. Not at all (please skip Q3 if you choose C)
Q3. What did you learn from participating in Alternative Music Collaboration of the Week? Please provide examples if possible.
Q4. Do you have any negative experience of Alternative Music Collaboration of the Week?
Q5. What do you think are some of the reasons for Wikiproject Alternative Music's cancellation of collaboration of the week?
Best of luck with your research.-- Michig ( talk) 20:33, 31 January 2011 (UTC)
Q1. How many times (approximately) have you participated in Alternative Music Collaboration of the Week?
Q2. How much do you learn from participating in Alternative Music Collaboration of the Week? A. A lot; B. A little bit; C. Not at all (please skip Q3 if you choose C)
Q3. What did you learn from participating in Alternative Music Collaboration of the Week? Please provide examples if possible.
Q4. Do you have any negative experience of Alternative Music Collaboration of the Week?
Q5. What do you think are some of the reasons for Wikiproject Alternative Music's cancellation of collaboration of the week?
Hope I helped. Tezero ( talk) 00:04, 5 February 2011 (UTC)
I'd rather not do any interviews. Sorry. Finetooth ( talk) 21:32, 30 October 2011 (UTC)
Sorry, but lately I've been having some problems with the messenger, and I'm unable to use it. Regards. Tintor2 ( talk) 15:29, 8 November 2011 (UTC)
I am willing to take part in the interview, and can be reached through Windows Live Messenger.
I can be contacted through MSN at merlinsorca@hotmail.com
There is really no specific time I need to do the interview. Most of the time, I'll be logged in whenever I'm on the computer. If you find me online today or during this week, you can contact me whenever you have the time!
Merlin
s
orca 16:18, 8 November 2011 (UTC)
Hi. I'll have to decline on the interview as I don't think I've ever participated in any of the collaborations of the week/month. Good luck with your research. TH1RT3EN talk ♦ contribs 01:02, 9 November 2011 (UTC)
Hi, I'm willing and able to participate in a text-based interview via Skype (my Skype-name is game-guru999). I'm free all weekend and would prefer doing it then, but I'll also be available on Thursday evening (betweek 5 and 8 PM, UTC+1) and Friday afternoon (between 3 and 6 PM).
Regards, Game-Guru999 ( talk) 16:58, 9 November 2011 (UTC)
I would be happy to try to find time to answer questions about the video game collaboration of the week. To be fair, when I started using it, it had been decreasing in use. It's been changed from weekly to monthly and I haven't been watching it too closely, but I can answer questions about selection, my involvement, or how it had been used historically. I'm am likely more available for a text chat than anything else. — Ost ( talk) 22:45, 14 November 2011 (UTC)
Hi Haiyi,
Sorry it took me a while to respond, I haven't been on Wikipedia much recently. I can try to help as much as I can, albeit I haven't edited much on WP in a year or 2. If you want to talk prob the best way would be to email me your questions, and I can then respond. If you are still interested, shoot me an email @ pdwinfre (at) loyno (dot) edu. -- Samwisep86 ( talk) 16:46, 23 November 2011 (UTC)
Hi Haiyizhu, I wanted to let you know that there was some discussion of your research work on Wikipedia published in the Features section of The WikiProject Video Games Newsletter Volume 5, Number 3. If there is any concern that you have been misquoted or that your views have been distorted in any way then please let me know so we can issue a correction in the next newsletter. Thank you. - Thibbs ( talk) 13:50, 17 October 2012 (UTC)
Given your work, WP:ORCID may be of interest. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:57, 24 November 2014 (UTC)
Hello Haiyizhu,
Would you be able to help evaluate the accuracy of translations of Wikipedia articles from Chinese to English Wikipedia?
This would involve evaluating a translated article on the English Wikipedia by comparing it to the original Chinese article, and marking it "Pass" or "Fail" based on whether the translation faithfully represents the original. Here's the reason for this request:
There are a number of articles on English Wikipedia that were created as machine translations from different languages including Chinese , using the Content Translation tool, sometimes by users with no knowledge of the source language. The config problem that allowed this to happen has since been fixed, but this has left us with a backlog of articles whose accuracy of translation is suspect or unknown, including some articles translated from Chinese. In many cases, other editors have come forward later to copyedit and fix any English grammar or style issues, but that doesn't necessarily mean that the translation is accurate, as factual errors from the original translation may remain. To put it another way: Good English is not the same as good translation.
If you can help out, that would be great. Here's a sample of the articles that need checking:
All you have to do, is compare the English article to the Chinese article, and mark it "Pass" or "Fail" (templates {{ Pass}} and {{ Fail}} may be useful). (Naturally, if you feel like fixing an inaccurate translation and then marking it "Pass", that's even better, but it isn't required.)
If you can help, please let me know. Thanks! Mathglot ( talk) 22:22, 3 June 2017 (UTC)
-- 02:07, Thursday, October 19, 2017 ( UTC)
Mission 1 | Mission 2 | Mission 3 | Mission 4 | Mission 5 | Mission 6 | Mission 7 |
Say Hello to the World | An Invitation to Earth | Small Changes, Big Impact | The Neutral Point of View | The Veil of Verifiability | The Civility Code | Looking Good Together |
-- 02:08, Thursday, October 19, 2017 ( UTC)
Mission 1 | Mission 2 | Mission 3 | Mission 4 | Mission 5 | Mission 6 | Mission 7 |
Say Hello to the World | An Invitation to Earth | Small Changes, Big Impact | The Neutral Point of View | The Veil of Verifiability | The Civility Code | Looking Good Together |
-- 02:09, Thursday, October 19, 2017 ( UTC)
Mission 1 | Mission 2 | Mission 3 | Mission 4 | Mission 5 | Mission 6 | Mission 7 |
Say Hello to the World | An Invitation to Earth | Small Changes, Big Impact | The Neutral Point of View | The Veil of Verifiability | The Civility Code | Looking Good Together |
Imagine you’ve just spent 27 minutes working on what you earnestly thought would be a helpful edit to your favorite article. You click that bright blue “Publish changes” button for the very first time, and you see your edit go live! Weeee! But 52 seconds later, you refresh the page and discover that your edit has been reverted and wiped off the planet.
However, your post was not deleted by a human editor; an AI system - called ORES - has been contributing to this rapid judgement of hundreds of thousands of editors’ work on Wikipedia. ORES is a machine learning system that automatically predicts edit and article quality to support editing tools in Wikipedia. For example, when you go to RecentChanges, you can see whether an edit is flagged as damaging and should be reviewed. This is based on the ORES predictions and RecentChanges even allows you to interact with ORES to change the sensitivity of the algorithm to "High (flags more edits)" or "Low (flags fewer edits)”.
In this discussion post, we want to invite editors to discuss the following *4 potential ORES models* -- Among those four models, which one you think presents the best outcomes and would recommend for the English Wikipedia community to use? why?
ABOUT US: We are a group of HCI researchers at Carnegie Mellon University and we are inviting editors to discuss the values of Wikipedia as it relates to ORES. We aim to help build a better wikipedia community by engaging editors in the design and development of AI tools. More details are available at our research metapage.
Group / Metrics | Accuracy | False Positive Rate | False Negative Rate | Damaging Rate |
---|---|---|---|---|
Overall | 97.7% | 0.1% | 56.8% | 1.8% |
Newcomer | 92.7% | 0.9% | 55.0% | 6.2% |
Experienced | 99.6% | 0.0% | 83.6% | 0.1% |
Anonymous | 92.3% | 0.3% | 53.8% | 6.6% |
Advantages:
For anonymous and newcomer editors, there are only a small proportion of their edits that are considered as damaging. The algorithm of this threshold treats experienced and non-experienced editors as similarly as possible.
Disadvantages:
1. For all these three groups of editors, a large proportion of their damaging edits will be considered by the algorithm as good.
Group / Metrics | Accuracy | False Positive Rate | False Negative Rate | Damaging Rate |
---|---|---|---|---|
Overall | 98.0% | 1.6% | 11.5% | 5.0% |
Newcomer | 95.9% | 3.8% | 7.0% | 14.3% |
Experienced | 99.8% | 0.0% | 41.8% | 0.3% |
Anonymous | 91.6% | 8.3% | 9.2% | 19.7% |
Advantages:
1. For all these three groups of editors, only a small proportion of their good edits will be identified by the algorithm as damaging.
Disadvantages:
1. For all these three groups of editors, a relatively large proportion of their damaging edits will be considered by the algorithm as good.
Group / Metrics | Accuracy | False Positive Rate | False Negative Rate | Damaging Rate |
---|---|---|---|---|
Overall | 95.5% | 4.6% | 1.6% | 8.2% |
Newcomer | 90.8% | 10.4% | 0.5% | 21.0% |
Experienced | 99.9% | 0.0% | 11.9% | 0.4% |
Anonymous | 80.0% | 23.1% | 0.6% | 33.6% |
Advantages:
1. The algorithm can correctly classify most edits by experienced editors.
Disadvantages:
1. The algorithm of this threshold treats experienced and non-experienced editors in a quite different way.
Group / Metrics | Accuracy | False Positive Rate | False Negative Rate | Damaging Rate |
---|---|---|---|---|
Overall | 90.5% | 9.9% | 0.8% | 13.3% |
Newcomer | 78.7% | 24.2% | 0.0% | 33.2% |
Experienced | 99.9% | 0.1% | 7.5% | 0.5% |
Anonymous | 58.0% | 48.6% | 0.2% | 55.7% |
Advantages:
1. Only a really small proportion of damaging edits will be considered by the algorithm as good.
Disadvantages:
1. The algorithm of this threshold treats experienced and non-experienced editors in a really different way.
This is Haiyizhu's talk page, where you can send her messages and comments. |
|
Welcome :)-- Haiyizhu ( talk) 04:24, 24 January 2011 (UTC)
Here are my answers to your questions. Thanks for your questions and let me know if you need more information. mheart ( talk) 20:40, 24 January 2011 (UTC)
1. How many times (approximately) have you participated in the alternative music collaboration of the week?
2. Why did you participate in the alternative music collaboration of the week?
3. Do you feel you become a better editor through participating in collaboration? If so, could you provide some examples?
I don't know if this counts as being specifically a part of the alternative music collaboration of the week, but the Soundgarden was once chosen for the collaboration. At another time, I assisted another user article in editing the Soundgarden article to obtain good article status on Wikipedia. Cannibaloki and I pushed each other to get the article up to good standards and our respective know-how about editing and the band ultimately sent the Soundgarden article into good article territory.
4. How did alternative music collaboration of the week change your behaviors? What causes the changes?
5. Are you still participating alternative music collaboration of the week recently? Why?
Hi, I replied to your questions over on my talk page. Hope they are useful in some way. Chevy monte carlo 19:00, 25 January 2011 (UTC)
Hi Haiyizhu,
Thanks for your interest in Wikipedia. Here are my answers to your five questions.
1. I've taken part in Wikiproject Oregon's Collaboration of the Week between 10 and 20 times since 2007.
2. I usually work more-or-less alone on topics that interest me. An Oregon collaboration from time-to-time adds variety.
3. Collaborations help me see how others view things and how they go about their work. They know things that I don't know, and I learn from them by observation or discussion. For example, when I took part in a collaboration to improve articles about Oregon hospitals, I imitated what others were doing and asked their advice.
4. The collaborations did not change my basic behaviors.
5. I haven't participated as much lately because I've spent increasing amounts of time reviewing, especially at WP:PR. I have a wide variety of Wikipedia interests, many of which are collaborative.
Hope this helps. Finetooth ( talk) 02:40, 30 January 2011 (UTC)
Q1. How many times (approximately) have you participated in Alternative Music Collaboration of the Week?
Q2. How much do you learn from participating in Alternative Music Collaboration of the Week? A. A lot; B. A little bit; C. Not at all (please skip Q3 if you choose C)
Q3. What did you learn from participating in Alternative Music Collaboration of the Week? Please provide examples if possible.
Q4. Do you have any negative experience of Alternative Music Collaboration of the Week?
Q5. What do you think are some of the reasons for Wikiproject Alternative Music's cancellation of collaboration of the week?
Best of luck with your research.-- Michig ( talk) 20:33, 31 January 2011 (UTC)
Q1. How many times (approximately) have you participated in Alternative Music Collaboration of the Week?
Q2. How much do you learn from participating in Alternative Music Collaboration of the Week? A. A lot; B. A little bit; C. Not at all (please skip Q3 if you choose C)
Q3. What did you learn from participating in Alternative Music Collaboration of the Week? Please provide examples if possible.
Q4. Do you have any negative experience of Alternative Music Collaboration of the Week?
Q5. What do you think are some of the reasons for Wikiproject Alternative Music's cancellation of collaboration of the week?
Hope I helped. Tezero ( talk) 00:04, 5 February 2011 (UTC)
I'd rather not do any interviews. Sorry. Finetooth ( talk) 21:32, 30 October 2011 (UTC)
Sorry, but lately I've been having some problems with the messenger, and I'm unable to use it. Regards. Tintor2 ( talk) 15:29, 8 November 2011 (UTC)
I am willing to take part in the interview, and can be reached through Windows Live Messenger.
I can be contacted through MSN at merlinsorca@hotmail.com
There is really no specific time I need to do the interview. Most of the time, I'll be logged in whenever I'm on the computer. If you find me online today or during this week, you can contact me whenever you have the time!
Merlin
s
orca 16:18, 8 November 2011 (UTC)
Hi. I'll have to decline on the interview as I don't think I've ever participated in any of the collaborations of the week/month. Good luck with your research. TH1RT3EN talk ♦ contribs 01:02, 9 November 2011 (UTC)
Hi, I'm willing and able to participate in a text-based interview via Skype (my Skype-name is game-guru999). I'm free all weekend and would prefer doing it then, but I'll also be available on Thursday evening (betweek 5 and 8 PM, UTC+1) and Friday afternoon (between 3 and 6 PM).
Regards, Game-Guru999 ( talk) 16:58, 9 November 2011 (UTC)
I would be happy to try to find time to answer questions about the video game collaboration of the week. To be fair, when I started using it, it had been decreasing in use. It's been changed from weekly to monthly and I haven't been watching it too closely, but I can answer questions about selection, my involvement, or how it had been used historically. I'm am likely more available for a text chat than anything else. — Ost ( talk) 22:45, 14 November 2011 (UTC)
Hi Haiyi,
Sorry it took me a while to respond, I haven't been on Wikipedia much recently. I can try to help as much as I can, albeit I haven't edited much on WP in a year or 2. If you want to talk prob the best way would be to email me your questions, and I can then respond. If you are still interested, shoot me an email @ pdwinfre (at) loyno (dot) edu. -- Samwisep86 ( talk) 16:46, 23 November 2011 (UTC)
Hi Haiyizhu, I wanted to let you know that there was some discussion of your research work on Wikipedia published in the Features section of The WikiProject Video Games Newsletter Volume 5, Number 3. If there is any concern that you have been misquoted or that your views have been distorted in any way then please let me know so we can issue a correction in the next newsletter. Thank you. - Thibbs ( talk) 13:50, 17 October 2012 (UTC)
Given your work, WP:ORCID may be of interest. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:57, 24 November 2014 (UTC)
Hello Haiyizhu,
Would you be able to help evaluate the accuracy of translations of Wikipedia articles from Chinese to English Wikipedia?
This would involve evaluating a translated article on the English Wikipedia by comparing it to the original Chinese article, and marking it "Pass" or "Fail" based on whether the translation faithfully represents the original. Here's the reason for this request:
There are a number of articles on English Wikipedia that were created as machine translations from different languages including Chinese , using the Content Translation tool, sometimes by users with no knowledge of the source language. The config problem that allowed this to happen has since been fixed, but this has left us with a backlog of articles whose accuracy of translation is suspect or unknown, including some articles translated from Chinese. In many cases, other editors have come forward later to copyedit and fix any English grammar or style issues, but that doesn't necessarily mean that the translation is accurate, as factual errors from the original translation may remain. To put it another way: Good English is not the same as good translation.
If you can help out, that would be great. Here's a sample of the articles that need checking:
All you have to do, is compare the English article to the Chinese article, and mark it "Pass" or "Fail" (templates {{ Pass}} and {{ Fail}} may be useful). (Naturally, if you feel like fixing an inaccurate translation and then marking it "Pass", that's even better, but it isn't required.)
If you can help, please let me know. Thanks! Mathglot ( talk) 22:22, 3 June 2017 (UTC)
-- 02:07, Thursday, October 19, 2017 ( UTC)
Mission 1 | Mission 2 | Mission 3 | Mission 4 | Mission 5 | Mission 6 | Mission 7 |
Say Hello to the World | An Invitation to Earth | Small Changes, Big Impact | The Neutral Point of View | The Veil of Verifiability | The Civility Code | Looking Good Together |
-- 02:08, Thursday, October 19, 2017 ( UTC)
Mission 1 | Mission 2 | Mission 3 | Mission 4 | Mission 5 | Mission 6 | Mission 7 |
Say Hello to the World | An Invitation to Earth | Small Changes, Big Impact | The Neutral Point of View | The Veil of Verifiability | The Civility Code | Looking Good Together |
-- 02:09, Thursday, October 19, 2017 ( UTC)
Mission 1 | Mission 2 | Mission 3 | Mission 4 | Mission 5 | Mission 6 | Mission 7 |
Say Hello to the World | An Invitation to Earth | Small Changes, Big Impact | The Neutral Point of View | The Veil of Verifiability | The Civility Code | Looking Good Together |
Imagine you’ve just spent 27 minutes working on what you earnestly thought would be a helpful edit to your favorite article. You click that bright blue “Publish changes” button for the very first time, and you see your edit go live! Weeee! But 52 seconds later, you refresh the page and discover that your edit has been reverted and wiped off the planet.
However, your post was not deleted by a human editor; an AI system - called ORES - has been contributing to this rapid judgement of hundreds of thousands of editors’ work on Wikipedia. ORES is a machine learning system that automatically predicts edit and article quality to support editing tools in Wikipedia. For example, when you go to RecentChanges, you can see whether an edit is flagged as damaging and should be reviewed. This is based on the ORES predictions and RecentChanges even allows you to interact with ORES to change the sensitivity of the algorithm to "High (flags more edits)" or "Low (flags fewer edits)”.
In this discussion post, we want to invite editors to discuss the following *4 potential ORES models* -- Among those four models, which one you think presents the best outcomes and would recommend for the English Wikipedia community to use? why?
ABOUT US: We are a group of HCI researchers at Carnegie Mellon University and we are inviting editors to discuss the values of Wikipedia as it relates to ORES. We aim to help build a better wikipedia community by engaging editors in the design and development of AI tools. More details are available at our research metapage.
Group / Metrics | Accuracy | False Positive Rate | False Negative Rate | Damaging Rate |
---|---|---|---|---|
Overall | 97.7% | 0.1% | 56.8% | 1.8% |
Newcomer | 92.7% | 0.9% | 55.0% | 6.2% |
Experienced | 99.6% | 0.0% | 83.6% | 0.1% |
Anonymous | 92.3% | 0.3% | 53.8% | 6.6% |
Advantages:
For anonymous and newcomer editors, there are only a small proportion of their edits that are considered as damaging. The algorithm of this threshold treats experienced and non-experienced editors as similarly as possible.
Disadvantages:
1. For all these three groups of editors, a large proportion of their damaging edits will be considered by the algorithm as good.
Group / Metrics | Accuracy | False Positive Rate | False Negative Rate | Damaging Rate |
---|---|---|---|---|
Overall | 98.0% | 1.6% | 11.5% | 5.0% |
Newcomer | 95.9% | 3.8% | 7.0% | 14.3% |
Experienced | 99.8% | 0.0% | 41.8% | 0.3% |
Anonymous | 91.6% | 8.3% | 9.2% | 19.7% |
Advantages:
1. For all these three groups of editors, only a small proportion of their good edits will be identified by the algorithm as damaging.
Disadvantages:
1. For all these three groups of editors, a relatively large proportion of their damaging edits will be considered by the algorithm as good.
Group / Metrics | Accuracy | False Positive Rate | False Negative Rate | Damaging Rate |
---|---|---|---|---|
Overall | 95.5% | 4.6% | 1.6% | 8.2% |
Newcomer | 90.8% | 10.4% | 0.5% | 21.0% |
Experienced | 99.9% | 0.0% | 11.9% | 0.4% |
Anonymous | 80.0% | 23.1% | 0.6% | 33.6% |
Advantages:
1. The algorithm can correctly classify most edits by experienced editors.
Disadvantages:
1. The algorithm of this threshold treats experienced and non-experienced editors in a quite different way.
Group / Metrics | Accuracy | False Positive Rate | False Negative Rate | Damaging Rate |
---|---|---|---|---|
Overall | 90.5% | 9.9% | 0.8% | 13.3% |
Newcomer | 78.7% | 24.2% | 0.0% | 33.2% |
Experienced | 99.9% | 0.1% | 7.5% | 0.5% |
Anonymous | 58.0% | 48.6% | 0.2% | 55.7% |
Advantages:
1. Only a really small proportion of damaging edits will be considered by the algorithm as good.
Disadvantages:
1. The algorithm of this threshold treats experienced and non-experienced editors in a really different way.