A Bit More Detail

Assorted Personal Notations, Essays, and Other Jottings

Posts Tagged ‘wikipedia

[BLOG] Some Friday links

  • D-Brief shares rare video of beaked whales on the move.
  • Dangerous Minds notes that someone has actually begun selling unauthorized action figures of Trump Administration figures like Bannon and Spencer.
  • Language Log looks at a linguistic feature of Emma Watson’s quote, her ending it with a preposition.
  • Marginal Revolution’s Tyler Cowen considers, originally for Bloomberg View, if Trump could be seen as a placebo for what ails America.
  • The New APPS Blog takes a Marxist angle on the issue of big data, from the perspective of (among other things) primitive accumulation.
  • The Search reports on the phenomenon of the Women’s History Month Wikipedia edit-a-thon, aiming to literally increase the representation of notable women on Wikipedia.
  • Towleroad notes the six men who will be stars of a new Fire Island reality television show.
  • The Volokh Conspiracy finds some merit in Ben Carson’s description of American slaves as immigrants. (Some.)
  • Window on Eurasia argues that Belarusians are beginning to mobilize against their government and suggests they are already making headway.

[BLOG] Some Saturday links

  • Centauri Dreams looks at the advanced microelectronics that might last a space probe the two decades it would take to get to Proxima Centauri.
  • Dangerous Minds links to a 1980 filmed concert performance by Queen.
  • The Dragon’s Gaze reports on the discovery of potassium in the atmosphere of WASP-17b.
  • Language Hat looks at the Carmina of Optatianus, an interesting piece of Latin literature.
  • Lawyers, Guns and Money reports on the shameless anti-democratic maneuvering of the Republicans in North Carolina.
  • The LRB Blog reflects on the shamelessness of the perpetrators of the Aleppo massacres.
  • Marginal Revolution looks at what Charles Darwin’s reading habits have to say about the man’s process of research.
  • North!’s Justin Petrone looks at the elves of Estonia.
  • The NYRB Daily praises the new movie Manchester by the Sea.
  • The Planetary Society Blog shares a recent photo of Phobos.
  • Peter Rukavina argues that the Island’s low PISA scores do not necessarily reflect on what Islanders have learned.
  • Savage Minds shares an essay by someone who combines academic work with library work.
  • Torontoist notes the city’s subsidies to some major water polluters.
  • Window on Eurasia notes the anniversary of some important riots in Kazakhstan.
  • Arnold Zwicky reflects on the penguin-related caption of a photo on Wikipedia that has made the world laugh.

[LINK] “A Wikipedian Explains How Wikipedia Stays Reliable in the Fake News”

Vice‘s Mike Pearl interviews Wikipedia editor Victor Grigas to examine Wikipedia’s strategies for exposing fraud.

VICE: How’d you get into writing about fake news?
Victor Grigas: Chicago stuff is what I write about, and I had all these friends who were like, “This is bullshit, man!” when Trump got elected. And I was like, “Send your [protest] photos in!” I had one friend who did, and I uploaded them. [So] I’m pretty happy with where [Wikipedia’s articles about Trump protests have] gone. But in the process of researching it, if you type in “Trump protests,” you’ll find these fake news articles that say there were people paid, and it’s crazy! If you actually read the fake news articles, they’ll cite this one YouTube video of a dash cam camera driving in Chicago past a bunch of buses. So it’s like, “Oh, because these buses are here, they’ve bused in protesters from everywhere!”

Is that claim backed up by any sources Wikipedia considers reliable?
It’s total nonsense with no basis whatsoever! But they’re writing this to feed whatever beast. I don’t know if they’re writing it just to make money, or if there’s a political incentive. I have no fucking clue, but it’s obviously not reliable. But for some reason it’s coming up near the top of my Google searches, which is really infuriating. So I want to make sure that when people read about these things, they know they’re not there.

Does the existence of this fake news merit its own inclusion in well-sourced articles?
At the bottom of the page about the protests, there’s one or two lines about [fake news]. And I got into a little bit of an editing conflict about that because I tried using the fake news site as a source about the fake news. They deleted what I wrote, and I think the line was “awful reference!” and it got deleted right away, automatically without reading or trying to understand what I was trying to do about it.

So when veteran Wikipedia editors aren’t around, what happens when an article shows up based on fake news?
There’s a lot of policing that happens on Wikipedia, which people see as a real barrier to entry to get started, because there’s a huge learning curve. One of the aspects of that learning curve is what you’re allowed to write, basically. And it takes a little bit of patience to figure out how to make it work. So one of the things that happens is you start editing and stuff gets deleted like that.

What kind of stuff do you mean?
If you start [sourcing] like a blog, or a personal site, or something like that, it’s gonna bite the dust real fast. People are gonna take it out, and they’re gonna point you to the reason why they took it out, usually.

Written by Randy McDonald

December 2, 2016 at 6:00 pm

[NEWS] Some Tuesday links

  • The Inter Press Service suggests climate change is contributing to a severe drought in Nicaragua.
  • Reuters notes China’s plan to implement sanctions against North Korea.
  • Atlas Obscura explores the now-defunct medium of vinyl movies.
  • Science goes into detail about the findings that many pre-contact American populations did not survive conquest at all.
  • CBC notes evidence that salmon prefer dark-walled tanks.
  • Universe Today notes the discovery of a spinning neutron star in the Andromeda Galaxy.
  • Vice’s Motherboard notes how Angolan users of free limited-access internet sites are sharing files through Wikipedia.
  • MacLean’s notes how an ordinary British Columbia man’s boudoir photos for his wife have led to a modelling gig.

[LINK] “At 15, Wikipedia Is Finally Finding Its Way to the Truth”

Wired‘s Cade Metz notes Wikipedia’s evolution in recent years, for good and for ill.

Today, Wikipedia celebrates its fifteenth birthday. In Internet years, that’s pretty old. But “the encyclopedia that anyone can edit” is different from services like Google, Amazon, and Facebook. Though Wikipedia has long been one of the Internet’s most popular sites—a force that decimated institutions like the Encyclopedia Britannica—it’s only just reaching maturity.

The site’s defining moment, it turns out, came about a decade ago, when Stephen Colbert coined the term “Wikiality.” In a 2006 episode The Colbert Report, the comedian spotlighted Wikipedia’s most obvious weakness: With a crowdsourced encyclopedia, we run the risk of a small group of people—or even a single person—bending reality to suit their particular opinions or attitudes or motivations.

“Any user can change any entry, and if enough other users agree with them, it becomes true,” Colbert said, before ironically praising Wikipedia in a way that exposed one of its biggest flaws. “Who is Britannica to tell me that George Washington had slaves? If I want to say he didn’t, that’s my right. And now, thanks to Wikipedia, it’s also a fact. We should apply these principles to all information. All we need to do is convince a majority of people that some factoid is true.”

Fifteen years on, Wikipedia is approaching an equilibrium.

To prove his point, Colbert invited viewers to add incorrect information to Wikipedia’s article on elephants. And they did. In the end, this wonderfully clever piece of participatory social commentary sparked a response from Jimmy Wales, Wikipedia’s co-founder and figurehead-in-chief. At the 2006 Wikimania—an annual gathering of Wikipedia’s core editors and administrators—Wales signaled a shift in the site’s priorities, saying the community would put a greater emphasis on the quality of its articles, as opposed to the quantity. “We’re going from the era of growth to the era of quality,” Wales told the The New York Times.

And that’s just what happened. The site’s administrators redoubled efforts to stop site vandalism, to prevent the kind of “truthiness” Colbert had satirized. In many ways, it worked. “There was a major switch,” says Aaron Halfaker, a researcher with the Wikimedia Foundation, the not-profit that oversees Wikipedia. Volunteers policed pages with a greater vigor and, generally speaking, became more wary of anyone who wasn’t already a part of the community. The article on elephants is still “protected” from unknown editors.

Written by Randy McDonald

January 23, 2016 at 1:01 pm

[LINK] On the potential decline of Wikipedia

Marginal Revolution’s Tyler Cowen linked to the preprint of an upcoming paper, by Halfaker et al., “The Rise and Decline of an Open Collaboration System:
How Wikipedia’s reaction to popularity is causing its decline”
.

Open collaboration systems like Wikipedia need to maintain a pool of volunteer contributors in order to remain relevant. Wikipedia was created through a tremendous number of contributions by millions of contributors. However, recent research has shown that the number of active contributors in Wikipedia has been declining steadily for years, and suggests that a sharp decline in the retention of newcomers is the cause. This paper presents data that show that several changes the Wikipedia community made to manage quality and consistency in the face of a massive growth in participation have ironically crippled the very growth they were designed to manage. Specifically, the restrictiveness of the encyclopedia’s primary quality control mechanism and the algorithmic tools used to reject contributions are implicated as key causes of decreased newcomer retention. Further, the community’s formal mechanisms for norm articulation are shown to have calcified against changes – especially changes proposed by newer editors.

Written by Randy McDonald

January 2, 2016 at 5:45 pm

[LINK] “Wikipedia Deploys AI to Expand Its Ranks of Human Editors”

Wikipedia’s development of AI editors, noted by Wired‘s Cade Metz, worries me. If AIs will work there, what role will humans have?

Aaron Halfaker just built an artificial intelligence engine designed to automatically analyze changes to Wikipedia.

Wikipedia is the online encyclopedia anyone can edit. In crowdsourcing the creation of an encyclopedia, the not-for-profit website forever changed the way we get information. It’s among the ten most-visited sites on the Internet, and it has swept tomes like World Book and Encyclopedia Britannica into the dustbin of history. But it’s not without flaws. If anyone can edit Wikipedia, anyone can mistakenly add bogus information. And anyone can vandalize the site, purposefully adding bogus information. Halfaker, a senior research scientist at the Wikimedia Foundation, the organization that oversees Wikipedia, built his AI engine as a way of identifying such vandalism.

In one sense, this means less work for the volunteer editors who police Wikipedia’s articles. And it might seem like a step toward phasing these editors out, another example of AI replacing humans. But Halfaker’s project is actually an effort to increase human participation in Wikipedia. Although some predict that AI and robotics will replace as much as 47 percent of our jobs over the next 20 years, others believe that AI will also create a significant number of new jobs. This project is at least a small example of that dynamic at work.

“This project is one attempt to bring back the human element,” says Dario Taraborelli, Wikimedia’s head of research, “to allocate human attention where it’s most needed.”

Written by Randy McDonald

December 2, 2015 at 3:49 pm