circa75 Home | About circa75 | Articles | Links | Contact Us

Posted by aaron at 12:01PM, Monday, November 01st, 2004

Wikipedia and the Myth of Objectivity

I was poking around on Wikipedia this afternoon, and happened to look at a page I'd edited a couple months back. Since then, someone had edited it again, removing many of my additions, and adding new ones containing factual and grammatical errors. I'm prompted to think about the nature of distributed information systems like this.

It's a tough nut. How do you set up environments like this, in ways that minimize errors and the 'net-centric tendency for people to actively refute anything with which they personally disagree? The latter is bad enough on discussion boards, where two strong personalities are apt to hijack a discussion and make it all about their pet beliefs. On Wikipedia, the strong personalities can go through and, rather than merely posting a contrary opinion, remove any content that offends them. While Wikipedia has a change-list so that you can go back and see earlier revisions, I don't think that really addresses the problem, particularly when you have to go to the trouble of searching through an unsophisticated interface to see old revisions.

I think the problem needs to be addressed from the user side, but I'm not sure how that works. People, always, should be more considerate. That's a truism. One person's idea of inconsequential irrelevant fluff is another's idea of Truth, and when the former person wipes out the latter's changes, bad feelings fester. It's terribly difficult to change user behavior by changing the natures of the people who use Wikipedia; perhaps it's easier to change user behavior by changing the technology itself.

So how can you mandate respect and consideration via technology? I think technology can foster attitudes, but it can't enforce them; trying to enforce behavior or grammar through technology is probably counterproductive. What if Wikipedia offered a space for multiple people to provide articles for a given entry? What if it allowed the authors to set whether other people were allowed to edit their content -- or whether anonymous users were allowed to? Would this sort of granularity of control be an improvement, or would it just be irritating? Or perhaps both?

The problem isn't just one of respect, or of Wikipedia's setup not being conducive towards presenting multiple, often-conflicting viewpoints. The problem also involves the technical aspects of writing. There are people on Wikipedia who cannot (or, at least, do not) write with good grammar. There are people who confuse "its" and "it's" all over the internet, and it's offensive to those of us who know how to use possessives to see someone "correct" a phrase like "This company sells its products to customers..." by changing it to "This company sell's it's products..." How do you deal with editorial issues like that, when you don't have editors who exercise control over the final version? How do you create a single "voice" for style and editing in an encyclopedia presenting a diaspora of voices in a single entry? And should you even try to pretend that this diaspora is a single voice, as I think Wikipedia does?

One of the nice things about the New Yorker is that the magazine is carefully edited for grammar, and for consistency in things like comma-usage and spelling. In these days when even the New York Times can't seem to get "it's" and "its" right, that degree of control and care is downright old-school. And it says something to me about the care the people responsible for creating the magazine take with the rest of the content.

Should I take a wiki entry seriously, if I see two grammatical errors in the first sentence? How about if I then see two new edits that introduce statements containing factual errors, which could be found with the most trivial of verification checks? If I read some diatribe about computer usability on the web, and, in the first paragraph, the author makes two mistakes about when certain GUI features were introduced -- and he's proclaiming that it's this expertise from which he makes his living -- I stop reading. I can't be bothered to give a damn about his "well-reasoned" opinions if a lot of the evidence he bases them on is made up out of his own ignorance. One of the principal reasons I buy particular magazines is because reading them has led me to believe that I can trust their editorial department to check the facts that they can check, and to fix the errors that they can fix. There's accountability, and they take pride in their work. That saves me from having to take everything I read with a grain of salt until I can verify the factual statements. The traditional media have been touting this reputation, this accountability and editorial control and enforced objectivity, as their big advantage over blogs and other such pesky things you read on the internet. They'd have a point, if, for the most part, they had any such reputation, or objectivity, in the first place. Obviously, papers like the Times don't.

That is not to say that there's good reason to trust anonymous essays you find on the net. I think the anonymity is part of what I don't like about Wikipedia. For a long time, through my early education and all through high school, I was taught that effective essays should be dispassionate and objective, built from a predetermined structure, their style attaining an anonymity such that any two excellent essayists might write exactly the same essay. It wasn't until I had a class with a great English professor at Bowdoin that I realized what a crock that idea was. Postmodernist feminist theory teaches us that objectivity is, largely, a myth, and any viewpoint someone expresses is inherently bound up with his background and situation, whether he's intentionally pushing an agenda or not. Aiming for inoffensive anonymity of style merely perpetuates that myth: "This is the One Truth. How could anyone disagree?" Certainly, facts (or lies presented as such) should be verified; beyond that, couching your personal viewpoint as that of "everyone," or of "every reasonable person," might be a good definition of propaganda. The exclusion and mockery of opposing viewpoints in the run-up to the Iraq war was quite effective, and Democrats in Congress stepped over themselves to be "reasonable" and "patriotic" in accepting the President's (lying) words at face value. "Everyone agreed," after all, that Saddam had a nuclear weapons program. Even though "everyone" was only people who stood to gain financially if a war occurred, anybody who disagreed about WMD or the "need" for war was easily, and effectively, painted as a nobody. Suddenly there was powerful isopraxis at work, and people who, out of integrity and honesty, demonstrated against this war that was based on lies, along with anyone who bought French wine or cheese, were The Enemy. The myth of media objectivity failed the public interest spectacularly, at the hands of a few powerful cynics, expertly painting their viewpoint (which was based on lies) as everybody's -- or at least, every patriot's -- in order to alienate dissenters and weaken their influence in public debate.

We learn that society thinks it's more important to be in consensus -- even if you're wrong -- than to be correct, if such correctness comes at the expense of being personal, rather than acceptably bland. And Wikipedia's collective editing, in my experience, enforces exactly that: impersonal generic blandness at the expense of correctness.

In light of all that, I don't think it makes me a cynic if I view anything presented as the only single correct, authoritative summary of a complex (and non-scientific) issue with some suspicion. But if I read an essay where the author is not afraid to say "I," to present her experiences and rationales as personal, as based on one person's incontrovertible experiences and sensations, rather than universal truths, I'm much more impressed. I can't argue that Author X didn't feel this way because of Experience Y, even though I can still verify whether Experience Y actually happened.

Sadly, the collective editing that the Wikipedia framework embraces has a tendency to erase the personal. Opinions get run over and replaced with "facts." So who's to say that, say, during the next run-up to a war fueled by lies, the wiki entries won't fan the flames of xenophobia and fear just as the Times and the rest of the corporate media did -- presenting the experiences of every author with an innate (but perhaps unconscious) agenda as the one true version of truth, rather than allowing different authors to present their own personal (and possibly conflicting) versions of the truth within the same wiki entry and framework? That may sound paranoid, but in this era of bad education, of computer programmers who are supposed to know the grammar of their programming languages but don't bother to learn the grammar of their verbal and written language; of Orwellian NewSpeke like "Freedom isn't free," when black is white and lies are truth and Iraq had a nuclear weapons program; when language itself is being destroyed by the neocon agenda in order to steal from the People their most powerful weapon, that of oration and meaningful public discourse... well, I think it's important to look at how technology shapes our societal dialog, and our storage of information, of fact and opinion each disguised as the other.




Technologists too rarely consider the impacts of the technologies they help create and form. The physicists who helped the United States build the atom bomb were caught up in the race to beat the Germans, and were discouraged from stopping to think things out - although a few, like Einstein and Oppenheimer, later had the courage to admit their mistake. The programmers who helped amass and sift data so that Gillette could send me a free razor blade on my 18th birthday either didn't think about the privacy implications of their work, or didn't care. I'm not even sure that the coders who made the first blogging frameworks necessarily thought a lot about the effects they might have -- of the democratization of the notion of "media," or of the easy and cheap dissemination of memes, ideas, and facts that blogs would support. Many of the early bloggers -- the users -- absolutely did, though, just as did many of the geeks who made the earliest websites in the 90s. From what I've read, the people who invented Wikipedia have thought a lot about how their technology would be used, and what sorts of things it could facilitate. There are still problems, though. We've seen that monolithic entities like the Times are willing to subvert the truth to drive the agenda of their corporate sponsors and upper-tax-bracket management. Certainly, many of the individual contributors to Wikipedia do the same, intentionally or not. I don't mean that as an insult; it's just human nature. And while the Wikipedia framework aims to address the agenda issue by allowing anyone to edit any entry at any time, it still attempts to address the problem of accuracy by simply having enough people looking at it to create any errors as they get posted, so that you, the viewer, presumably have a good chance of seeing a "correct" article. But that's always going to be a compromise, and while the notion of involving the consumer of wiki information as a contributor, too, is a compelling one, the system itself limits the extant of the dialog it might foster by presenting each article as the One Definitive Truth about a given subject.

Technologists -- and programmers, particularly, if my interview experiences are at all representative -- are apt to fall for the lie of the One Truth. They're apt to buy into Java architectural fads as if they're the second coming, and label any disbelievers as heretics unfit to touch code. They're just like my second-rate high school history teachers in that. I bet some of the wiki people think there's One Right Way to write any given Wikipedia entry, just as there was One Right Way to write the essay on JFK funding the moon landings or carpetbaggers during the Reconstruction in high school, just as there's One Right Way to make a Java enterprise website. Just as there's One Right Sex of people to sleep with if you're a man, just as there was One Right Position to take on WMD in 2003. Some of us, however, are repulsed by such absolutism, because we've been on the pointy end of it, and still bear the wounds. Larry Wall was fond of saying "there's more than one way to do it," just as any good postmodernist might say "there's more than one way to see it," or "there's more than one way to explain it."

If the point of your framework is to have accuracy by consensus, you require the culture generating content to be geared towards consensus, populated by individuals largely capable of assessing factual accuracy and writing accurately. I think the cultural divisiveness prevalent in the States now, the Judy Miller fiasco about the pre-war WMD "consensus," and Wikipedia's problems with grammar illustrate, respectively, that we fail to meet all those requirements. Wikipedia lacks accuracy on specific pages, and instead hews towards the conventional wisdom prevalent in the population doing the editing. Minority opinions and voices are quashed, and along the way, correct facts and good grammar are replaced with refutable errors, all for the sake of inoffensive isopraxis, because the framework itself is based on a fictional notion of One Objective Truth. So why not change the framework to reflect the multiplicity of thought in our culture in a way geared towards how our culture actually functions, rather than an early-20th-Century myth of objectivity?
circa75 Home | About circa75 | Articles | Links | Contact Us

All content copyright © 2001-2009 the owners of http://www.circa75.com/