Who's Keeping You in Line Online?
Before you post that zinger, meet the people who might take it down.
Oct. 14, 2008 -- For every 10 user comments posted to ABCNews.com, two or three of them get banished from the site. They're too profane or too bigoted or just too much to bear, even by the wide-open standards of Internet conversation.
The job of tempering the discussion falls to people like Deirdre, a 41-year-old online moderator who asked that her last name be withheld to protect her privacy.
During an election season, for instance, the volume of comments escalates and, with it, the potential for incendiary speech, said Deirdre, who works for Mzinga, a Burlington, Mass.-based social media and community management firm.
Offensive posts often include nasty twists on the names of politicians and political parties, she said. Obambi and McSane are harmless enough, but Democrap and "Congoleeza Rice" for Secretary of State Condoleezza Rice are examples of name-calling that push the envelope too far.
As companies embrace social media and launch online forums that allow customers to share opinions about their products and services, they're realizing that simply creating Web sites is insufficient.
The savviest managers are paying close attention to what transpires on those sites as well, because they understand that user comments have the potential to influence other consumers and, sometimes, even investors.
Becoming the 'Party Giver'
Observing a shift in the pervasiveness of public commentary and conversation, Barry Libert, in 2001, launched a predecessor to Mzinga.
His founding thesis? "Become the party giver," he said. "Stop being the dance hall."
What makes a party memorable, the Mzinga board chairman said, isn't the location of the get-together. Instead, it's the food, music and, most of all, the people.
Similarly, a Web forum on its own isn't going to attract and retain users. The "hosts," or moderators, are the ones who have the power to ensure a positive online experience.
Mzinga's clientele include big name companies such as American Express, Cadbury Schweppes, The New York Times Digital and ABC. Its business has doubled in the past year.
Mike Pascucci, Mzinga's director of moderation services, said that 50 full-time and about 30 part-time moderators in the United States and overseas manage about 14,000 communities for Mzinga's dozens of clients.
Once a company decides to work with Mzinga, it creates a set of guidelines that will govern the online community. Larger clients tend to be more set in their ways, while smaller ones are more flexible and open in terms of the kinds of content that can be allowed, he said.
But as the sites evolve, so, too, do the rules and the roles of the moderators.
The Making of the Moderator
"Most companies are just educating themselves about what that role needs," Pascucci said.
When moderators are first assigned to a new client, they undergo preliminary training.
But the key to successful moderation is ongoing communication and guidance, he said. Moderators are consistently in touch with their clients as they learn to identify the line that separates the acceptable from the atrocious, he said.
For some clients, moderators seed content and interact with community members. For example, if new users are shy about submitting personal stories on a parenting site, the moderator will get the conversation started with stories of her own.
But for other communities, moderators review content and remove posts that breach the site's user guidelines. Racist or sexist speech, for example, could get a comment unpublished. Posts that directly target and insult other users could also be rejected.
Most companies also remove unattributed content that could prompt copyright infringement suits.
Pascucci said Mzinga moderators span a wide age-range, and include college students and parents with full-time day jobs who want to supplement their income.
Other companies, however, such as boutique social media firm eModeration, said they prefer older moderators with more life experience under their belts.
Tamara Littleton, CEO of the London-based firm, said that a high proportion of her company's moderators are women who want to juggle raising a family with flexible work. But, increasingly, she said, more men have joined the fold.
Although her company is headquartered in London, 60 percent of its business is in the United States. Littleton said the industry pays moderators anywhere from $8 to $19 an hour, adding that more specialized firms like hers tend to pay on the higher end of the range. Because the field is emerging, it's difficult to quantify the size of the industry.
They Don't Spend the Day in PJs
The moderation business is a billion-dollar industry, Littleton said. But, because it's behind-the-scenes work that people do from home, misperceptions persist about its not being "proper work."
Bill Keller, 48, who moderates for the Emeryville, Calif.-based Lithium from his home outside Kansas City, Mo., agrees.
"When I said I have a job online working as a moderator, [my father and brother] both rolled their eyes," he said. "They said, 'Working online isn't really a job, working from home isn't a job.' I had to convince them it was making money."
Christina Mattoni, 40, a moderator for LiveWorld, a San Jose, Calif.-based social network marketing agency, said that despite relentless myths to the contrary, moderators aren't "paid to surf the Internet and sit in their pajamas and drink coffee all day."
That they're solitary folks who shun social interaction is another popular misperception.
"I do interact with quite a few people on a regular basis through chat and e-mail," Keller told ABCNews.com. "It's not like we don't hear another human being's voice."
In fact, moderation companies say that in hiring new recruits, in addition to looking for people with solid backgrounds in communications and online communities, they favor those who are effective team players.
The 'Toxic Poster'
Moderation professionals emphasize that most of the people who use online message boards, and submit user comments, do so with respect for other users and the owners of the sites. But, they say, just one "toxic poster" can derail a conversation and kill the mood for a whole community.
Sherry Wilcox, 39, a veteran of the industry who works for eModeration from Jacksonville, Fla., said that after more than 15 years in the business, she can predict the direction of the conversation from reading one foul post.
"When you're moderating a site, you know that a discussion is going to go in one direction, and it's not going to be good," she said. "But if you remove that one single post, the whole thing might go back on track."
Although Mzinga's Deirdre said that, on average, she removes 20 to 30 percent of the posts she reads on ABCNews.com, other companies, such as Lithium, said that removing 5 percent of the content on a given site is high for them.
While Web sites differ in their tolerance of unsavory speech, you can be almost certain that the unoriginal perennials -- "scum," "idiot" and "Nazi" -- are near the top of the "do not post" list.
But offensive speech is not a stationary target.
As society's sensitivity to certain words diminishes, those words lose their blacklist status.
Five years ago, Deirdre said, calling someone a terrorist was new to our vernacular. Because it was an unfamiliar insult, moderators would immediately remove posts that contained it.
"But it's like now you can call any Tom, Dick or Harry a terrorist," she said. "So if I unpublish every post where someone called someone a terrorist, it'd be an empty message board."
Moderators say that tensions tend to run higher on political and issue-oriented sites, but they also point out that fires can flare on just about any site.
Jonathan Wishart, 21, a full-time student in Tempe, Ariz., is an avid gamer who followed his passion onto the Internet. Despite his full course load, for 40 hours a week, he monitors a gaming site owned by a Lithium client.
He said that on his site he has seen members duke it out over seemingly innocuous topics, like gaming consoles and choosing teams for sports video games.
"It's a competition. You're trying to show that you know more than others," he said. "It could be two people who are huge fans of Madden. They both love the game, but one person thinks the Cardinals are better and the other, the Patriots. It can be something as simple as that."
The 'Disinhibition Effect'
Deirdre is a veteran of the online moderation business. She started moderating for America Online in 1996 as a volunteer. But still, she said, malicious posts, and the negativity they indicate, sadden her.
"If you look at the online community as sort of a metapicture of the bigger world, people's willingness and readiness and speed with which they devolve into name-calling ... would never, ever happen in the real life," she said.
The most upsetting part, she continued, is that if people are bold enough to hurl hateful speech on the Internet under the veil of anonymity, then they must also think those thoughts privately and just never articulate them.
John Grohol, a clinical psychologist and founder of the online mental health resource Psych Central, explained Deirdre's observation.
"Psychology has identified what we call the "disinhibition effect" on online behavior," he said, adding that when people communicate through message boards, they do often say things that they wouldn't dare utter in face-to-face interactions.
"We tend to lose sight of the fact that behind those screen names there are people just like us," he said. Because we only see the words -- but know nothing about the person offering them -- we tend to slip more quickly into an emotional and argumentative mode.
That passion can often lead to a desire to proselytize, he said, which presents moderators with a tricky task: encouraging people to express their opinions without letting the conversations become flame-fests.
But assuming we don't let online dialogue forums deteriorate into verbal cesspools, some sociologists suggest that we actually need the culture clash.
According to Keith Hampton, a sociologist at the Annenberg School for Communication at the University of Pennsylvania, a growing body of research shows that our social circles have become smaller in the past 20 years.
This loss of exposure to diverse opinions probably has some impact on our ability to deal with others, he said.
"It's opportunities like [online message boards] that expose us to these very strong opinions," he said. "Even if they are very strong, at least over time it helps us recognize that there are other people with other points of view and gives us strategies for coping with very strong opinions."