Although Mzinga's Deirdre said that, on average, she removes 20 to 30 percent of the posts she reads on ABCNews.com, other companies, such as Lithium, said that removing 5 percent of the content on a given site is high for them.
While Web sites differ in their tolerance of unsavory speech, you can be almost certain that the unoriginal perennials -- "scum," "idiot" and "Nazi" -- are near the top of the "do not post" list.
But offensive speech is not a stationary target.
As society's sensitivity to certain words diminishes, those words lose their blacklist status.
Five years ago, Deirdre said, calling someone a terrorist was new to our vernacular. Because it was an unfamiliar insult, moderators would immediately remove posts that contained it.
"But it's like now you can call any Tom, Dick or Harry a terrorist," she said. "So if I unpublish every post where someone called someone a terrorist, it'd be an empty message board."
Moderators say that tensions tend to run higher on political and issue-oriented sites, but they also point out that fires can flare on just about any site.
Jonathan Wishart, 21, a full-time student in Tempe, Ariz., is an avid gamer who followed his passion onto the Internet. Despite his full course load, for 40 hours a week, he monitors a gaming site owned by a Lithium client.
He said that on his site he has seen members duke it out over seemingly innocuous topics, like gaming consoles and choosing teams for sports video games.
"It's a competition. You're trying to show that you know more than others," he said. "It could be two people who are huge fans of Madden. They both love the game, but one person thinks the Cardinals are better and the other, the Patriots. It can be something as simple as that."
Deirdre is a veteran of the online moderation business. She started moderating for America Online in 1996 as a volunteer. But still, she said, malicious posts, and the negativity they indicate, sadden her.
"If you look at the online community as sort of a metapicture of the bigger world, people's willingness and readiness and speed with which they devolve into name-calling ... would never, ever happen in the real life," she said.
The most upsetting part, she continued, is that if people are bold enough to hurl hateful speech on the Internet under the veil of anonymity, then they must also think those thoughts privately and just never articulate them.
John Grohol, a clinical psychologist and founder of the online mental health resource Psych Central, explained Deirdre's observation.
"Psychology has identified what we call the "disinhibition effect" on online behavior," he said, adding that when people communicate through message boards, they do often say things that they wouldn't dare utter in face-to-face interactions.
"We tend to lose sight of the fact that behind those screen names there are people just like us," he said. Because we only see the words -- but know nothing about the person offering them -- we tend to slip more quickly into an emotional and argumentative mode.
That passion can often lead to a desire to proselytize, he said, which presents moderators with a tricky task: encouraging people to express their opinions without letting the conversations become flame-fests.
But assuming we don't let online dialogue forums deteriorate into verbal cesspools, some sociologists suggest that we actually need the culture clash.
According to Keith Hampton, a sociologist at the Annenberg School for Communication at the University of Pennsylvania, a growing body of research shows that our social circles have become smaller in the past 20 years.
This loss of exposure to diverse opinions probably has some impact on our ability to deal with others, he said.
"It's opportunities like [online message boards] that expose us to these very strong opinions," he said. "Even if they are very strong, at least over time it helps us recognize that there are other people with other points of view and gives us strategies for coping with very strong opinions."