Transcript for Facebook cracks down on hate speech
One group that is aware of what the Russians did in 2016. Is the company FaceBook that they announced overnight that they are taking new steps to crack down on the misuse. Of their platform both by Russia and by other extremist groups. In a statement saying we don't allow hate groups to maintain a presence on FaceBook we consider a number of different signals among them. Organizations and their leaders. Also that have called for directly carrying out violence against people based a race religious affiliation nationality ethnicity gender sex sexual orientation. Serious disease or disability of course that statement comes as they. Formally kicked off the platform Alex Jones of info wars and Lewis her con live nation of Islam leader. Among others quite a controversial move what does this mean for the presidential campaign coming up. And how did they decide what actually is too extreme we're joined now by two experts in both of those things Jon Cohen RA BC news national security. Can trigger he's eyewitness on the foreign Paul Barrett the deputy director of the NYU. Mr. and center for business in human rights is standing by as well but John let's start with you. What do you make of Facebook's effort today to taxi kick off. People they deem to be extremists. Oh well on the one hand DeVon it. You know if it were it illustrates that FaceBook recognizes that there's a real problem and that their platform along with other. The social media platforms are part of problem. Social media paparazzi just FaceBook YouTube and mr. Graham. Have really become the platform or choice for terrorist groups extremist organizations. Our purpose you know other purveyors of hate even foreign intelligence organizations. To. Use those platforms to spread disinformation or to spread hateful rhetoric directly for the purposes of inciting violence. But at the same time and and I know that packed compositions that law enforcement and Homeland Security officials across the country on this issue. They did their little skeptical that these companies. Are other right entities to police their networks. In that sense we're acting asking a private company to be the arbiter of free speech. To make decisions on whether something's that is being posted on the platform is incendiary or whether it is part of a pattern of behavior that exhibits. You know potential for violence. Were also asking them to do something that very well may be inconsistent with their. There underlying business interest and that raises concerns. About how committed they will be too it so there's there's definitely things that we have to do that relates to social media usage by. I individuals were trying to incite violence or who are about to conduct an attack. But I think it's real skepticism that having these companies police their own network is the right approach. And Paul Barrett some of the criticism that FaceBook. And has encountered both from these individuals that were booted but from others particular conservatives. Is that the definition here is very squishy. Way at what do you make of of where that we're FaceBook has drawn the line for what is and isn't allowed and what constitutes an extremist. Well I mean I think FaceBook. Looks at things or lines of what was just mention in terms of people who harass others in a systematic way. Engaging in hate speech and spread misinformation. Sort of brutally untrue. Facts and I guess I disagree. With your colleague. Because I don't know that there's really an alternative to FaceBook to some degree policing it's our own. Platform because we certainly can't under the First Amendment have a government in that. So the only other alternative it is Facebook's hands are tied it is that there has its total free for all at all times and that just doesn't seem to be the best option. And who should hold them account to account Paul. You know there's a lot of talk here in this town in Washington about imposing regulation. On FaceBook. But again and in today's examples of individuals kicked off the platform. We're hearing from some of the trump people over in the White House complaining again. About the ham handedness with which this is is allegedly done that algorithms identifying people as extremists. Rather than in in portion all human police force if you will so how how do you grapple with that if your company like FaceBook. Well I think they have to move very carefully I don't think they needed and algorithms by cannot buy Alex Jones. As disseminated or of hoaxes and end conspiracy theories and allegations. Such as guided the notion that school shooting. Are false events where children are trained as actors pretend that they'd been shot. So you know I again I think. Inevitably Facebook. A private organization that owns and operates. A an Internet platform. It regulate content is I'm much much less desirable if not unconstitutional on space. John corps how do you respond to that what what what role is there for any or any individual outside of FaceBook itself here. Yeah I aptly disagree with what was just said. They Facebook's actions are probably going to be challenged in court they are platform and while they are private company can regulate what's on their platform. The they're gonna run into the same challenges. About free speech that a government entity would be. Citing aboard do agree is that government should not be regulating speech and law enforcement in particular shouldn't be. In the position of regulating what material is on those platforms with government should be doing. It is incorporating. Social media behavior into broader efforts to identify. Billy behaviors that may be associated with a violent activity. And then using that information as part of a broader effort. To identify high risk individuals and take steps to stop violent attack so try to regulate the speech is can be much harder. Been looking at. At this speech as part of a broad range of behaviors that are associated with violent activity and taking steps to talk about violent attacks. Interest in debate here on Iran on the role of government and the business obligation or FaceBook before we let you go Paul. Won't want to just go back to this definitional issue again gets so much of the criticism today. In the Blogosphere and on social media about these moves is what is the definition. Of extremism obviously as you said some of the examples they are fairly obvious. The budget but does FaceBook need to do more to spell out. What is and is inappropriate and does doesn't that get them into trouble. I would agree review that the more it FaceBook does to publicly explained. The principles that lie behind what they're doing the general principles that their client that it would aces. The better. So I am sure that you do a better job. But I you know I'd I think they're well within. They're rice. And in fact I would say that had an obligation. Two. Demote and or remove material that's prudently false. And that is is dangerous. And final thoughts you Jon Cohen. That I think he did the key is that the federal government to police. And impacting that Custer country. Extreme Stotts and even extreme speech is protected by the constitution. What law enforcement counterterrorism authorities ban and other government agencies are concerned about. It is our identifying people who may be prepared to use those spots that's been motivation for violence. And that's where police and another law enforcement officials like to work more closely with companies like FaceBook. And YouTube. Google. So that they can be more effective in identifying. Behaviors exhibited on the digital street that when looked at broadly. Give us prayed inside the people who are preparing to conduct attacks. Fascinating conversation Jon Cohen joining us by phone ABC news contributor and great depth Paul parent with a city deputy director of the NYU stern center for business and human rights thank you both very much.
This transcript has been automatically generated and may not be 100% accurate.