Bloomberg Radio, President Joe Biden, Santa Clara University School Of Law discussed on Bloomberg Law
Automatic TRANSCRIPT
Is Bloomberg law with June grosso from Bloomberg radio. It's been the subject of controversy for years. Section two 30 of the communications decency act, a legal shield for social media platforms, and Congress has been debating whether it should be reformed or revoked. Repealing the law may be the one thing that president Joe Biden and former president Donald Trump agree on. Of course, not for the same reasons. We must hold social media platforms accountable for the national experiment they're conducting on our children for profit. The big tech persists in coordination with the mainstream media, we must immediately strip them of their section two 30 protection. There's been no action in Congress in the face of partisan differences, and now the Supreme Court has decided to step into the middle of this politically fraught debate over whether some of the world's most powerful tech companies should continue to be protected or should be held accountable for third party content. My guest is Eric Goldman, a professor at Santa Clara university school of law, and co director of the school's high-tech law institute. So Eric tell us about section two 30. Section two 30 says that websites aren't liable for third party content. It's a really simple premise. The idea is that people who post content take responsibility for it, but the services that they use to post that content don't. Both cases, the court is going to consider involve terrorist attacks abroad, one in Paris in 2015 and another in Istanbul in 2017. Tell us about the plaintiff's arguments against Google and Twitter. Well, I really both of them involve pretty much the same set of facts. They involve terrorist attacks abroad that were allegedly related to social media. And the relationship can vary based on the facts, but the general gist is that the terrorist organizations recruited and radicalized readers online and because of that the services now take responsibility for the actions that are done by these terrorist organizations or the people that they radicalized. So are the plaintiffs in these cases complaining about the algorithm generated recommendations? Well, what are those part of it? You could look at it a little bit more broadly. I think the starting premise