Facebook, Robert, Last Year discussed on Marketplace Tech with Molly Wood
Facebook. Last month announced it will stop recommending political and civic groups to its users the company says users want less politics in there feeds and has said it didn't realize how much it's groups we're going to spread medical misinformation be used to radicalize people into on and be one of the home bases for the people who plan the capital insurrection on january. Sixth but this week the wall street journal reported that the company has had in research for months. If not years about private groups being toxic full of calls for violence and still being recommended to facebook users and rene arresthe a research manager at the stanford internet observatory says the shift to groups creates a long standing cycle of radicalization. What was happening. Beginning in two thousand sixteen was you were starting to see these very conspiratorial communities that were taking shape and they were being recommended to people who had interest in other conspiratorial communities. And when you started to have this it was like a correlation matrix. were saying. Oh you're interested in this wild theory while here try this one on again that engagement. Highly engaged communities wildly sensational content high-volume groups and posts really came to in some ways. Become a much more significant part of the experience for people who went and participated in those communities. Is it credible to you for facebook to say. We couldn't have anticipated how conspiratorial these groups were gonna get. No no not at all and that's because there is a wall street journal article that came out last year that said that what the platforms own internal research had showed back in two thousand sixteen was that they were realizing that sixty four percent of people that joined some of what they called more extreme groups were doing it because of props from the recommendation engine for researchers such as myself who were seeing it from the outside we have been very kind of anecdotal sense of the problem. Like right now if you were to go to instagram and follow robert. F kennedy juniors account. You'll see a whole range of recommended accounts that will be suggested to you. That are mostly corona virus. Denial accounts. now. That's the thing that i can see it small-scale but i really don't know if that's a systemic problem or a problem that's anecdotal. How hopeful are you about this move to stop recommending political and civic groups like you know how big a deal do you think it could be to untangle the recommendations from the existence of the groups themselves. I'm not sure that a blanket ban on topics is the way to go about doing this. And facebook has encountered some challenges with the definition of political in other product fronts like ads for example where in order to run advertisements that were related to political issues. You had to go and get yourself verified. Now i think that that's a reasonable amount of friction but the question then became what is a political issue. I think that there are plenty of political groups. That new stay within the realm of healthy behaviors right. They're not being used for organizing violence. The decision to re remove them from recommendations just means that people will have to go and kind of proactively search for them. And i think it'll be interesting to see what impact that has on their growth. I don't think it's a silver bullet though. And then how about the announcement that facebook will require moderators to spend more time reviewing member post like his moderation. A better solution. There's some really interesting evidence from read it. That suggests that the answer to that is yes. Read it really worked to empower moderation at the local level giving tools to moderators at the sub reddit level to make determinations about what norms and values and standards were appropriate for their communities. And so i think improving moderation tools. And then also you know putting the onus on people so that if they create these groups that they don't just kind of you know let them go haywire and then say oh. I just didn't know if you're choosing to form a community it gives us perhaps more of a sense going forward that that that choice is something met. We're going to be expected. To take responsibility for rene duress says research manager at.