Social media platforms supporting moderation is Reduce harassment, misinformation and conspiracy. It will not be

Social media platforms supporting moderation is Reduce harassment, misinformation and conspiracy. It will not be

If the US wants democracy protection and public health must be recognized that the Internet platforms cause serious damage and accept that executives like Mark Zuckerberg are not sincere in their promises to do better. The “solutions” Facebook and the other does not want to work. They should distract us. The news in recent weeks has highlighted both the good and evil of platforms like Facebook and Twitter. transformed video graphics of police violence in several cities public opinion for the race to create a possible move to address a problem that has plagued the country since its inception: the good. peaceful demonstrators used social platforms to convey their message, outcompeting the minority who argued for violent tactics. The bad: waves of misinformation from politicians, police, Fox News and other waste the reality of police brutality, too high, warned the role of spoiler in protests and Antifa radical groups. Only a month ago exposed, critics of undermining the role of internet platforms in the country in response to the 19-COVID pandemic strengthen health misinformation. Misinformation has convinced millions of people who face masks and social distancing culture were the themes of war instead would advice to public health, to open the safe economy again afford. The Internet platforms have worked hard to minimize the perception of harm from their store. Faced with a challenge we can not deny or deflect, their answer is always apologize and promise to do better. In the case of Facebook, University of North Carolina Scholar Zeynep Tüfekçi coined the term “Zuckerberg 14 years around excuses.” If necessary, a roadmap to offer, CEO of technology they use the opaque nature of their platforms, to create the illusion of progress while the impact to minimize the proposed solution to commercial practices. Despite much damage revelations, beginning with his role in undermining the integrity of the elections in 2016 to continue these platforms to succeed in defining the issues in a favorable light. If reducing harassment under pressure, misinformation and conspiracy theories surrounding the targeted solution platforms in terms of content moderation, which means that there are no other options. promoted according despite several waves of investment in artificial intelligence and human moderators, no platform is successfully limit the damage of third party content. If exposed to public pressure to remove the malicious, Internet platforms deny causes of address, medium that does not go away old problems to develop as new. For example, the prohibition of Alex Jones conspiracy from top attractions, but did nothing to stop the flood of similar content from other people. The platforms meet every new challenge PR with an excuse, another promise, and sometimes an increase in investments in moderation. They have done so often, I’ve lost track. And yet, still largely let them get away with it, policy makers and journalists. We must recognize that the Internet platforms are experts in human attention. They know how to distract us. You know that eventually get bored and move on. Despite ample evidence to the contrary, too many politicians and journalists behave as if Internet platforms eventually reduce the damage through targeted harassment, misinformation and conspiracy through moderation of content. There are three reasons why he has not: Skala, latency and intent. These platforms are huge. In the last quarter we saw Facebook that 1.7 billion people daily use its main platform, and approximately 2.3 billion Euros in the four major platforms. They do not disclose written each day the number of messages, but it is likely in the hundreds of millions, if not a billion or more, only on Facebook. It can prevent large investments in artificial intelligence and human moderators not millions of malicious messages from getting through. The second obstacle is latency, which describes the time to the moderator will identify a malicious message and remove. AI works quickly, but people can take minutes or days. This means a large number of messages circulating for some time before he finally removed. Damage will occur in this range. It is tempting to think that AI can solve everything, but this is a long way. AI systems are based on records from old information systems, and are blended into content, they are still not able to be interpreted as sedition. The last – and most important – obstacles to the moderation of content is intentional. The sad truth is that the content should be removed, we asked Internet platforms, we are extremely valuable and they do not want to remove it. Consequently, the rules for AI and human moderators approve are designed as low as possible. But under the three themes in moderation, the intention can only be addressed by the Regulation. A permissive approach to content has two main advantages for platforms: profit and power. The business model of the Internet platforms like Facebook, Instagram, YouTube and Twitter relies on advertising, whose value depends on the consumer’s attention. Where traditional media to create properties for a mass audience content to allow Internet platforms optimize content for each user, a particularly precise alignment monitoring. Advertisers are offered on precision and comfort of internet platforms are addictive. move each year draw an increasing percentage of their expenses to them, of which platforms huge profits and wealth. Limit the amplification of targeted harassment, misinformation and conspiracy theories would reduce the commitment and revenue. Power, in the form of political influence, is an essential part of the success of the largest Internet platforms. They are ubiquitous, making them vulnerable to politics. Tight alignment with the powerful ensures success in every country, platforms authoritarian cable management, including those who violate human rights. For example, Facebook’s regime has targeted the genocide and repression in Myanmar sponsored by the government in Cambodia and the Philippines activated. In the US, Facebook and other platforms ignored or changed their terms of Trump and his allies to allow you to use the platform in a way that would normally be prohibited. If exposed, for example, journalists Trump campaign ads that violate falsehoods Facebook Terms, Facebook changed its Terms of Use, instead of pulling the ads. Also, I have chosen a Facebook help position follow a notice of public safety on a Trump Post that promised looting violence in the case. With its leading revolts platforms play an important role in fundraising and communication campaign for candidates for both parties. While the US dollar to the platforms does not make sense, leading to play the power and influence of a significant role in electoral politics. This is especially true for Facebook. Currently platforms have no liability for any damage caused by them give business model. Their algorithms will continue to strengthen harmful content until an economic incentive is to do differently. The solution is to change for the Congress to provide incentives by an exception to the safe harbor of section 230 of the Communications Decency Act for the distribution algorithm gain of harmful content and the right to litigate against the platforms for these damages. This solution does not interfere with the First Amendment rights, such platforms are free to continue their existing business practices, unless the responsibility for the damages. Thanks COVID-19 and the protest marches, consumers and policymakers are much more aware of the role played by the Internet platforms misinformation amplifier. For the first time in a generation there is support in both parties in Congress for a revision of the 230. It ‘section to increase public support for regulation. We do not need to accept misinformation as the cost of access to Internet platforms. Harmful gain is the result of business decisions that can be changed. It is up to us and make our elected representatives what will happen. The pandemic protests and social justice emphasize the urgency to do so.
Picture copyright by Ilana Panich-Linsman-The Washington Post via Getty Images