Facebook is notifying users that divided Crown misinformation. You could do the same in politics?

Facebook is notifying users that divided Crown misinformation. You could do the same in politics?

Hoping to stem the flow of false remedies and conspiracy theories on COVID-19, Facebook announced Thursday that it would begin users to inform the world that liked, commented or shared misinformation “harmful” about the shows coronavirus them instead toward a trusted source. Facebook hopes the move will reduce the spread of false information about the in-line corona dramatically, a growing crisis that the World Health Organization (WHO) has described as “infodemic.” “We want people to connect with harmful disinformation of the virus with the truth from reliable sources, if you see or hear, can play together again these claims,” ​​said Guy Rosen, Facebook vice president for integrity, in a post on blog released early Thursday. The new policy applies only to used false claims to the crown, but activists say the announcement could set the stage for a breakthrough in the fight against the disinformation policy. “Facebook this application for the pandemic is a good first step, but this should also be applied to the disinformation policy, especially in elections 2020 United States comes close,” says Fadi Quran, a campaign director for Avaaz, a global advocacy group that Facebook has advertised “correct the record” on inaccurate since 2018. (in the language of online security “misinformation” refers to the coordinated, targeted dissemination of false information as “misinformation” refers to accidental inaccuracies.) “I hope this will be extended to other subjects and in a hurry, “says the Koran TIME. all users who have interacted message in their news feed with him are shown on According to Facebook’s new policy, when a piece of disinformation “harmful” related crown was exposed by his fact checking, and removed from the site, director of the WHO list are unmasked credits. People, comments or shares of Facebook posts that drinking can say bleaching heal COVID-19, or social distancing does not prevent the spread of the disease, will be among the first new message “in the coming weeks to see,” said Rosen. It is updated here with our daily newsletter crown. Currently, users who read the article, “interact” will not appear with him without corrections. It announced after Thursday, Facebook is the correction message is not suited to the specific part of the disinformation, you saw; Instead, users are shown a single blanket message read: “Help friends and avoid false information about COVID-19” family via a link from the WHO web site. However, Facebook plans to test different ways to show the messages as a speaker, in order to open the possibility of localized corrections, targeting all those who have seen the misinformation. “We think it should be more accurate,” says Christopher Schott, another campaign director at Avaaz. “Facebook admits the first step in what has in fact taken, it is actually saying that the right people on factual information. It ‘s time to go into the next room and we think they should.” Next come to inform people mean evil with him the information not only those who have committed, they have seen, is the case in current plans. Schott says Facebook has sophisticated tools that it can use advertising clients, who can say how long a user has been watching a piece of content, even if you do not interact with it. These tools, he says, could be used for purposes of disinformation and to correct the rise of Facebook advertising revenue. “If someone is watching a video on hold as hold your breath for 10 seconds, you can let them know if you COVID-19 or less [a unmasks false accusation that circulate on-line], but you do not like or comment you want publish or not, then that person should not be informed? it just does not make sense to me, “said Schott. Although activists have taken the news, there are still some obstacles to Facebook from the issue of political falsehoods corrections in the same manner preventing possible for those of public health. The last four years have proven to crack Facebook reluctant to disinformation policy. Since Russia is trying to influence the elections in 2016 US flooding social media with false news, Facebook has broken the foreign interference. But he made only limited attempts to break home grown against the misinformation and is particularly bad attention on first amendment rights. However, the company has become involved on matters in a partisan struggle, with President Trump accused Facebook (along with Twitter and YouTube) of anti-conservative bias. The dispute led known to Facebook in September 2019 that it would not review political ads during the 2020 elections on facts, provide effective disinformation policy a free pass as long as it is not part of a campaign foreign influence. Amid political consensus Crown misinformation is dangerous, but Facebook has been relatively fast in order to combat the spread of false information about COVID-19 As conspiracy theories circulated in February and March to 5G disease Connect your telecommunications equipment, has Facebook said it would request contributions Remove 5G trees are attacked. While acknowledging Avaaz activists overcome political opposition will be a great job, I’m optimistic that Facebook and sending correction granted to combat false information a viable strategy. “Our discussions with Facebook and other social media platforms, with them began to say that the recording correction would be impossible,” said Koran. “And then, when it is proved that it was possible the argument that the corrections were ineffective it was. And then, when we have the research to prove that corrections are effective, was” political controversy. The announcement comes after Facebook with one presented by Avaaz-commissioned study shows that users are almost 50% less likely to believe that disinformation on Facebook when subsequently are shown a display of notification. In the study, a representative sample of 2,000 American adults was a Facebook feed displayed a replica version. One group was shown misinformation alone. A second misinformation group, and then was shown a correction shown clearly visible in the news feed. A third control group was shown any misinformation. The group that had the corrections showed 49.4% were less likely to false information, the researchers found the George Washington University and Ohio State University believe. Moreover, the Koran says Avaaz, the study also showed that fact from fiction exposed persons were better disinformation and corrections to the separation folks that no one was shown as yet. “This shows Dass, entrance den korrigieren’könnte companies more resistant to do,” says the Koran TIME. “Even for people who do not see false information.” Now, users tell Facebook at work did controls. After post of Rosen’s blog, 95% of users, the warnings were shown over 400,000 pieces of content posted in March has chosen to information that does not hurt to click and read. Incidentally, these were the messages that were not “harmful” immediately indicated by Facebook, which means they were allowed to stay on the line. announced as part of Facebook’s policies on Thursday, users that a correction would have interacted with such a charge does not get in their news feed. Activist Avaaz say that is not necessarily a bad thing wrong messages that are not considered “dangerous” are still in line – but Facebook believes in a hurry his new correction policy expand the users should be informed about this type of misinformation, too. “The important thing is that you are not the old content to be removed, unless it does not have the ability to cause direct damage,” says Schott TIME. “Just go back to the people and say,., Just so you know, on this particular issue, in fact inspectors found data containing factual errors.” It is more people to give information and ‘much better than a single government to say’ this false news is to take over there. “Please send any suggestions, cables and [email protected] stories.
copyright Image courtesy of Facebook