Alphabet's YouTube said on Friday that the platform would stop removing content that might have spread false claims related to US presidential elections in 2020 and before. The new set of updates is part of YouTube's elections misinformation policy that will go into effect immediately.
"In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech," YouTube said in a blog post. The platform also said the rest of its policies against hate speech, harassment, and incitement to violence would continue to apply to all user content, including elections. The proliferation of disinformation has raised questions about how social media platforms enforce their policies against misleading content about elections.
Other social media platforms like Twitter and Meta Platform's Facebook have also seen a spike in disinformation related to elections.
In March, YouTube lifted restrictions on former US President Donald Trump's channel, following more than two-year suspension after the deadly Capitol Hill riot on January 6, 2021.
"We carefully evaluated the continued risk of real-world violence, while balancing the chance for voters to hear equally from major national candidates in the run up to an election," YouTube said in a tweet, referring to the move.
The video-streaming platform banned Trump in 2021 for violating its policy of inciting violence after his supporters stormed the US Capitol when Congress began to certify Joe Biden's victory in the presidential election.
In the same month, the US Federal Trade Commission (FTC) issued orders to eight social media and video streaming firms including Meta Platforms, Twitter, TikTok, and YouTube seeking information on how the platforms screen for misleading advertisements.
© Thomson Reuters 2023
from Gadgets 360 https://ift.tt/MITnXto
No comments:
Post a Comment