No, Zuckerberg Didn’t Admit “The Science Was Wrong About Covid”

No, Zuckerberg Didn’t Admit “The Science Was Wrong About Covid”

On June 10, 2023, the editorial staff Facta. news I received a message via WhatsApp asking to verify the post subscriber in Facebook. Another topic for our analysis reports a purpose d the truth Published the same day, it was headlined, “Zuckerberg Apologizes for Censorship: ‘Science Was Wrong About Covid'”. was the news Widespread Also on TikTok.

According to the article by D the truthZuckerberg had declared that there were issues the community was discussing, “for example Covid: at the beginning of the pandemic there were real health implications but there was no time to study the breadth of scientific hypotheses that emerged.” According to Zuckerberg, the article continues, “a large part of the foundation In a certain sense” he was “confused about many factual elements” and “asked for censorship of several stories which later proved to be at least dubious if not entirely true.” This ultimately undermined citizens’ trust in institutions.

Referring to the interview given by Mark Zuckerberg on June 9, 2023, CEO of the US company Meta that owns Facebook, Instagram and WhatsApp, Lex Friedmana research scientist at the Massachusetts Institute of Technology (Massachusetts Institute of Technology Extension) and podcast author. We specify that for the sake of transparency Facta. news Receives funds from Meta within its range Third party validation software.

Content reported by the truthHowever, it is misleading and spreading false news.

The interview in question lasts just over two hours and can be found in its full English version at channel Friedman YouTube. hereHowever, you can find the transcript of the interview. The discussion on the future of the Meta focused, particularly in the first part, on the role of artificial intelligence and censorship.

See also  A physiotherapist reveals the ultimate tricks to sleeping well

In one of many questions Friedman put to Zuckerberg, host Asked What is the standard by which Meta evaluates what is harmful, what is misleading, and what is not harmful. Zuckerberg said that on some issues everyone agrees that they should not find space on social networks because they are harmful, such as pedophilia, terrorism and incitement to violence. Founder of Facebook select later that in regard to misrepresentation there is no clear opinion, as in the former cases: «There are things which are obviously wrong, aren’t they? Or is it realistic, but it is probably not harmful ». Precisely for this reason, Zuckerberg continues, it is difficult to blame anyone who made a mistake, but did no harm.

Then Zuckerberg led example Disinformation on the topic of Covid-19: “In other cases, however, which relate to the Covid content at the beginning of the epidemic, there were real health implications, but there was no time for a comprehensive examination of a series of scientific hypotheses.” In this case, Zuckerberg continues, “Unfortunately, I believe that many organizations hesitated about some facts and requested censorship of some content that, at a later time, turned out to be questionable or true.”

As can be verified by listening to the interview (or reading his transcript) Zuckerberg never claimed that “the science was wrong about Covid”. In a more general rhetoric about how to deal with disinformation, the Facebook founder instead said that at the start of the pandemic, when information about it was scarce, it was difficult to understand which contents are correct and which are instead cases of disinformation.

See also  School, Zaya pleads with Draghi: Listen to the science, we won't hold out

Moreover, Zuckerberg did not apologize for what the article calls “censorship.” On this issue, in fact, Zuckerberg preceded That we need to distinguish between content that is a real security issue and how people prefer to view some content, even if it is misleading, ie if marked or not. meta, in fact, introduced recently A tool that allows people to read fact checking content. Anyone who does not trust fact-checking, specifies the founder of Facebook, may not read the fact-checking article attached to content identified as misinformation. So in this case, the post is not blocked, but remains on the social network with a notification that can be read or not. Instead, Zuckerberg concludes, “If the content violates some rules, like incitement to violence, or something similar, it’s not allowed.”

Leave a Reply

Your email address will not be published. Required fields are marked *