new report makes suggestions that the authors say could “eliminate the dangers posed to democracies by the chaos of information.”
One suggestion is that social networks should release details of their algorithms (calculations) and basic functions to trusted researchers so that their technology can be tested.
The report also recommends “testing” online sharing to prevent the spread of misinformation.
This report is published by the Forum for Information and Democracy. It was set up to make non-binding recommendations to 38 countries. These include Australia, Canada, France, Germany, India, South Korea and the United Kingdom.
Cambridge Analytica’s Whistleblower Christopher Valley and former Facebook investor Roger Mack are among the participants in the report. Roger has long been a critic of social media.
The Electronic Frontier Foundation has been consulted with the Freedom of Expression Group, Article 19, and the Digital Rights Group
What does the report say?
One of the key recommendations is to create a ‘legislation code’ that outlines the essential safety and quality requirements for digital platforms.
Mr Valley told: “If I have to make a kitchen utensil, I have to follow a lot of compliance rules, but not to make Facebook.
He said that social networks should account for all possible losses that may be because of their design and engineering decisions.
The report also suggests that if an independent fact-checker identifies a story as false, social networks should pass it on to everyone who shares it.
Other suggestions are as follows:
Implement a ‘circuit breaker’ to temporarily stop new viral content from spreading until it is tested.
should urge social networks to disclose the basis on which the content is being recommended to the end user.
Restrict the use of advertising messages to reach your target at the micro level
Make it illegal to deprive people of content because of race or religion, such as hiding an empty room ad from non-whites
Prohibit the use of so-called difficult patterns that are designed to confuse or frustrate the user or you may have trouble deleting your account.
It also includes some suggestions that Facebook, Twitter and YouTube already do voluntarily, such as
Labeling accounts of state-controlled news organizations
Limit how often messages can be sent to large groups, as Facebook does on the WhatsApp
A copy of the report was sent to the three platforms on Wednesday and the BBC invited them to comment.
Nick Rickles, Twitter’s head of public policy strategy, said: We support open internet, future regulation on freedom of expression and fair competition in the field of internet.
‘Openness and transparency are key to Twitter, as seen in our public app, our archives of information, our commitment to user preferences, decisions to ban political advertising and labeling. So that we can provide context according to the information. In addition, we have a Twitter transparency report.
“However, technology companies are not the same, nor is technology the only part of the media system. To address these important issues, it is important to ensure a response from all sections of society.
In an interview with ENG Publisher, Mr Valley said the report’s recommendations were designed to protect an individual’s freedom of expression.
This interview is short and concise;
Whenever there is a proposal to regulate social media, freedom of expression is strangled. Aren’t there any such risks in your suggestions?
In most Western democracies you have freedom of expression. But freedom of expression does not mean the right of access. You are free to say what you want, but not in the category of hate speech or litigation. But you don’t have the right to use technology to artificially increase your voice.
These platforms are not neutral. Algorithms decide what people see and what they don’t. Nothing in this report restricts your ability to speak your mind. What we are talking about here is the artificially widespread misrepresentation and manipulation of the platform
Who decides what is wrong information?
I think this leads to quite a few basic things: Do you believe the truth? There are some things spreading so fast on Facebook right now that can’t be logically proven. For example, the code does not exist and these vaccines are actually meant to control people’s minds. These are all things that are clearly wrong and you can prove it.
Our democratic institutions and public discourse accept the assumption that we can at least agree on things that are true. We can discuss how we would react to a particular issue or how valuable it is to us. But we have a common understanding that some things are clearly true.
Will this regulation undermine the free flow of ideas and people’s right to believe in their own will?
If we assume that people should have a legal right to manipulate and defraud, we will not have rules of deception or undue influence. But there are far more tangible harms than manipulating people. Public health reactions to code 19 in the United States have been hurt by widespread misinformation about the existence of the virus or false claims about a variety of treatments that are not actually cures Were.
Do you have the right to believe what you want? Yes, of course. No one I know is suggesting any kind of regulation on the mind and heart in any way.
But we have to focus on the responsibilities of the platform. Facebook, Twitter and YouTube run on algorithms that help promote and highlight information. This is an active engineering decision.
When this results in disrupting the public health response to the epidemic or undermining trust in our democratic institutions because people are being misinformed. In that case, there must be some kind of accountability for these platforms.
But Facebook says it is working hard to deal with misinformation and does not take advantage of hate speech.
This is exactly what happened when the oil company said, ‘We don’t make a profit from pollution. Pollution is a by-product and a harmful by-product. Regardless of whether Facebook profits from hateful content, there is a harmful by-product of its current design that harms society.