Social Media
Social media should have “duty of care” towards kids, UK MPs urge
Social media platforms are being urged to be far more transparent about how their services operate and to make “anonymised high-level data” available to researchers so the technology’s effects on users — and especially on children and teens — can be better understood.
The calls have been made in a report by the UK parliament’s Science and Technology Committee which has been looking into the impacts of social media and screen use among children — to consider whether such tech is “healthy or harmful”.
“Social media companies must also be far more open and transparent regarding how they operate and particularly how they moderate, review and prioritise content,” it writes.
Concerns have been growing about children’s use of social media and mobile technology for some years now, with plenty of anecdotal evidence and also some studies linking tech use to developmental problems, as well as distressing stories connecting depression and even suicide to social media use.
Although the committee writes that its dive into the topic was hindered by “the limited quantity and quality of academic evidence available”. But it also asserts: “The absence of good academic evidence is not, in itself, evidence that social media and screens have no effect on young people.”
“We found that the majority of published research did not provide a clear indication of causation, but instead indicated a possible correlation between social media/screens and a particular health effect,” it continues. “There was even less focus in published research on exactly who was at risk and if some groups were potentially more vulnerable than others when using screens and social media.”
The UK government expressed its intention to legislate in this area, announcing a plan last May to “make social media safer” — promising new online safety laws to tackle concerns.
The committee writes that it’s therefore surprised the government has not commissioned “any new, substantive research to help inform its proposals”, and suggests it get on and do so “as a matter of urgency” — with a focus on identifying people at risk of experiencing harm online and on social media; the reasons for the risk factors; and the longer-term consequences of the tech’s exposure on children.
It further suggests the government should consider what legislation is required to improve researchers’ access to this type of data, given platforms have failed to provide enough access for researchers of their own accord.
The committee says it heard evidence of a variety of instances where social media could be “a force for good” but also received testimonies about some of the potential negative impacts of social media on the health and emotional wellbeing of children.
“These ranged from detrimental effects on sleep patterns and body image through to cyberbullying, grooming and ‘sexting’,” it notes. “Generally, social media was not the root cause of the risk but helped to facilitate it, while also providing the opportunity for a large degree of amplification. This was particularly apparent in the case of the abuse of children online, via social media.
“It is imperative that the government leads the way in ensuring that an effective partnership is in place, across civil society, technology companies, law enforcement agencies, the government and non-governmental organisations, aimed at ending child sexual exploitation (CSE) and abuse online.”
The committee suggests the government commission specific research to establish the scale and prevalence of online CSE — pushing it to set an “ambitious target” to halve reported online CSE in two years and “all but eliminate it in four”.
A duty of care
A further recommendation will likely send a shiver down tech giants’ spines, with the committee urging a duty of care principle be enshrined in law for social media users under 18 years of age to protect them from harm when on social media sites.
Such a duty would up the legal risk stakes considerably for user generated content platforms which don’t bar children from accessing their services.
The committee suggests the government could achieve that by introducing a statutory code of practice for social media firms, via new primary legislation, to provide “consistency on content reporting practices and moderation mechanisms”.
It also recommends a requirement in law for social media companies to publish detailed Transparency Reports every six months.
It is also for a 24 hour takedown law for illegal content, saying that platforms should have to review reports of potentially illegal content and take a decision on whether to remove, block or flag it — and reply the decision to the individual/organisation who reported it — within 24 hours.
Germany already legislated for such a law, back in 2017 — though in that case the focus is on speeding up hate speech takedowns.
In Germany social media platforms can be fined up to €50 million if they fail to comply with the NetzDG law, as its truncated German name is known. (The EU executive has also been pushing platforms to remove terrorist related material within an hour of a report, suggesting it too could legislate on this front if they fail to moderate content fast enough.)
The committee suggests the UK’s media and telecoms regulator, Ofcom would be well-placed to oversee how illegal content is handled under any new law.
It also recommends that social media companies use AI to identify and flag to users (or remove as appropriate) content that “may be fake” — pointing to the risk posed by new technologies such as “deep fake videos”.
More robust systems for age verification are also needed, in the committee’s view. It writes that these must go beyond “a simple ‘tick box’ or entering a date of birth”.
Looking beyond platforms, the committee presses the government to take steps to improve children’s digital literacy and resilience, suggesting PSHE (personal, social and health) education should be made mandatory for primary and secondary school pupils — delivering “an age-appropriate understanding of, and resilience towards, the harms and benefits of the digital world”.
Teachers and parents should also not be overlooked, with the committee suggesting training and resources for teachers and awareness and engagement campaigns for parents.
-
Entertainment7 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment7 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment6 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment5 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment4 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment2 days ago
A24 is selling chocolate now. But what would their films actually taste like?
-
Entertainment3 days ago
New teen video-viewing guidelines: What you should know