Social media firms told algorithms must not recommend harmful content to children
- Ofcom's draft code of conduct aims to improve child safety online by implementing age checks, complaints procedures, and regulating harmful content recommended by algorithms, with a possible 10% fine for non-compliance.
- The UK Internet regulator, Ofcom, is urging tech companies like Instagram and YouTube to enhance child safety by implementing better age checks, content filtering, and assessing harmful topics to reduce access for under-18s.
- The Children’s Safety Code could lead to a significant change in how internet companies approach online safety, potentially forcing them to adopt new measures.
Insights by Ground AI
Does this summary seem wrong?
0 Articles
0 Articles
All
Left
Center
Right
Coverage Details
Total News Sources0
Leaning Left3Leaning Right3Center7Last Updated11 days agoBias Distribution54% Center
Bias Distribution
- 54% of the sources are Center
54% Center
L 23%
C 54%
R 23%
Factuality
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage