Published • loading... • Updated
Culture minister says ‘serious conversation’ needed about AI systems and news media
A McGill report found AI models often fail to attribute Canadian journalism 82% of the time, raising concerns about compensation and the sustainability of news media.
- At a national summit on AI and culture, Miller said 'Having the news cannibalized and regurgitated undermines the spirit of the use of that news in the first place and the purpose for which it's used and we have to have a serious conversation with the platforms that purport to use it including AI shops'.
- McGill's team reported that researchers tested 2,267 Canadian news stories on ChatGPT, Gemini, Claude and Grok and found models lacked source attribution about 82 per cent of the time, accelerating journalism's economic decline.
- Under the Online News Act, Meta and Google must compensate outlets, but Meta pulled news while Google continues payments, and a coalition including The Canadian Press and CBC/Radio‑Canada sued OpenAI in late 2024 for training ChatGPT without permission.
- The minister argued regulators should focus on company responsibility, saying 'it's not about opening up the Online News Act but ensuring companies act responsibly and people pay their fair share.'
- Legal and policy timelines remain uncertain as the House of Commons heritage committee heard last year from creative industries calling for licensing, while McGill's brief says AI's effect differs from social media because it can make visiting the source unnecessary.
Insights by Ground AI
33 Articles
33 Articles
+23 Reposted by 23 other sources
Culture minister says 'serious conversation' needed about AI systems and news media
Breaking News, Sports, Manitoba, Canada
·Winnipeg, Canada
Read Full ArticleCoverage Details
Total News Sources33
Leaning Left21Leaning Right2Center4Last UpdatedBias Distribution78% Left
Bias Distribution
- 78% of the sources lean Left
78% Left
L 78%
15%
Factuality
To view factuality data please Upgrade to Premium














