(comments)
原始链接: https://news.ycombinator.com/item?id=44197932
This Hacker News thread discusses YouTube's content moderation policies, sparked by a video removal concerning self-hosting media. Commenters express concern about censorship creep, citing past COVID-19 content removals and the UK's Online Safety Act as examples of vaguely defined rules leading to over-moderation.
Many debate the balance between freedom of speech and platform responsibility, questioning whether large platforms like YouTube should have unchecked moderation power. Some propose government funding for open-source, self-hosted alternatives and advocate for stronger antitrust measures to break up monopolies.
The discussion also touches on YouTube's unique advantages, such as its vast CDN and monetization system. Commenters acknowledge the difficulty of competing with YouTube and explore various potential solutions, including stronger regulations, community moderation, and decentralized hosting. Ultimately, the thread highlights the complex challenges of content moderation in the digital age.
From another comment: "Looks like some L-whateverthefuck just got the task to go through YT's backlog and cut down on the mention/promotion of alternative video platforms/self-hosted video serving software."
This is exactly what YT did with Covid related content.
Here in the UK, Ofcom held their second day-long livestreamed seminar on their implementation of the Online Safety Act on Wednesday this week. This time it was about keeping children "safe", including with "effective age assurance".
Ofcom refused to give any specific guidance on how platforms should implement the regime they want to see. They said this is on the basis that if they give specific advice, it may restrict their ability to take enforcement action later.
So it's up to the platforms to interpret the extremely complex and vaguely defined requirements and impose a regime which Ofcom will find acceptable. It was clear from the Q&A that some pretty big platforms are really struggling with it.
The inevitable outcome is that platforms will err on the side of caution, bearing in mind the potential penalties.
Many will say, this is good, children should be protected. The second part of that is true. But the way this is being done won't protect children in my opinion. It will result in many more topic areas falling below the censorship threshold.
reply