Designated social media providers, live-streaming services and adult sites that allow users to upload content will have to scrutinize and delete objectionable messages, images, and videos if the Liberal government’s proposed Online Harms Act, which includes the creation of a Digital Safety Commission to hear complaints, is passed.
The act, Bill C-63, which was introduced today, says designated services must remove within 24 hours two categories of content: Material that sexually victimizes a child or re-victimizes a survivor; and intimate content posted without the consent of an individual.
The 24-hour deadline would be subject to oversight and review.
The law would also make it clear anyone who provides an internet service — including social media platforms — has to report to a designated law enforcement agency if child porn is posted on their service. To enhance that reporting, service providers will have to hold user content for one year — up from the current 21 days — to ensure content is available for criminal investigation.
Designated providers would have to put child safety first when designing products and features including offer parental controls, content warning labels for children and set rules around targeted content or ads directed at children.
In addition, the government plans to amend the Criminal Code to fight hate by creating a new hate crime offence punishable by up to life imprisonment, and by raising the maximum penalties for the existing four hate propaganda offenses.
The Canadian Human Rights Act would be amended to specify that posting hate speech online is discrimination, and allow the Human Rights Tribunal to handle hate speech complaints, including granting them the power to order the removal of such content.
Amendments would also make it clear anyone who provides an internet service — including social media platforms — has to report to police if child porn is posted on their service.
Only online services that have a big enough number of users would be covered under the Online Harms Act. The size would be covered in yet-to-be announced regulations.
The proposed law would cover seven categories of harmful content:
— content that sexually victimizes a child or re-victimizes a survivor;
— content that could be used to bully a child;
— content that induces a child to harm themselves;
— content that incites violence;
— content that foments hatred;
— intimate content communicated without consent, including deepfaked audio, images and videos.
The Digital Safety Commission would be composed of five people appointed by the government, with the power to order providers to remove content that sexually victimizes a child. It would also have the responsibility of setting norms in online safety.
The proposed legislation would also create a Digital Safety Ombudsperson, also appointed by the government.
The goals, the government says, are to reduce the exposure of harmful content to Canadians, give special protections for children and stronger reporting of child pornography; give public oversight of and accountability from online services and “improved safety over time.”
The legislation will allow people to request the quick removal of child porn, submit complaints to the Digital Safety Commission, allow people to contact the Digital Safety Ombudsperson to receive support and be directed to the right help resource, and to file complaints with the Human Rights Commission when facing online hate.
More to come…
The post Proposed Canadian law puts burden on large internet providers to police child porn, hate first appeared on IT World Canada.