Monday 17 February 2020

Internet censorship coming to New Zealand


Jacinda Adern takes off from where John Key finished off in the destruction of democracy.

Eventually it will be anything that counters the government line. Maybe not today or tomorrow but it is coming.

Takedown notices and 
internet filters coming to NZ

a man and a woman sitting on a bench

30 January, 2020

The Government is planning legislation that will give it increased powers to tackle the spread of terrorist content online, proactively released documents from the Department of Internal Affairs show.

A new suite of reforms will make it illegal to livestream objectionable content and give the Government the ability to issue takedown notices to online content hosts, among other powers.


The proposed legislation was authorised by Cabinet in December and will be introduced by March, according to a proactively released Cabinet paper from the Department of Internal Affairs (DIA).


The legislation seeks to respond to gaps in New Zealand's regulatory framework for online content that were revealed by the March 15 terror attack. It will also create a regime for the introduction of mandatory or voluntary internet filters, such as the one that the Government currently operates to block websites hosting child sexual exploitation content, known as the DCEFS.


Regulatory regime outdated

As Newsroom has previously reported, the response to the Christchurch attack was ad hoc and inefficient.
Internet Service Providers (ISPs) butted heads with DIA officials who wanted content blocked but didn't have the statutory authority to demand that. The list of URLs to be blocked was hosted on a Google spreadsheet and on at least one occasion, an email full of website addresses for censoring was deleted by an email spam filter.


The December Cabinet paper acknowledges the impromptu nature of the digital response to Christchurch. "While these efforts were effective," it states, "the experience highlighted the inefficiencies and ambiguities in our censorship system for responding to objectionable online content, such as that depicting an act of violent extremism or terrorism."


The paper identified five key limitations: that livestreaming is not a medium regulated by the Films, Videos and Publications Classification Act 1993 (FVPCA), which is the central legislation setting up New Zealand's censorship system); bureaucratic delays in the Chief Censor's decision-making process; the inability of the Government to order removal of objectionable online content; unclear responsibilities of ISPs and online platforms; and the lack of a regulatory framework for the existing DCEFS internet filter.


Livestreaming terror attacks would be illegal

In order to rectify this, Minister of Internal Affairs Tracey Martin will introduce legislation in March granting the Government significant new powers to counter online terrorist content. None of the reforms will create new definitions for objectionable content. The definition of an objectionable publication as laid out in the 1993 FVPCA - a publication that "describes, depicts, expresses, or otherwise deals with matters such as sex, horror, crime, cruelty, or violence in such a manner that the availability of the publication is likely to be injurious to the public good" - will remain in place.


Smaller modifications to the regulatory framework will allow the Chief Censor to make faster, interim classification decisions and ensure that the new law doesn't conflict with the "safe harbour" provision of the Harmful Digital Communications Act, which renders some online content hosts immune from legal responsibility for content posted on their platforms.


Four more significant reforms help modernise the existing censorship system. The first will make it illegal to "knowingly livestream objectionable content". Violators of this measure will be subject to the same punishments as the existing offence of "knowingly distributing an objectionable publication", including up to 14 years imprisonment for an individual or a $200,000 fine for a body corporate.


Take down powers introduced

Second, Government agencies authorised to review and confiscate objectionable content, such as the DIA or the Police, will be empowered to issue takedown notices to online content hosts. These notices would only be issued for objectionable content and "would only be used in situations where other options for seeking removal of objectionable content would be ineffective", the Cabinet paper states.


"The current collaborative practice of requesting online content hosts to voluntarily remove identified objectionable content would continue to be the first approach adopted. DIA would be required to publish the numbers of takedown notices issued, and the reasons for their issue (type of content) to ensure this power is used transparently."

Third, the Government would be able to punish hosts that refuse to comply with a takedown notice "as soon as reasonably practicable".
"This change would bring online content hosts in line with the expectations of businesses operating in New Zealand as they relate to physical content classified as objectionable."


Fines of up to $200,000 could be issued and, because they would be civil pecuniary penalties, international partners that have mutual agreements with New Zealand could also enforce them. In a regulatory impact statement, DIA officials cited Australia as an example of a country that would be able to enforce such a penalty.


Filtering the internet a last resort


Lastly, the Government would be granted the authority to establish internet filters in the future if one was required. This would bring the existed DCEFS filter for child exploitation content into a defined regulatory framework and open the door for more filters.


The paper acknowledges that "filtering is not a silver bullet" and "should constitute the final step in enforcement after all other options are exhausted".


The paper states that the DCEFS and ISP actions directly after Christchurch "are an ad-hoc solution to a long-term problem. If we see internet filtering as a legitimate policy response, an authorising framework is needed. Any new or even existing proposal should have a robust regulatory basis that provides for executive authorisation and public discussion, given the incursion on a free and open internet any filter may represent."


Newsroom reported in October that the Government was exploring the possibility of a filter for violent extremist content and the Cabinet paper confirms that Martin will "direct officials to commence work on a potential filter for terrorist and violent extremist content, including targeted consultation. I will report back to Cabinet in late 2020 on the progress of this workstream."







No comments:

Post a Comment

Note: only a member of this blog may post a comment.