Wednesday 27 May 2020

New Zealand's move towards the police state


New bill comes with online 

takedown powers



MSN,
27 May, 2020

The Government has introduced a bill that will allow it to issue takedown notices and create internet filters, with a focus on combatting terrorism and violent extremism, Marc Daalder reports

New legislation will enable the Government to issue takedown notices and create internet filters for content deemed objectionable by the Chief Censor, with an eye towards terrorist and violent extremist content.


Newsroom first reported the Government was moving ahead with the suite of reforms in January. The bill was meant to be introduced in March but seems to have been delayed by Covid-19.


The bill does not appear to have changed significantly from the proposal Internal Affairs Minister Tracey Martin took to Cabinet in December.


"The internet brings many benefits to society but can also be used as a weapon to spread harmful and illegal content and that is what this legislation targets," Martin said in a statement.


"Our laws need to reflect the digital age and the Government has worked with industry partners to create this bill, which will ensure law enforcement and industry partners can rapidly prevent and fight harm from illegal online content. 

This bill is part of a wider government programme to address violent extremism. This is about protecting New Zealanders from harmful content they can be exposed to on their everyday social media feeds."


A December Cabinet paper acknowledges the impromptu nature of the digital response to Christchurch. "While these efforts were effective," it states, "the experience highlighted the inefficiencies and ambiguities in our censorship system for responding to objectionable online content, such as that depicting an act of violent extremism or terrorism".


The paper identified five key limitations: that livestreaming is not a medium regulated by the Films, Videos and Publications Classification Act 1993 (FVPCA), which is the central legislation setting up New Zealand's censorship system; bureaucratic delays in the Chief Censor's decision-making process; the inability of the Government to order removal of objectionable online content; unclear responsibilities of ISPs and online platforms; and the lack of a regulatory framework for the existing child exploitation material internet filter.


Takedown notices and enforcement powers

Alongside a number of small regulatory tweaks to ensure the amendment to the (FVPCA) doesn't conflict with the Harmful Digital Communications Act 2015, the legislation has four main components.


First, the legislation introduces an ability to issue takedown notices and legally punish non-compliant websites.


As it stands, the Government is unable to direct websites to remove objectionable content. After the March 15 terror attack, police were put in the position of having to politely ask websites to take down footage of the shootings.


Now, select Government agencies - including the police - will be able to "issue a takedown notice relating to a particular online publication to an online content host if "the content is objectionable or if the person "believes, on reasonable grounds, that the online publication is objectionable".


For the purposes of this measure, "online content hosts" refers to companies "both in New Zealand and overseas that provide services to the public".


Companies issued a takedown notice must remove the content "as soon as is reasonably practicable", although they may be asked to securely and privately retain a copy for later investigation.


Non-compliant hosts can be taken to court and made to pay a fine of up to $200,000.


Because these fines would be civil pecuniary penalties, international partners that have mutual agreements with New Zealand could also enforce them. In a regulatory impact statement, Department of Internal Affairs (DIA) officials cited Australia as an example of a country that would be able to enforce such a penalty.


These notices would only be issued for objectionable content and "would only be used in situations where other options for seeking removal of objectionable content would be ineffective", the December Cabinet paper on the legislation states.


"The current collaborative practice of requesting online content hosts to voluntarily remove identified objectionable content would continue to be the first approach adopted. DIA would be required to publish the numbers of takedown notices issued, and the reasons for their issue (type of content) to ensure this power is used transparently."


Framework for internet filters

The second component of the legislation allows the DIA to create internet filters for objectionable content.


"The Department of Internal Affairs may operate an electronic system to prevent access by the public to objectionable online publications," the bill states.


The system may "prevent access by the public to a particular online publication" or "the website, a part of the website, the online application, or similar on which an [objectionable] online publication [...] is accessible".

Before launching any filter, the DIA must consult Internet Service Providers (ISPs), "technical experts and online content hosts to the extent the Secretary thinks necessary" and the public.


There are few other limits on the filtering power in the bill itself. While the legislation states "when deciding on the design and form of the system, the Secretary [of Internal Affairs] must consider", among other things, "the need to balance [...] any likely impact on public access to non-objectionable online publications; and the protection of the public from harm from objectionable online publications".


However, it then notes this "needs be considered only to the extent that it is relevant in the Secretary’s view".


As it stands, the Government only operates one internet filter, the Digital Child Exploitation Filtering Service (DCEFS), which blocks access to a number of sources of child sexual exploitation material. The Cabinet paper indicated that the DCEFS would likely be brought within this new legal framework.


The DCEFS is voluntary for participation by ISPs but the Cabinet paper made clear that mandatory filters could also be created under the new framework.


Filtering proposal controversial

Although the legislation does not call for specific filters to be created, Martin told Cabinet in December she would "direct officials to commence work on a potential filter for terrorist and violent extremist content, including targeted consultation. I will report back to Cabinet in late 2020 on the progress of this workstream."

Newsroom first reported in October that the Government was exploring the possibility of a filter for violent extremist content and the revelations in January the Government was going ahead with such an idea sparked a clash between civil society and Internet Service Providers anxious to be relieved of the burden of having to choose what content to block and what to let through.


Since the March 15 terror attack, ISPs have repeatedly asked the Government to create a framework for ordering the blocking or filtering of websites. In the immediate aftermath of the attack, ISPs butted heads with DIA officials who wanted content blocked but didn't have the statutory authority to demand that.


The list of URLs to be blocked was hosted on a Google spreadsheet and on at least one occasion, an email full of website addresses for censoring was deleted by an email spam filter.


Meanwhile, InternetNZ came out strongly against the proposed filtering regime.
"We do not think filtering at the ISP level is a viable option (whether optional or mandatory for the ISP)," chief executive Jordan Carter told Newsroom in February.


"Motivated users can find ways to get around a filter. Internet filtering can also introduce security vulnerabilities into a network and, as currently scoped, will not prevent harms occurring on platforms. The risk of overreach and therefore blocking legal and legitimate content is also high.


"The evidence and analysis in the Cabinet papers do not justify the introduction of a filter."


The Cabinet paper acknowledges that "filtering is not a silver bullet" and "should constitute the final step in enforcement after all other options are exhausted".
In particular, the paper doesn't define whether the potential terrorism filter will be voluntary for ISPs to sign up to (like the DCEFS) or mandatory, for which there is no precedent in New Zealand.


Chief Censor empowered to act quickly

The third component allows the Chief Censor to make interim classifications of objectionable material if he "believes that there is an urgent need" to "notify the public that the content of the publication is likely to be objectionable (on the basis of the interim assessment)" and to "limit harm to the public".


In such a situation, the Censor would rule that the information is "likely to be objectionable". This ruling would have the same legal force as an official determination that the material was objectionable, and an official decision would need to be returned within 20 working days.


This measure seems inspired by the delay in deeming the March 15 attack video and associated manifesto objectionable. It took three days for Chief Censor David Shanks to rule the livestream objectionable and another five days for the manifesto.


Lastly, the legislation makes it an offence to livestream objectionable content. 

This measure appears to address concern that such actions were not already illegal under the FVPCA, a notion some legal experts have objected to. The new offence would be punishable by up to 14 years in prison if carried out by an individual or a $200,000 fine for corporations.


Jacinda Ardern 'We will 

continue to be your single 

source of truth'


No comments:

Post a Comment

Note: only a member of this blog may post a comment.