Publications & Resources

“A Matter of Security, Privacy and Trust: A study of the principles and values of encryption in New Zealand”

Researchers/Authors: Dr Michael Dizon, Professor Ryan Ko, and Associate Professor Wayne Rumbles

Released 12 December 2019

New Zealand has a reasonably comprehensive and well-grounded legal regime and strategy for dealing with cybersecurity matters. However, there is one area that deserves further attention and discussion – encryption. Encryption is at the heart of and underpins many of the technologies and technical processes used for computer and network security, but current laws and policies do not expressly cover this significant technology.

Brief overview of conclusions and recommendations:

1. Laws and policies that undermine or weaken information security (whether intentionally or as an unintended effect) should be avoided.

2. Any laws and policies that seek to curb the development and use of encryption, or to limit the choice or availability of encryption technologies, should not be pursued.

3. The main issue is how can powers and measures that apply to encryption be improved to better balance law enforcement and public order values vis-à-vis human rights and freedoms.

4. The right against unreasonable search and seizure, and the right against self-incrimination, represent the final or ultimate line of protection or defence against potential abuse or unreasonable outcomes. The right against unreasonable search and seizure is particularly relevant to the issue of network operators and services being under the duty to offer reasonable assistance to intercept or collect the communications sought, while the right against self-incrimination is impacted by the forced disclosure of access information and passwords.

5. A principles- and values-based approach can serve as an overarching framework for assessing the validity, legitimacy or utility of existing or proposed laws, powers and measures concerning encryption.

6. Encryption fundamentally relies on trust; therefore, trust can act as an essential standard or criterion for evaluating whether a balance can be or has been struck among the competing private and public issues and concerns.

Full report in PDF – 200 pages
Media advisory, 12 December 2019
Link to the Principal Investigator’s webpage

“Government Use of Artificial Intelligence in New Zealand”

Researchers/Authors: Colin Gavaghan, James Maclaurin, Ali Knott, John Zerilli and Joy Liddicoat

Released 27 May 2019

This is the first major report from the Artificial Intelligence and the Law Project.  The overall focus of the report is on the regulatory issues surrounding uses of artificial intelligence (AI) in New Zealand.

There are many types of AI systems, and many spheres within which AI systems are used (in New Zealand and beyond).

Phase 1 of the project focuses on regulatory issues surrounding the use of predictive AI models in New Zealand government departments. As discussed in the report, while there are many types of AI model, the concept of a “predictive model” picks out a reasonably well-defined class of models that share certain commonalities and are fairly well characterisable as a regulatory target.

The report specifically focuses on the use of predictive models in the public sector because the researchers want to begin by discussing regulatory options in a sphere where the New Zealand Government can readily take action. New Zealand’s Government can relatively easily effect changes in the way its own departments and public institutions operate.

The report identifies and discusses a number of primary concerns:

•         Accuracy

•         Human control

•         Transparency and a right to reasons/explanations

•         Bias, fairness and discrimination

•         Privacy

Individual rights are vital for any democracy but exclusive reliance should not be placed on individual rights models that depend on affected parties holding predictive algorithms to account. Often, individuals will lack the resources to do so. Furthermore, individual rights models might offer limited efficacy in monitoring group harms.

With regard to oversight and regulation, one of the key recommendations of the report is that Government should consider the establishment of a regulatory/oversight agency. Several possible models for the new regulatory agency are proposed in the report. The new regulator could serve a range of other functions, including:

•         Producing best practice guidelines;

•         Maintaining a register of algorithms used in government;

•         Producing an annual public report on such uses;

•         Conducting ongoing monitoring on the effects of these tools.

The report indicates preference for a relatively “hard-edged” regulatory agency, with the authority to demand information and answers, and to deny permission for certain proposals. However, even a light-touch regulatory agency could serve an important function.

The researchers stress the need for consultation with a wide range of stakeholders across New Zealand society, especially with populations likely to be affected by algorithmic decisions, and with those likely to be under-represented in construction and training. This is likely to include those in lower socio-economic classes, and Māori and Pacific Island populations. Quite simply, they are likely to have insights, concerns and perspectives that will not be available to even the most well-intentioned of outside observers.

Full report in PDF – 92 pages
Media advisory, 27 May 2019
Link to researchers’ webpage

“Perception inception: Preparing for deepfakes and the synthetic media of tomorrow”

Researchers/Authors: Tom Barraclough and Curtis Barnes

Released 21 May 2019

Contemporary audio-visual effects technologies now allow for the creation and manipulation of information in challenging ways never before encountered. They can be used to put words in people’s mouths, portray people doing things they never did, copy their faces and voices, or even create entirely new faces and voices that appear thoroughly human.

This research report considers the wide-ranging social, legal and policy issues arising. Synthetic media technologies have huge potential benefits, but they also have risks. Public awareness of this risk of deception has grown through discussion of one kind of emerging audiovisual technology known as “deepfakes”. The existence of such technologies may undermine general trust in audiovisual information to some degree.

The researchers anticipate that synthetic media will continue to improve, becoming better and more accessible. They think it likely that in the near future, consumers and citizens will be regularly exposed to audio, images and video that looks or sounds as if it is a reliable representation of factual events, even though it is not. It is information that gives the impression that it was “captured”, when in fact it was “constructed” to a greater or lesser extent. Lots of this information will be benign or beneficial, but some of it will be harmful.

The researchers are not convinced that enacting substantial new law is either necessary or the best way to address the harms that may be generated by synthetic media. They also identify a risk that, where new law goes beyond existing law, it may abrogate rights of freedom of expression. Synthetic media is a means of expression like many others.

One of the more significant gaps in New Zealand law is not so much a gap as a boundary. It is a result of the nature of its jurisdictional limits to its own sovereign borders (in most cases): in particular, its application to overseas actors, whether other internet users, or to large social media platforms. Importantly, this is not an issue unique to New Zealand or to synthetic media technologies.

In their research report from page 122 to 129, the researchers set out their conclusions, identify specific gaps in New Zealand law and make specific recommendations, as well as provide concluding remarks.

Full report in PDF – 136 pages
Media advisory, 21 May 2019
Link to the researchers webpage

“Digital Threats to Democracy”

Researchers/Authors: Marianne Elliott, Dr Jess Berentson-Shaw, Dr Kathleen Kuehn, Dr Leon Salter, and Ella Brownlie

Released 8 May 2019

Smart regulation and “human responses”, rather than a narrow focus on content moderation alone, are needed to counter the threat to democracy posed by digital media platforms like Facebook as they currently operate, a newly-launched study has found.

As well as regulating social media platforms, Marianne’s study team calls for several far-reaching and as yet untested “human responses” to rein in the ill-effects of “platform monopolies,” the dominance of the social media market by a few players.

The research team’s proposals include collective action to influence the major platforms, through groups like technology workers and digital media users using their leverage to demand ethical product design. The study argues that fake news can be countered by investing more in public interest media and alternative platforms, leading to a more democratic internet.

It also points to evidence that online platforms enabling citizen participation in decision-making can improve public trust and lead to more citizen-oriented policies.

The study recommends action in the following areas:

  1. Restore a genuinely multi-stakeholder approach to internet governance, including meaningful mechanisms for collective engagement by citizens/users;
  2. Refresh antitrust & competition regulation, taxation regimes and related enforcement mechanisms to align them across like-minded liberal democracies and restore competitive fairness;
  3. Recommit to publicly funded democratic infrastructure including public interest media and the online platforms that afford citizen participation and deliberation;
  4. Regulate for greater transparency and accountability from the platforms including algorithmic transparency and accountability for verifying the sources of political advertising;
  5. Revisit regulation of privacy and data protection to better protect indigenous rights to data sovereignty and redress the failures of a consent-based approach to data management; and
  6. Recalibrate policies and protections to address not only individual rights and privacy but also collective impact and wellbeing.
Full report in PDF – 246 pages
Media advisory, 8 May 2019
Link to the Principal Investigator’s webpage

“Regulating Cryptocurrencies in New Zealand”

Researchers/Authors: Associate Professor Alexandra Sims, Dr Kanchana Kariyawasam, Professor David Mayes

Released 28 September 2018
NZ needs to jump on blockchain train – A central bank-issued cryptocurrency, thriving cryptocurrency exchanges and the ability for businesses to trade in GST-free cryptocurrency are needed if Aotearoa New Zealand is to enjoy the vast potential benefits from this technology, a new report finds.

A team of legal and banking experts have recommended a regulatory framework for blockchain. Their report warns against the Government attempting to ban the use of cryptocurrencies and argues the Government should instead actively support New Zealand becoming a blockchain and financial technology (fintech) hub.

The report’s recommendations are:

  • The New Zealand Government should continue to allow cryptocurrencies to be traded as well as used for the payment of goods and services within and outside New Zealand
  • Greater advice and protection for consumers on cryptocurrencies by the Financial Markets Authority (FMA) and Department of Internal Affairs (DIA) and others
  • The Reserve Bank of New Zealand (RBNZ) trials the creation and issuance of a New Zealand cryptocurrency
  • New Zealand-based cryptocurrency exchanges be encouraged, with clear and detailed guidance provided as to their anti-money laundering/counter-terrorism financing obligations by both the DIA and FMA
  • Cryptocurrency exchanges that comply with these safeguards must have access to bank accounts with New Zealand banks
  • Merchants must be able to accept cryptocurrency payments for under NZD100 or payment through a compliant exchange
  • GST is removed from cryptocurrencies used to pay for goods and services
  • The Inland Revenue Department accepts cryptocurrencies for the payment of taxes
  • New Zealand should follow countries such as the UK and Australia in creating a regulatory sandbox and ensure that the regulators work alongside fintech companies
Full report in PDF – 179 pages
Media advisory, 28 Sept 2018
Radio NZ Nine to Noon podcast – Kathryn Ryan interviews Assoc Prof Alex Sims – 28th Sept – 22 mins 2018
Link to the Principal Investigator’s webpage

“Realising the Potential of Driverless Vehicles – recommendations for law reform”

Researcher/Author: Michael Cameron

Released April 2018

This major study finds law change is needed soon to ensure driverless vehicles can be used legally on New Zealand roads. Study author Michael Cameron says a complete overhaul of law and policy around driverless vehicles is required.

Issues raised in Michael Cameron’s study relate to the research themes and focus of the Law Foundation’s Information Law and Policy Project. Michael was awarded the Law Foundation’s 2016 International Research Fellowship to undertake research on driverless vehicles.

Full report in PDF – 192 pages
Hard copies are available at Unity Books
Brief overview of the report – pdf 2 pages
Links to photos from Unity Books launch on 19th April

“Regulation of New Technology: Institutions and Processes”

Researcher/Author: James Every-Palmer

Released March 2018

The rapid development and uptake of information technology has had a significant impact on how New Zealanders work and socialize, and on how our economy and occupations are organised. This paper contributes to a conversation about the implications of technology change for good regulatory practice in terms of our institutions and processes. This paper asks whether some sorts of institutions and processes are likely to be better suited to accommodating and regulating technological changes than others.

The author identifies six broad policy issues that often arise in relation to new digital technologies and need to be considered by our regulatory systems. In addition, the author sets out and discusses five steps that we should taking now to be in the best position to adjust our regulatory settings to accommodate and/or regulate new technologies in the future.

Full report in PDF – 23 pages
Link to James Every-Palmer’s webpages for ILAPP grants


Forgotten your password ? Click right here.
Not registered ? Click here to create your account


Firstname: *
Surname: *
Postal Address: *
City: *
Telephone: *
Ext. No.:
Cell phone:
Email: *
Password: *
Confirm: *
Anti-hack: 5 + 10 = *
Already registered ? Click here to login