Scoop has an Ethical Paywall
Work smarter with a Pro licence Learn More

Gordon Campbell | Parliament TV | Parliament Today | News Video | Crime | Employers | Housing | Immigration | Legal | Local Govt. | Maori | Welfare | Unions | Youth | Search

 

Lisa Owen interviews Justice Minister Amy Adams

Lisa Owen interviews Justice Minister Amy Adams and law professor Ursula Cheer about the Harmful Digital Communications Bill

Adams defends anti-cyber-bullying bill, rejecting criticism that free speech will be be criminalised or suppressed and insists not enacting new legislation “would be failing the public”

Minister says the Bill due back in the House next week has few changes, but will now allow web hosts to “opt out” of ‘safe harbour’ clause and follow their own terms and conditions.

“That’s completely up to the content host how they choose to work through that process. And that’s important, because we don’t want to compel hosts to take down information, because that would be a very unreasonable limit on the freedom of speech.”

If online content hosts choose the ‘safe harbour’ option, involving a 48 hour notice period, then they won’t be held liable for any action they take; but will lose legal protection if they opt out.

Says only in extreme cases will the Bill result in criminal action being taken: “…we have made sure that it is only where the person intends to cause significant harm to the victim and actual harm is caused.”

People who are offended by online satire will be able to complain and ask for take-downs under new law but says “the terms under the legislation make it quite clear that having a joke at someone’s expense will not meet the threshold”.

Playing down criticism, Adams says the same goes for journalism: “I have no concern at all that genuine investigative information revealing things in the normal course of news media would be captured by this law”.

Advertisement - scroll to continue reading

Are you getting our free newsletter?

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.

There will be no defence of truth under the new law. But citing revenge porn Adams says, “, the fact that it happened doesn’t mean it’s not very damaging and inappropriate for that to be put up on the internet”.

Says the definition of harm as serious emotional distress will be unchanged

Law professor Ursula Cheer says she still has concerns about the Bill and media could get dragged into complaints.

Amy Adams: Well, obviously when you’re dealing in the cyber medium, you’re not just dealing with a one-to-one conversation which can be hurtful and might be overheard by a few people. As we all know, these things can go viral and reach audiences literally of millions very, very quickly, so the damage to people’s reputation and to their mental health from this sort of abuse is significant. And, of course, it lasts forever. Once something’s on the Internet, unless it’s taken down and dealt with very carefully, it can be there for a long time, and the damage is repeated.
Lisa Owen: This bill is back in the House next week. Can you tell us what kind of changes you’re looking to make?
Well, really, over the course of the bill, we’ve just been looking to clarify some of the drafting and provisions, and of course there’s always some technical tidy-ups that the drafters find. So there’s going to be nothing staggering that’s new in the changes. There are a few areas where we want to, as I say, clarify, for example, how the safe harbour works, clarify that the fence regime covers a number of different options. So there’s a few tidy-ups of that nature.
Can you tell us a bit more about that? A safe harbour clause that you talked about then?
Yeah, well, the bill provides a safe harbour provision in it which makes it very clear that if you are what we call an online content host - so think your Trade Me message boards or the ISP who provide the net service - we don’t want them to be criminalised in this, so we’ve created a very specific process that they can follow - they don’t have to, but they can follow - of receiving the complaint, giving notice to the person who put the message up, and if they follow through that process, then they can’t be held in any way liable for any action they take. That’s not to say, of course, that the content host can’t do what they would want to do anyway. So take it down immediately if they think it’s offensive or follow their own terms and conditions. But it just provides a very clear legislative process that if they follow that, there can be no question at all of any liability for them.
All right. I want to talk through some examples a bit later, but can you tell me whether the definition of harm - because this is all about - whether a tweet or something that’s put out in cyberspace harms someone - is the definition of harm going to be the same? To cause emotional distress?
Yeah, look, we’re not changing that. This is a piece of legislation that we have to balance very carefully, and the select committee, I think, did a very good job of testing exactly where that balance needs to sit between protecting people from unfair cyber bullying but not going so far as to be an unreasonable limit on freedom of speech and free expression.
But in terms of the harm, causing harm, isn’t that kind of loose, subjective language that’s open to interpretation? How do you measure that? Because one person’s horseplay is another person’s bullying.
Yeah, that’s true, but it’s not actually unusual in the law to talk about concepts like reasonableness and whether a reasonable person would be upset by it, so that’s something the law is used to dealing with. And, as I said, the vast majority of cases will be dealt with by the approved agency who can’t compel anyone to do anything but to try and mediate between people to remove content that’s inappropriate. And it only becomes a legal issue if it really is at the very serious end of the spectrum. And harm in that sense can go from significant emotional distress right through to people potentially wanting to take their own lives. We’ve certainly seen that in cases like Charlotte Dawson and the like.
All right. Well, to help us understand this, let’s use an example. So, there is a post on The Nation’s Facebook page that calls Steven Joyce a lazy bludger, and it accuses him of racism and discrimination. So, let’s say Mr Joyce rings the host of our webpage under this law and says that it’s caused him harm or emotional distress, what happens under your law?
Well, there’s a couple of things. First of all, a politician is probably not the best example to use because we have an expectation that in our jobs, we are subject to a lot more abuse and criticism than the average person, and our thresholds adjust to that.
But there’s nothing in the bill that specifies that, is there?
No, that’s exactly right.
Right, so let’s use this example then. He rings in and says it’s caused him harm.
Okay, so there’s three players involved here. There’s the person who put up the information. So, the poster of the information. There’s the host - as you say, the site it’s on - and the complainant. So, the complainant can go to the online host and lodge a complaint. Now, the host, at that point, can choose to go down the safe harbour provision, which is a 48 hour notice receiving the information and then dealing with it. Or the online content host can choose to follow their own processes, which may be taking it down straight away, as they do now. It may be doing nothing further. So, that’s completely up to the content host how they choose to work through that process. And that’s important, because we don’t want to compel hosts to take down information, because that would be a very unreasonable limit on the freedom of speech. So there is an option there for the content host, and what happens is the approved agency would work with the parties to see if it can be resolved amicably between them.
You talked about the safe harbour provisions, but a host could conceivably take it down to avoid grief of the matter going any further. So that in itself, as the default position, could be censorship.
Well, that’s the position now. Right now, if you don’t like something that’s up on Facebook about you, you can ring Facebook and ask them to take it down. And right now, they’ll make their own assessment about whether they will or won’t. This doesn’t change that. What it does say is that there is a specific process that the host can use if they want to. And if they want to, then they can’t be held criminally liable for their decision.
And if they use that specific process, they can contact, can’t they, the person who has produced that item that’s caused the offence.
That’s right.
And those people have, at the moment, 48 hours to respond. That’s correct?
That’s right. Absolutely.
And if they don’t, if they’re unable to contact those people or they don’t get a response, the default position, if they want to use this clause to protect themselves, is they’ve got to take the material down, don’t they?
If they want to follow through the safe harbour provisions, that’s exactly right. But, as I’ve said, that’s entirely optional for them, and they can choose to use or not use that process.
So they can use their own terms and conditions, and then they’re on their own in terms of liability if they opt for that. If they use safe harbour, they get no response within 48 hours or can’t contact the person who has produced that material, then in order to be protected, they must take it down.
That’s right. They need to work with the approved agency to do that, but we have to have a process whereby it’s not an out to simply return a call. So, if you take a situation of revenge porn where someone’s put up very personal, very intimate photos or information about you on a website, you make a complaint under these provisions, and the online content host can’t get hold of anyone, you can’t have that remaining there for extended periods of time. We want there to be an adequate method of response, and I have every confidence that the approved agencies and the online content hosts will use that appropriately.
Okay. You talk about not wanting to suppress free speech, and Thomas Beagle from Tech Liberty has said this offence will criminalise all speech that causes harm, regardless of whether that speech has any other value. He’s right, isn’t he? It will criminalise-?
No, I don’t think he is at all. No, look, I don’t accept that at all. The criminal offence in the bill is for the very extreme end of harmful digital communications, and we have made sure that it is only where the person intends to cause significant harm to the victim and actual harm is caused. So you have to set out to want to hurt that person, and that person is in fact harmed.
Okay. Well, can we talk through a couple of examples, then. Jeremy Corbett and Paul Ego will come on this show shortly, and they will make someone the butt of their jokes. I think this week it’s Nick Smith. It goes up on our webpage. Now, what’s to stop someone asking for that to be taken down because they feel it’s harmed them or referring it to the agency or taking it to court? And then a lot of time and money is spent working out whether that’s a joke or it’s bullying.
So, you’re right. As I said before, people can complain, and people actually take cases to court under all sorts of provisions at the moment that there isn’t a proper basis for and they don’t proceed further. There’s always that balance between should they not be able to raise the issue, or should it be able to be raised and tested. I’m very comfortable that, actually, the terms under the legislation make it quite clear that having a joke at someone’s expense will not meet the threshold of causing the sort of harm that the bill anticipates.
Critics say that this is one of the types of cases that will get caught up in this bill and will waste time and money.
Well, I simply don’t agree with them. And, of course, if that is the case, then the law can always be looked at.
Can you say, Minister, that cartoons- the likes of cartoons, Minister. Can you say that those will be protected as well, like cartoons about Muhammad and Christ? And they cause a lot of people a lot of distress.
Look, it really- Whether there is a satirical drawing or some sort of other information is not the test; the test is the content of the information. So if the information in cartoon form or any other form created serious harm, incited racial hatred or the like, the same principles apply.
Well, for example, the Charlie Hebdo cartoons that obviously resulted in a number of lives being lost - sometimes the intent of cartoons is to stir, to create public debate.
Well, this bill isn’t going to criminalise anything that doesn’t set out to cause serious harm and serious harm is caused. And it will be for the court in each case to work out whether that test has been made. But what I can assure you is that that threshold is very high.
Will there be a defence of honestly held opinion or truth?
So, when you have offences that require- that prosecute people for intentionally causing serious harm to someone, I don’t think it’s okay to simply say, ‘Well, I think it’s reasonable,’ or ‘It’s my right to say it.’ The courts actually say if you’ve gone out to cause serious harm-
But what if it’s true? What if it’s true, Minister? What if the thing published in cyberspace is true?
Well, you have to be very careful, I think, Lisa. There’s a lot of what we’re talking about will be true in so far as that someone might have taken a recording of you doing someone incredibly intimate. Now, the fact that it happened doesn’t mean it’s not very damaging and inappropriate for that to be put up on the internet.
But what about let’s say- Let’s use an example of a politician who may be married and has campaigned on family values is then found to be having an affair, and that material is put up in a story in cyberspace. While it might be harmful to that individual, it is truthful. And the intent may be to undermine their political career, but it’s truthful, and it has some public value, perhaps, the story.
Look, it’s very difficult to start going through every single situation and trying to guess how a court will assess it. The point being that we need to have a system for dealing with people who actively set out to cause serious harm by inciting violence, by inciting hatred by posting intimate information about people. That is not the same as investigative journalism, and I have no concern at all that genuine investigative information revealing things in the normal course of news media would be captured by this law. That is not the intent. I think the Law Commission and the select committee tested those things very carefully. And I think if we were to say, ‘Well, look, we’re just not going to take any action at all, and this harm being caused by harmful cyber bullying is not something we’re going to worry about,’ then I think we would be failing the public.
All right. Thank you very much. Justice Minister Amy Adams. Very interesting to talk to you this morning. Thank you.
You’re welcome.

Lisa Owen: Ursula Cheer is a law professor at Canterbury University. She says the bill is a risk, and she joins me live from our Wellington studio. Good morning, Professor.
Ursula Cheer: Good morning, Lisa.
We all want to stop bullying, but there have been some concern raised about this Bill — some from you — that it may unintentionally undermine free speech. Are any of your concerns relieved by what you've heard there from the Minister?
Well, I guess I wouldn't be quite so sanguine and certain as the Minister about what the effects of the legislation will be. This is the next piece of legislation that may have quite serious impacts on speech in New Zealand. So, certainly, the aims are very worthy and I think worth pursuing, but I think just because other countries are doing things like this, doesn't mean that they're all going to work out necessarily. Particularly, we can assume they're going to work well and not take in speech, and that there is no risk to speech that shouldn't be, perhaps, criminalised or covered by a civil regime. So, this is new legislation, and all around the world, countries are trying to deal with the issue of cyber porn, and they're all trying it without really knowing what the effects might be. So I think there are risks there, and they do need to be kept in mind both when the agency is being appointed to deal with the lower level of this regime, and then by all the individuals and civil servants and so on who will be asked to administer this scheme.
Because the reality is that free speech can cause harm.
That's absolutely right, and I think our Bill of Rights contains freedom of expression, and it has to be borne in mind wherever any legislation is passed and wherever it's administered. So I guess my first concern is about the agency that is supposed to run that low level part of the scheme where an individual who thinks they've been harmed can go and complain and then seek to have mediation, something of that kind, carried out for them. So we don't know who this agency's going to be yet, and although the Law Commission talked about NetSafe being the appropriate body, there's two things about that agency. First, they're going to have to be really well resourced because this is likely to be a regime I think that will be quite popular. Speech on the internet now is exploding so...
Professor, do you think that agency, whoever it turns out to be, could be inundated with, perhaps, nuisance complaints?
Well, not only nuisance complaints but just lots of complaints, and it's like the Privacy Commissioner has become very popular in that way. And your agency has to be resourced really well to deal with that, and you don't want to be like the Privacy Commissioner's office which has a backload and needs more money. So that's the first real issue. But, of course, there will be, possibly, what will be seen as vexatious or frivolous complaints as well. And they have tried to deal with those, but it's always tricky to weed those out and work out whether you should deal with certain complaints. And that takes time to work those things out, and it takes resources.
Do you have any concerns about the media not being exempt from this law?
That's unusual for legislation of this kind, so what will happen is there will be compliance costs that may impact on media, and there might be the occasional complaint in relation to web pages. All the media outlets run web pages. So it's possible people will be offended by, perhaps, a religious cartoon or some other sort of coverage that impacts on somebody's privacy. Then the media have to get involved in the process if a complaint is made, and then if it is, perhaps, looks as if it's upheld or there can't be a mediation around it and it goes on to the court, then there have to be methods there within the scheme to allow for the media position. They have tried to...
The Minister was very strong about saying there that you need to have intended harm, but if we look at a case this morning. We've done an interview about Colin Craig, and we intended to reveal some material that presumably could cause him and his family distress. Could that get us in trouble under that law?
OK, I think what the Minister was talking about there was the criminal offence, and that threshold has been set very high as to whether behaviour will be criminalised or not, so I think it is unlikely the media will be caught by that. There has to be the intention to cause harm, and the harm, including emotional harm, has to actually have been caused. So I think that's not so much an issue, but I think the civil regime, whereby if it goes to a court, a district court, there might be an order for a 'take down' or something like that. That's more likely to maybe impact on the media. They have tried to build in the public interest being taken into account, but it's not a defence. It's not a specific defence. It's just one of those things that will be weighed up along with everything else by a court. And in the meantime, the media has the cost and the time involved in dealing with that complaint. So the media may well be dragged into this process.
All right. Thank you so much for joining me this morning, Professor Ursula Cheer.
Thank you.
Transcript provided by Able. www.able.co.nz

ENDS

© Scoop Media

Advertisement - scroll to continue reading
 
 
 
Parliament Headlines | Politics Headlines | Regional Headlines

 
 
 
 
 
 
 

LATEST HEADLINES

  • PARLIAMENT
  • POLITICS
  • REGIONAL
 
 

InfoPages News Channels


 
 
 
 

Join Our Free Newsletter

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.