Episode 25

Special edition: Australia’s Privacy Act reforms introduced to parliament

Cassie Findlay, Principal at elevenM Consulting, and Chris Brinkworth, Director at Civic Data, join Anthony and Kris for an analysis of the first tranche of reforms to Australia’s Privacy Act. During their conversation, they discuss the newly established tort and its many implications for Australian organizations, the Children’s Online Privacy Code, the requirement for transparency about automated decision-making, and the steps every organization should be taking to prepare for the reforms both in tranche one, and the upcoming tranche two.  

Topics discussed:

  • What’s covered in tranche one and what’s been reserved for tranche two  
  • How the new reforms position Australia on a world scale of privacy protections  
  • Predictions on how the new tort may be used and when it will go into effect  
  • Where the ‘fair and reasonable’ test fits into all of it  

Resources:

Transcript:

Anthony: Welcome to FILED, a monthly conversation with those at the convergence of data privacy, data security, data regulation, records, and governance. I'm Anthony Woodward, CEO of RecordPoint. And with me today is my co host, Kris Brown, RecordPoint's VP of product management. Hey, how are you, Kris?  

Kris: I am good, mate, how are you?  

Anthony: Yeah. Good. Good. Super looking forward to today's podcast. We have a different podcast than normal. It's a round table discussion. I have some fantastic guests assembled for looking at the first reading of the amendments to the Australian Privacy Act that occurred yesterday.  

Kris: And I'm going to go possibly a couple of firsts.  

So I think we've got our first repeat guest. And I also think we've got our first more than one guest. So could be a couple of  

Chris: I'm good, thank you. I think like everyone else, we've been desperately waiting for what we heard yesterday in regards to the reforms. So, I don't know about Cassie, but I'm kind of a bit numb from, Less sleep and, and lots of focus.

Anthony: Yes. And look, just to remind those, Chris, you're not, you're a director there at Civic Data, you're a member of the IAPP and you can go back and listen to the episode that Chris had with us six or seven episodes ago of FILED. It was a fantastic conversation, and as Chris has already let out of the bag, we also have Cassie Finlay, who is a principal at elevenM. I remember Cassie from many moons ago working in the records management space, but these days you're in privacy, cybersecurity, data risk, and you do a bunch of consultancy here in Australia. And I know you were in San Francisco as well at one point.  

Cassie: Thanks for having me. And yes, I've been back in Australia for four years now. I was a COVID refugee out of San Francisco where I  was head of information governance at the GAP group of companies, uh, which was a fun gig, lots of discount on the retail, which I enjoyed. And yeah, I joined elevenM not long after I got back and been blending data and information and privacy in cyber ever since.

Anthony: No, fantastic. It's really great to have you on Cassie and bring, you know, not just the Australian perspective, but some of those global perspectives as well.  

Kris: Congratulations folks. As I said to Anthony, sort of the first multi-guest, I think this is going to be really interesting to see how we handle this.

So I apologize if we get a bit wild and crazy, but Chris, let me start with yourself, great to have you back on FILED. A little bit of a refresher for the group and maybe just a quick bit of a background on your experience and how you came into that intersection of sort of Martech and technologies and, and privacy.

Chris: Sure. So I think I'm a little bit different to everyone else. I've got an op-ed coming out, actually, I think later on today or tomorrow, where I explain that I've tracked, I've targeted hundreds of millions of people using billions of behavioral data points, cookies, IP address, everything I can over my 23 or 24 years working in digital media and technology.

So I come at privacy from a very different angle where I understand how the kind of data flows of different pixels, different tags, different customer data platforms, anything that's used in a digital economy today for advertising, marketing or measurement, I understand how that all works because I've done it.

So my background and interest in privacy comes from the fact that. Clearly, globally, there have been so many changes of which Australia is very, very far behind on that has already impacted how those technologies work. So you need to understand where privacy has already stopped certain technologies just from working in the first place at a technical level, let alone a regulatory level.  

But then a lot of those technologies that need to be replaced. By agencies, marketers, and so on, because you can no longer use a cookie to identify or target people in certain, certain aspects anymore at a technical level, businesses are starting to use more durable identities, other solutions, such as a hashed ID that they'll send to a social media network and those themselves, those technologies, while an agency or a marketer seems to think that's a great idea, quite often it's done absent of a knowledge of anyone from a data governance or a privacy expert within the business. So the other part of that challenge is you need to understand, can you still use those new technologies from a regulatory perspective as well? So that's my interest.  

Our core clients that we have are very much those that have never been exposed to GDPR, big Australian businesses that deal with Australian, Australian data, haven't had to worry about anything like this before, but now start to realize there's a bunch of tools they're using across their websites that people like Carly kind the Privacy Commissioner has been  talking about a lot recently. How they're collecting data and sharing that data with various social media networks that really the consumer has no idea is happening.

So that's kind of where we focus is just understanding how those technologies could put businesses at risk. We don't get into the stuff that Cassie does, which is great that Cassie is joining us. So there's a bunch of stuff where I know Cassie and elevenM are really, really good at focusing. Cause we're very much the team that goes in and cleans up that, that technical aspect, but it gets into else, I'm really glad Cassie's here.  

Kris: What a great, great segue there. So Cassie, again, thank you so much for joining us as Anthony's already mentioned principal there at elevenM it's a really cool, interesting career bouncing around from the US I said, I'm sure my wife would have enjoyed said discounts as well, we spent way too much money when living there myself at the gap. Now that you're engaged in this privacy sector, yeah, what are the challenges that you're helping clients with today?  

Cassie: I definitely learned a lot when I was at GAP, not only about the joys of shopping at Banana Republic and GAP, but also they were going through that GDPR and CCPA journey when I was there.

And really it was a sort of very intense period of getting systems practices and processes up to speed with those laws. And since I've been back in Australia, probably two areas have really stood out more recently as things that clients are looking to us for. Number one is understanding collections uses and disclosures of personal information across their business and at a quite granular level and doing that in a way that is about the business purpose of the collection, the use, and the disclosure of information.

So you're not coming at it from a technical perspective, you're not scanning repositories for categories of PI, but you're actually going in there and talking to teams about why do you collect this information, what do you use it for, what sorts of volumes are you handling per year and all these sorts of questions which help you to articulate a risk level across all of those processes and gives you a really rich understanding of the data handling landscape that you're operating in.

So that's one area that we, we do a lot of work with clients, both government and, and large corporate, and this is endlessly entertaining to me as somebody who was a record keeping person for a very, very long time, still am, but having corporate clients come to us and say, 'Oh, we need to do something about retention of personal information and we need to dispose of stuff.'

And for many of them, this is a completely foreign concept, this idea that you might actually establish rules and do things. routine disposal, deletion, de-identification. And so for those pieces of work, I'm drawing on my record keeping background, but I'm also having to adapt certain principles into these data rich environments and make them sensible and practical for the businesses that are using us.

So that's really, rewarding, both from my long standing professional sort of interests, but also because it's really fun to see the penny drop and to see somebody sort of realize that just because it's digital, it doesn't mean you have to keep it forever. And in fact, there are techniques and approaches to making sure you keep your house in order on that front.

Kris: Hallelujah.  

Cassie: I thought you'd like that as, as RecordPoint people, I thought you'd enjoy it, but yeah, it's, it's really been quite noticeable. And I will say the link to the large data breaches is clear. And so there is a bit of a leadership teams coming to us and saying, look, we don't want to be the next Optus, we don't want to be the next Medibank, but look, I'll take it because it's not just the fear factor that is. Should be driving them. It is truly minimizing the amount of PII that you're holding about your customers and so that you can maintain that trust, which is so essential today. The trust between the customer and the, and the business.

Anthony: No, absolutely. And thank you. So certainly while we've been talking and, and out there and very, very, very much in line with. With what we talk about here on file with that convergence really of, of these worlds where there is this really strong interrelationship, which probably brings us on to the core purpose of the conversation today around, um, the new amendments.

To the privacy act and the review that's been ongoing for some time. And I think Chris and Cassie, you, you contributed to the various waves of feedback that the government asked for and have been analyzing the changes that were announced yesterday and the first reading of the bill, it'd be great if you wouldn't mind though, um, probably starting with you, Cassie to break down, what we saw in, in those amendments and what has Mark Dreyfuss put forward together based on what we were expecting.  

Cassie: For sure. So I was talking to a colleague about this yesterday and he called it the Clayton's reforms, the reforms you have when you're not really having reforms, but I think that was a little harsh, and also possibly a bit aging for those of us that remember those ads. But there are some really important items in the agenda that Dreyfus introduced.

So, for example, the tort for serious invasions of privacy, which brings in the possibility for individuals to seek redress for harms as a result of really serious intentional or reckless breaches of their privacy or misuse of their information.

And that's both intruding upon their seclusion, physically into their space, or indeed the sort of more online versions of that around cyberstalking. And it extends to individuals and companies misusing information in a way that is reckless and brings harms. That's something that's been on the cards for a long time, so I don't think it was a big shock to anyone that that one came through.

The Children's Online Privacy Code as well is a welcome announcement. This is something we've seen in other jurisdictions around the world. putting in place some requirements, in addition to the baseline requirements of the Privacy Act, but requirements that will address certain ways of designing, for example, educational technologies that kids are being presented in schools, and put a bit more of the onus onto the, the developers and the providers of those technologies, rather than the poor schools and parents who are trying to navigate this stuff and work out, is my children's data safe, um, being ingested into this tool, being used by this tool? It's going to be a bit of a way off that code, but it's a very welcome introduction.  

Interestingly, one of the only reforms that is actually going to be reflected in organisations privacy programs in a sort of very specific way is the requirement to be transparent about automated decision making.

And that one, the requirement will be that you describe how you, if you are using automated decision making in your privacy policy. And of course that one's in response in part to things like the robo-debt case. And again, that trust concept that at the first step to being upfront with consumers and users of your services is to make sure they understand what you're doing with their data.

And automated decision making being a particularly sensitive one, particularly where it has the potential to result in real harms to around their health or their credit rating or whatever you want to, so that's certainly very good. There is a range of reforms that relate to the OAIC and its powers and it's sort of, the stick has gotten a bit longer and has more pointy out bits, I don't know, something like that.

There's some additional levels of civil penalties would apply to sort of not super egregious misuses of information or privacy issues, but are sort of the more mid range monitoring and investigation powers for the OAIC. And I guess, and they're getting a bit more money, but mainly to do with the development of the Children's Code, I believe.

So it'll be interesting to see if they do get any sort of support and more funds and resources for all of these that are now introduced. And I won't go, I mean, in the interest of time, there's a few other bits and bobs around overseas disclosures, being more transparent at the attorney general, sharing information about data breaches as well.

But that's my quick, gallop through some of the main ones.  

Kris: And I think mace is the club with the sticky outy bits. So maybe they're going from club to a mace. It's, there's my  

Cassie: Or a cat-o-nine tails.  

Kris: Yeah, there you go. It's just a little bit stickier, a little bit owier. I like the analogy. I might push this one over to Chris then.

One aspect I actually would really like to focus on, Chris, and certainly obviously between yourself and Cassie, you can answer this, is that, that new statutory tort. Especially around the serious invasions of privacy, how, how do you think this raises risk for businesses when it comes to data breaches and ransomware and what are the must do's. You're at the edge here you're saying before cleaning up as these sorts of things come in, what are the must do's for businesses that are worried about that privacy security posture?

Chris: Coming back to the big stick thing, uh, the big, big mace, whatever it is with pointy bits. I was actually writing about this the other day. There's both a carrot and a stick here, right? So I think the carrot is the nice thing where it's a great opportunity for businesses to suddenly realize they do need to tidy up their act now.

But it is a very, very, very big stick, this tort, and it brings us in line very much with many jurisdictions across the world where we've seen class action lawsuits. How long has the tort been in discussion now? It must be at least 10 to 12 years, Cassie, I think. The tort's been a very, it's been coming for a very, very long time.

What it means, I'm not going to lie, there's been a bunch of - if we look at the way our business operates for the past three years, and I don't know about Cassie's business, but from a privacy perspective, it was only those people who really wanted to get on the front foot that had been using our service to kind of tidy up their stack and everything else.

Then when Privacy was announced in May, that it would be in August, the reforms, a lot of people just said, let's wait and see what happens first and then when this tort became known, there was suddenly some very, very, very serious phone calls and emails that came through going right. We get it now. We understand.

I'll break that down. If you look at the type of class action lawsuits that are happening and there's one the other day on the in the AFR based on the VPPA which is an antiquated old law from the U. S. Around videos and personal data. And there's a class action lawsuit involving the NRL, the AFL and Foxtel that was mentioned just because they have meta as a pixel on their website.

And there are so many related examples of those types of cases because if you think about the way these technologies work on a daily basis, most people on this call would think about a breach being 100,000 records have been stolen in X case or someone got hacked or someone leaked this information.

But if you think about how many people visit a large website on a daily basis or a large app on a daily basis, you can think about 100,000 ants walking away with your data. Slowly, but surely, and they're walking with that data and that in itself is a breach because it over a number of months, it could be a considerable amount of data, personal information and other such that you didn't cover in your privacy policy that you were disclosing to certain people that you didn't realize as an example, um, if I have a pharmacy based website, for example, and in the URL structure, it shows that I am looking at certain health related conditions, is and that is going to de-identification vendors as well, such as a meta.

It becomes quite an interesting challenge. So, so this tort it's actually creating more angst in the clients that I've been speaking to than the basic targeting principles and advertising principles a lot of the advertising industry are woeful about not seeing. Because it's a serious matter if you are suddenly realizing the amount of data that you've been sending left, right and center through these different technologies that could put you at risk.

Now that tort, as far as I'm aware, and correct me if I'm wrong on this, Cassie, but I actually see that within a six month period also becoming a problem. So when we talk about the ADM stuff shortly, there's about a 24 month readiness period for the ADM stuff. But the tort itself could happen within a six month period.

If you look at some examples of where you may already be in breach, but not aware of it, and if someone, two seconds, three seconds after the tort becomes live, someone at a class action law firm manages, because they've already done their research over the six months previously, they get screen grabs of the evidence.

That's going to be a major problem. So there's been a bunch of people reached out about that tort aspect, but I think coming back to the, the other piece, which would infringement notices that Cassie mentioned as well, I found that really interesting from the perspective again, of my world that, uh, if you were to look at the fact that you're collecting data, disclosing data, using data in certain ways in your web based or your digital based environments, but that doesn't match up to your privacy policy, your disclosure agreements, and everything else. Those infringements have actually become a lot easier based on this change.

It's much easier for an information commissioner to pursue it based on these new proposed laws as well. So it's going to be rapid and fast and I saw a number the other day that was up to $300,000. Could be for one of those infringements as well. So these are really considerable numbers that I think people haven't quite grasped yet, that if they don't understand the data flow, the customer data lifecycle, where data is coming from, how they're using it, how it's covered in their privacy policy and so on, both the tort and these infringements alone could put a lot of pressure on businesses.

Anthony: And are you drawing on that just to drill in a little bit because of the experience in the UK, particularly what we saw. and, uh, Campbell v. Mirror Group, so the human rights elements and, and the tortious nature of what exploded straight after that. Is that, is that the comparisons you're looking to draw there, Chris, or do you, are you seeing other comparisons?

Chris: I'm looking at wording such as, look, please everyone remember I'm not a lawyer and I've come at it from a very different world, but is it reasonable, is the question here, that if I go to a website that that website is sharing information about my browsing habits, what I'm doing, what I'm buying, sense of information I'm looking into.

Is it reasonable that they're sharing that with their disclosure parties that they've not told me about? Is that reasonable? And again, coming back to, I put a lot of attention onto a lot of the op-eds and podcasts that Carly Kind, the pricing commission has been doing recently, and her wording around this entire pixels tags and everything else is very similar around: it's very unreasonable that this is happening and it's unreasonable that health information, sensitive information, et cetera, is being disclosed.

So that's what I actually look at and go. There's an angle there that could be used quite easily. on those millions of ants walking away with data. If you are sharing something about your health with a social media network, if I go to a health based website, a hospital website, a government website, or whatever it is, is it fair that that's being shared with a social media network to add to their vast graph about me?

That's what I'm looking at as interesting in that tort piece.  

Anthony: I think to be fair to what Carly Kind and others have been talking about is, it really does, and I think listening to the conversation she had this morning, it really is serious violation. So, you would need to be able to prove under the new legislation should it go forward, under the bill more correctly.

It isn't just the showing of the pixel. Clearly it is the distress of actually losing a piece of asset that has that violation that results in loss of reputation or other harm from that party. So I think there's a lot there to unpack around that language because it, what's in the new bill doesn't give a lot of clarity just yet to that direction. It's very open to the court's interpretation.

Cassie: The Carly Kind and other commentators, Katherine Kemp, I was reading in the conversation as well, have made the point that, you know, one of the things that has been deferred to the second tranche is the fair and reasonable test. And that goes to exactly what you're talking about, Chris, which is that under the current arrangements, you need to collect personal information for a purpose that is appropriate and relevant.

And, um, have a certain degree of control over that process, but ultimately after that happens, your uses and disclosures, well, you know, it's more of a matter for you. And what the Fair and Reasonable Test would be doing is establishing that no, no, no, all of those processes need to be measured against this test.

And of course, that will depend on the context of what your business is, what is fair and reasonable, given the particular thing that you do. So various of the people that I just mentioned have sort of said, well, it's a shame really that that hasn't come with this tranche. That's not to say though, that businesses can't get on with doing the right thing now.

And in fact, what my colleagues, Melanie Marks and Jordan Wilson Otto said in their blog yesterday on the release of the bill was that we need to stop waiting around. A lot of companies, a lot of our clients have been saying, look, We're just going to wait and see, see what happens with the reforms. And now we've got this additional wait, and of course there's an election in between.

In today's digital economy and in the risk environment that we're in, you can't keep waiting. You've really got to get on top of these things, so understanding what your uses and disclosures as well as your collections are in the light of a fair and reasonable test is just something that is a good news story regardless of who you are.

Anthony: I think you brought up an interesting point that I'd like to just touch on. This is the first tranche. We've seen the OAIC and others acknowledge it is the first tranche. And so we, we are expecting more, aren't we? We're expecting, you know, and obviously elections and other things may get in the way of these processes.

At the end of the day, there is more to come here, isn't there?  

Cassie: Yeah, I mean they've kept some of the really impactful stuff for that second tranche, and so I'm talking about the fair and reasonable test, employee records exemption being changed, small businesses exemption potentially changing, and that is something that is for all of us who are sort of living in the world, we would like to know that the real estate agents that we're dealing with, for example, not to point the finger too much at them, but those small to medium businesses are actually required to exercise due caution with our personal information, that would be good.  

And then there's also the whole area of the right to erasure. Which I did a lot of work on that when I was in the U. S. And depending on how big your company, your consumer base is, that can be quite a substantial piece of work. But yeah, so there's a lot still in that second tranche, which has been foreshadowed.

But I guess we'll just have to, to wait and see. Again, I'll just circle back to my point that now is the time to get on with sort of uplifting your information handling processes, regardless.  

Anthony: It was really interesting or ironic and maybe this was just an Anthony Woodward issue, but I'm not sure if you followed the day prior, the government put out the new draft for the AML CTF amendment bill, which is primarily targeted at real estate agents and small office solicitors to look at, you know, money being transferred, you know, terrorists and other activities from any money laundering.

Yet we didn't see them having to have some enforcement over the core piece of data that enables that from a privacy perspective.  

Cassie: It seems like a mismatch, doesn't it?  

Anthony: Yeah.  

Cassie: A missed opportunity for sure. Just to say, you know, it's been interesting to see the flavor of the announcements this week, a proper focus on children's privacy, children and social media.

Interesting one. I want to see how that operationalized, but anyway, of course the other part of the announcements yesterday was the anti-doxxing provisions as well, so clearly they're very alive to the perils and risks of the cyber world and living online as we all do, but yeah, there's definitely more to come.

Anthony: Probably one for you, quick spring to it, that, you know, of things now that we didn't see in the legislation that was expected in the bill for me really stood out was the consent elements. So the bill currently sits at the same place it did with implied consent, you know, which has some really large ramifications as we flow that down.

Are you seeing any conversation in that area around the lack of further definition that's really required to make consent less ambiguous?  

Chris: I'm still licking my wounds on the reasonable conversation because I'm looking at it still with this kind of a question in my mind around if the term is a reasonable expectation of privacy, and if you go to a news website or any type of website, and that data that you're giving is being used to create behavioral profiles about you very much elsewhere and use very much elsewhere, I still think and strongly think that will become an issue.

I've put this out to a couple of lawyer friends as well and they're mulling it because I'm still really interested in how that plays on the aspect because it has played out in that way overseas. There's a number of complaints around building, building identities and behavioral profiles based on, I guess, derivative products of other people's data.

Now, but that then plays into this conversation around the consent aspect as well. So, yes, there's a lot of angst around what will consent be in the future. We haven't got that visibility into it, but I think even these infringement notices now become interesting. Again, if you look at the ACCC report, did we discuss the ACCC stuff last time?

I can't remember or not, but the ACCC data broker report. So if you were to look in the ACCC data broker report, the Katherine Kemp that Cassie mentioned earlier on, Anna Johnson and others put a lot, a lot of thought and response into as well. Just that highlighted where data is currently being used beyond the consent of a notice that was given.

It's being commercialized in ways that previous consent wasn't given for. It's being disclosed to parties that consent wasn't given for. So that comes back to this infringement piece where. that big stick with the pointy bits on it, uh, from an infringement perspective, I even think that's going to come under that.

So that also solves for that kind of angst around consent, is you need to understand this stuff anyway. I think in that infringement, there's also something specifically around the ability to opt out. As well is within that infringement aspect. And again, how do you do that in a digital environment, especially let me come back to my reasonable expectation.

If your data is being used by another business, you have no idea that had it. And your identity is part of this kind of derivative product. How do you opt out of that? How do you get rid of that? How do you even change that preference? So, yeah, they're the things that are in my mind in regards to consent.

So I don't know if I've given you the idea you want as the answer. That's what's in my mind when I think about this stuff.  

Kris: And I think Chris, probably the unique thing for yourself here, being on that market or martech side, and I know we spoke earlier about the MI3 piece covered, you know, how marketers are in limbo.

What are you hoping there to change? You know, obviously the next round of changes are coming at some point, we didn't get it this time. What should happen there for those marketers? You know, you've touched on it a little bit about the, 'I'm worried,' right, like I'm worried about that fair and reasonable piece, but what would you hope comes off that MI3 piece where we're specifically saying that, you know, the marketers have been left, you know, to wait.

Chris: I think anyone that makes money from data, let's break this down as a kind of different groups. My job is not to represent or hope someone gets the best way that consent works for them to  make money from that data. Because obviously a lot of companies probably prefer to keep very lax consent based rules.

They can get away with anything. There'll be a bunch of people in the advertising industry who would like things to remain the same so they can keep doing things that they see as, you the benefit far outweighs the costs associated with the risk. So I think there's that aspect, but the problem is a lot of people just don't know what to do because there's no clear set legislation.

So there's been conversations around it, but without this clear set legislation, you can't actually aim towards a certain goal. So really, I think what everyone wants is just some kind of idea of where to focus. It's a bit like this, this damn cookie thing that's been going on for ages, which is our Google, are they going to, or not going to deprecate cookies?

So and it just went on for years and years and years, and people just wanted to know what the hell to get on with. So it's the same thing. If you don't know what to aim for, then, then it's hard to kind of balance it. So that's what I would say is the challenge. People in the marketing industry from a publisher perspective, agency perspective, advertiser perspective.

They just want to know what they can actually do that's not going to get them in trouble, but that's not really clearly defined for the new legislation. But I also think that with these new areas of litigation, from the infringement perspective and everything else, people are going to have to start looking at what they're doing right now anyway, just to get on top of it.

Kris: So let me dig into that just a little bit more though. So we know that the legislation that's been tabled doesn't, doesn't give you what you need in terms of clear direction. They're going to have to do something though anyway. What should they be doing? Because as Cassie said a moment ago, the time is now.

You just can't keep waiting. This other stuff's gonna come. What should they do? As you just said, you've alluded to it, which is, they need to start looking, but what should they really do?  

Chris: I'm on another panel next week, I think it's next week or a week after, and I know I'm gonna use the same, same title thing I always do, where someone says, so what should people do around, exactly what you're saying, what should people do to prepare?

And I ask everyone in the room, put your hand up if you use any type of customer data to target people to measure them and so on, such as email addresses or Facebook marketing. um, everyone puts their hand up and then I say, keep your hand up if you could tell me the name of your privacy leader. And then there's probably only two hands that left up out of three or 400 people.

I think in regards to what you need to do, Chris, is just to add to compound that I had another meeting yesterday, they caught up saying, can you come in? We don't want to bring the legal people in just yet because we want to tidy up what we're doing first before we let the legal people know. The key thing that people need to do, Chris, is just truly understand or let the legal and the privacy experts know what they've been doing all this time.

And unfortunately, a lot of these aspects of what you've seen in those, the ACCC report as an example from the APP 3.6 aspect is people just don't involve their legal team or their general counsel in a lot of these discussions. They just get willy nilly with a bunch of stuff. That's again, that's in my world, right?

In the customer experience layer, I'm not talking about in your world from employee records and everything else. But again, when you talk about the amount of data on a daily basis leaving these websites, death by a thousand paper cuts, or marching ants carrying data away, it's a considerable amount of data.

Kris: It's interesting that while in the RecordPoint, the type of data governance that we're talking about in our world on a day to day, this is a different type of data. I think the way in which I've read where they're trying to achieve. They are focusing on that piece, right? Like it is about the, everything else, the things that we don't really know.

We want to understand more people should be more responsible. No time like the present that we can do a bunch of cliches here, Cassie. I do want to obviously push on here a little bit, but while the bill itself, you know, starts the process and you, and you drew on this in your, your earlier introduction, it starts the process for that Children's Online Privacy Code, and it's just.

I'm wondering, you know, if you have some thoughts on how those businesses might deal with those who do deal with minors data and what they should be doing to prepare.  

Cassie: First of all, the question is, who is the target of this code? Who will need to be following this code? It's not likely to be just any sort of business or website that children may find themselves on, but rather, product services, online websites and applications that are built for children.

And so they need to be cognizant of the fact that this code is coming. Actually, as a group, we at elevenM, we have occasional, actually weekly, uh, whitepaper sessions where we dig into topics. And happily, my wonderful colleague Brett did a whitepaper last week on the Children's Privacy, Online Privacy Code code.

And he looked at what's happening elsewhere around the world, such as in the UK. The concept of the best interests of the child is something that comes up again and again in these types of codes and arrangements. And that is something that is obviously, again, going to be very sort of contextual as to what it is that you're doing with the child's data.

But hopefully there will be guidance that will come through along with the code about how you make that assessment. And then he, my colleague, basically sort of looked at some of the other things that provisions that are, we're likely to see in the code and group them by, well, there's privacy best practice, which is data minimization, not collecting what you don't need to, transparency about what you're doing, not sort of enticing children with fun games and toys in exchange for giving more data, that kind of slightly dark pattern-y sort of thing that, that might creep into some of these apps.

There's also building technologies where certain things are just off by default. If they are directed towards children, don't have geolocation on by default, for example. Don't do the kind of profiling that Chris you were talking about where you're sending hashed identifiers off to a broker so that you can then target that individual as they go about their lives, regardless of whether what the Privacy Act says, just don't do it.

It's creepy and I'm sure with the Children's Code coming in, it will be against that code. And then there are other sorts of ways of dealing with this around parental controls and other sorts of avoiding having what are called nudge techniques, which sort of, you know, encourage the child to sort of play more and get more sort of rewards and really uses some of that psychology that we see for adults as well, being really glued to the screen.

So there'll be a lot of detail obviously that needs to come out, but having a look at some of those other codes like the UK's. Age appropriate design code, for example, would be a good place to start.  

Anthony: Super interesting space that's going to have to build out. I think tangentially linked to that, one of the things we were shocked at RecordPoint about not seeing a lot of detail in, and I do hope it's coming in further tranches is the right to be forgotten.

Now, obviously it isn't trying to some extent in the Australian Privacy Principles 12 and 13, I think, right. We've really not seen a strong application of that in any form. And we were certainly, yeah, and certainly I I've posted on our website and written some blogs around you looking at that and other jurisdictions that we operate in.

You know, even India has some really strong fundamental rights in that area. I also was living in the U S up until COVID much like yourself, Cassie. So was involved in Washington State's uh, privacy piece and their right to be forgotten and you're seeing that happen in other US. states. Where do you think that goes?

It's a really fundamental piece, I think, of this area and we're still a little gray in that area here in Australia.  

Cassie: We have rights of access and correction and some businesses voluntarily will delete your data if you send a request through, but you don't currently have a right to have that deletion done. And I think it is, it's part of the consumer, when it is in place, it can be sort of the place of last resort for customers who are dealing with a company that isn't, hasn't got their house in order when it comes to things like marketing consents. So if you've tried to opt out of marketing multiple times and you've been on the call, you've rung up and then in those circumstances, you have the right to say, delete every last bit of information that you hold about me.

And that's the, hopefully it wouldn't, in most cases, have to come to that. Then there's also the sort of use of it in the context of social media sites and online providers and that sort of thing where you've got an interest in not having that profile available. And that's sort of related to that sort of right to be forgotten online set of laws that came out in Europe a few years ago.  

So yeah, it's important and I think it should hopefully come through in the second tranche. It does present technical problems and as records people and information people, you would know this, that one does not simply, to recite a meme, one does not simply delete customer data out of a database without potentially wrecking said database.

Anthony: Having spoken to lots of records managers in the last 12 months, I think those technical challenges have diminished considerably. So they're less of a discussion than they used to be.  

Cassie: This links to another thing that I hope is going to come through in the second tranche, which is the promised further help with de-identification techniques for businesses, because many companies, many of our clients are trying to implement de-identification for categories of personal information in all sorts of repositories and it's a learning journey for some they're quite sophisticated and they understand that de-identification is not just pulling out a few identifiers from a set of data, but it's around a whole ecosystem that sits around that, but for many, it's, they're still very much at the beginning of that journey.

So I'm hoping that that is part of what, what comes next as well.  

Anthony: Yeah, fantastic.  

Chris: So for anyone that has a tick box on their website. And when I'm signing up to that website, to a service or whatever it is, and the general counsel or the head of privacy or the head of whoever it is, and the data governance team have all agreed on everything in the data disclosure terms agreement, privacy policy, and everything else, matches up to if that box is being ticked in regards to data being collected. If I sign up, but I don't tick that box and I get that, that lovely red notice saying, please tick the box because all of our team has said you have to get that box before we keep hold of that data.  

If you've sent a copy of an email address externally to a third-party vendor, such as a Google or a social media network through these pieces of technology that I've been talking about, how do you, if you've not kept a copy yourself because you didn't have the box ticked, how do you allow for that data to be erased from where you sent it? And there's some of the challenges again, that we're starting to identify of a bunch of the clients where we review that they're set up.

Is that they actually facilitated the ability for that data to leave their control, but they're in no position to identify where it's gone because they, with their own rules, didn't keep it internally either. So there's a big mismatch there. So when you start to look at this kind of right to, right to erase your aspect, um, that piece is fascinating to me.  

Anthony: Isn't that really where the pub test applies now to that? Cause there's those scenarios. Isn't that the point of the pub test?  

Chris: I'd say it comes back to the reasonable expectation of privacy. That's that part. So yeah, that's the piece around the right to erasure. And that's the same with just updating your consent preferences as well.

How do you update consent? If you don't know. What consent's being given, and also the journalist exemption, I don't know if we touched on that earlier, I think we did. The journalist exemption aspect of where an accredited journalist can use someone's name and information, that is in this particular tranche, I believe.

If someone has a right to erasure, does that mean you can delete that in the future? From, uh, websites as an example, where a search engine has picked it up and so on. So that stuff's interesting to me.  

Anthony: And look, I think really interesting questions that we'll, we'll need to see the next sets of tranches to answer, right.

Or at least the, the further thinking on, as we sort of close out, it'd be good to understand what you think the context is of anything else that's going to be difficult in this existing bill to operate in, but there are any, any kind of gotchas in there that, We should close out today. With that, people should start thinking about, we've talked about how to be prepared.

What do you think is the the trap for young players?  

Chris: I don't think it's the young players that have got the issue. I think it's the people that just don't really understand it. One of the things we haven't discussed properly is the automated decision making aspect. I think that's a fascinating play.

There's 24 months really for people to prepare for that. I think if I'm a business as an example that makes, I do food delivery and my entire business model is based on understanding the distance between a consumer, where they live, um, what's around them in regards to other options from a pricing perspective, and I'm using all of that to get the best possible price, best possible margin and so on.

Um, that in itself would be fascinating to prepare for and 24 months seems it would be nothing if your entire stack is built on everything being automated. I think that's really interesting. Same with any bank or anyone that kind of routes a user. So if I'm a customer, I go to a bank and. You decide that actually I'm not worth your cash, I'm not going to make much money from you.   

So I'll just route you to this particular customer service agent instead. You've not been transparent about how you're doing that. Are you in some way discriminating and so on? So that whole automation thing's a really interesting one to get on top of now. I think because 24 months seems a long way away, but I think it can have quite an impact if you're not transparent about how you're doing it.

Now that doesn't go as far as, because I really dived into article 22 from a GDPR perspective, which is also worth looking at on automated decision making. It doesn't go as far as they do, but I do believe, um, that's more information required about defining what is automated decision making in this legislation.

That is not very well defined.  

Cassie: Traps for young players. I think that one area that potentially, privacy managers and teams might want to look at is their complaints handling process in light of the, the tort. And so, on one hand, they're very good reasons for people to bring about a case because they've had really substantial harms as a result of reckless or intentionally malicious handling of information by a company or an individual.

Where it's companies, in some instances, and this is sort of borne out in other similar sort of examples where people have a direct right of action, is you do get the vexatious and the litigious coming after you. So I think that and this is a very specific example, but I think that having a very tightly managed complaints management pipeline and having the right people triaging those complaints and then where there are matters that are probably not in scope for the tort, but that people are complaining about getting onto them quickly and just getting your house in order on that regard so that you don't get dragged into things that you don't want to be dragged into.

Yeah. Call that one out.  

Kris: Crystal ball, Tom. Let's do the, the rubber hits the road. The reforms are going to come. What's the prediction? Ignore elections and the next set of trenches and whatever else. What's your one big prediction as how the business world's going to respond here?  

Chris: To these trenches or future trenches? Sorry, because you got 2025 is when the next set of trenches would even happen. Are you talking about these ones?  

Kris: Yeah, I'm giving you a crystal ball. You can look five years from now. Where are we? What's happened and how did they respond?  

Chris: I really do feel the foot. I mean, credit where credit's due. Yes, there's a lot of people think the government may have not done an amazing job on this and they've dropped a bunch of things, but trying to push everything through wholesale right now before an election, I don't think would have been a great idea anyway.

Yet that this combination of automated decision making, tort, infringements, has actually, you know, from what I'm seeing, put a lot of people on edge going, 'right, we do need to get on top of this.' And I think just that alone will help to clean up a bunch of stuff that's been happening, that will prepare people for the other stuff anyway.

I actually think that in five years time, we'll be very much in line from a, well, maybe we'll be in a position with a CCPA rather than a CPRA but we'll definitely be more in line for sure. But hey, in five years time, we'll probably all be owned by AI people anyway. So we'll be the servants.  

Kris: There's a conversation for another whole day.

And Cassie, let me park the same question on yourself. And again, pick any time frame. But I think I do like that comment, Chris. Really, it's, we've got to start somewhere. They've kicked us off. I think the comment you made there that they are worried is a good one, that convergence of privacy, security and governance has to get tightly bounded organizations.

They need to understand. It's my two cents in that. And Cassie, where's, what do you think is going to happen?  

Cassie: Yeah, I think, I mean, it's funny, Chris, you mentioned AI, and I think that is going to drive a lot of the kind of uplifting data governance that we need to see for privacy purposes, I think is going to be pushed along by the sort of introduction of more AI governance regimes, and we've seen a little bit of that recently from the government with some principles to guide high-risk AI applications recently, and we are seeing a lot more, some of our clients are coming to us to say, how can we extend out our governance over these AI initiatives?

So I think, and it's not just a hope, I think genuinely that data quality and data provenance are essential to good AI governance. So I think those things are going to continue to improve across the board. And yeah, just to, you know, as we've all said, this is a partial set of reforms, but it does not in any way diminish the need to get on with understanding your business, understanding what information you're handling and why.

And, you know, frankly, just doing it in a way that is going to engender your customer's trust and the people that you deal with. Trust is becoming a very tangible currency in the corporate world these days. Like if you'd asked me five, 10 years ago, if board level conversations would be about customer trust, I wouldn't not- I would have been surprised, but they are happening now. So this is why privacy has to get up in that, that level of visibility. And I think is it's all good news.  

Anthony: It's fantastic. And we love the trust thing. It's so integrated with everything we talk about here on FILED. I want to thank you both. For joining us today, we could, I'm sure continue this conversation for some time, but I really appreciate you jumping straight on so quickly with this analysis.

It was really fine-tuned and really get those two different views from two very connected, but different worlds was amazing.  

Chris: Oh, I'd say Cassie's more likely to be right than mine. I'm only new at this stuff. I wouldn't worry, but I just look at things in different angles.

Anthony: Thanks for joining us today, Chris and Cassie. It was fantastic having you on the podcast. It is an evolving issue. There's going to be more analysis coming out around the tort law. We're going to see a bunch of other conversation happening in a whole different set of jurisdictions. I know we've got you on speed dial.

Tranche two is going to be around the corner post the election here in Australia, and we'll definitely get you back on to have conversations. I think between now and tranche two coming out, as we see the operations, the law, but thank you very much for giving us the time. It was some great analysis. I do thank you all for listening. I'm Anthony Woodward.  

Kris: And I'm Kris Brown. And we'll see you next time on FILED.

Enjoying the podcast?

Subscribe to FILED Newsletter.  
Your monthly round-up of the latest news and views at the intersection of data privacy, data security, and governance.
Subscribe Now

We want to hear from you! 

Do you have a burning topic you'd love to hear discussed?
Submit your topic idea now to help shape the conversation.
Submit your Topic