GA4 in the EU with Romar Van Der Leij and Emma Gordon - EP016

GA4 in the EU with Romar Van Der Leij and Emma Gordon - EP016

This episode was recorded live at the Digital Analytics Summit on the 13th of October in Amsterdam. In this first session I was joined by Emma Gordon who is a Digital Data & Analytics Lead and Romar van der Leij who is Legal Council at the DDMA. We discussed if Google Analytics 4 (is not) legal in the EU, and what the status is right now. We also look at how we got here, and what the future might hold for us from a legislative perspective.

Make sure you follow Emma on LinkedIn and Romar on LinkedIn and Twitter.

Some of the resources mentioned in this podcast:

Make sure you follow the show:

If you want to help us out, please share the link to this episode page with anyone you think might be interested in learning about Digital Marketing in a Post-GDPR world.

Talk to you next week!

-Rick Dronkers

https://lifeaftergdpr.eu/episode-016/

Transcription Disclaimer PLEASE NOTE LEGAL CONDITIONS: Data to Value B.V. owns the copyright in and to all content in and transcripts of the Life aFTEr GDPR Podcast, with all rights reserved, as well as the right of publicity.

WHAT YOU’RE WELCOME TO DO: You are welcome to share the below transcript (up to 500 words but not more) in media articles, on your personal website, in a non-commercial article or blog post (e.g., Medi), and/or on a personal social media account for non-commercial purposes, provided that you include attribution to “Life After GDPR” and link back to the https://lifeafterGDPR.eu URL. For the sake of clarity, media outlets with advertising models are permitted to use excerpts from the transcript per the above.

WHAT IS NOT ALLOWED: No one is authorized to copy any portion of the podcast content or use the Life after GDPR Podcast name, image or ness for any commercial purpose or use, including without limitation inclusion in any books, e-books, book smaries or synopses, or on a commercial website or social media site (e.g., Facebook, Twitter, Instagram, etc.) that offers or promotes your or another’s products or services without written explicit consent to do so.

Transcripts are based on our best efforts but will contain typos and errors. Enjoy.

[MUSIC SOUND EFFECT BEGINS AND FADES]

[00:00:00] Thank you for tuning in to the Life after GDPR podcast, where we discuss digital marketing in a post GDPR world. Today's podcast is a special episode, which was recorded at the Digital Analytics Summit on the 13th of October, 2022. It was a great event. We recorded a couple of podcasts with some speakers before the event for promotional purposes and we recorded two

[00:00:33] discussion sessions as an audio only podcast format on the summit. We also opened it up for questions. Which didn't really happen a lot. So that's a learning point for next time but it was a lot of fun to do. This podcast is audio only except for this intro, perhaps.

[00:00:53] You will hear that it was recorded live. There's some background noise. We tried to clean it up a bit, but this is as good as it gets. So hopefully that's good enough. This first round table session was with Emma Gordon and Romar Van Der Leij and we discussed Google Analytics 4 in the EU.

[00:01:15] Yes, no, maybe. That was the premise of the discussion. I think we went into some pretty interesting topics and we explored some avenues on what is personal data and why you can or cannot use it and how we have to change as an industry to support that? One disclaimer,

[00:01:39] although Romar technically is a Legal specialist, nothing that is said on this podcast is legal advice. Take that for what it and enjoy the podcast.

[00:01:53] Rick Dronkers: Welcome everybody to a podcast / presentation. Yeah. So this is a new unique format. We'll see how it pans out. If you don't catch everything afterwards, you can listen to the podcast. My name is Rick. I'm the host of Life After GDPR. We released a couple episodes before this event with some of the speakers, and today we're doing a couple of podcasts with speakers from the event about privacy, GDPR, all the fun topics. I'm joined in this session by Emma Gordon and by Romar Van Der Leij. You've both already had sessions, right? Yeah, yeah. How did that go, Emma?

[00:02:30] Emma Gordon: It was my first time speaking post Corona virus. And I was pretty happy with it. I had to rush the end, so I got the timing a bit off, but I can work on that.

[00:02:41] Rick Dronkers: We can dive into more details and questions in the podcast.

[00:02:45] Romar van der Leij: Yeah. For me it went well as well. It was my first time presenting in English, so it was the first. But I had fun.

[00:02:51] Rick Dronkers: It went well. I was there. Yeah. Thank you. Maybe can you both give a short introduction of who you are and what you do, and then we can dive into the details?

[00:03:00] Emma Gordon: Sure thing. I'm Emma Gordon. I'm a digital technical and analytical lead working as Lead in the Product Analytics for a software company, Hotjar. I'm Irish. I've been living in Amsterdam for a couple of years now. I also tend to mumble, but I'm hoping that this big microphone fixes that problem. [Laughs]

[00:03:19] Earlier today I had a talk today. I was talking about getting attribution to digital analytics teams and reporting on ROI and how that becomes more increasingly more important in order to get visibility on the value that data teams are driving to the business.

[00:03:34] Romar van der Leij: I'm Romar Van Der Leij and I'm a legal council at DDMA. So the organization that is behind the organization of the event we are today. We have a pretty special position in the, as a legal council because we are pretty much in between practical side. So the marketing side, the analytical side, and the laws and regulation side which gives us the special position because

[00:03:59] we also have technical knowledge but also a lot of knowledge of the law, which makes us, yeah, a bit more practical to work with, maybe as a legal council, a bit less star. and we do that in all kinds of ways because DDMA is a branch association. We organize offense legal offense as well, but we also write documents for our members legal help desk. Very broad service side.

[00:04:24] Rick Dronkers: And your presentation today was actually a nice prelude to what we're gonna talk about Yeah. In this podcast. Who witnessed his presentation before?

[00:04:32] Romar van der Leij: No. You're seeing like 100 hands. Yeah. No video, but you, so you can lie.

[00:04:38] Rick Dronkers: Yeah, cuz you talked about. the impact of the GDPR, right? Or of privacy regulation?

[00:04:44] Romar van der Leij: Yeah. Yeah. I talked about like the legal basics. You need to know when processing data for analytical purposes. Which is very complicated because I could only, I had the 20 minutes to cover the whole GDPR in 20 minutes. And I also wanted to give a legal update about everything that's going on right now with Google Analytics.

[00:05:03] especially then the broader few on data transfers to the United States. Which is a really hot topic right now. And even people are talking about a ban on Google Analytics. So that's what I tried to discuss.

[00:05:15] Rick Dronkers: Yeah. So I think in this podcast we can dive a bit deeper on specifically the Google Analytics part, Google Analytics for in the EU? Yes. No, maybe that's the title

[00:05:26] we ended up with. And I think it's very relevant for a lot of people right now in digital analytics. A lot of people have been using Google Analytics, mainly because it was a free tool. So the adoption was very high and got used to it, and now we have to deal with these potential issues. And it's not very clear like you already highlighted in your presentation. would be nice if we got a clear yes or no. Yeah. It seems to be a maybe, right? From a regulation point of view.

[00:05:53] Romar van der Leij: Maybe it isn't a, maybe it's just we don't know. Because it's a very complicated and more broader problem than just Google Analytics.

[00:06:04] It's a problem for all tools you use that transfer data to the United States. And that's why we can't really say if it's a yes or a no. As I mentioned in my presentation, it's a little bit that you have to see what's the compliance expectation for that day. So every day can be a different day.

[00:06:23] The one day we might be saying, Okay, there's a new privacy shoot. We can use Google Analytics completely right now. But two days later, the, the Code of Justice could say something else. It's probably going to be a bit, little bit of longer time before we come in that situation, but yeah, it's a bit, a little bit fluctuant, I guess. so it's not a yes, it's not a no, but it's also not a maybe.

[00:06:46] Rick Dronkers: So anybody that hoped to get an answer outta luck.

[00:06:49] Romar van der Leij: I'm sorry. I'm sorry.

[TIMECODE WILL BE OFF BY 00:00:27]

[00:07:07] Emma Gordon: I think it's a tricky question and it comes down to the same way that Google Analytics is kind of disseminating, how we approach how you approach implementing privacy becomes down to the responsibility of the user to ensure that they are implementing in a compliant manner, but then it's also the data is quite anonymized. But the focus isn't on that at the moment, so it's not really hot topic yet. Yeah. Because the focus is all on Google Analytics and Google Analytics has got the brunt for this.

[00:07:40] There was this moment, where after the announcement of the movement to GA4 where. Google Analytics was a little bit of a vulnerable moment, which was interesting because Google Analytics is so ubiquitous to analytics. It's been, it's been, it was the precursor to the digital analytics industry when it made that free product coming outta urchin into classic Google Analytics and the universal analytics, and it opened up and was a big key driver in the digital analytics. Even the question yes, no, or maybe on Google Analytics, it's kind of incredible that this is coming up as a topic from coming up as a vulnerable position for Google Analytics to be in. And there was a moment there where Adobe and other companies were kind of marketing on that moment where people were stepping back and looking at their stack and think, do we need to go forward with GA4?

[00:08:33] And that's a question that hasn't ever been raised because if you do marketing or if you do digital analytics, you have Google Analytics. So a lot of that was Adobe was like directly marketing to this vulnerability. But the core laws that are being processed and argued also apply to these tools.

[00:08:54] Yeah. Right now the focus is on Google Analytics, but it's coming up for other tools as well. I don't think anyone is going to escape this spotlight.

[00:09:05] Romar van der Leij: The question is if, in the near future. Also other tools will be receiving complaints about the fact that they are transferring data to the US.

[00:09:14] And if DPAs are investigating that, taking decisions on that, then we would be knowing more. But until that nobody really knows if Hotjar or any other marketing tool used or analytical tool used that sends data to the US is compliant or not. Yeah, because it's always, DPAs can only so data protection authorities can.

[00:09:37] take decisions on specific circumstances. They can only assess whether the processing of personal data and those circumstances are in line with the GDPR or not. So that's what makes it difficult to say yes no, or maybe because it's not a, it's not like a fair overviewing decisions that the DPA makes. So,

[00:09:57] Rick Dronkers: So we'll end up with, it depends.

[00:09:59] Romar van der Leij: Yeah. Always.

[00:10:01] Rick Dronkers: Yeah. I think that we raise a couple important points. So one is the vulnerability of Google. Also, the timing of introducing GA4, which in my opinion was not really ready for going out of better at that moment, maybe arguably today, even not.

[00:10:18] And then getting these, yeah, these privacy issues at the same time that really like you said, it was so ubiquitous that it's almost strange to think like, will many people now switch? Bewa, Matoma, whatever kind of other solution. But they really opened it up for that. The fact that I feel like Google is being picked on to set an example, which probably makes sense. Like, if you're gonna do it, might as well go for Google. Right. So that's probably, the logic of none of your business as well. Like, file complaints against Facebook and Google.

[00:10:48] And then if it applies to them, probably should apply to others. But like you said, it's all on a case by case basis. Yeah. Which makes it really hard for us practitioners who are not interested in legal battles. We wanna know if what we are doing is allowed or not, and how we should adapt.

[00:11:06] Romar van der Leij: Maybe better say you just wanna do stuff. Yeah. And even not be thinking about if it's legally possible, but just, and we discussed it before on the podcast. If it's more of an ethical question internally in your organization, do you think we can do this or not? And that's just a legal question.

[00:11:25] Emma Gordon: I think like with server side as well and on these other technologies, the responsibility for enacting privacy has decentralized back to the implementers. Google Analytics. So because it's happening server side, which is not quite a black box, but you can do a lot of enrichment in the data as it's sent server side that puts it outside the client's view.

[00:11:50] It becomes your responsibility to ensure that you are compliant in that. And then this kind of decentralizes the responsibility because. Everyone's kind of hoping for a kind of structured guidance that they can just adhere to, but it's not, it's coming down to individual implementations.

[00:12:06] Yeah. And then it becomes a question that you have to figure out yourself. First of all, we have to determine, what is personal data, what is privacy? And I know you had a guest on podcasts, I can't recall who made a really good observation of what personal data. Or what of privacy?

[00:12:24] Rick Dronkers: Yeah. Hannes Kuhl from Trakken. Yeah.

[00:12:26] Emma Gordon: Yeah. About it opening and closing doors in the future. But that, that kind of interpretation of that is falling back to the broader public and users, us to kind of, get a grip on. It's not solidified. It's kind of, it's quite a liquid definition. We have to think about the ethics of that and about what we are doing. If we were to look at the ethics of data and we're kinda open, I'm gonna open this topic a little bit. It's a big topic, right? So if you're, and there's a lot of conversations on it. If you're writing AI algorithms, you're going to have data ethics.

[00:12:59] You should have a data ethics mindset and you should be looking into. Topics like Race Against Technology by Aruba Benjamin, and this kind of reading, which kind of opens up how we compound biases into ai. That's kind of data ethics very heavily and in your forefront of your mind, but there's also the data ethics that we have as marketeers and digital analysts that we don't think that we are in control of, but we are.

[00:13:23] So we compound biases with the segments we create. We compound biases with the way we advertise, the way we market. And then also about how we control use. And then there's the bigger issues of controlling how we use data, how we control it, and also how. If we're selling it, sending it to the third parties that's kind of a smaller version of the data ethics, but something that we are responsible for and something we're responsible for. Understanding, knowing, acknowledging, and bringing into our workflows. And now it's with server-side, it's become something that we are now in control of.

[00:13:56] Romar van der Leij: From a legal perspective to add is that at DDMA we often say things that are an ethical discussion right now will probably become laws and regulations in yeah, four or five years. So we're seeing that right now with the AI regulation. A lot of things going on in Holland, in the Netherlands. We had with our tax office some problems with AI systems. So there's a lot of attention right now. And earlier this year the European Commission did a proposal for an AI. For the whole European area based on the risk for individuals that an AI system can have. And depending on that risk the higher the risks, the more rules and re regulations apply in the lower the risk but that's from an ethical perspective always. If there's an ethical discussion right now, You probably can say for sure that in four or five years there will be a law of regulation. That's a normal flow.

[00:14:53] Rick Dronkers: Let's zoom in on that a little bit because so for the people in the room right now who is, who works in a marketing organization where there is more than a hundred people that work on digital marketing, like within your company, more than 50, more than 10, And less than So, the majority, although this is not a really great, simple size right, but the majority is

[00:15:21] like, let's say between one and 25. How reasonable is it? Like these, these things like thinking about data ethics, right? So we're coming from a world where we had an analytics tool that helped us optimize our digital marketing campaigns and spent and return on investment. And this is, it feels for a lot of companies, feels like it feels like an added job on top of it. Yeah. Right. So it feels like things are being taken away and we have to do more.

[00:15:50] Romar van der Leij: It’s like a setback.

[00:15:51] Rick Dronkers: Yeah, exactly. So how do you see, like I could argue for a large bank or a large corporation. Okay. They just have to employ him. I'm looking at you.

[00:16:02] They have to employ people for that, right? They have the budget for it. They have to look into it. But for small, medium size businesses, how do you see this playing out?

[00:16:11] Romar van der Leij: Yeah, it's a bit bit difficult for me to say because I'm not really involved in shaping teams and shaping organizations in their organizational structures with the persons they attract and they employ. But I think there should be at least emphasis on the fact that it's shifting.

[00:16:29] We see the shift. We see more people, more marketing qualified people get also the added job as privacy officer or professional privacy, professional privacy champion. I think we all know the terms for that. And I think that's an important shift. But we also need to take a look if it's okay if we just gave that task to one person, because maybe it's not effective enough to just add someone with, give someone another task.

[00:16:57] But maybe you should take a, should I just add another person to the team qualified with enough expertise on this subject? And then also keeping in mind the ethical issues. Yeah.

[00:17:07] Emma Gordon: Can you load up that question for me again? Yeah. Cause I was thinking about it and then drifted off. [Laughs]

[00:17:13] Rick Dronkers: Yeah. So the issue I could see for a lot of small, medium sized businesses, they have been using analytics to perform marketing, return on investment, and now they basically, they have to add an FTE.

[00:17:27] let's call it what it is. They have to add an FTE that is full on, on privacy, just to be able to use that data. That's, I think, that's gonna be the end result of all this. Like every company that wants to use data will have to have an FTE dedicated to that because otherwise you're, you're exposed to too much risk.

[00:17:45] Emma Gordon: I don't know. Is there, is, is there something, I don't know if there's, that's a bad thing in any way. I think that it's, in taking this as a responsibility that we have to be aware of and taking care of in how we process our workflows is important. I think with, with it though, there's this enormous gray area emerging right now and it's quite conf can be quite, it can be quite confusing.

[00:18:12] It's shifting. The goal posts are moving; they will have to have that FTE in order to keep on top of it and to ensure that compliance. I don't know if we can say that it's necessary. Like a drag or a pull down on the budget or the way we have to shape teams if we have to open this new role.

[00:18:30] It's just part of the dissemination of data teams from being. Like in 2012, 1 or 2 web analysts to now being these tribes, which are according to the Spotify models, laughs which are, the data engineering growth analytics, product analytics and data privacy and compliance. It's just part of the whole of, producing a data product.

[00:18:56] Rick Dronkers: I think, I agree. I think it's what it's, this is the hard sell. Like, so I consult small, medium sized businesses and this is basically the hard sell. Like people are losing, like they feel like they're losing out, right? Because, so what they had, they can still have it, but then they're basically exposed to risk or they have to minimize, right? So they have to eliminate all personal data and. accept that, they can do less with it or they have to get this FTE in.

[00:19:22] Emma Gordon: But this is, I think, where we've been a little bit spoiled. Yeah, definitely. Because of Google Analytics quite well rounded report interface was free. Yeah. And with that kind of came a little bit, it kind of got hard to sell the value of data because it was, we used to having it as such a low budget.

[00:19:42] Rick Dronkers: It kind of distorted the expectations that you should have actually.

[00:19:46] Emma Gordon: Yeah. And now if to get the value, and that's what I was kind of talking earlier about, processing ROI and pushing it to the VP and director level proactively. It's because Budget requirement for getting that value has now increased. Yeah. That doesn't mean that it's, you're deriving less value from it. It doesn't mean that it's still not driving value and revenue to your business. You just have to cut a little bit outta the pie.

[00:20:12] Rick Dronkers: Yeah. And that could result in deciding that in this new reality the ROI is negative and maybe shouldn't do it. Right. That's also a possible outcome.

[00:20:22] Emma Gordon: It's a possible outcome. Yeah. That's a possible outcome. Or you have a data set that you have tried to use the settings in GA4 to make it as compliant as possible. Obfuscate the IP addresses or remove them completely, sorry. And then, just have a more generalized data set for more generalized trends and yeah. And if that's acceptable to drive your business, then, yeah, that's what you do.

[00:20:48] Rick Dronkers: I wanna also open it up for questions. We have a microphone on a chair here. If anybody wants to jump in with a question about GA4. Meanwhile we can just keep it going, but if you, if you wanna ask a question, feel free.

[00:21:02] Romar van der Leij: Free shot to be in the podcast. That's awesome. Yeah.

[00:21:05] Rick Dronkers: No, company promotion. So for you from the DDMA's point of view, like you try to help organizations navigate through this, right? Mm-hmm. I know it's a hot topic. The website is full of articles about this. What do you feel is the main challenge for companies that are now using Google Analytics? And what are the questions that you, you.

[00:21:28] Romar van der Leij: The main challenge is understanding what's happening and get a grip on what's actually the problem going on. And, and then in context can, and then you have to place in context. How important is it that we do something with. I see you nodding, but I hope you can understand what I mean.

[00:21:48] So there are a lot of companies that know, okay, there's something going on with Google Analytics, there's something going on with data transfers. But what is going on? So in, in detail and what do we need to do with it? Do we need to take action? Do we need to stop using Google Analytics? And what are the most important steps we need to take right now?

[00:22:06] So, The problem is as you said before, you would like a guideline. So here are six steps you need to do and then you're fine. It's not that easy. At this moment it's just like a balance of risks inside your own organization. So, knowing that at this moment in time transferring data to the United States isn't compliant according to Schlemm stay and everything that happened in that case,

[00:22:29] Rick Dronkers: Personal data, right?

[00:22:30] Romar van der Leij: Personal data, yeah. Knowing that there is a risk that a DPA receives a complaint of you using any tool, not just Google Analytics, any tool transferring that personal data is it a bit of a big risk for you? Is it a big for your image if it comes out that you transfer the data?

[00:22:47] So and that's what I meant with compliance of the day. You just have to take a look. Okay, What's the status right now? What are our balance of our risk? And if we think, okay, it's going to cost us a lot of money if we stop using Google Analytics. Yeah, maybe you should keep using it knowing that it's not 100% compliant.

[00:23:05] But the other way around, if you see that there are no big risks, then you maybe should stop using Google Analytics and maybe look for an alternative. I just, in my presentation also talked about it, there are a lot of companies using universal analytics and probably now Google Analytics.

[00:23:21] That maybe use 10, 15% of the features that are in there. Because it's free, it's easy to use, it's simple. Maybe Google Analytics for a little bit less, but companies could probably use a tool that uses and processes less data, so that would be a better fit. And then maybe also just inside eu. So it's really, it differs per organization.

[00:23:46] Rick Dronkers: I feel like for a lot of organizations, the going from this place of not knowing and being scared to assessing value and risk. Yeah. you have these to access, right? One is how big is the risk and I think GDPR could be up to 4%. Yeah. Revenue defined.

[00:24:06] Romar van der Leij: Yeah. Depending on the breach of the GDPR. Yeah. The violation, it can be 4% or Right.

[00:24:10] Rick Dronkers: So you could create a skill based upon that. Take some sort of monetary value. Yeah. And then on the value side, you have to actually, like the question that all analysts wanna tell their boss is how valuable is analytics to us? Which is always a really hard question to, to really answer, but that's the other one you need to, Yeah. Need to figure out.

[00:24:30] Emma Gordon: I think a lot of companies are when they're making that, if they're making a risk assessment like that I think there's a lot of fuzzy areas. So it's a case of, oh, we think we got everything and we think we covered everything. And then, oh, you missed this part and now you're at risk for this huge fine. I guess in that kind of, that fuzziness and that liquidity, you kind of have a bit of a question and that is what is the legal definition of privacy and what is the legal definition of personal data? At this time. Yeah.

[00:24:58] Romar van der Leij: Well, you're not cutting into a fairly. [Laughs]

[00:25:02] Emma Gordon: So, and digressing a bit super interest in this, cuz this is at the core of everything, right? Because that's where the fuzziness comes in. That's where like, oh, did we cover everything? Are we done? Well, to start with, and not to get too deep into the legal technical issue, but The fundamental right, the EU fundamental right is laid down for privacy, is laid down in the, in the charter.

[00:25:20] Romar van der Leij: of the EU. It is a right to have privacy and protection of your personal data and stuff like that. But that can also be limited and that's what we're doing every day. Processing data. You can use some personal data for some purposes because you can limit the fundamental right of privacy.

[00:25:39] The question, what is privacy and what is personal data is always, you always need to place it in a context. So I just, I think yesterday I spoke to someone about the expectation you have when talking about privacy. So, Right now, we just saw somebody shoot some photos of us because we're in this setting.

[00:25:56] It's fine. In this context, our privacy expectation is that people can take pictures of us, people can listen to us or record our voice. But if I would do the same thing tomorrow at your house, it would be a different expectation because your expectation is not that somebody records your voice, takes pictures of you, in the context of personal data. The GDPR actually in my opinion, has a little more strict interpretation of personal data than is being used right now by data protection authorities and maybe even the European Court of Justice because

[00:26:29] Emma Gordon: Interesting.

[00:26:29] Romar van der Leij: personal data is always depending on the context. If you're using a photo you made a photo of someone and it, the police uses it, it's probably personal data because they use it to identify you. But if I use it for a different purpose, can't think of one right now. But a different purpose that is not used to identify someone. It's probably not personal data, it's just a photo.

[00:26:51] But all the DPAs and the European Court of Justice right now are saying, personal data in the GDPR needs to be interpreted on a broad way that any in theory possible way to single out an individual make something personal data. So that means that in the Slams case with Google Analytics and everything is going on, that the DPA say, yeah, because Google has a lot of data, because the website has a lot of data combining that data together makes it possible in some magical way. In the end, single out an individual. What makes it personal data? And the question is how far do we go?

[00:27:30] Rick Dronkers: It's not even a magical way, right? In their case, what their argument was, if you are logged into the Chrome browser, so yeah, let's assume you use Gmail, YouTube, Google Maps, right? Any of Google's many properties, which is highly likely, and you are logged into the website. Yeah. At that moment you have a Google Analytics client Id. To us, it's just a random number and does not really identify you. Yeah. But you could argue that Google could lay those two data sets over each other and figure out who you are on that website. Cause they know you're Gmail. And then that's the high level argument, which probably makes sense that Google could do that.

[00:28:07] Romar van der Leij: Yeah, sure. It does. It does make sense. because you can connect all those data Yeah. All the data and probably identify someone. But there's still a question if how far do we go in, say, in which context to choose? And right now it's just said that if it's possible to identify someone in which way we can imagine then it's personal data. We got a question. See if this works.

[00:28:30] Guest - Iris: Hi, my name is Iris. What if I collect the data personal data in order to build an AI in order to use a personalization, sorry, they might not as if I use the data to create an AI in order to calculate the next move of my customer. Yeah. We consider that as PII>

[00:28:53] Romar van der Leij: Yeah, any processing of data for what purpose? If it's, like I just said, if you can single out a person, so it's identifiable makes it personal data and the GDPR applies.

[00:29:07] Guest - Iris: But if we just, we deploy that, but we don't identify the customer. But when our customers, like specific from London, when move we don't identify and set customer to this customer a specific content. That is again, API processing data.

[00:29:29] Romar van der Leij: I don't think I totally understand your question, but.

[00:29:32] Guest - Iris: We have a customers Yeah.

[00:29:35] Specific in London. We process the data and we know that customers from East London going to A, and customers usually from West London going to B. There on the side. Yeah. When we identify the customers from East London, we can instead of providing two journeys only provide only the journey that they like. Yeah. That is again PII.

[00:30:01] Romar van der Leij: Okay. Now, I understand it. Thanks for rephrasing. There is a difference in, I've talked about this before between PII in personal data because, PII in most cases is personal data, but personal data isn't. PII. Yeah. Saying it right?

[00:30:18] Yeah, I guess so. Which means that some information that is classified as pi, so people from West London don't necessarily need to be personal data and the GDPR doesn't necessarily apply. But if you have other metrics, so not just West London, but also maybe postal code, house number, stuff like that.

[00:30:36] So it depends on the granularity. So if the deeper you get, the more probably it will be personal data. And the problem with Google Analytics is that they say, Okay, we got people from West London and we're processing that for this purpose right now. But in the back of Google's million database, they also got the poster codes and they also got email addresses, and they also got this and this and this and this. And that's the possibility that they can combine all those things together with a Google account makes it possible to identify a person. So I hope that answers your question a bit.

[00:31:09] Guest - Iris: Yeah, that was helpful. Thanks.

[00:31:10] Emma Gordon: So it's the compounding of these less sensitive PII pieces of information together that is forming a more identifiable profile. Yeah. So it's not the one individual piece.

[00:31:23] Rick Dronkers: And it's like the privacy by design principle, like it's about trying to make sure that something that could happen cannot happen by not, by not having the building blocks for it basically. So even like in Google's case, like we in Google Analytics, we check the boxes, don't use the data for advertising, don't use it.

[00:31:45] But the thing is, Google could I theory, like if they were in nefarious, they could, and that's what it's about. Not necessarily that it's actually technically already happening. It might we don't know what happens at Google's end. It's about the fact that they could,

[00:31:59] Romar van der Leij: Yeah. And, and the problem is that it's not really the problem, but Google always comes out with statements that we don't do that. Yeah. We don't receive requests from a national security agency, so everything is fine. Leave us. But the EDPP, which is like the mother board of all DPAs in Europe also rolled out in a guide. That's statements by companies that they don't do stuff because they have organizational measures.

[00:32:28] That's a piece of information on its own is not enough to say it's not happening. Yeah. Which I can understand. I think with the question, just ask differentiating East London and West London users and basically doing website personalization based on whether you're from east or west, I think if you have enough visitors and they're all from London, then the group becomes large enough that you can't identify single users.

[00:32:52] Rick Dronkers: So you could have a user ID East London and a user ID West London and aggregate all users under that. Yeah. But then if you only have two visitors, then it already becomes an issue, right? Like yeah.

[00:33:01] Romar van der Leij: Right. So if it's more aggregated, it's less sensitive, but the main issue is that to receive that information to you have data, you're already processing that data to know where these people are living. Yeah. Stuff like that. And then you put it in a box aggregated and then it's just east and West London. Yeah. But the processing used to get there. Yeah. Is already subjected to the gdp.

[00:33:23] Rick Dronkers: Yeah. But if so let's, for example, let's say that you didn't do it by IP LoopUp. You asked them on this website, they can select, I live in East London and West London. Right. They can do that in the form in that case.

[00:33:33] Romar van der Leij: Without an IP address.

[00:33:35] Rick Dronkers: Yeah. Without IP address. Right. Yeah. So we we're, we're taking that all out. We just know that whether they're from east or West London Yeah. Would you still, would you still argue that it's personal data at that moment?

[00:33:45] Romar van der Leij: For me it's not because in the context you're, collecting it and processing it's probably just used to divide those two groups. Yeah. And you can't single out an individual. Yeah. But I can't imagine that the DPA or the European Court of Justice maybe. Would say that if Google does this and combines all this data with the rest of, with all the, with all the rest of the data and someone logging into his account, they would say it's maybe personal data.

[00:34:12] Rick Dronkers: Yeah. And I think that's where the struggle lies. Like all these technical, what ifs. And the reality of what is currently.

[00:34:21] Emma Gordon: Exactly. And it's like that second layer on top. So you do the base layer, but then you have the compounding identifiers which together are forming a profile.

[00:34:29] And then, okay, so now it's east and West London. And I think you had this before one of your pockets also. But what if it's just a village of one person and another village of one person? It becomes about the number of observations per data point as well. Yeah. Because then it becomes, even though you have a very and geographic area. It's now focused in on one point because there isn't enough other observations to obfuscate.

[00:34:49] Rick Dronkers: Yeah. I think we are out of time. I wanna thank you both for thank you for joining us on the podcast. Yes. Thank you very much. Have fun at the event.

[00:34:57] Romar van der Leij: Yeah. We'll speak to each other probably some more.

[00:35:00] Rick Dronkers: Thank you. Thanks very much. Thank you all.

[00:35:02] Emma Gordon: Bye now.

[00:35:04] [MUSIC SOUND EFFECT BEGINS AND FADES]