DataTopics Unplugged

#43 The Privacy Paradox - The Tug of War Between Data Privacy and Safety

DataTopics

Welcome to the cozy corner of the tech world where ones and zeros mingle with casual chit-chat. Datatopics Unplugged is your go-to spot for relaxed discussions around tech, news, data, and society.

Dive into conversations that should flow as smoothly as your
morning coffee (but don't), where industry insights meet laid-back banter.
Whether you're a data aficionado or just someone curious about the digital age,
pull up a chair, relax, and let's explore the complex intersections of data,
unplugged style!

In episode #43, titled "The Privacy Paradox - The Tug of War Between Data Privacy and Safety," our special guest Sofie Van Den Eynden guides us through a labyrinth of data privacy issues that challenge our notions of security, consent, and progress.

  • The DNA Privacy Conundrum: Sofie Van Den Eynden unravels the tensions between the incredible potential of DNA for solving crimes (inspired from Sofie Claerhout's research and historic cases like the Golden State Killer), and the ensuing privacy concerns, exemplified by the 23andMe data breach and the broad implications of such incidents.
  • Privacy on the Road: A deep dive into the ethical implications of employing smart cameras for road safety and the debate on privacy trade-offs, with a glance at policies in Germany and the Netherlands.
  • The Cost of Cyber Safety: How much privacy are we willing to sacrifice for safety? Debating the cost of privacy in the fight against cybercrime. Should transaction data be more accessible to prevent fraud?
  • GenAI and Copyright : We cap off the episode with a discussion on copyright issues in the age of AI, ignited by Kid Rock's interview with Theo Von.

 

Speaker 1:

Let's do it. Thanks, Alex.

Speaker 2:

You have taste In a way that's meaningful to software people. Hello, I'm Bill Gates. I would recommend TypeScript. Yeah, it writes a lot of code for me and usually it's slightly wrong. I'm reminded, incidentally, of Rust here. Rust, rust, congressman. Iphone is made by a different company, and so you know, you will not learn rust while skydiving. Well, I'm sorry, guys, I don't know what's going on.

Speaker 1:

Thank you for the opportunity to speak to you today about large neural networks.

Speaker 3:

It's really an honor to be here, rust Data topics Welcome to the data.

Speaker 2:

Welcome to the data topics podcast.

Speaker 1:

Hello and welcome to Data Topics Unplugged, your casual corner of the web where we discuss what's new in data every week, from Gen AI generated songs to data and privacy, everything goes. We're also live on YouTube, linkedin, twitch X, you name it. We're there. Feel free to leave your questions. Feel free to join us there as well. We'll try to address some questions we have here. Today is the 29th of March of 2024. My name is Murillo. I'm your host for today. I'm joined by my sidekick Actually, I think it's a bit weird to call you sidekick, but the one that is always there.

Speaker 2:

I'm the Robin to your Batman.

Speaker 1:

It's a bit strange to say that, but sure I'll take it, Bart. Hi and we also have a special guest today. We have Sophie. I don't know if I should try to attempt your last name Sophie Vanden Aden.

Speaker 3:

Sure.

Speaker 1:

Okay, sure, that was very polite. Data analyst in the data strategy unit here at Data Roots. Expert snowboarder, as we were covering before the official intro, also born on the 13th, the February Friday, the 13th, but self-proclaimed, not self-proclaim Good luck symbol Despite her birthday. But anyways, how are you doing, sophie?

Speaker 3:

I'm very good. I'm very good. I had a lovely day. I'm excited for this podcast. I've been a huge fan of yours.

Speaker 1:

It's recorded. We're all here for your first next week. On the intro You're going to hear that Cool, really happy to have you here my pleasure. Yes, on this nice day. So what do we have here? I saw you have some topics about data privacy.

Speaker 3:

Yeah, one of the things I really like is always looking at data and what we can do with data and how much possibilities there are, but then it always I feel not always, but in a lot of cases it clashes with how far can we go for the personal side? How can we navigate through regulations, through privacy concerns, through all of that? I feel there are endless possibilities with data and with AI and with ML and it's such a nice research domain, but then we're often held back by regulations and by things that are nothing to do with science or with other things, but more to protect us as humans, and lately I've saw something very interesting about DNA research. So in a lot of cold cases, Is it this one listening?

Speaker 3:

Yeah, yeah, so actually no, let me introduce Dr Sophie Gladhout. She's a very interesting person because she's been doing research into DNA and chromosome Y investigations and actually she found a manner to investigate chromosome y over generations until the first male person in the world. So she can actually go down all the generations of the first male until you and bart not me, because I don't have that chromosome, so it's only male-based and she actually found a way to. If on a crime scene and the murderer or the person that actually did the crime left some DNA, she can, through that Y chromosome in the DNA left on the crime scene. She can very easily find the murderer without having the actual.

Speaker 1:

DNA Without having the actual DNA of the murderer.

Speaker 3:

She has the DNA of the murderer, but she didn't get it from him because we don't know who is he.

Speaker 1:

Yeah, okay. So then I guess the Well through the Y chromosome right.

Speaker 3:

But she can't do it Like she. Legisl legislation keeps her from doing the actual things she could do with the Y chromosome because of privacy regulations, I see.

Speaker 2:

Yeah, difficult to honor.

Speaker 3:

Yeah, and we have a cold case of 30 years about a woman that is violently murdered in a small village at the seaside. A woman that is violently murdered in the small village at the seaside and now, a month ago, a new legislation has been approved that she can actually do this.

Speaker 1:

This is all in Belgium.

Speaker 3:

This is in Belgium, so it's Belgian legislation that is old and was old and it kept her from doing her stuff and to actually solve. She's very confident that with this new law she can actually get this murder. That did this murder 30 years ago.

Speaker 2:

Okay, yeah interesting and uh actually, but it's not there yet, like now they're starting yeah, they're starting, but it's all she.

Speaker 3:

It's a gray zone, like with everything, with legislation, um, but what she can do is she can actually ask for all the, all the people that live in that village on the coast of belgium to give their dna on a voluntary basis, and when she will find a match with the y chromosome of all these people, she will can yeah, she can actually find that family, but it can be third, fifth, sixth nephew, it can be go very, very far and then, yeah, okay, the police can do their investigation, but now they are on a dead, on a dead end for the moment and I guess the only like, uh, the only way, from what I understand to to be able to use that data is if you're officially a suspect yeah, it's the only way.

Speaker 3:

Yeah, like until now for now yeah, you think this is going to change in the near future yeah, I mean now she's allowed to use family's data, she doesn't need to put them as an official suspect oh, ok, it's a new legislation.

Speaker 1:

Yeah, it's a bit tricky one, right? I think it's like the whole privacy, data privacy versus what you could do if there was not so strict privacy laws.

Speaker 3:

I think the tricky thing here is that some people will maybe work with her in this experiment and then, by accident, their sixth nephew is actually the murderer and because they helped this investigation, okay, yeah, they put away his the murder, but it's also their family, yeah, and do they give consent about that? Today, like it's not super clear how consent is given with actually putting people behind bars?

Speaker 1:

yeah, it's a bit of a, because there's also the argument of if you have nothing to hide, then why do you need so much right? And I think in the us there is a. The privacy laws are less way less.

Speaker 3:

It's a very.

Speaker 2:

It's a big, big difference yeah, there's actually this case here, right, I mean, there's also but I found if, when we, when we map this today to privacy or privacy, you know, like, like, where is the? Where's the cutoff to this? Right? What do you mean by the cutoff? Like, I don't know very much about this domain Like, if we move this to messaging platforms, right, like, if you do WhatsApp, it used to not be end-to-end encrypted. It is now. There's still some debate whether or not they can actually read it. If, let's say, law enforcement requires it, signal is for the end-to-end encrypted. No one can read it. And I think you can make this argument like privacy is super important. No one should be able to read your personal messages. You can also make this argument where you say, for legal reasons, for criminal investigations, for example, there should be an opening to read these messages, and I think that is a bit like the threshold. Like where do you want to still be able to do something that you feel like this should not be done for legal purposes?

Speaker 2:

to investigate something versus blocking everything.

Speaker 1:

I think the other thing too is the proactive versus reactive right, like if you, if you know that like there's communication about I don't know very dramatic, it's like the terrorist attack kind of thing, right, and it's like if and if you could just kind of monitor, then maybe you can proactively mitigate that but that's even one step further exactly exactly exactly what's the threshold then?

Speaker 1:

yeah, exactly it's a slippery slope. I'm not for that, just to be very clear for everyone that says any, but I do like that's an argument that we do here thanks, mart.

Speaker 1:

This is talking sideways in the mic I was trying to be like cool, you know, when you hold the thing sideways like the rappers do, but uh, it's a podcast anyways, but uh, yeah, so indeed, uh, I'm not for that, but I think that's an argument that you hear right, it's like you can be proactive, you have nothing to hide why you're so afraid, etc. Etc.

Speaker 1:

Um, and I think for the also the other dimension to this, because we're talking about medical data like dna yeah if someone, because I think arguably like right, if you monitor my messages and maybe you can blackmail me or something like maybe there's something you can cancel me or whatever. Um, there was also I think this reminds me a bit there was I think 20, 23 and me I think is the name of the thing, but 23 and me.

Speaker 2:

Yeah, yeah, a big data leak like uh, I want to say three months ago. Let me see if I can find it 23 and me. For the people that don't know, it's a service where you can send a DNA sample.

Speaker 3:

Yeah, commercial data yeah.

Speaker 2:

And they do an analysis of your DNA. Yeah, and you get back a bit of analysis of your. What is your ancestry Exactly? And they can also link that to other people if you opt in for that. Link that to other people. It also allows to find relatives on 23andMe.

Speaker 1:

Yeah, and then what happened? This was in December of 2023, december 4th that's my mom's birthday, by the way Shout out 23andMe confirms hackers stole ancestry data from 6.9 million users, and well, it's not good, right? But then being a bit devil's advocate here is like what's the worst thing they can do with it there was also ancestry data on there yeah, I think that is maybe like I don't know if you have a son or something.

Speaker 2:

You know that can be a bit sensitive, but like maybe you're now on there and then these hackers found out that you actually have a son somewhere yeah that, uh, your current girlfriend doesn't know about yeah, I think that's, that's, that's uh that's one right.

Speaker 1:

But, for example, if you just have access just to your dna right, your dna data, like without the connections, what's the worst that they can do? I think again being a devil's advocate again.

Speaker 2:

Right, even there are like we're not doing much with dna yet, but from the moment it's starting to be used for insurance purposes, these type of things, it's become super.

Speaker 1:

That's true and that's also the conclusion I got a bit right. Maybe you have a I don't know, you have a predisposition to alzheimer or something, and maybe that's something you can map on your dna. And knowing, having the information, is very powerful. Right like you can have a lot of leverage over someone. True, like, maybe, like you said, insurance companies. Maybe, if you know you have predisposition for this, they'll be less likely to insure you or to be able to pay so much, or I don't know. There I think there are.

Speaker 2:

I mean, let's, say, 10 years from now, you want to become the prime minister of belgium. Marilo for prime minister and someone knows this that you have uh, you have predisposition for alzheimer. That's very much Blackmail material. Yeah, destroy your career. No one will vote for you.

Speaker 1:

You think, I think I could convince them no. Even though the Alzheimer's I think I can be charming. No, I'll charm my way into people's hearts. Yeah, okay, maybe not, but yeah, you are very charming. You heard it here.

Speaker 3:

Actually this should be also in the intro, but these commercial data banks, that's what they use in the usa to actually investigate murders, so what they do, okay, yeah, so it's actually this, this whole case. It's uh, it's also a big um serial killer from in the 70s and they found like a cigarette thing with dna on it cigarette but yeah this is the case.

Speaker 3:

You say yeah, the golden state killer. So he killed over 13 women, but in the so 50 years ago almost. And now what they did because of these commercial data banks like 23 and me they actually the police investigators were quite, um, let's say, creative and they put his DNA in these databanks and they actually got multiple matches.

Speaker 2:

Oh, wow.

Speaker 3:

Yeah, so they used.

Speaker 2:

He was family members.

Speaker 3:

Yeah, okay, and cousins and all that. And then, okay, with a lot of effort in drawing the family trees, they actually found him and he's now arrested.

Speaker 2:

He was still alive.

Speaker 3:

Yeah, yeah yeah, yeah, and it's all because of the creativeness of these policemen, because in belgium you can't like the policeman. Just he acted like it was his dna, because you have to fill in all these details and then he just put in the sample of the cigarette but that he found next to the they just pretended to be a consumer. Yeah, like like a regular like you and me would do, to just find out who's our ancestor okay wow, very creative, but there it's all.

Speaker 3:

It's not illegal, so they can do it. You know there's not too much, yeah, but in Belgium impossible of all.

Speaker 1:

So we don't have really these but like I'm drinking my drink now.

Speaker 3:

Yeah.

Speaker 1:

So what you're saying like I'm going to take this bottle, after that I could. He has a Ziploc bag next to him, he's just going to zip it up. So what you're saying is like you could send this to 23andMe, like as if it's yours.

Speaker 3:

Yeah.

Speaker 1:

And it'd just be like 50% Japanese or something.

Speaker 3:

Yeah, no, but I mean I know that a lot of these police officers they actually chased alleged murders. They followed them in restaurants and when you were done with eating they went to their plate, took your napkin or something, and that's how they actually could prove that the DNA on that napkin was the same DNA in the crime scene and then they could put it in these commercial databanks and they had who that?

Speaker 1:

person was.

Speaker 3:

That's very creative, but then I mean who's controlling this and who?

Speaker 1:

yeah, are you for? What is your opinion on this?

Speaker 3:

I mean the family of these people that put their data. And then they need to know I'm 50% Japanese or I'm 50% Brazilian. They didn't give consent for the investigation. Then do you need to know I'm 50% Japanese or I'm 50% Brazilian. They didn't give consent for the investigation.

Speaker 1:

Yeah.

Speaker 3:

So their data is used for other reasons that they actually put it in the database. They just wanted to know where they're from or who adopted them, or something like that. Who has not adopted them? Yeah, and now suddenly, like their brother or their sister or their is now, uh, arrested.

Speaker 1:

So yeah, it's, uh, it's a bit. It feels very fishy.

Speaker 3:

I feel like if you give your data, you have to know for what purposes. I think that's one of the base things of privacy yeah, you need to consent consent. And now, yeah, because of creativity and with like lack of legislation, they could just do what they want yeah, yeah, I see, I see your point.

Speaker 1:

I think it's.

Speaker 2:

I think that is a discussion point, like I think the default is that you need to consent to everything it will be used for, but the question is should that also be the case for legal investigations? For serious legal investigations, or should they just be allowed to do it?

Speaker 1:

Yeah. It's a gray zone, it's a gray zone, but I think most people would agree. In ideal world that's okay, right, like if there is a crime, there's something very serious that you need to, and the question is what is serious, right?

Speaker 2:

That is a threshold, and also who's managing it right.

Speaker 1:

Like if you say there is someone looking at it, but you know for sure that they will stay in, whatever, like you know it's not going to go anywhere else, the person's not, it's not a person, but you know for sure that it's not gonna. So it's like the governing body plus what is actually a serious offense, that you actually that this is actually justifiable, and I think that's where people probably not gonna agree. Always right. What's your stance on this part?

Speaker 2:

on whether or not this should be allowed yes or in which?

Speaker 1:

how should it be allowed, if, if in any way I?

Speaker 2:

think it should not be allowed, like as he was explaining that it was done to it in the us, but like let's pretend to be a customer and use someone else's dna. That should not be allowed. Like Sviva was explaining that it was done in the US with like let's pretend to be a customer and use someone else's DNA. That should not be allowed in a formal legal context. I think it's a bit of a controversial stance, maybe in our world where privacy is holy for good reasons, but I think if there are very strong arguments to make from a legal point of view and that there is a serious crime going on and because of that you ignore the consent law, basically, and you open up someone's data, I think there should be an opening to do this, but with a very clear legislation and framework. And of course, that becomes super hard in a world where boundaries are become very virtual between countries. So what?

Speaker 1:

does it mean then in each country, and yeah, it's a tricky one but you do think there should be a way I think there should be a way okay, and you think so too, sophie I think also in belgium.

Speaker 3:

Now we are evolving towards that, but before the only thing we could do is we have a database full of DNAs of previous murders or previous people who were arrested, and now the DNA on the crime scene could only be a perfect match. That was the only way to match DNAs, but that means that it would only be about people that already did a crime, so new people were never in those databases. So that is just very limited, because we can see a lot in DNA Like it's also very personal. No one has the same DNA as you have, except your identical twin maybe.

Speaker 1:

Yeah, yeah, yeah.

Speaker 3:

But it's something so powerful and we, because of all these rigid legislations, I think we're losing a lot of all the research that has been done and I think towards we can actually, yeah, go towards a more open, yeah, framework for it, because actually it's for the greater good, I mean it's to put away people that actually did bad things and all that.

Speaker 1:

For that I stand that there could be a bit more flexibility yeah, I know that maybe making this discussion a bit broader, right, like doing machine learning, data science and medical data, is also not always easy because of all the privacy and all those things, right. So, yeah, I mean I think I felt like that, like there's so much potential in the medical applications for data science, but the legislation and the privacy aspect of it making super hard to to really get productive in that space I feel, but would you give?

Speaker 3:

just because some people just say like let's all give all of our, all our dna, everyone, just yeah and then we have this whole database and then every time we have a crime scene, then we have always a match or some. Would you, if they ask you right now, would you give your dna?

Speaker 1:

I feel like the correct answer is yes, but I wouldn't, you wouldn't, you wouldn't right. But yeah, but I think like why wouldn't you?

Speaker 3:

because I think that's the next natural, next question, right, why wouldn't you? Wouldn't, you wouldn't?

Speaker 1:

right, but yeah, but I didn't think like why wouldn't you? Because I think that's the next natural, next question, right, why wouldn't you?

Speaker 2:

I don't think that it will be. Uh, it will be uh secure enough that it will not get leaked. Okay, that's a good, that's a good answer.

Speaker 1:

I was like I plan to commit a crime, part like but also like uh, we all know how these things work.

Speaker 2:

There's bias and system yeah. Much more machine learning models are being used for decisions. Let's say, I commit a crime Via my DNA, I'm linked to my kids. Will it impact my kids' decisions, for example?

Speaker 1:

But you mean people knowing that they are your kids.

Speaker 2:

People knowing that they are my kids.

Speaker 1:

yeah, yeah, I think so right. I mean that will be some, it becomes part of the feature set. Ah it, yeah, I think so right. Like I mean that will be, some becomes part of the feature set.

Speaker 2:

I see what you're saying like try to apply for a government position I mean that could be the case today.

Speaker 1:

Right, like if, uh like, if my father is a is a convict or something.

Speaker 2:

Well, the thing is with you.

Speaker 1:

For example, we don't have a tracker Because I'm from an exotic land of Brazil where everything goes, but that means also data quality is important.

Speaker 2:

Let's say, take this hypothetical example that you just told me where I'm going to take your bottle, just to know what your ancestry is. I'm going to log in, create an account on 23andMe Bart's mates. I'm going to use your DNA, just to get a view on you. Big data quality issue. Yeah, that's true 15 years from now. That's true 20 years from now. My kids, they will create an account on 23andMe and they will be like super confused. Are you Brazilian?

Speaker 1:

Japanese. Yeah, I can see that. Have you done the 23andMe actually no.

Speaker 2:

Would you do it after this leak? I've always been a bit hesitant because of the US company. I'm interested in the information that would come out of it, but it would be an easy top-up if it would be European-based.

Speaker 1:

But also because of the data privacy legislation. No In the EU.

Speaker 2:

Yeah, exactly, yeah, that's the reason.

Speaker 1:

She links back to what we were talking about. What?

Speaker 3:

about you, Sophie. I'm pretty sure where my ancestors are from Belgium. Yeah, it's pretty boring. But I mean also, I know that in these 23 me databases it's it's also very american, like there's not too much other ethnicities in it yeah, but there's a lot of american people that come from. Yeah, true, true but that's why it's so popular there, because they want to know that I'm 10 irish, 20 italian, because that that's also how it is, but I'm not too interested in that for the moment.

Speaker 1:

You know, on the ski trip I was talking to Nick, because a lot of last names in Belgium actually are like Van Mechelen or something right, which means like from Mechelen or Van Heel or something, and Nick is from Heel and Mechelen is a place name. Mechelen is a place, yeah.

Speaker 1:

It's a city here. Um, there's, we have a colleague, van luva, right, which is the so? And then I was like, yeah, high level of creativity, yeah, yeah, yeah. But I mean it's funny, because I mean funny not making fun of anyone, but uh, in brazil as well, like if you go play football or something, and like they have two people with the same name, they usually give a nickname from the city where you're from, and I imagine it's like, for example, there's a football player called, uh, alexandre pato, and pato is like means duck, but actually because he's from a city that's called pato branco, so it's like his.

Speaker 1:

If you see his shirt, it's like alexandre pato, pato, pato, pato. That became his, his name almost, and he's like it's kind of the same right, um, and I was telling nick he's. Yeah, actually there's a lot of people in my city, rio, that is a Van Rio, and I was like they've been staying there for like generations, you know, and it's kind of like the same thing. I think it's in Brazil. It's not common, right, of course, because we have a lot of people that came from other continents, including Japan, including Africa, but I feel like here in Belgium, I find it very interesting that you still see that you know the people, that they've been here for generations. You know generations and generations, so just a fun fact.

Speaker 2:

I don't know how we got to this, yeah no, because she was saying she said that her family's from here and I was like yeah, I believe you yeah, yeah, I think it was something that I noticed when I came to belgium is that people go to the same supermarket that their parents and their parents before them, and their parents before them.

Speaker 1:

Oh really.

Speaker 2:

Like the same branch, not necessarily in the same place.

Speaker 1:

Like, the like on all the once you go, correct, for example, once you go the less you don't go back, yeah.

Speaker 1:

Maybe just to another fun fact on the name thing. I heard this from a high school teacher in brazil high school maybe before, so I haven't fact checked it, but he was saying the name, the word name, comes from like latin or greek or something that from the word numen, which means destiny. That's because, like there's also last names, that are professions like schumacher, right, like shoemaker and whatnot. So in germany as well. I see alex nodding. Yes, alex is my reference here. Um, so yeah, which makes sense, right, because back in the day, like you, what your profession was probably going to be like a family profession, right? So it was your destiny to become a baker a shoemaker, whatever.

Speaker 1:

So just a little, you know, dropping some uh, maybe true, maybe not true knowledge here, but it's fine you had troubles um pronouncing my last name, right?

Speaker 3:

but, actually it means in english is very easy. It means from the end from the end from the end end, end of so I think supposedly it is because my family always are used to live at the end of the street. It's simple.

Speaker 1:

Oh, okay, even by the heart attack.

Speaker 3:

Yeah, no, but it can be very simple.

Speaker 1:

Okay, but it almost feels like nicknames, you know.

Speaker 2:

I just asked JPT what Murillo means.

Speaker 1:

You know, I think, maybe you know, maybe it's very deep.

Speaker 2:

It's a Portuguese orgasm, often associated with Brazil.

Speaker 1:

Really it says that.

Speaker 2:

Yeah, it is derived from the Latin word murus, which basically just means wall.

Speaker 1:

Yeah, murus is wall, but actually I heard from someone that speaks like murillo or something.

Speaker 2:

So wall to me is a bit like a bit of a letdown. Right, I expected more, yeah. But symbolically, the name Murillo can be interpreted to mean protection or strength, as walls are typically built to protect and enclose spaces.

Speaker 3:

Wow.

Speaker 2:

Can we get an applause there?

Speaker 1:

It's okay, just a guy, just call me Murillo, it's fine, but yeah, I think I actually, I actually, but I actually heard that murillo, I don't know. Actually I heard someone like in french does it, uh, would you, does it mean anything, alex? Yeah, no, what do you? Actually? I heard that it means like small wall, which to me felt a bit useless, but okay. So actually I knew that, but I just didn't didn't want to put it there for everyone to know more. On the data privacy part computer vision data to make our roads safer, sophie, do you happen to know anything about that?

Speaker 3:

maybe, yeah, I know there's been a really nice project with smart cameras and computer vision to detect if people use phones or their mobile devices. I know there's been a really nice project with smart cameras and computer vision to detect if people use phones or their mobile devices behind the steering wheel while driving. We all know I think, yeah, every one of us has done it, and we know that it's not okay, no, no, no.

Speaker 1:

I never done it. I never done it. I don't know what you're talking about.

Speaker 3:

Okay, then, okay then, and they actually. There was a really nice project that worked really good, a really good image recognition model, but again, belgium policy or Belgium legislation didn't allow it, because it was very clear who was in the car, what was he doing, what was he eating with, who was he or she? Okay, he could detect if I was using the mobile, which was good, but he could also see everything else. And then they were like they didn't let it fly and who is owner of the data?

Speaker 3:

how do we store it? Do we blur the faces? Do we all these kind of questions popped up. The organization was like yes, of course we follow all the privacy rules. And still, or it's a political thing, because in germany, in the netherlands, they use it and it works very good, it's very good as a reinforcement and the fines are high.

Speaker 3:

Yeah, in the netherlands, in germany yeah, you're going to germany later, right, just watch out be careful just know, just saying um yeah, and it's actually very good as a as a campaign would this image be flagged? No, no we're looking at an image, uh yeah of a guy holding his phone the images are not from within the car, of course, if you go down so but this would be flagged. If he was from the cameras, I think, yeah, this is a good exact.

Speaker 2:

This person is also this is someone that is not holding his phone to his ear but like uh in front of his chest, like the phone is on the speaker.

Speaker 3:

Yeah, yeah yeah, but someone else is driving the car, so this one this, this, this image here, this video, yeah this would be the image of the smart camera that makes it, but you can see everything right yeah, yeah everything. Who is with you, who is not with you?

Speaker 2:

it feels a bit weird but this is also.

Speaker 1:

This is okay. Oh look, this is. This is really cool. I'm just kidding, um, so now we're just showing an image, so this is part of a youtube video, so the links, all everything, will be in the description, if you're curious. Um, and actually this is a picture, so you see a bit from a helicopter view, almost so it's from above, and you see, actually this person really went all out. The person is on the phone with both hands and then the person next to them is holding the steering wheel for them. So I think this is extra detection. The person is pretty extra, I would say. Um, but yeah, so I guess the issue here is that you can see the make of the vehicle. You can actually connect, probably, this information with more stuff. You see the people that they're with right.

Speaker 2:

So I guess it's a bit more to me this feels weird. Feels weird why I can't really define because, like, when you do, when you go over a certain speed limit, you get, uh, also your picture of your car is being taken, that's true, so you don't make you know the license plate.

Speaker 1:

Actually, this feels more personal this is very personal this um but what if the person was speeding right now? They would still take the picture and all the image would be similar. No, when?

Speaker 2:

when you're speeding. I actually got uh, I know someone that got a ticket the other day and there was actually a picture of this person.

Speaker 1:

Yeah, it's new here, but here in belgium oh, but what kind of picture in the car, like this person behind it.

Speaker 2:

It wasn't a good picture yeah, it wasn't from the right side of this person, um, but even that felt a bit weird, but maybe that's the same than this right like.

Speaker 3:

Like that's the same level of but did it also show like the whole dash cam, the whole windshield? Who was next to you? Because no, no, it zoomed in on I feel that is a lot of the issue here, that it's everything like imagine you're with a person that is not the person you're normally with, or and these data get leaked, and all that like it's too much information well, also.

Speaker 1:

Yeah, so a friend of mine also went to get a fine in germany. So apparently, according to this friend, the like the speed light, the speed limit changed very short distance and there was a camera right after and actually took a picture and they sent the fine with the picture and it was like it's exactly the same for me.

Speaker 3:

I think it even wasn't For you.

Speaker 2:

Oh, no, no, no, I'm not that friend.

Speaker 3:

Very smart, smooth.

Speaker 1:

That's what we do here Police Get him, yeah, but, and actually there was a picture of the person. It was exactly that's what we do here, master Police Get him, yeah, but, and actually there was a picture of the person. It was exactly like that and the person was very serious. It was very mean, very, very scary. And then there was someone next to them in the car, but then there was a big block.

Speaker 2:

A big crossed out, but that's the thing. You still have the data right.

Speaker 1:

You have the data. Yeah, yeah, yeah, but the thing is also like that can be very sensitive for if they didn't block it out, if there's an issue like you send the fine to your house and then someone opens it, right, it's like yeah, but no, it's very, you know, like it's, you know, I think you have something to hide. No, not my friend, my friend. My friend never had anything to hide, he was a straight-up fella, I would say.

Speaker 2:

But are there other use cases where this would be a problem?

Speaker 1:

I don't know, but that's the thing, right? So the issue there was like that you have the make of the car, you have what the person is doing, who they're with maybe I don't know other things, right, I feel like this is less of a? I'm not sure.

Speaker 2:

I'm not sure, I'm not sure if I feel like this is less of an issue yeah, I feel a bit because now it's cars, tomorrow it's like crossing the pedestrian light without but it is very deadly use I.

Speaker 3:

I read that it's like crossing the pedestrian light. But, it is very deadly. I read that it's like 50 deadly accidents just because of using your phone behind. So it always brings it a bit in balance. I'm like, yeah, sure, then maybe we can.

Speaker 2:

It's a slippery slope again. Where is the threshold?

Speaker 3:

Imagine it's your. You can also always things like that. Imagine it's your brother or that gets killed because someone is using their phone.

Speaker 1:

That's true, easy and you can take all the data then yeah I think here on this, on this thing, he says 2021 alone, there were 1000 accidents caused by distraction, among other things to cell phone. Yeah, this is in germany. And then, if you go here for the belgium article, it says according to the recent estimates, mobile phone usage behind the wheel causes 50 road deaths and more than 4 500 injuries in belgium every year I feel that's a lot, and if it, yeah, it's a lot yeah, it is so.

Speaker 1:

Yeah, I think, yeah, yeah, but I think it's like it can be a slippery slope, right, like they're using the cameras and then they're like, oh yeah, but maybe they're not talking on the phone, but maybe they are doing something else, you know, or maybe they're not wearing the c-belt, for example.

Speaker 1:

Right, and then, oh yeah, they're not wearing the c-belt. Maybe you can do the camera here, maybe you can do there. You know, like, once you have the data available, I feel like it's quote unquote easy to have a slippery slope there. I don't know. I I'm all for reducing road accidents, of course, but I can understand.

Speaker 3:

There was one example that the person was petting his dog on his lap.

Speaker 1:

Driving.

Speaker 3:

Driving while driving, and then he also got a fine for it, and he was like yeah, but was this spoken about? I mean, was this communicated? I was yeah.

Speaker 1:

Is there a law that you cannot pet your dogs while driving? I don't know.

Speaker 3:

It looks like it's. Otherwise, how can you get a fine?

Speaker 1:

yeah I don't know.

Speaker 3:

I've never heard of that or it's something like you have to have your both hands free or on the steer or something yeah, yeah, I heard actually.

Speaker 1:

Yeah, I think that's. Uh, that's the thing. Yeah, um, which I mean I always do, but my friend, that one that got the fine, sometimes he doesn't, but um, yeah these cameras they do like in one hour they can have more.

Speaker 3:

Um, yeah, they see more people using their phone than a police officer a whole year. So one hour, because the police officer has to really see it happening while he's driving next you stuff like that.

Speaker 1:

So it's very I mean yeah, but that's so in belgium. That's how it works, right, like you have the speed cameras, so it's automated, or you can have a police officer. Yeah, that can give the fines if there are other infractions, I guess he has to see you using your phone yeah, like or not wearing the seat belts, or like making a turn without a turning light or something, so it doesn't happen that often you know, and with these cameras it's so efficient yeah, in the us is why I lived in the us for four years and the the.

Speaker 1:

The way that they actually do the traffic control is also with police officers, but they actually do it quite a lot like they actually have.

Speaker 1:

That's the main way so they actually have a lot of police officers, and I think I may be wrong here, but I think they even have some quote-unquote quotas, like to give warnings or stuff, so like they expect police officers to be on the road and keep the road safe, which I thought it was a bit inefficient before I moved there, but now I actually think it's pretty good, because in Brazil it's mainly speed cameras as well, and with Waze and all these other apps you know exactly where they are.

Speaker 1:

So in Brazil, like people speed and they slow down for the camera and they speed again and all the other infractions are just unaccounted for. Actually, in brazil is a bit different, because it's not a police officer that gives the fines, there's a special police officer, like he's a traffic control guy, um, and they're the only ones that get fines. Regular police officers don't.

Speaker 3:

But then it's like I think it still gets very messy, but yeah, but then for this one verdict on the, the cameras for phone usage behind the wheel I, I would say go you're a pro, yeah, because also, you're on a public road you have no, I mean you're not in your house, you're not on your private domain and you're actually putting other people in danger.

Speaker 1:

That's for me like a hard but then, like talking about the whole, the data privacy part of the discussion is that you need to give consent.

Speaker 2:

You're not giving consent at all here, right, but that's law enforcement if the police officer is standing next to the road, she's exactly the same thing. Do you give consent, then, but like the police officer is not to the road.

Speaker 3:

She's exactly the same thing. Do you give consent, then?

Speaker 1:

But like the police officer is not a system right Like it's a difference.

Speaker 2:

Yeah, it's not stored anywhere, it's just a so if there's like on every corner of police officer standing, that's fine with you.

Speaker 1:

Well, the Brazilian me wants to say no, but yes, no, I think. It's more like I think the whole issue is that there's data is recorded and can be shared with people right With a police officer watching, Potentially leaked, Potentially leaked. Exactly right, If it's a police officer, he's not going to share his memories with someone, right?

Speaker 2:

Potentially used for different things.

Speaker 1:

Exactly right. What about you, bart? Are you pro or?

Speaker 2:

Well, after this person that I know who received this picture, it's too late.

Speaker 1:

Well, after this person that I know, who received this picture, if I just need to say like a blanket yes, no statement, I'd say yes, I'm fine with it. Okay, I feel like you were a bit.

Speaker 3:

Uh, you know strong-armed into that?

Speaker 1:

uh, yeah, and you would you.

Speaker 2:

Oh, that's not how this podcast but to me it's very like, very much a slippery slope. Like I think I'm fine with like uh next to uh next to a speed camera, that there is a camera that uh also looks at. Maybe it's the same camera that looks at whether or not you're holding your phone from the moments at the pedestrian light. I'm like like maybe this one's too far. That becomes really like real-time surveillance of people, right, yeah, like the big brother kind of thing. I don't know where the start and stop should be, to be honest.

Speaker 1:

What about seatbelts? I think that's a controversial one, because arguably….

Speaker 2:

I think it's only controversial for people from Brazil.

Speaker 1:

No, no, but I feel like the argument is if I don't wear a seat belt, I'm putting myself in danger seat belts today you cannot not wear.

Speaker 2:

Like your car starts beeping, you go crazy. Yeah, you wear your airplex if you know.

Speaker 1:

But don't get me wrong, I I always wear a seat belt. Yeah, my friend, no, just kidding, like I'm full for it, right, but if someone comes to me and say, look it's, if something happens, it's my, like I'm not hurting anyone, like it's not. Like you know, like if you're on the phone you may cause an accident, you may hurt someone else, right? If you're on the the seat belt, whether there's an accident or not, that you're just risking yourself yourself exactly so.

Speaker 1:

It's more of a self thing and I should have control of my over my body or whatever. You know, like, what risks I'm taking, right, you want to go ski off slope? That's your risk, right, you're making the decision and I think that that argument, I think, has some merit to it. I mean, unless with the assuming that you cannot fly through the windshield and hurt someone because you flew through the windshield, like given that but then you're saying that you would not appreciate it that they do it for the seat belt, but, but yes, for the phone.

Speaker 1:

Well, I think it should be enforced. I still think it's, I'm still for it right. But if someone comes to me and gives me that argument, it wouldn't be very easy for me to just kind of discredit them, let's say, or to rebuke that you know, actually, in Brazil even I think my history teacher told me as well in Brazil it's illegal to commit suicide. So if you try and you fail, you can be arrested. That's what I heard.

Speaker 2:

A bit of a hard time. Like harm for yourself. That is the main risk to put that in this situation. Maybe, like I have a speed bike, an electric speed bike a speed pedelec, I think it's called.

Speaker 2:

You are enforced to wear a certain type of helmet, otherwise you can get a fine. There are special helmets for speed pedelecs. This would be a case where you say we're not allowed to monitor this with a camera next to a stoplight, maybe. Yeah, I mean again Because it's only you that has a risk. Something like that. I agree, I agree, but to me it's very hard to define that, but I still think it should be a regulation.

Speaker 1:

I still think you should enforce this. I still think so.

Speaker 3:

But you can enforce it, you should enforce this?

Speaker 1:

I still think so, but you can enforce, like I mean, maybe not with cameras, right, but like the same way that, like, using this, you gotta find if you don't wear the helmet, you wear, fine, if you don't wear seat belts, I think I don't. If I was making the decisions, I would make the same decisions as well, and I think maybe you can argue that maybe the people that don't wear seat belts, they don't understand the increasing risk, right, like they don't have. So it's like you just make a legislation because you know, or the same thing with the helmets. Maybe you can say, yeah, but then you're treating people like kids, you know, and they should, just you know, like it's a. You can go on and on, but like I feel like if I was making the decisions, I would still make uh, pass a law, whatever. Like that, if you don't do this, then there's a fine, there's this, there's that.

Speaker 2:

So not sure we also have a question here assuming the pictures taken are mailed to the address the car is registered to along with the fine. I think that is what we would assume. I think the problem with this is that it will probably be mailed to the address the car is registered to, questions that might not be the driver, and thing is with this is that it will probably be mailed to the address the car's registered to uh, questions that might not be the driver, and thing is it's of course it gets mailed to, but it's also like it gets stored somewhere, right, and that's that storage can like who knows what happens to it yeah, that's uh for the.

Speaker 1:

My friend that got the the german fine. It was like it was virtual right, but yeah, basically they just sent the fine.

Speaker 2:

That was the picture in the fine so, and we don't also don't know what is illegal in the future. There was, um, I don't know how long ago this uh. Two years ago. When was uh? This is really out of my domain of knowledge. But I want to to say, in Texas, abortion got Ah yeah, yeah, yeah yeah yeah, and it's actually like there is.

Speaker 2:

I'm not sure how it was implemented in the end, but at the point they were talking about, like, if you have a tip on someone actually going for abortion, you can get a sort of a bounty, you can get paid for these tips.

Speaker 2:

Oh, wow, which immediately for the major tech players like created for all tech players, like, certainly like your database administrator becomes a potential leak because someone is paying for this data. Ah, yeah, and I think never when they were creating this database of this person with on google max was close to this abortion clinic, like google maps probably never thought like this will at some point be a problem. True, but suddenly, like there is a new state that you're in and this is suddenly super high risk data for people involved. Yeah, and I think that is also something that is very hard to understand. Like, even if you say, now, for this specific situation, this is okay. Yeah, and I think that is also something that is very hard to understand, like even if you say, now, for this specific situation, this is okay, the data is there. And what does that mean, like, a few years from now?

Speaker 1:

Like if yeah, I think I is this the way I'm going to see if I can reject All right Whistleblower website for snitching abortion shutdown. Yeah, it's a bit. Yeah, it's that what you were started with, right? Like you don't know what the future holds, right, so it's like.

Speaker 2:

Well, for now. I'm very happy that I live in Europe me too.

Speaker 1:

But yeah, it's indeed.

Speaker 2:

It's like the rules of the game change in the middle follow-up question on the um emailing photos to your address, and I think it's a very fair one. Like how would it comply with gdpr, uh, if you're on a public roadway, in public, no guarantee of your privacy but when, as you mail your photo to potentially someone else's address the owner of the car, like it doesn't comply with gdpr. Good question, yeah, actually that's uh, yeah, to be fair, I don't know to be fair, I would very much doubt it actually.

Speaker 3:

We just but, um, this person never consented to it yeah, indeed it's a but I think police or law enforcement I'm not sure if they need that kind of consent.

Speaker 2:

Yeah well, I don't know, I can't really say. To be honest, it's a very good question.

Speaker 1:

We need a GDPR expert on there. Okay, in the interest of time, there's one more here on the data and privacy.

Speaker 2:

Just to maybe because it's 10 minutes to five. Just to remember, if I become silent at five, it means that I snuck out because he had a heart attack.

Speaker 1:

The heart issues had the best of bart, um, and we just carried on with the podcast. No, uh, maybe just one last one, uh, the. I don't think we have a link for this, but it's more about cybercrime, so like fraudulent transactions, right. So, um, oh, and, by the way, we have a link for this, but it's more about cybercrime, so like fraudulent transactions, right. So, by the way, you have a fan, bart. Yay, I'm with Bart on this one. Whoop Go, bart. All right, you want to pause, or?

Speaker 2:

no, no, that's fine. My whoop whoop was fine, that's good.

Speaker 1:

That's not about like cyber cyber, that's not about like a cyber cyber cyber crime, right? So, fraudulent transactions, fishing, I think, sophie, I think what you had brought up is, um, again, if there was a way, a safe way, a safe framework to monitor like the transactions and all these things, we could be preventive as well on fishing and all these things. Right, I think it's a bit of what we mentioned before, right, it's like, yeah, you can be preventive, but I think, opening up to that level, I think would you be open to give your bank the authorities to analyze all your transactions and report to police if they see something they already do by the way?

Speaker 3:

for, for, severe cases. Yeah, um, but it's. It's more like how, how comfortable are you yourself for the greater good, for the greater, or for the safety of the community, or for the safety of society, like it's just that balancing act. That is just so interesting, but also I don't know how it goes.

Speaker 1:

Yeah, I think. For me is again I think Bart also touched on this it's like do you trust that there is a system that is secure enough that you can disclose all these things and everything will be fine, right? I think that's the, and again, it's not the idea, it's more the putting the idea into practice. That makes me a bit iffy.

Speaker 3:

I think we just need a lot of control on the control of your data.

Speaker 1:

Yeah.

Speaker 3:

That's just a whole.

Speaker 1:

Yeah.

Speaker 3:

And it's very expensive.

Speaker 1:

Yeah, and I think for I think, trends like financial data. I think it's very easy to find malicious intent Right, so that's why I think it's also extra. This is the one that I think I'm the most against. Quote unquote Not sure. How do you feel about this, Sophie?

Speaker 3:

I'm always just if it could prevent that. For example, my grandmother is not fished anymore Because you know, all these elderly people are so easily fished and it's so hard to track because they're getting fished into actually doing transactions voluntarily to these malicious persons. But if I could prevent that by giving those banks the authority to analyze, I would always go for. Yes, let's go, but you also have on the other side, yeah, people with bad intentions about yourself, of your own data, and that's just the balancing act.

Speaker 1:

Yeah, yeah, no, I see what you're saying. I think also, maybe the circumstance would change right If I said I've had a personal experience with this. But, um, yeah, yeah, yeah, I see your point. What about? What about you, Bart? You want me to repeat the question? Yeah, maybe, Um, if you could, would you disclose your transactions, financial transactions, to stop phishing attacks and all these cybercrime, Especially for older people? I feel like they're more susceptible for these things. Is this something that you'd be okay with?

Speaker 2:

I'm not sure how to stop cybercrime. If I open up my transactions what do you think kind of transactions that I do All these phish people's money comes to me.

Speaker 1:

No, I mean, I don't know. I'm wondering if, like, if I make a transaction to you, it'll show up on my end and your end, right, but they will know that you're not a legitimate business or something right Poof.

Speaker 2:

No, it would not Okay, but this does get done on a more macro scale. No, I would not, okay, but this does get done on a more macro scale, like between banks, between clearing houses, like this does get done, like there are blacklist and whitelist and monitoring on these things, and we are actually often these things actually doing this. So, yeah, very much for it.

Speaker 2:

That's like at the institutional level okay but that's an individual level but has a thing. I mean that's a bit like to me, the trend here and all these things like the moment it becomes too personal. The privacy bubble of a person like like where is where does, like this is a good use case versus mass surveillance like that.

Speaker 1:

Yeah yeah, yeah, all right, we have. Maybe, bart, you have a hot take. We have five minutes. I'm trying to be mindful of your time as well. Here. We already do the hot take.

Speaker 2:

Well, I think we're wrapping up. No, yeah, okay, okay, okay, it's a hot take.

Speaker 1:

It's a snippet I heard from a podcast of a very controversial person by the way, we need, like we need, a special beat or sound for the hot take. You know we need that, just see what you mean.

Speaker 2:

Bit of a side note you know, yeah, yeah, I see what you mean, but next time, next time, next time. It's not that difficult actually.

Speaker 1:

Um let me, let me, let me, let me try. You're gonna do now. Not sure if it works no, I, maybe the people listening to us will hear, we're not gonna I don't, so I have the hot take.

Speaker 2:

Uh, it's something that's like a snippet of a podcast. Uh, it's a very controversial person so we should not focus on person. It's from uh, kid rock. Okay, it's not to focus on the person. Um, he was uh. I heard a snippet of uh this past weekend with tiovon uh, he was apparently on there and he said something that I did find interesting, because we've talked about this uh in the past, about gen ai and the impact on music and stuff like that is this the?

Speaker 1:

is this the the podcast uh?

Speaker 2:

yeah, I think so yeah this one. Yeah, past weekend I assume, I assume so okay, unless he was there multiple times um, and he said, uh, I don't give a f star ck. No, I'm not sure what that means I don't give a f star ck about jenny I, and he was talking about uh, jenny I generated music using his voice, so basically, saying use my voice I don't give a f star ck.

Speaker 2:

Whatever that means, yeah, okay, um which I find interesting, right, because we are. We are talking about this a lot and and one of the things that comes up is like, who has the copyright? Um, which is uh, already very difficult when it comes to image generation and a certain style uh, which is um, but there you can still apply a copyright law. Um, when it comes to deep fakes, copyright law doesn't even apply, like you need to be very creative in order to bring this to court under copyright law. And this is more or less a deep fake of his voice, and we see this a lot in the performance of these things. It's also becoming very good.

Speaker 2:

I think probably his stance is not the average stance in the music world. I think people are concerned about this. If your voice or whatever of you gets used without any uh, any uh um agreements or without any royalties, or uh, and his he explained a little bit about it, why he, he thinks so and he says a bit jokingly, like it would be cool if someone does the work for me creates a good song, because no one makes shit from royalties. The only thing that people, we as musicians, make music from is from performances, live performances.

Speaker 1:

So basically, if you get more popular because people created GenAI music for you people are going to come to your shows and you're going to make more money. Exactly.

Speaker 2:

That was his reasoning. Yeah, and he was saying like I mean, no one else is going to give a live performance for me. I mean, are you going to buy a ticket for me or for the, the Chris rock catch rock cover band? He's right there, right.

Speaker 1:

There was a virtual artist or something that they had like holograms of holograms of ABBA the four people of the ABBA.

Speaker 2:

People go every day to their shows oh wow, that probably goes to the state of that goes to the fund behind it but I thought it was an interesting take and it's probably like an etch take, but it's an interesting way to. It's probably like an edge take yeah For sure, but it's an interesting way to look at it, and especially, I think, in the music world, because in the music world, like for the individual artists, probably most of the money comes from performance.

Speaker 1:

But then when it comes to more, let's say like, like images, paintings, these type of digital art, yeah like, like images, paintings, so this type of digital art yeah, it very much comes from work, right, yeah, but then I think if someone figures out how to make money with his gen ai songs, then he would care at that point.

Speaker 2:

Well, probably he has too much money to really care.

Speaker 1:

But yeah, if it would just be a lot of money yeah what if?

Speaker 3:

what if the generated song is actually very popular and then he's forced to sing it on his performances?

Speaker 2:

well his, his stance was. That would be easy, because then I don't need to write a song okay but what if it's the opposite?

Speaker 1:

what if it's a song that has very negative negative like reflects very negative to his image.

Speaker 2:

I think that's harder for cat rock, but Okay, yeah, but for other artists that could be the case, yeah.

Speaker 1:

Interesting, never thought of. From that point, yeah.

Speaker 2:

And it's interesting to see how this, how this will evolve, because YouTube is also working on this, like generating voices, songs, via Gen AI, but very much in close collaboration with artists. Yeah, but very much in close collaboration with artists yeah, with explicit consent. Uh, sam altman was uh also on the podcast with uh lex friedman also talking about like that, if your content, like your images, are used to generate something in that style, you should in some way profit from that that they're thinking about. How can we link this to something, to the original content, so that in some way, you can renumerate? Um, yeah, I think we're still far from that, because it's very hard to link these things. Um, but it's interesting to see how this field will evolve yeah, yeah, I don't think it evolved so quickly that today it's a bit free for all.

Speaker 2:

When it comes to royalties and copyright law and these things, it's a bit free-for-all. I don't think it will stay that way, I agree.

Speaker 1:

And talking about making, generating music from other people's likeness. Maybe we can already hit the outro sound. Yes, you have text. I'm going to go in a way that's meaningful to someone.

Speaker 2:

Sophie, thanks a lot for joining us. You, I'm going to go. I'm going to go In a way that's meaningful to someone. Sophie, thanks a lot for joining us. You're very welcome. Thanks, mark, I'm Bill Gates. Thanks everyone for listening.

Speaker 2:

I also just want to say I would recommend TypeScript, yeah it writes part parts, a lot of code for me, and usually it's slightly wrong. I'm reminded, incidentally, of Rust here. Rust, congressman, iphone is made by a different company and so you know you will not learn Rust while skydiving. Well, I'm sorry, guys, I don't know what's going on.

Speaker 1:

Thank you for the opportunity to speak to you today about large neural networks. It's really an honor to be here, rust.

Speaker 3:

Rust, rust, rust. Data topics Welcome to the data.

Speaker 2:

Welcome to the data topics podcast. Ciao, Bye everyone.

People on this episode