DataTopics Unplugged

#45 Tech Check: Amazon's AI, Rust vs. Go vs. C++ and the Intricacies of AI in Coding

April 12, 2024 DataTopics
DataTopics Unplugged
#45 Tech Check: Amazon's AI, Rust vs. Go vs. C++ and the Intricacies of AI in Coding
Show Notes Transcript Chapter Markers

Welcome to the cozy corner of the tech world where ones and zeros mingle with casual chit-chat. Datatopics Unplugged is your go-to spot for relaxed discussions around tech, news, data, and society.

Dive into conversations that should flow as smoothly as your morning coffee (but don't), where industry insights meet laid-back banter. Whether you're a data aficionado or just someone curious about the digital age, pull up a chair, relax, and let's get into the heart of data, unplugged style!

In this episode, titled "#45 Tech Check: Amazon's AI, Rust vs. Go vs. C++ and the Intricacies of AI in Coding" we are joined by special guest Lukas Valatka as we peel back the layers of AI's reliability and dive into the coding languages shaping our digital futures: 

  • Amazon's "Just Walk Out" Technology: A deep dive into the challenges and human supervision required behind Amazon's cashier-less shopping.
  • Rust vs. Go vs. C++: Is it time to join the Rust rush? We dive into the language loyalty debate and compare the productivity and features of Rust versus traditional languages like C++ and Go.
  • The API Balancing Act: The trade-offs of updating PyO3's API with user convenience in mind.
  • Generative AI and Refactoring: Weighing the upsides and downsides of AI-driven codebase refactors.
  • Tech Debt and Rust Libraries: Turning tech debt into an asset by internalizing open-source libraries.
  • Cursor - AI-Powered Programming: Exploring the intrusive potential of AI's filesystem access in code editing.
  • AirBnB's Open-Source Gambit: Should we invest interest in Airbnb's Chronon, or is it just another big fish in the open-source sea?
Speaker 1:

You have taste in a way that's meaningful to software people.

Speaker 2:

Hello, I'm Bill Gates. I would recommend TypeScript. Yeah, it writes a lot of code for me and usually it's slightly wrong.

Speaker 1:

I'm reminded, incidentally, of Rust here.

Speaker 2:

Rust, rust, congressman, iphone is made by a different company, and so you know you will not learn Rust while you're at it.

Speaker 1:

Well, I'm sorry, guys, I don't know what's going on. Thank you for the opportunity to speak to you today about large neural networks. It's really an honor to be here. Rust, rust, data Topics. Welcome to the Data Topics.

Speaker 2:

Welcome to the Data Topics Podcast. Welcome to the Data Topics Podcast. Um, we're on YouTube. This episode is unfortunately not live stream, but we'll make sure, if everything goes well, you should be seeing this on YouTube later on, uh, but we're not on the other platforms like LinkedIn and whatnot. But feel free to check us out there, uh, and feel free to leave a comment on the video as well. We'll try to reply to you. Today is the 9th of April of 2024. My name is Murilo. I'm your host for today. I am 9th of april of 2024. My name is morillo. I'm your host for today. I am not joined by bart. We should have a button set. Oh yeah, um, we are joined by lucas. Hey, lucas, hello, how are you? Um? We'll get back to you in a second.

Speaker 2:

And behind the camera, the one and only sound engineer, alex hello all right, lucas, maybe you want to introduce yourself real quick for the people you have been on the show before. Yes, you are a friend of the pod. Um, yeah, see, you see. Um, you want to introduce yourself to people that missed the first episode, people that don't know who you are yet yeah, I think it was a a month and a half ago oh really already.

Speaker 1:

Yeah, time flies time really flies yes, so I'm lucas, hi, and uh, I'm an ml engineer here at data roots and uh, yeah, now a second time podcast guest yes, very uh.

Speaker 2:

Python savvy developer. Yes, yes, that's cool, cool uh. Alex, actually were you here last time when lucas was a guest, or not?

Speaker 1:

no, no, I don't think so it was much more shabby before.

Speaker 2:

Yeah, we had the alex, you know it looks professional I don't feel like, I don't feel weird, like you want me to be there, you sure, yeah, cool, cool and uh, what's new with you? Any life updates since last time? Anything?

Speaker 1:

yeah, actually, when you said that, yeah, what'd you do last week or something, yeah, so, uh, last week I was back in lithuania so it's like. I flew up a couple of hours north and spent there a couple of weeks, so it was nice getting back with family and all that Nice nice, and you said the weather there was pretty crazy.

Speaker 2:

no, yeah, it was 22 degrees.

Speaker 1:

then suddenly it started snowing, so you can imagine the contrast. In one evening you just get out and it's like wow, you're having drinks outside.

Speaker 2:

Yeah, yeah, yeah, man, I don't feel so good.

Speaker 1:

I don't know what's happening I think I drink too much cool, cool, good food family yeah, food is uh, good, very greasy, but good, very, actually I'm not even sure what is lithuanian food, what's traditional.

Speaker 2:

If I had to try one dish from lithuania, what would you tell me to try?

Speaker 1:

Damn. These questions are usually very tough.

Speaker 2:

I mean, but if you ask, me.

Speaker 1:

I think we have these potato dumplings called zeppelins.

Speaker 2:

Zeppelins.

Speaker 1:

They're not fully Lithuanian, but they're kind of yeah Enough.

Speaker 2:

Yeah, yeah.

Speaker 1:

There's also called Borscht. You know Borscht, the Ukrainian red beet soup. Yes, so the Lithuanian version is the cold one.

Speaker 2:

Oh yeah, it's pretty neat. So maybe even that, maybe I'll try it someday. I'll try it someday. What about you, alex? Anything exciting last weekend, I know.

Speaker 1:

Yeah, yeah, I said, I didn't have any plans last week, but my parents came to visit me and we went to my sister's football game, soccer, game, soccer, depending where you're listening from.

Speaker 2:

Okay, she plays um in belgium like is it like a amateur, professional, semi-professional I want to say semi-professional.

Speaker 1:

Oh wow, so she's pretty good then.

Speaker 2:

Oh wow, okay, really really cool my weekend. Oh, thanks for asking. Oh yeah, sorry, it's okay, it's fine, I'm used to it. I was in venice actually for the weekend. Oh yeah, yeah don't, don't brag, but you know whatever, just enjoy my life. You know whatever, my life is dope. What can I do? I'm just kidding, um, but cool, what do we have for this week? Let's see. Let's see, amazon. Uh, you have something here, lucas, from amazon. Let's just walk out yes what is it about?

Speaker 1:

yeah, I think we, we saw it last week. So apparently amazon has these self-checkout shops that are. You know, you take an item, you just leave it, I guess, the shop, and then they kind of charge you. It sounds super exciting.

Speaker 2:

This is not that new, I think. I think when I lived I used to live in the US and when I lived there they were announcing it already. I think it used to be called Amazon Go or Go Shopping, shopping or something. So it's like you scan your on your phone, there's a qr code you scan when you walk in and then the idea is that you just go in, you pick whatever you put in your basket and then you just walk out and then they charge on your on your amazon account.

Speaker 2:

My brother also is a works for aws and he was in. He happened to go to seattle, which is the headquarters of aws, after they announced it and then they actually were encouraging my brother to go to these shops. I think they gave him vouchers and stuff because they were really testing the algorithm because, for example, I'm not short, but I'm not the tallest guy either, right, let's just say it like that. So, for example, if there is something on the very top shelf and I cannot reach it, I can say, hey, lucas, can you grab it for me? And you grab it and give it to me. The algorithm should be able to detect that I should be the one all right.

Speaker 2:

Or, for example, if you have parents with kids and the kids run and pick something up. You know that could happen. Or what if you return an item to the wrong place? Right, there's a lot of edge cases and my they were encouraging my brother to give it a try so they could hash this out, and I think the technology is called just walk out.

Speaker 1:

So there's no way you can shoplift right.

Speaker 2:

Yeah, that is a way. It's unshopliftable, I guess, unless you don't scan the QR code, right. But I think I saw even there was a sketch from Saturday Night Live or something. They were interviewing like marginalized groups, right, and they're like, yeah, you just take and just walk out. It's like, are you sure? Like nice, try, no, I'll pay for it, it's fine. Um, but yeah, true, that's the. I think that's the premise. That's the premise. So what is this about? Lucas?

Speaker 1:

yeah, and so in those, I think so it's been seven years now, based on your story in the US for this technology.

Speaker 2:

Oh, how old do you think I am? No, it's been a while.

Speaker 1:

Yeah. So given that, you might think, ah, but the technology now covers all the edge cases and what they discovered was that actually they have real humans, kind of making sure that you're charged, so kind of offshore somewhere. I think, and that was kind of weird, because so the bunch of people in India like really, and Amazon said that it was like for training data or something. It sounds like just to train the algorithms, essentially make them better.

Speaker 2:

It's been like seven years as well. Yeah, it's a bit weird. Exactly To me it's been like seven years as well. Yeah, it's like a bit weird.

Speaker 1:

Exactly to me sounds weird because seven years is a long time to get enough data already. It's like yeah why do you need those? So my thought was like yeah, you always have these edge cases. Essentially they haven't covered all the edge cases and they need the guys and yeah yeah, it also speaks.

Speaker 2:

I think there's now. I'm thinking to myself here as well, like how, even in ai, we actually see a lot of the use cases are not the ones that ai makes the decision. Quote unquote yeah, but usually it's a supervised process, right. In this case, I think it's saying that here says that just walkout sales are revealed by real people in india. So I mean, maybe the algorithm works really well, but they're just reviewing some cases, right?

Speaker 1:

so but since they're reviewing, gives an indication that, like I mean, if it was like perfect to you know just, there is a certain extent at which you can say it's kind of perfect yeah if they're still reviewing and the numbers say it's quite a lot of people. So is it really good?

Speaker 2:

yeah it's like maybe, yeah I think here it's saying uh, 700 of every 1 000 just walkout sales had to be revealed by amazon's team in india into 2022.

Speaker 1:

that's a lot, that's quite a bit. So it's like yeah internally.

Speaker 2:

Amazon wanted just 50 out of every thousand sales to get a manual check, which I think is way more feasible. But indeed, yeah, 700 out of uh 1000. I just did some quick maths here in my head. That's like what? 70, yeah, pretty much.

Speaker 1:

It's pretty big.

Speaker 2:

It's just manually supervised process yeah, but I'm also wondering if the, if this is really about accountability, there's also the philosophical side of ai.

Speaker 2:

Right, like you have a self-driving car yeah it gets into an accident, it hits someone, someone goes to the hospital or something worse happens. Who is liable for this? I think today, like even tesla, they think you cannot just let the tesla drive, right, you still need to be, have someone behind the wheel and that person is always accountable. But in a world where it's fully, because in that sense, the driver is the supervisor. But in a world if everything is fully automated, what's the accountability there? Right, I even saw there's the. I'm gonna be philosophical today, so just stay with me, bear with me. Um, there's also the. The trolley problem. Have you ever's also the trolley problem? Have you ever heard of the trolley problem?

Speaker 1:

Is it the one that you need to choose the rails?

Speaker 2:

basically yes.

Speaker 1:

So one person, or seven people, or something. Yes, so yeah.

Speaker 2:

Trolley problem is a thought experiment in ethics about a fictional scenario in which an onlooker has this choice to save five people in danger of being hit by a trolley or by diverting the trolley to kill just one person. So that's the thought experiment, right, but most people or I don't know, I don't know if that's really most people, but there's an argument to say that killing one person is better than killing five persons, so you should take an action. But then there's also a line of thinking that by engaging, you know like you're, there's no better. Or there's also the personal feelings there's.

Speaker 2:

You can change a bit the trolley problem to say, okay, instead of being a lever like there's an image here for the people following the video uh, if you have to push someone off a bridge, for example, to the train hits first before getting to the five people, people change their opinions, etc. Etc. Which is a very philosophical thing. But if you're talking about self-driving cars and AI, arguably you could program those choices in right, like imagine there's a self-driving car in one lane and for some reason, something falls in the lane right, so the car can swerve left, swerve right or just hit the object right. The response time for an algorithm. Machines are faster than humans and it's premeditated, right like you can think this before and program that in now.

Speaker 2:

If you say left, you're gonna hit one person, on the right you're gonna hit five people, or you're just gonna hit the car and put the driver's safety at risk. Which one should you choose, right? So then it becomes a bit of a philosophical problem, and if that happens, then it's the developer's fault. Does the company have a policy on this right? So I don't have answers, but I have heard different sides of this discussion. What do you think, lucas?

Speaker 1:

yeah, I think it's. Yeah, it's exactly that. It's like sometimes, I think when you said you can probably program these cases, but usually I think what? Well, not usually, but what can happen is that there's this another edge case in which, like, if you have a I don't know a pattern recognition machine or something, so it doesn't know really what to do, but then a human could even make a completely wild, wildly different choice or something I don't know.

Speaker 1:

It's like if you could cover all the edge cases probably, and then the developer would be kind of liable yeah for the choice made, but if it's something, you know, if there is a third choice or something, then yeah, I don't know who is then liable, right?

Speaker 2:

yeah, it's very philosophical, yeah indeed, but I do think that for the I mean I'm wondering how much of the amazon just walk out. Technology is also about liability yeah, wrong charges or something, yeah, I mean, I think practice is a combination of things, right, but if you have the perfect algorithm and there is a mistake or there is, I don't know, some philosophical edge case or something, not sure did they actually mention on the article, if they uh, if this is improving or not really, or not really.

Speaker 1:

They said yeah, there was something about training data so it's like they're trying to avoid the topic Related to this, though what's the word, I don't know. Basically, it's not just Amazon who develops this kind of thing. So if I can shamelessly plug in something, do it. There's this Lithuanian company, I think they are developing these self-checkout shops. It's Pixavia, I think they develop these self-checkout shops. It's Pixavia, I think. So they develop these self-checkout shops now in Lithuania and Germany and they're doing quite well, but they're a bit kind of overshadowed by Amazon and all the big players.

Speaker 2:

Yeah.

Speaker 1:

I was just saying that maybe there are these tiny players that could spin off something even more robust.

Speaker 2:

True.

Speaker 1:

We're about to see.

Speaker 2:

True, but I'm also, I feel like to me, I don't know, I don't know. I mean, I think it's a cool technology.

Speaker 1:

Yeah.

Speaker 2:

It's a cool idea, but I'm wondering if this is how much needed. Is that you know?

Speaker 1:

Saves a bit of uh yeah, I wonder.

Speaker 2:

The main idea is that it saves money by eliminating cashiers and all that and I guess it saves time for consumers, right, I think that's actually probably the biggest thing. No, because I'm just wondering, like with the self checkouts and whatnot, you still get any people in the store? Yeah right, you're not like, you're not eliminating people.

Speaker 1:

Yes, it's like with these. Do you also call these chat self checkout thingies in the in the supermarkets. Well yeah, not with an AI, but the ones that you do your. Yeah, you scan yourself yeah scan yourself, so you also need supervisors for that. Yeah a couple of people for.

Speaker 2:

Yeah, when people for 20 machines, or yeah this one should be the same. So one out of 20,. What's the percentage of that? Way less than.

Speaker 1:

Five percent or something, yeah way less than what they're paying now. So it's much more efficient just to keep these.

Speaker 2:

True, true, true, true. And speaking of efficiency, have you heard of this programming language called Rust, Rust, rust, rust?

Speaker 1:

No, where is Bart when we need him?

Speaker 2:

Where is Bart when we need him? What's about Rust for the people that are not never heard of Rust? I mean, if you listen to the show, you've probably heard of Rust already once at least, and twice after Bart's when Bart's in the episode, because he repeats it a couple of times. Rust is a programming language that it's super efficient. That's the premise, right? It's very efficient. Traditionally, c++ was the throne. Maybe you heard of C++ if you're listening, but Rust has some different philosophy on code. That basically makes it more reliable, like bug free, quote, unquote, with memory safety and all these things. Why are we bringing it up? Um, not too long ago, but not last week. Um, there was a rust nation, uk, and someone from google, which also is the creator of the Go programming language. Actually, let's see who's this person again.

Speaker 1:

He's the director of engineering at Google Android Very controversial.

Speaker 2:

Yes, and in his talk, among other things, he was comparing productivity of using Rust versus Go and C++. And productivity here is like as, as a developer, how fast can you write a new feature? Can you write up the application? Right, and the premise here is that rust, it's faster after you write it, but the process of writing taking takes much longer, right? Um, and I think I have experience with python and with a bit with Rust, right, in my experience so far, yeah, I'm more proficient with Python, but there are less things you need to worry about when you're programming Python, for sure.

Speaker 2:

But in this talk the speaker defends that, and I think the speaker's name is this, lars, this guy, lars Bergstrom, and you see here, director of Engineering at Android. So he works for Google. But he was still making the claim that Rust teams at Google are more productive as ones using Go and more than twice as productive as teams using C++. So basically, he did an experiment. So, first pointer, he is a director of engineering at google but, doing some quick research, he's also the. He's also chair of the rust foundation board of directors and previous servo project member.

Speaker 2:

So yeah, take it with a bit of salt yeah, he's definitely involved with rust, but basically there's a whole bunch of like benchmarks. So rust because it's memory safe and all these things, apparently it requires less, means there's less bugs. So you do have, you don't have to worry about it as much and work on it after you've deployed. Um, but apparently in development as well and I can share this one here some timestamps on the talk, right, people feel they are as productive in Rust as they are in other languages. They were previously writing in after, I think they said six months.

Speaker 2:

Yeah, there's a here. Google found that putting gold to Rust takes about the same size team and about the same time to build it. So there's no loss of productivity and we do some benefits from it. We see reduced memory usage, uh, decrease of defect rate over time. So basically there are less bugs over time and memory usage is also less. So basically, there feels a bit they're just advertising Rust, right, they're just saying switch to Rust now, because the only argument for not using Rust is that it would take you longer to get up to speed, but that's not the case.

Speaker 1:

I wonder how. So some languages are somewhat much more loved by developers than others, but it's a non-objective thing, so it's like people kind of are very passionate about rust you know, in in general because it's a very cool new concept and I I don't think people are as passionate about go or even python sometimes yeah, I think actually, like you mentioned, I would write fast. I would write rust faster than let's say, go, maybe because it seems like a much more sexy language.

Speaker 1:

I don't know, that's kind of weird that's why, like it doesn't feel that the working hours are too long. If you write rest or something, maybe.

Speaker 2:

Yes, I don't know yeah, I think there's a lot. It's very, yeah, like how proficient you are with the language, right, there's a lot of stuff to it. I mean you also mentioned the love programming languages.

Speaker 1:

Yes, oh, it's a new cool thing.

Speaker 2:

You know my job is so boring now, but you know that Rust, according to Stack Overflow survey for this, I think, seven years in a row Rust is the most loved programming language, which that's why there's always the thing about rewriting Rust and the Rust fanboys, and all these things right.

Speaker 2:

Yeah, it's like rust has been getting a lot of space, I feel. I think linux also supports rust now, so it's been getting more and more um space in the industry. I do feel like I again I'm not a rust developer by any means I am curious about rust um but I do see people on posting like oh yeah, the number of people that are looking for rust developers is increasing over time.

Speaker 1:

So it may be definitely increasing, yeah, so I think it's a.

Speaker 2:

In a way, it's a safe bet. Yes to bet on rust very much so.

Speaker 2:

And also rust has some other cool things. So, for example, rust, rust has the FFI, the form function interface, meaning that you can have a function written in another language, so the form, and bring it to Rust. Rust supports natively C and that's actually what most programming languages support, and through C you can have Python. Actually, the most popular implementation of Python is in C. Is the C Python, right? So actually Rust became a very popular choice for Python packages because of that. I mean actually because of that, I'm not sure, but I know that the way that Python and Rust interact is through the C layer makes sense. Did you know that? Did you have your?

Speaker 1:

no, no, no, but it makes the c layer makes sense. Did you know that? Did you have here?

Speaker 2:

no, no, no, but makes kind of sense, it ticks yeah, the reason why I'm also bringing it up is not to show off my knowledge of rust and python, but also because I came across this article here as well that talks about pyro 3, and pyro 3 is a python package and a rust crate. I guess. Yes, um, that basically helps like it makes it very easy for people to go from rust package and expose it in python like an extension.

Speaker 2:

Yeah, like an extension, like a binding, they call it right so and uh, for if, whoever's, if they're familiar with TensorFlow, pytorch, I think, even NumPy, I think cryptography is an old one, these libraries, they're actually written in C or C++, I want to say. And then there's a thin layer of Python on top. So basically, whenever you run computations, it's actually running on the C level, which is very optimized. And rust has been gaining popularity and starting to quote, unquote, replace some of these things. So, for example, polars is a framework that is in written in rust, pidentic is written in rust, cryptography actually is in rust now as well. Hugging face tokenizers is actually in.

Speaker 2:

Rust as well. So there's quite a lot, and I think all of them actually use this pile three for binding, for creating the binding. Actually, I used it before. It's actually quite easy, surprisingly easy, to do something in Rust and bring it to Python with this. There's a lot of ways I can take this article that I read. So, for example, breaking, breaking changes and all these things, but, uh, because I know Bart was throwing me under the bus through some months or some weeks ago. Um, but basically, what is this article about there? It's a bit technical, so everyone take a deep breath with me. Um, basically, they are changing the API, right. They're changing the API right.

Speaker 2:

They're changing the way that the code then needs to look like for it to work as expected. Basically, and in this article they're motivating why they're making this change. Well, one of the things that they say is that by doing this it will be faster, the code in Rust will be actually faster and use less memory, right? So they actually show some stuff here and they show the differences on the API. When I say difference on the API, for example, here it says pylist, colon, colon new, and on the next one you see pylist, colon, colon, new, bound. So that's the change in the API. So not a lot, but it may change the. So still with me, lukas, yeah, yeah, okay, the reason why they do this is lifetimes, and I'm not going to try not to get too much into it the way that the syntax is referenced in Rust today.

Speaker 2:

This means two things. One, it basically tells Python that you can play with it and it's not going to break anything memory-wise. You can interact with Python safely, right? That's one thing. So this is to quote, quote here the article the lifetime pi, that rust code can safely interact with the python interpreter. Okay, but it also means something else. It also means that this is a lifetime.

Speaker 2:

That rust code owns a reference of the python object in question and because it means two things, it makes things way less flexible, right? Um? In practice, if anyone's still following me on this one, yeah, um, by decoupling these two things and that's why they introduce this bound notation here, right, because basically you're binding how long this object will live, quote unquote. By decoupling these two things, what you can do now, starting on pi o3 0.21, is that you can delete objects from memory at any time, quote, unquote. You can basically choose when you want to drop something in memory, which will make things way more memory efficient, which is actually what the previous item was talking about, about rust memory efficiency right, um, yeah, I'm not going to go through all the details, so if you're interested in rust empire 3, I would encourage you to to have a look.

Speaker 2:

It also talks about a lot about lifetimes and whatnot, but they even they do admit that this will make it more complicated for people to learn.

Speaker 1:

So it's like you get the performance game.

Speaker 2:

Exactly.

Speaker 1:

You have the performance game, which is cool and nice, but then Exactly.

Speaker 2:

So basically, like before, also for completeness here they even noticed that there was a package in Python which uses coroutines and whatnot that would make this unsound by this assumption of bundling these two things together. So I think it's interesting. I also, like they do mention, this is going to make things more complicated for developers and people that are coming from python to rust. This article is also interesting because it really highlights the difference between rust and python how in python you don't have to worry about a lot of these things and in rust you have. To right, and I think before you had to worry about more things.

Speaker 2:

But now, with this change, you have to worry about another thing because you're decoupling two meanings yeah right, um, and my question is basically having a trade-off between giving more user control or something that is easier for you to use. How do you stand on this trade-off?

Speaker 1:

I think you always, whenever you, if you develop a package or something you always eventually have to I don't know you eventually have to face this dilemma Like how abstract your API is, so how much control you want to give out versus how easy it is to use. And if you kind of make your interface more and more detail-oriented so you give out control, then it becomes more complex to be kind of digested essentially by users.

Speaker 1:

So yeah I don't know, it depends, like, really I would just you, you have to do field tests. Really what?

Speaker 2:

do you mean by field tests?

Speaker 1:

I mean really test with users uh yeah what they, what they need, because, like sometimes you, you especially engineers they think they like to generalize. Like you, each of us have a vision of like what's the perfect interface, like control versus easiness to use. You know.

Speaker 2:

Yeah.

Speaker 1:

But then you do a test and you realize that either your audience is like I don't know, like completely tech, I don't know, like, let's say, they're all Python devs, but like they're kind of a web people in the sense that they are very high level in this tech. Yeah, and then you say, ah, but these changes, they completely won't fly, you know true, I don't know.

Speaker 2:

I think, yeah, it depends a lot on your user base. Indeed, right, I'm also wondering if, um how, how we play for example, you can choose to have more control yeah that comes with the overhead of understanding how everything works. I think in this case they mentioned uh, they thought it was okay trade off, because if you're learning rust, at some point you're gonna have to learn about the memory management yeah, right but I also feel like some things it's uh, some packages.

Speaker 2:

They only become popular because of how easy it is to use or because not necessarily how easy it is to use. But I'm also one like the lack of customization yeah, there's like this you know needed interface, that's exactly right.

Speaker 2:

I mean and I think I'm not sure if this really fits here, but I like rust even they say that, uh, it is more opinionated. They force you to do quote unquote, force you to do things a certain way. Actually, even in python, the zen of python, they say that there is one way of doing things, only one way yeah, that's right like the linter black right yeah it became very popular because you cannot customize it at all, right.

Speaker 2:

So, yeah, I think I mean, personally, I like this change. I think if you're, if you're going for rusty, probably because you really want to squeeze every bit of performance, yeah, I think you're either way going, venturing into compiled languages territory.

Speaker 1:

You'll yeah just take your time, absorb the concepts and uh, kind of a I don't know just that that's going to be your new reality. Don't take shortcuts yeah like if you're going venturing that road yeah, indeed indeed.

Speaker 2:

But yeah, I I search this. I agree, I agree, I think I I like this change. I would. I'm excited to to try this. I'm gonna try, try this out, try this out this evening, right now yeah, um, but to be and I also think it's a relatively small change like it's not, like it's not like they're changing completely the API and you have to learn something completely new. That is way more complex, right? So I think I'm still looking at this. But, yeah, the trade-offs of package maintainers and still on Rust.

Speaker 2:

Rust and technical debt yeah you share an article as well here? Yes, on tech debt.

Speaker 1:

My rust library is now a cdo it's a very weird article that we shared in our tech channels. Yeah, that kind of made me think, and I'm not even sure the author kind of thought over the entire idea that he tried to communicate fully. He just kind of wrote it down. He just probably woke up one morning and just wrote down whatever was in his head.

Speaker 2:

Maybe about the author. Real quick, it's Armin Ronacher, right he's fancy. Yeah, he's popular. We actually talked about him a few times before he oh, I don't want to listen to this. He's also the writer of uh jinja a lot of packages flask, and do they have it? Does he have it here? Flask rye? Oh, he doesn't have rye right up.

Speaker 1:

It's probably not you, it's just a baby.

Speaker 2:

Maybe it's not yeah, maybe it's too early, but we have talked about Rye quite some time. So, we were talking about him as well. He's a fancy guy. He is a fancy guy, he's a well-known guy and he's also as a good Python developer, yeah yeah, like most Python developers. Yeah, yeah, but what is this article about?

Speaker 1:

Yeah, so he kind of. So he had this issue, basically that there was this lib that he used to use in his project, that I think it was YAML parsing lib or something, and the maintainer just kind of oh yeah, the maintainer just stopped maintaining it.

Speaker 2:

Yes.

Speaker 1:

Which is yeah.

Speaker 2:

Which is the way of open source, sometimes right, yeah, like you just maybe I don't know. They got tired, they got burned out, they didn't want to do this anymore. But basically, so the setup is I'm writing something and I'm like, oh, I need a yaml parser.

Speaker 2:

Yeah, lucas created a yaml parser, so I don't need to do it myself. Yeah, just so. I'm just gonna take lucas, but lucas, for some reason, is not working on this anymore and I still need it. So I still need updates. So whenever there's a new version of rust or python, whatever, I need a new version of the parser, but it's not maintained.

Speaker 1:

Yeah, right so what happens next? Yeah, so so what you notice was, like some of these packages, then they are, they get showcased in the community and they get a so-called rating. You compared it to debts, basically. So that's not, you know, that is not equal to another debt. They have their ratings based on agencies, whatever. So it's like based on the on agencies, whatever. So it's like based on how likely it is to be returned, whatever. So he compared the open source packages to debts and he said that that package, when he was exposed to it, was Rust's SEC database. I think it's a package database or something. I'm not a guru, but they gave it a really bad rating in the sense that a lot of tests were failing. There were a lot of feature requests like, hey, are you going to fix that or that, like you know. But the author was like you know, somewhere off in the tropical island, whatever. Nobody cares, like he's not maintainer anymore, formally, you know, yeah, yeah, yeah.

Speaker 1:

So what he said is like the package is now junk essentially, and I have to do something about it because it's not going to be maintained. So what he did was he managed to make the debt again kind of perfect by just copying over the package into his project, like literally copy pasting the open source package into his project like a subfolder yeah vendor. Like old school techniques, like if somebody had built that package for you some time ago.

Speaker 2:

And then the package from his perspective is, you know, flawless again yeah, it's a good rating because he can change it whenever he wants yeah, I think the interesting thing is, like he I think for and this he wanted the package to be maintained. Yes, but he didn't want to do it himself, because by maintaining that package, I imagine he would also need to maintain other parts of the package that he's not using. Yeah, right, if there are five functions in your package and you're only using one. Yeah, if you were to take over because it's an Observe project and he could potentially fork it and republish it.

Speaker 2:

Then he would be need to maintain all five.

Speaker 1:

Yeah, because it's not a binding contract. You could become another author that says I quit basically but it's like a community, especially from the guy like him, they would expect him to maintain so he could just sign them and copy it over whatever he needed and he's going to maintain whatever he is. So he's just silently copied over whatever he needed and he's going to maintain whatever he is kind of a you know yeah, and so he's using what he needs.

Speaker 2:

Yeah, there's an issue, he can fix that it's good, good again, you know. Yeah, it's a perfect package, yeah I actually have you ever rendered packages at all?

Speaker 1:

no, it sounds like a very old school thing, like I think. Even so, before open source, I think at one company that that I worked, they had a bunch of packages built by other companies which is a very rare thing nowadays.

Speaker 2:

But like not open source, no closed source.

Speaker 1:

But yeah, this is the thing they would call a vendor lip, because it's not open source, built by some external company, it's not open source built by some external company, you know. So it's like, but to me it sounds very old school because, you know, the open source movement has been like for 10 years now.

Speaker 2:

Yeah, yeah, yeah.

Speaker 1:

Everything is open source.

Speaker 2:

Yeah, yeah, maybe they have their reasons. I vendored once a package, but it was because it was actually rust. So actually there was, like it was a one line change, yeah, and I wasn't sure if that would make a big like I would. Actually I raised an issue, yeah, saying hey, is there a reason for doing this, doing this? But I wanted to make progress on my work, so I actually vended that git back. I guess a git project really.

Speaker 2:

So in rust you can really do this and I changed that, um, and it was working fine. But then every time there was a new bug fix or whatever new feature, then I would have to basically copy it again and make the change and then in the end they actually just they made the change and they published on a version two of that package that I was using and before it was merged, because I still didn't want to wait, I actually added a dependency as a Git dependency. So that was also fine. I also think I didn't make the pull request right away because I was like man, this is a one-line change, like, should I make a pull request?

Speaker 1:

for one line.

Speaker 2:

I think it's my insecurities.

Speaker 1:

Like sometimes people are going to be judging me.

Speaker 2:

It's like what's this guy doing?

Speaker 1:

I think from other people's perspective, if you have a chance to actually contribute to open source via one line change, wow, that's a golden ticket.

Speaker 2:

Yeah, actually, yeah, yeah, yeah. Maybe I'll share this as well. This is not related to it. It's related to one line changes. You'll be happy to hear that I've gone over this social anxiety that I have of one line changes. So this is an open, this is a like. This is the engine that I use to create slides, and that was.

Speaker 2:

They did a new release and I noticed that there was an issue for me, so I literally just changed one line. I just deleted one line and I created a pull request and it was merged recently. So I'm officially a contributor. Wow, nice, yeah, yeah, can you have the applause, please, alex? Yes, thank you, stop, it's fine, I'm just kidding, anyways, but I also talked to Sam the brain about this, because he's also a core contributor of TBT and he's like, yeah, why not? You know like it's like yeah, why not? You know like it's? Yeah, it's one line, but now no one else needs to do it, right, like what's the why? Why wouldn't you do it? So, uh, so yeah, I talked to him to give me a bit of confidence. I still have a bit of social anxiety, that's uh it's fine, I'm okay okay all righty, all righty.

Speaker 2:

What should we talk about next? Let's see, maybe let's talk about ai, because, you know, haven't heard about it these days. Cursor, what is cursor, what is this?

Speaker 1:

I have no idea. To be honest, I haven't used it by all. My friends started using it. It's the in lithuania.

Speaker 2:

Yeah, yeah.

Speaker 1:

One friend was talking about it, so it's AI first code editor. Sounds very fancy, so what is so?

Speaker 2:

build software faster in an editor designed for pair programming with AI. Okay, what about it? What does it?

Speaker 1:

do so. That's why it's AI first, so the copilot already feels like your sidekick, essentially.

Speaker 2:

Yes, which is?

Speaker 1:

good, you know, you paste some questions into it and gives you answers, whatever. But this idea is like built from the ground up to let ai, you know, do things like you know refactor code, access file system, you know, to find something, uh. So it's very, very intrusive, but but it sees everything and can help with anything.

Speaker 2:

How does it do with privacy?

Speaker 1:

I have no idea. Probably they don't really care about it.

Speaker 2:

Probably send it to India again.

Speaker 1:

I think it's just initial steps for them.

Speaker 2:

But then it's more like the. So it has probably similar features to VS Code with Copilot, but in a different UI that makes it nicer to use.

Speaker 1:

Yeah, I think so. It doesn't feel like a plugin essentially, I think the file system thing Copilot doesn't have it yeah, yeah, yeah. It's like, because I think if you wanted to plug that in into a plugin, then it's kind of hard. This is from the, so it's worth a shot. I'm going to try it, but like I see here.

Speaker 2:

I clicked on features on the website. It says Command K. Command K lets you edit and write code with AI by selecting edit. Okay, so I guess it's just a shortcut.

Speaker 1:

It's like a tighter integration with your code.

Speaker 2:

Yeah, but I'm wondering if you could just create a. Well, let me just read everything before before I commentilot. Plus, plus is a cursor, cursors, native, native, our complete feature. You can enable it on the bottom status bar. It's a very powerful jacob, I can use mine that's the same as copilot yeah, it's the same as copilot. There's the chat. This is also similar.

Speaker 1:

No, yeah, that's similar very much this is symbols.

Speaker 2:

Huh, this is new. No at, symbols can use you, so apparently there are some things here. My question is would it be easier to just create a vs code plugin or extension instead of a new editor?

Speaker 1:

yeah, I wonder if they, uh, if they even even ask themselves this question, because sometimes ah, but look, I think, uh, now we went to the for the people listening.

Speaker 2:

Only we went to the pricing and you see here the basic free. It actually has 50s load GPT-4. So I think they they have their own. So I don't need to have copilot myself, I just use theirs, copilot yeah, probably they're probably hosting, or it's GP.

Speaker 1:

Or it's it's GPT.

Speaker 2:

So it's nice to have GPT for GPT for fast, slow, okay, maybe. I mean there is a, there is a basic free right, so maybe it's worth a try. Maybe we can try it out, because then the next time when you come back can tell us all about it yeah and why we should all switch to cursor or not, yeah or not or not but yeah, it's a good question, though.

Speaker 1:

Why would? Why would you build a new ide from ground up and not extend this code, which is actually a good enough ide like?

Speaker 2:

yeah, it's like. I feel like they're winning the battles of ids. No, yeah, this code is the way to go.

Speaker 1:

I feel yeah, because the entire ecosystem is so, but I think yeah, I'm not sure I have to try it has something to do with VS Code.

Speaker 2:

Is it a?

Speaker 1:

fork of VS Code or something I'm not sure, but don't quote me on this, but that's okay.

Speaker 2:

But I'm wondering for IDEs. I know Part-Time is very popular for Python as well, yeah, but I still feel like actually jet brains, they have one ide per language yeah, that's crazy right but it's uh, and I used to be like this actually I used to have one for python, one. Actually I have one for python and one for everything else, but, um, I think it's very hard to justify these days.

Speaker 1:

Yeah, just go with VS Code. Yeah, because it gets 80% of work done. Even more probably. Sometimes I feel these JetPrint products they're kind of clunky.

Speaker 2:

Yeah, I think PyCharm is good, but the thing is I don't want more than one ID.

Speaker 1:

Yeah.

Speaker 2:

And also there is Zed these days, which is like a very fast focused on uh sharing, like the like code sharing kind of like, but yeah to be seen, verdict still out, verdict still out. And uh, talking about ai and ai assisted. You had a foot for thought here what if we had mass refactors with Gen AI?

Speaker 1:

Yeah, it's like technically before Gen AI refactoring equaled migration. So it's like in the sense that it's not, if you have a really big code base, it's not a quarter job, that it's not. You know, if you have a really big code base, it's not a, you know, a quarter job. It's probably a year long job at big companies or something.

Speaker 1:

You know it's something to be planned ahead for and it's not a easy undertaking usually, but now with the, you know, with the JGPD and everything like, you can already refactor code, could you actually just kickstart? I wonder if anyone's tried that. Probably they have, but it still seems like if you could kickstart. I don't know. I remember you once talked on your podcast about this whether there is even a sense of having like a SQL, like a standard SQL, basically.

Speaker 2:

Yeah, yeah.

Speaker 1:

Restricting yourself to use only standard SQL and not like specific dialects, because with the generative models to refactor it's really easy nowadays. Is there even a point?

Speaker 2:

Yeah.

Speaker 1:

They could do it for you. Just kickstart the pros about letting it there.

Speaker 2:

All those edge cases so like refactoring, like yeah, refactoring is gen aic, but now we can refactor for gen a I, for all code bases.

Speaker 1:

Yeah, let's say python, yeah I remember like even a year ago we were doing this uh, redshift to databricks migration. We had to literally change SQL queries here and there.

Speaker 2:

Yeah.

Speaker 1:

I don't think ChagPT was then even that big a year and a bit.

Speaker 2:

I'm not sure, but I haven't used it. You hadn't back then, yeah?

Speaker 1:

So with this you just shove in the query and like but again the edge cases are there, like if it doesn't work with them, like you know maybe create a loop. Then you have this Devin thing that can probably do that.

Speaker 2:

Yeah, true, I feel like today there are a lot of options. For example, if I had to refactor a Python code base today, probably what I would do is to use Gen AI to write a lot of tests.

Speaker 2:

So just write tests and make sure all these tests pass, yes, and then have the jna, I think, to change the code without changing the test. So like a two-step process. Yeah, first time you don't touch the code, you just write a lot of tests. So basically you cover everything exactly, but like a lot of test cases you really try to like if you have a space of inputs or space of what could happen, you try to fill that out as much as possible. Also put like type hints and all these things.

Speaker 2:

Right with gen ei maybe, because I guess you can always iterate on these things yeah and then, once you have that input space covered as much as possible, then you change the internals and then you see so like yeah, it's like. But I'm wondering, like, even if you refactor, I guess the question is is this gonna be better, and who? And if you refactor, I guess the question is is this going to be?

Speaker 1:

better, and if you have a question how this works, who's going to answer that? Yeah, exactly. How are you going to measure the gain of refactor.

Speaker 2:

But also I guess I thought as well, what if the GNI refactor or something? Then no one can debug. Yeah so basically, developer experience degrades degree? Yeah, indeed right, and I guess it could happen today as well like if I write your factor or something. Yeah I mean, I guess it could happen today, like if I write something and then I leave the company and then someone else needs to maintain this that could also be the issue, right, but I guess as long as I'm in the company, there is always a, a backup plan, I guess, or something.

Speaker 1:

Yeah right, so you can always call chad gpt and ask yeah, yeah, that's true.

Speaker 2:

Maybe you can use gen ai to explain what the code does indeed cursor so yeah, true yeah, yeah, yeah, cursor, but uh, yeah, yeah, I don't know it just feels that I'm not sure. Maybe we could google it sometime but I guess the thing is like if you're refactoring code, you probably want it to look a certain way, no, yeah, exactly, so you have a like you already have, like in.

Speaker 2:

I don't know if genii is good for that no, it can explain any, any ugliness whatsoever yeah, because you know, like, if you like, okay, you have chagpt and you say, refactor this code, probably not gonna do what you want. You probably say, refactor this code in these five functions, yeah, and then it's probably gonna do something. It's like, okay, refine this code in five functions and the first function. Maybe you just want to do it, you know, maybe you're just going to go ahead and just do it and not ask the GPT.

Speaker 1:

Yeah.

Speaker 2:

You know Kind of like yeah, I'm not sure, I'm just thinking out loud here Just my philosophical side.

Speaker 1:

It's like I would use it if the refactoring undertaking is like yeah, I don't know, it depends on the type of refactoring.

Speaker 2:

Yeah.

Speaker 1:

Like, if you're trying to squeeze out I don't know milliseconds, then that's very very narrow focus.

Speaker 2:

Yeah, performance refactoring.

Speaker 1:

So it's like you cover it with tests, like you said, right yeah, and then you try to knack ChagPT to squeeze out the last bits which is. It's not always good at that, I noticed.

Speaker 2:

But like you could probably use what if you want to go from python to rust.

Speaker 1:

Yeah, it's like this.

Speaker 2:

Well, this is basically like a very like true transpilation between languages like yeah, so I think for that that would be, could be cool, because usually it's a pretty dumb.

Speaker 1:

uh like it's not changing the architecture you wouldn't change the architecture. Let's say you have go and then you wanna port to Rust because you wanna squeeze out even more milliseconds, so you wouldn't change the architecture that much. It's a bit of a dumb rewrite.

Speaker 2:

I would say yeah, unless your code really relies on the Python's dynamic nature yeah then, then it can be an issue yeah, then it needs to be great it's an interesting thought experiment. I mean, even if you have a two-step thing, you have code, you have CGPT, explain what the code does. And the next point is okay, based on your explanation, rewrite it in Rust or something. Maybe it can look completely different, right?

Speaker 1:

But maybe I think it's yet to see, but these things will pop up for sure. Yeah, for sure, or at least like we talked in the beginning, like maybe there's always going to be a human agent on the top, but they're just going to use ChatGPT meticulously to rewrite everything.

Speaker 2:

Yeah.

Speaker 1:

Like we do now.

Speaker 2:

Basically, the world is not going to change it's going to be the same as it was. There's always going to be a human operator saying yes, no, yes, okay, yeah, let's see it will be an interesting thought experiment. So if you ever give it a try, come back and discuss that you have two homework items already. That's a lot, alright. So if you ever give it a try, come back. You have two homework items already, all right?

Speaker 1:

uh, I think we have time for a hot take, okay, yes, what do you got?

Speaker 2:

airbnb open sources feature engineering platform. What is it?

Speaker 1:

about. Yeah, I don't know, I don't care, so that's why it's a hot take Again. I saw it on Twitter.

Speaker 2:

All right, thanks everyone. See you next time.

Speaker 1:

Yeah, it's like no, I just saw it on Twitter and like I don't know, should we care? Like I remember. So that's why it's a hot take Because, like I remember, a couple of years ago, like uber released their ml platform. Like, yeah, they have a bunch of companies open sourcing these mega projects yeah like not some really cool util, like by torch, which is not small nowadays, but it used to be small yeah or like tensorflow, like something they could plug in into anything, and it kind of does its job, does very specific thing.

Speaker 1:

Well, yeah now. Instead, it just released these uh monsters yeah what for like? Is anyone gonna use it? I'm yet to see any company that I don't know. That's true. Actually, there is a especially for feature stores. This is a feature store engineering thingy, so it's like again yeah, no, I agree, I agree.

Speaker 2:

I'm actually looking here because I see this is not even python, right? No map, string, string, key map. This is like c sharp or something, java or c sharp or java, yeah let's see or no, java, java so c sharp, no java, indeed, yeah, I think uh this, this links a bit to uh this is actually a hot take from bart episode. You remember data topics, part mentioned. Open source equals, oh, it's not even a while ago.

Speaker 2:

Open source equals. Open source equals marketing, and I do feel in this case it is a bit of marketing, yeah, because I think it's also like they're not creating a tool that does one thing well, and they intend to. I mean, I'm not saying I don't know, I don't know, but like a lot of these times, like you said, with Uber, with this, netflix also had the Metaflow which they use like notebooks in production.

Speaker 1:

I have no idea.

Speaker 2:

And again, there are some notebook lovers as well, so I think this is the one. Oh no, sorry, not this one, sorry, I think you shared the link?

Speaker 2:

Yeah, I shared the link here, this one Beyond Interactive Notebook Innovation and Netflix, which they even go as far as deploying notebooks in production. So there are like makes you always think, right, like, oh, is this should we revisit? You know, should we use notebooks in production? I think usually people say no, but, like when you see Netflix being successful with using notebooks in production, then you kind of question it a bit yourself. Right, but I haven't met anyone that has been really happy with notebooks in production. For example, I haven't met anyone that has been using the Ubers. What's the name of the Uber framework? Do you remember? It's a Michelangelo. Yeah, uber Michelangelo. Yeah, it's.

Speaker 1:

Uber, which is very fancy, like the name, the tech, everything is fancy indeed, indeed, I haven't met.

Speaker 2:

I mean again there's also yeah, no, but I think it's like I haven't met people using. One thing I have seen people use is the, the kendro, I think, which I think is from quantum labs. Kendra, yeah, okay, joe Kedro, sorry, okay is this? Here, ah, this McKinsey's, oh yeah, this one I have seen people use that's very cool.

Speaker 2:

Mckinsey has a huge exposure of clients yes, it's like they already have a good indeed right and that, and that's a good point. I feel like the nature of McKinsey is that they already see different clients, so if they build a framework, it's probably more flexible than something that Netflix built for their use case or Uber built for their use case or Airbnb built for their use case. And sometimes I wonder if it's like they have a very specific need or they want to optimize for something very specific, and that's why they create this framework internally. Yeah, it's not like hard-coded, right, you still need to have different use cases, but it's the same setup.

Speaker 2:

I guess, and people build stuff in-house and then they say, oh yeah, let's open source this, so we can, you know, like, and now it's open source, but it's not really open source, it's just. It's almost like source open. They're just making it transparent what they have, yeah, but they're not building something for the community necessarily it's not really useful.

Speaker 1:

I yet to see, but uh, that's what's up if anyone's listening from airbnb.

Speaker 2:

I want to prove, lucas, what's your linkedin now? What's your twitter, lucas, so they can they can tweet at you. They're never gonna hire me that's my take yeah, I don't know if I disagree. I mean I think I don't know this, I don't know this framework, right, but it feels too big to plug in into.

Speaker 1:

You know a lot of use cases like yeah, I think good open source tools are kind of a small ish yeah, they're specific very specific, they do yeah they get even company like I don't know. I think a lot of good open source tools that are very specific have been open sourced, you know, from the, from the companies, not from research but from private companies, like deep learning frameworks and all that. So I'm not saying that you know, companies cannot do open source. They do it really well.

Speaker 1:

But when they open source these monsters. I think it's really hard to create a general framework. I think like something like Ruby and Rails, right. Like that thing is just brilliant because it's like half of websites run on it yeah yeah, the rest is on wordpress or something, I don't know, it's like. It's like if you come from a company like well, it's just like probably five guys that there were. I've been building it inside for five years yeah what do you know about the outside world?

Speaker 2:

yeah, that's true, I mean it's.

Speaker 2:

Yeah, I I do see, uh, I do see companies like big companies, like even clients that we have, that they tend to build their own solutions right just to pick and pick up, you know yeah, and a lot of the times I feel like it's because you have already a team with expertise on a or b or c and then it's easier to just kind of go with that and ride with that. But yeah, so I see, I see your point. I see your point. I think sometimes when we have these really big companies, do they have their github here?

Speaker 2:

yeah, project github, if it gets traction, then yeah, cool yeah, yeah, but I think I think thing's also depends on who uses this right, so grown on. This is the Airbnb has 126 starts, so I think they're going to use it.

Speaker 1:

I asked the question, though, like do you think they're going to use the the open source version, internally as well?

Speaker 2:

Let's see, I want to say something like. I've heard rumors that, whatever they open source, the big tech yeah they already use something like new way, newer, yeah, so yeah in those cases, if that is true, just a rumor, but yeah, but if that is true, then really open source is exclusively marketing in this case.

Speaker 2:

Yeah, right, I mean actually maybe not, I'm saying all this, but I mean there's still some good. You can probably look at the code and get some good ideas and maybe this. So there's still other benefits, right, but I see your point. I see your point. I don't disagree. I'm not sure I fully agree either, but I think the fact that I haven't seen people, I haven't met people that really use these frameworks- end to end makes me think you're right.

Speaker 2:

There's also so many of them, which is also a bit hard to keep track, but, yeah, we have more items. But I think we'll leave it here for today and next week bart will be back and then I'll have something to talk to him, because I'm also going on holidays this weekend. Yeah, what are you gonna do this weekend?

Speaker 1:

not sure, okay, but yeah, road trip or something, we'll see what about you, alex again.

Speaker 2:

I don't know we'll ask next week yeah your road trip. You know the road trip but where are you going?

Speaker 2:

because I am going to romania. Yes, I'm going to take holidays on thursday and friday and I'll be in romania for to sunday. So yeah, this is also why today's live stream was not live streamed, so it was just a video. But we're also gonna be live streaming on tuesdays from now on. The people following us, um. But I think I'll leave it here for today. Thanks a lot, lucas for joining. Thanks, alex. Thank you. All right, get it. Thanks a lot, lucas for joining. Thanks, alex, thank you.

Speaker 1:

All right, get it. You have taste in a way that's meaningful to software people.

Speaker 2:

Hello, I'm Bill Gates. I would recommend TypeScript. Yeah, it writes a lot of code for me and usually it's slightly wrong.

Speaker 1:

I'm reminded, incidentally, of Rust here, rust.

Speaker 2:

Congressman, iPhone is made by a different company, and so you will not learn Rust while skydiving.

Speaker 1:

Well, I'm sorry, guys, I don't know what's going on. Thank you for the opportunity to speak to you today about large neural networks. It's really an honor to be here. Rust Rust Data topics. Welcome to the data. Welcome to the data topics podcast.

Speaker 2:

Ciao.

Amazon's Self-Checkout Technology and Challenges
Ethical Dilemmas in Self-Driving Technology
Rust as a Productive Programming Language
The Rise of Rust in Industry
Package Maintenance and Collaboration in Rust
AI Refactoring and IDE Comparison
Open Source Frameworks in Data Engineering