WEBVTT 00:00.000 --> 00:07.000 Thank you so much. 00:07.000 --> 00:09.000 Yeah. 00:09.000 --> 00:10.000 Last talk of the day. 00:10.000 --> 00:14.000 We'll talk about machine learning for very, very small systems, 00:14.000 --> 00:17.000 especially compared to what's all the rage today, 00:17.000 --> 00:19.000 with giant language models, generative AI, 00:19.000 --> 00:20.000 and so on. 00:20.000 --> 00:23.000 This will be completely on different side of the spectrum, 00:23.000 --> 00:27.000 and talking about how we can do machine learning that is useful, 00:27.000 --> 00:30.000 for different kind of tasks at the machine, 00:30.000 --> 00:34.000 to be somewhat size on small microsystems. 00:34.000 --> 00:36.000 So when it is you and them from sound sensing, 00:36.000 --> 00:39.000 we monitor ventilation systems, 00:39.000 --> 00:41.000 such as the ones shown here, 00:41.000 --> 00:43.000 and our customers are operating buildings such as this, 00:43.000 --> 00:46.000 and for us we use machine learning to automatically analyze 00:46.000 --> 00:48.000 to detect anomalies. 00:48.000 --> 00:53.000 And today we'll talk about why machine learning on micro-controllers 00:54.000 --> 00:57.000 is relevant, like how can we do useful things, 00:57.000 --> 00:59.000 which on such small systems? 00:59.000 --> 01:01.000 This niche is called tiny ML, 01:01.000 --> 01:05.000 or edge AI is also getting popular as a term. 01:05.000 --> 01:08.000 We'll talk about the EM Learn project, 01:08.000 --> 01:10.000 the open source project that I maintain, 01:10.000 --> 01:13.000 some projects that has been made, 01:13.000 --> 01:15.000 real world projects with EM Learn, 01:15.000 --> 01:18.000 and we'll cover a quick how through too. 01:18.000 --> 01:25.000 So machine learning on micro-controllers and embedded devices, 01:25.000 --> 01:28.000 what is it used for? 01:28.000 --> 01:32.000 It's primarily used for sensor data analysis. 01:32.000 --> 01:35.000 So we have some sort of sensor node. 01:35.000 --> 01:37.000 We have the sensor itself, 01:37.000 --> 01:40.000 can be camera, experimenter, microphone, radar, 01:40.000 --> 01:43.000 and many lightar, et cetera. 01:43.000 --> 01:46.000 And we want to process that information, 01:46.000 --> 01:49.000 or all that data, and extract the useful information 01:49.000 --> 01:53.000 from that large amount of data. 01:53.000 --> 01:56.000 And only, like for example, directly act on this, 01:56.000 --> 01:58.000 in robotics, for example, 01:58.000 --> 02:01.000 or we might transfer it over to a larger system, 02:01.000 --> 02:02.000 or potentially to a cloud. 02:02.000 --> 02:04.000 But just the relevant information, 02:04.000 --> 02:07.000 not all the useless data. 02:07.000 --> 02:09.000 So this allows some key benefits. 02:09.000 --> 02:12.000 We can make standalone systems that operate 02:12.000 --> 02:15.000 with low latency, guaranteed latency. 02:15.000 --> 02:17.000 Highly power efficient systems that can, 02:17.000 --> 02:21.000 where the sensors can run for multiple years on the single battery. 02:21.000 --> 02:24.000 And we can also make privacy compatible systems, 02:24.000 --> 02:28.000 because we don't transfer all the potentially sensitive data. 02:28.000 --> 02:30.000 And we can also make things low costs, 02:30.000 --> 02:33.000 which is relevant because it enables massive scale 02:33.000 --> 02:36.000 in terms of the number of units we will deploy. 02:36.000 --> 02:39.000 And this is already used in many areas, 02:39.000 --> 02:41.000 for example, in consumer technology, 02:41.000 --> 02:45.000 keyboard spotting, so detecting hay Google, or hay Siri. 02:45.000 --> 02:47.000 And so on, that's using microtroller, 02:47.000 --> 02:49.000 then wakes up the system. 02:49.000 --> 02:51.000 Or if you have a sleep quality tracker, 02:51.000 --> 02:55.000 or a fitness tracker, that will also use this kind of technology. 02:55.000 --> 02:57.000 And in industrial case, for example, 02:57.000 --> 03:01.000 you can use this to track the health of animals. 03:01.000 --> 03:04.000 We use it to track the health of machines. 03:04.000 --> 03:08.000 Or you can just use it for fun projects at home. 03:08.000 --> 03:10.000 For example, you can make a smart doorbell, 03:10.000 --> 03:12.000 which opens the door for your cats, 03:12.000 --> 03:14.000 but only if it's your specific cat, 03:14.000 --> 03:17.000 and not any cat, or you can make a magic wand, 03:17.000 --> 03:19.000 which you can use to make a gesture 03:19.000 --> 03:22.000 to control some of the devices in your home. 03:22.000 --> 03:24.000 And these microtrollers, 03:24.000 --> 03:26.000 they are essentially a full computer, 03:26.000 --> 03:28.000 but very, very small one. 03:28.000 --> 03:32.000 You have, for example, maybe one kilobytes of RAM, 03:32.000 --> 03:35.000 or maybe the largest system has 1000 kilobytes of RAM, 03:35.000 --> 03:38.000 and similar in terms of the program space. 03:38.000 --> 03:40.000 All the machine learning, all the code, 03:40.000 --> 03:43.000 needs to fit in there. 03:43.000 --> 03:46.000 And we're mostly doing inference on the edge. 03:46.000 --> 03:48.000 Not so much learning on the edge. 03:48.000 --> 03:52.000 The price point might be as low as 10 cents, 03:52.000 --> 03:56.000 or as much as 10 dollars for a microtroller. 03:56.000 --> 03:58.000 And this is a huge area. 03:58.000 --> 04:00.000 These like any electronics will have one or more. 04:00.000 --> 04:04.000 Like your car will have 100 microcontrollers these days. 04:04.000 --> 04:08.000 So there are over 20 billion devices shipped each year. 04:08.000 --> 04:12.000 So this is tiny chips, but massive scale. 04:12.000 --> 04:14.000 And this is increasingly available for hobbyists. 04:14.000 --> 04:17.000 So you can program these kind of things with the R. 04:17.000 --> 04:21.000 We know ID, for example, or with micro Python. 04:21.000 --> 04:24.000 But efficiency is key. 04:24.000 --> 04:28.000 So the EMLARM project I started in 2018, 04:28.000 --> 04:31.000 as part of my master in data science, 04:31.000 --> 04:33.000 before I have a background lesson. 04:33.000 --> 04:37.000 In general, I wanted to combine those skills. 04:37.000 --> 04:40.000 And there's two aspects to these kind of systems. 04:40.000 --> 04:42.000 You have the training process. 04:42.000 --> 04:44.000 That's quite standard. 04:44.000 --> 04:45.000 It happens on your computer. 04:45.000 --> 04:48.000 Use familiar tools, such as psychic learn or caras. 04:48.000 --> 04:51.000 And you have a pipeline that fits out of the deploy model. 04:51.000 --> 04:55.000 And then we have the EMLARM project. 04:55.000 --> 04:58.000 It's a Python library that you install. 04:58.000 --> 05:02.000 And you have a one line export to a efficient device. 05:02.000 --> 05:11.000 And then on the device side, you can then deploy this either with EMLARM C library, 05:11.000 --> 05:14.000 which is written in portable C99. 05:14.000 --> 05:20.000 It will run any system that can run C, which is basically any embedded system. 05:20.000 --> 05:23.000 This is the link of Franca of that area. 05:23.000 --> 05:28.000 And it's even run used inside the Linux kernel modules, for example. 05:28.000 --> 05:31.000 Or you can use it with EMLARM micro Python, 05:31.000 --> 05:36.000 which is quite interesting because most people 05:36.000 --> 05:39.000 let the machine learning they know Python primarily. 05:39.000 --> 05:43.000 So this enables to do all the application logic for even this kind of 05:43.000 --> 05:46.000 tiny ML type systems in Python. 05:46.000 --> 05:53.000 And then the EMLARM library is implemented in C and exposing Python APIs. 05:53.000 --> 05:57.000 EMLARM supports the most common tasks, 05:57.000 --> 06:01.000 such as classification, regression, and analytics that are relevant for 06:01.000 --> 06:06.000 these kind of sensor data and better use cases. 06:06.000 --> 06:11.000 Support a range of simple and effective embedded friendly model. 06:11.000 --> 06:12.000 So this is a subset. 06:12.000 --> 06:14.000 So there's no large language model. 06:14.000 --> 06:15.000 There's nothing like that. 06:15.000 --> 06:18.000 But you have, for example, decision trees, random forest, 06:18.000 --> 06:20.000 classification, k-nearest neighbors. 06:20.000 --> 06:23.000 You can do simple and device learning or adaptation. 06:23.000 --> 06:27.000 And also some small neural networks, multi-layer perceptrons, 06:27.000 --> 06:31.000 convolutional neural networks. 06:31.000 --> 06:34.000 And here are some projects you think in learn, of course, 06:34.000 --> 06:35.000 being open source. 06:35.000 --> 06:38.000 Most of the projects out there we will never hear about, 06:38.000 --> 06:41.000 but it's at least referenced in over 40 publications 06:41.000 --> 06:43.000 by scientists across the world. 06:43.000 --> 06:49.000 And here's this selection of examples to keep it kind of real 06:49.000 --> 06:51.000 for what is it used for. 06:51.000 --> 06:56.000 So from Virginia Tech, they created a system to track the health of cattle. 06:56.000 --> 07:00.000 So using an experimenter that is mounted on the cow. 07:00.000 --> 07:04.000 And in that system, you automatically 07:04.000 --> 07:07.000 reanalyze with a decision tree model. 07:07.000 --> 07:10.000 The extra amount of data in determining if the cow is flying, 07:10.000 --> 07:13.000 walking, standing, grazing, or resonating. 07:13.000 --> 07:16.000 These are like the stages of food processing for the cow. 07:16.000 --> 07:19.000 And then you transmit this data over lower one. 07:19.000 --> 07:22.000 Just the not that raw data, but the activities, 07:22.000 --> 07:25.000 like the classes, what is this cow up to. 07:25.000 --> 07:29.000 And then this is used then to track, to check for abnormal behavior. 07:29.000 --> 07:32.000 So like if the cows are not eating for a long time, 07:32.000 --> 07:33.000 that's concerning. 07:33.000 --> 07:35.000 That's something you might want to check out. 07:35.000 --> 07:38.000 And doing this all on the sensor, 07:38.000 --> 07:41.000 enables it to run in their one milliwat, 07:41.000 --> 07:45.000 which is 50 times lower power consumption and sending the raw data. 07:45.000 --> 07:47.000 Which is really important in this kind of case, 07:47.000 --> 07:52.000 because basically it allows you to have 50 times the battery life. 07:52.000 --> 07:55.000 Here's a project from Samsung Research, 07:55.000 --> 07:59.000 where they estimate the breeding rate of the person wearing a 07:59.000 --> 08:03.000 this kind of earbola or device. 08:03.000 --> 08:07.000 And they use a combination of audio data as well as the 08:07.000 --> 08:10.000 extraometer data to determine how quickly you're breeding, 08:10.000 --> 08:14.000 which is very important to track for those that have respiratory health problems. 08:15.000 --> 08:19.000 And this is a practical solution that can be used in worn everyday, 08:19.000 --> 08:22.000 not just in a clinical context. 08:22.000 --> 08:26.000 And again, this is important that the power consumption is not too high, 08:26.000 --> 08:30.000 so that you can still use this for like a couple of days without having to 08:30.000 --> 08:32.000 worry about battery life. 08:32.000 --> 08:35.000 Here's a more like fun project from of mine, 08:35.000 --> 08:41.000 using an extraometer in a timer that can be attached to a toothbrush, 08:41.000 --> 08:44.000 and you automatically track how long you're actively brushing your teeth. 08:44.000 --> 08:48.000 Because you are supposed to brush your teeth for at least two minutes, 08:48.000 --> 08:51.000 every time, and most of us don't do it. 08:51.000 --> 08:54.000 So this device can help you. 08:54.000 --> 08:57.000 So this is something that a Christmas, just a fun, 08:57.000 --> 09:00.000 and one thing I'd like to highlight here is that I collected and 09:00.000 --> 09:04.000 labeled the data for this in just a couple of hours. 09:04.000 --> 09:09.000 So these kind of projects are possible also with a low amount of data, 09:09.000 --> 09:12.000 because the problem is not so complex, 09:12.000 --> 09:17.000 like a realistic type of shading that you do in the University, 09:17.000 --> 09:22.000 that can be done by just selecting like five sessions of training data 09:22.000 --> 09:24.000 with two minutes each time. 09:24.000 --> 09:28.000 So that makes it really fun to get into. 09:28.000 --> 09:30.000 Like you don't need as massive data sets, 09:30.000 --> 09:33.000 you can go directly and solve things that you're interested in 09:33.000 --> 09:37.000 around your house or home or business or whatever. 09:37.000 --> 09:40.000 So there's full example code on that available online. 09:40.000 --> 09:44.000 Here's like more of a research project of mine, 09:44.000 --> 09:49.000 like trying to really find out like how cheap can we make this system, 09:49.000 --> 09:50.000 like it's still a useful system. 09:50.000 --> 09:54.000 And I post the question, can we make a useful animal system 09:54.000 --> 09:58.000 for under $1 in total component costs? 09:58.000 --> 10:00.000 And the preliminary answer is yes, 10:00.000 --> 10:03.000 you can do either motion analysis, 10:03.000 --> 10:06.000 so use an extrometer and you need a battery, 10:06.000 --> 10:09.000 and the BLE transmission all that for under $1. 10:09.000 --> 10:13.000 So there's like the bomb is around 70 cents for that. 10:13.000 --> 10:17.000 And the model needs to fit that in the microintroller we have for that, 10:17.000 --> 10:20.000 which has 8 kilowatts of RAM and 64 kilowatts of RAM. 10:20.000 --> 10:23.000 And that's actually quite a lot from my perspective. 10:23.000 --> 10:28.000 So we can should also be able to do audio analysis with that 10:28.000 --> 10:29.000 with small neural networks. 10:29.000 --> 10:31.000 So that is also possible. 10:32.000 --> 10:33.000 So you can't do both. 10:33.000 --> 10:34.000 You can't have a, 10:34.000 --> 10:36.000 you don't have enough money for both sensors, 10:36.000 --> 10:39.000 but you can do either. 10:39.000 --> 10:44.000 So Emlearn has this C library. 10:44.000 --> 10:51.000 So that's the easiest way to deploy on the vast majority of devices, 10:51.000 --> 10:54.000 because he's the most common. 10:54.000 --> 10:59.000 And you train the process using the standard Python ML libraries. 10:59.000 --> 11:02.000 So for example, keras or psychic learn. 11:02.000 --> 11:06.000 And yeah, we support selection of different models there. 11:06.000 --> 11:11.000 Then you use the EM Learn library with this convert function to 11:11.000 --> 11:13.000 convert into an efficient model. 11:13.000 --> 11:17.000 We do a couple of simple optimizations there. 11:17.000 --> 11:22.000 And then you can use the same function to generate some C code in this case. 11:22.000 --> 11:25.000 So this is just some example code or like, 11:25.000 --> 11:27.000 I'll sample output for a neural network. 11:27.000 --> 11:32.000 And we'll look quite similar for the other kinds of models. 11:32.000 --> 11:36.000 You can then include the generated header file. 11:36.000 --> 11:42.000 And you construct your input data that you would typically read from your sensor. 11:42.000 --> 11:47.000 And it might also include some feature engineering or pre-processing there. 11:47.000 --> 11:52.000 And then you pass the data to the predicts function that we provide. 11:52.000 --> 11:59.000 And that's all you need in order to run machine learning models on these kind of devices. 11:59.000 --> 12:04.000 The alternative is to the EM Learn C library. 12:04.000 --> 12:07.000 It's used EM Learn Micro Python library. 12:07.000 --> 12:11.000 Because as I mentioned many that to machine learning, 12:11.000 --> 12:14.000 they speak primarily Python. 12:14.000 --> 12:20.000 And it's very interesting to be able to use those skills directly on these kind of devices. 12:20.000 --> 12:24.000 So Micro Python implements a subset of Python 3. 12:24.000 --> 12:30.000 It's a really nice practical way of getting into these kind of microcontroller devices. 12:30.000 --> 12:35.000 And they have a really nice feature, which is that you can load C modules at runtime. 12:35.000 --> 12:44.000 So similar as on a real computer, you can just do MIP install and a MPI native module. 12:44.000 --> 12:49.000 And then you can drop in efficient C modules that you can use from Python. 12:49.000 --> 12:52.000 And that's quite unique in this embedded space. 12:52.000 --> 12:54.000 Very useful. 12:54.000 --> 12:58.000 So we provide them a range of modules. 12:58.000 --> 12:59.000 They're standalone. 12:59.000 --> 13:01.000 So you can install just what you need. 13:01.000 --> 13:06.000 For example, infinite impulse response filters for doing digital filtering. 13:06.000 --> 13:12.000 Very common to do random forest or tree based ensemble models. 13:12.000 --> 13:15.000 They're very power efficient and small. 13:15.000 --> 13:16.000 They're very useful. 13:16.000 --> 13:18.000 And also fast for a transport. 13:18.000 --> 13:25.000 So we saw that it's useful to also provide some DSP functionality because it was a bit lacking in the Micro Python space. 13:25.000 --> 13:28.000 And you can do convolutional neural networks. 13:28.000 --> 13:35.000 So there's examples doing image classification, et cetera, available in the repository. 13:35.000 --> 13:44.000 And doing this is at least 5 to 20 times faster than, for example, generating Python code and executing that with Micro Python. 13:44.000 --> 13:55.000 So you get the convenience of using Python, but you get the speed benefits and the size benefits of the C modules. 13:55.000 --> 13:58.000 And the process of using that is very similar. 13:58.000 --> 13:59.000 You train your model. 13:59.000 --> 14:01.000 Use the convert function. 14:01.000 --> 14:07.000 In this case, we often save the model as a file that because Micro Python has a file system. 14:07.000 --> 14:13.000 So we can copy the model definition over as a loadable file. 14:13.000 --> 14:19.000 And then, yeah, when installs the library, we're just one line as well. 14:19.000 --> 14:21.000 You can create the model. 14:21.000 --> 14:23.000 You have to specify the capacity. 14:23.000 --> 14:27.000 And then we load these the model definition from the file system. 14:27.000 --> 14:29.000 And then again, you read your sensor data. 14:29.000 --> 14:33.000 And you pass the data to the model predict function. 14:33.000 --> 14:40.000 And you get the classification or regression or anomaly detection output. 14:40.000 --> 14:44.000 So that's all that we had. 14:44.000 --> 14:51.000 So in summary, machine learning's used in embedded systems to automatically analyze sensor data. 14:51.000 --> 14:53.000 That's the predominant use case. 14:53.000 --> 14:57.000 And then we can make practical applications in this space. 14:57.000 --> 14:59.000 We just, a few kilobytes of RAM. 14:59.000 --> 15:02.000 And within just a few million watts of power. 15:02.000 --> 15:07.000 Or just a few dollars in terms of build materials or hardware costs. 15:08.000 --> 15:13.000 And EM Learn is an open source project to help deploy machine learning models to microcontrollers. 15:13.000 --> 15:18.000 And you can use it either with a C library or micro Python library. 15:18.000 --> 15:20.000 So there's some resources. 15:20.000 --> 15:25.000 And you'll find in the slides, of course, a line with more information, including some other talks, 15:25.000 --> 15:34.000 and go a little bit more into depth in particular aspects of this niche and the library. 15:34.000 --> 15:37.000 So thank you so much. 15:37.000 --> 15:40.000 Thank you. Thank you very much. 15:40.000 --> 15:44.000 This, this two graphs was, this two brush. 15:44.000 --> 15:46.000 I have to buy it like this. 15:46.000 --> 15:48.000 No, you know, it opensource. 15:48.000 --> 15:51.000 So you're just download the three different files. 15:51.000 --> 15:52.000 You have three different theory yourself. 15:52.000 --> 15:53.000 Use it there. 15:53.000 --> 15:56.000 If not, you send it to someone print and you download the code. 15:56.000 --> 15:57.000 Okay, that's cool. 15:57.000 --> 15:59.000 Any questions? 15:59.000 --> 16:00.000 Yes, please. 16:00.000 --> 16:01.000 I'm shouted. 16:02.000 --> 16:03.000 There's something. 16:03.000 --> 16:07.000 How come this as a small device has some kind of a co-position? 16:07.000 --> 16:08.000 Like a vector. 16:08.000 --> 16:10.000 And can you explain on this network? 16:10.000 --> 16:11.000 Yeah. 16:11.000 --> 16:12.000 So the question. 16:12.000 --> 16:13.000 I'll repeat it. 16:13.000 --> 16:18.000 So how are there devices that have some co-processors that you can use like vector processing 16:18.000 --> 16:19.000 and stuff like that? 16:19.000 --> 16:21.000 Is it common or not? 16:21.000 --> 16:22.000 Yep. 16:22.000 --> 16:25.000 It's getting more common because this is, 16:25.000 --> 16:29.000 using doing machine learning is more and more common and relevant. 16:29.000 --> 16:30.000 But hardware takes time, right? 16:30.000 --> 16:33.000 So it's not like 99 will have it. 16:33.000 --> 16:35.000 But there are some recent instructions. 16:35.000 --> 16:37.000 So arm has like standardized, 16:37.000 --> 16:40.000 Cindy instructions for the cortex and four. 16:40.000 --> 16:41.000 That is what useful. 16:41.000 --> 16:46.000 Can you give it like a 4x benefit on like neural network type problems? 16:46.000 --> 16:48.000 And then there's helium that is coming. 16:48.000 --> 16:50.000 They have standardized it like four years ago. 16:50.000 --> 16:52.000 But devices are not really out there. 16:52.000 --> 16:55.000 But then there are on the ESP32. 16:55.000 --> 16:58.000 There's S3, which has a bunch of smaller vector in the system. 16:58.000 --> 17:00.000 Smaller vector instructions. 17:00.000 --> 17:01.000 So it's possible. 17:01.000 --> 17:07.000 It's getting possible to do like a get like a 4 to 10x type speedups. 17:07.000 --> 17:14.000 And then there are now like SC just announced a new microcontroller just before Christmas. 17:14.000 --> 17:17.000 That has like a dedicated NPU. 17:17.000 --> 17:19.000 They call it neural processing unit. 17:19.000 --> 17:23.000 So definitely this is becoming more and more common. 17:23.000 --> 17:29.000 On the high end, like if you do like powerful image classification or object detection, 17:29.000 --> 17:32.000 for example, you want to use that. 17:32.000 --> 17:35.000 But then if you're doing with extra amount of data for example, 17:35.000 --> 17:37.000 it's usually overkill. 17:37.000 --> 17:40.000 And sometimes it can be less power efficient because you have this copercer, 17:40.000 --> 17:42.000 you need to shift data in and out. 17:42.000 --> 17:44.000 So it's definitely getting there. 17:44.000 --> 17:47.000 But you can do quite a lot without them right now. 17:47.000 --> 17:49.000 So let's go. 17:49.000 --> 17:52.000 We will see how much we implement. 17:52.000 --> 17:55.000 Like those instructions. 17:55.000 --> 17:56.000 They want to get common. 17:56.000 --> 17:57.000 Yes. 17:57.000 --> 17:59.000 But right now we prefer just doing plain. 17:59.000 --> 18:01.000 See runs everywhere. 18:01.000 --> 18:03.000 Simple and usually enough. 18:03.000 --> 18:04.000 That's a question. 18:04.000 --> 18:05.000 One question from mine. 18:05.000 --> 18:10.000 Do you see any time when those microcontrolls will be arming LLM inference? 18:10.000 --> 18:11.000 LLM. 18:11.000 --> 18:12.000 Oh yes. 18:12.000 --> 18:17.000 This is there's a surprising amount of buzz about LLMs. 18:17.000 --> 18:22.000 And I don't quite understand the use cases yet. 18:22.000 --> 18:24.000 But it might be relevant for Baltics. 18:24.000 --> 18:28.000 But there's a lot of hype around it. 18:28.000 --> 18:32.000 We will see in a few years if there's like useful things or not. 18:32.000 --> 18:33.000 I don't really know. 18:33.000 --> 18:35.000 But you have definitely a lot of hype. 18:35.000 --> 18:36.000 I'd like to talk to you. 18:36.000 --> 18:37.000 I'd like to talk to you. 18:37.000 --> 18:38.000 I'd like to talk to you. 18:38.000 --> 18:40.000 I'd like to tell you you need to do more, more, 18:40.000 --> 18:43.000 like if personal trainer, I don't know. 18:43.000 --> 18:44.000 Thank you. 18:44.000 --> 18:45.000 Thank you again. 18:45.000 --> 18:46.000 Thank you. 18:46.000 --> 18:48.000 And that was the last talk. 18:48.000 --> 18:53.000 And let me pass the mic to time.