Using Existing Edge Hardware for New AI Capabilities - with Roeland Nusselder of Plumerai
This is Daniel Fidel in you're listening to the AI businesses podcast. The focus of this week's episode is going to be on leveraging artificial intelligence at the edge. Hug. We run leaving closer to the edge and what are the use cases get labelled in doing. So this topic that has been of interest for me in our work in heavy industries such as mining and transportation. Manufacturing, but also in retail, one of our largest market research projects last year as many of you are aware emerging, you can think about us like a Boutique Forrester. Gartner we focused on the ROI helping companies, pick high ROI projects and refine their AI strategy. That's most of our work here emerge and that's most of my advisory work with enterprise innovation leaders are one of our. Bigger clients from last year was retailer focusing on what their largest competitors. In this case, the Walmart's the targets of the world were doing with computer vision in store and I would have never guessed at the outset of this project but maybe a third of the written pages from this report had to do with the considerations of leveraging ai at the edge the particular issues with. Hardware and software the particular issues, even battery time battery life, and with these applications actually looked like there's a lot to think about when we're putting a out at the edge in different environments again, in heavy industry, this was patently obvious for me but in computer vision in retail also became patently obvious. We talk about a little bit of all those things in this particular interview. So this was a fun one for me because it touched on our where a lot of last year's focus was with one of our larger clients here in emerge our guest. This week is rolling. Cinder. Who is the CEO and Co founder of humor I focuses on leveraging tiny amount at the edge with existing hardware. So not coming up with new hardware running a I. On existing chips role in talks to us about what new kinds of use cases can be enabled existing hardware, and also what it takes to take a machine learning model that might often be run in the cloud somewhere on GPU's and translate that down and deliver some value whether it's detecting if a person is in the screen or detecting a product is on the shelf and. Doing that kind of processing on a much older off the shelves bit of hardware is it turns out. That's its own technical problem. We talk more about the use cases than we do the tactical considerations, but we do cover the technical at sort of a conceptual level for those of you need to think about, what might it take to get some of these cases actually done Within your business on Roland is one of many presenters at the hardware summit put on by Kazakh research. The summit is taking place from September twenty nine through October seventh. It's entirely virtual. We partnered with Kazakh research last year to promote this event when he was out in California and now obviously due to covert. The entire event is virtual September twenty ninth through October seventeenth you. WanNa learn more about the hardware summit and Kasaka research. WHO's the sponsor of this episode. You can simply go to Google type in a hardware summit. You can learn more about their event and grab yourself a ticket. If you're interested in these of themes without further ado, though we're gonNA roll into this Tuesday I use episode this Roland of Humor here on a and business podcast. So Roland. Glad to have you on the program. I. Know we're going to be talking about a at the edge. I. Think in order to have that conversation based on where your firm is focused we should talk about micro controllers and a tiny am l. This is for for you folks really really big opportunity for ai at the edge he maybe not up what we're talking about today. Yes sure thing very me of course. So tiny mel is machine learning or a I own really cheap low power hardware and then usual micro controllers. Some Mike controllers are very cheap low-power chips and they're literally everywhere are hundreds of billions of my controllers in a roads sets also why they can be cheap. But it's very challenging to deploy machine earning or to run machine learning or microcontrollers and its resulting but maybe it's good if I did a bit of a wide so important to run machine learning microcontrollers. So one way to do this, I mean if he thought, you could think that you can just sense the data back to the cloud depressing Darren very heavy and. Expensive jeep use bid. This is often not a very good idea. First of all, they're bench with limitations. So if you have a camera that's connected to Wifi network and they sent up the whole camera to the clouds, if you have multiple gummer's connected through same wife network that just doesn't work your your wife is down immediately than there are things like latency take. Time to send it to the cloud to person Darah Senate back there reliability issues, diner radiant, and it's done. You still want to make sure that your product works their privacy issues. You don't want to send feed data or audio data through the clouds and dinners energy consumption. That's actually a big issue because sending data to the cloud or even if use WIFI consumes a lot of energy and that's not good. especially not if you have a battery powered device Saddam L. Solstice by running machinery and work by running two workloads on the device itself the very cheap low power chip. But. The thing is it's very difficult to run machine learning on a chip and that's what our companies folks them. Got It. When you say chip in this case, you're you're talking about microcontrollers. Yeah. Exactly. Okay. Got It. Got Maybe. So you've walked through a couple of instances an. Familiar. With you know the edge as as sort of an idea and the intersection of of Iot nfl you focus on this space pretty ardently though maybe we can talk about some of the cases where hyping data to the cloud makes sense and some of the cases where a dozen year security he brought up bandwidth. There's a lot of these practical concerns. Can we tie this to you know potential business cases Hey dan here's an example where it completely makes sense we we gotta send this stuff up it might be at the edge, but we got to send it to the club and here's an example where we really should not be doing that Do you have any? We can talk about The main thing here is that you want to have devised a battery powered. That's an important issue because makes much cheaper much easier to install vices. So for example, if you have a small camera in a grocery shop to detect if a shelf is empty or not. You want to make the device battery-powered at getter with smoke, and you can do that if the machine learning workloads is running on a mic controller and it just sense a small signal to the cloud if the shelves empty or if the if the shelf is enough to empty and you always want to tasks on the edge, if that's bull doing reason, why would not want to do it on the edge is if you need so if the Model is so complex that requires a lot of energy to run and requires very expensive. Large ships skull jeep use said, that's why you want to do hoax over example for very complex and opie models of complex NLP tusks you you might want to do it in cloud, but if it is Bulbul, if you game run it locally, you generally want to do that because of bent with. Issues, because of latency issues, because of reliability issues because of privacy issues. So a good example of adults that you want to run locally is, for example, an H. system where on a heating air conditioning system where you have a small camera which idex if there are human in room, and then if there is a human into room, the heating air conditioning system automatically turn on or turn off. You don't want to send it whole feed your feet to the cloud it. It's not great for your free event with of your Wifi network. Another great example where we ecstasy for between is in retail. So you're starting to see devices that are battery powered that have a small camera and that have a small microcontroller controller, and it runs in a little deep learning model to detect if the shelves empty or not car, and if the shelf has empty than the signal to store manager that someone needs to fill up the shelf again. Yeah. Yup or for example, as small camera detects how many people are waiting in acute and at the stores manage store managers do more effectively locate their their stuff thus like debt or you can do gay seduction off shopper and detect. What kind of products show purser January interested in said at the shop nick and do better product placements those kind of thoughts. been if you're running ds on larger chips that are more energy consuming, you have to connect them today Tristan Nets, somehow that makes it much more expensive to install much more painful store owner. So if he can make this battery-powered, you can just clip it on a shelf or you can just Louis this evening for example, would you really like if you can make these device better half these kind of devices everywhere. Got It. Okay. So that that makes sense I see the battery power as kind of a key threshold. It's one of the factors here that you're talking about as when when might it be better to do it do the Processing on the edge as opposed to the cloud in the retail example, just to be clear and I might be on the right page. I might not. It seems like if we definitely WanNa know these aisles whether their stock or not and these aisles where the gaze of the customer is it may make sense to just straight up install those cameras permanently and then they would have a power source but I think what you're getting at is that well, that's a very painful adoption process ray I that would take a long time and it's more expensive also maybe if aisles are moved or if we want to take different a set of different angles. Angle a different spot and see if we have a better read on inventory a better read on customers I know cases or whatever. Then it's very pliable it's removable and we don't have to start running. You know extension cords all over the place. We can just be doing m. l. without the need for that. Exactly, cool for Amazon, go type of store, which has cameras everywhere in which is completely built up for this. It's not a thing bid for much smaller shop which doesn't have the resources to install these devices than it much better. If can just clip on the shelf and installed very quickly and changed positions, etc, and is not just about berry power devices for many products margins are extremely important so if you suddenly have to install. NVIDIA JETSON GPO in there for example, which can easily cost hundreds of dollars. That's not great if he can do during the very cheap chip dead makes products. Yeah. That Sunday mixed pretty much more tractor for customers. So it's also verbs exempt for sound detection or for simple audio dos. So you can say during the on though GonNa, thaw six, and again you the dude locally also for privacy issues and for for energy consumption issues. Got It. Okay. So that make send us a couple of examples was an h VAC system detecting if people are in the room as to whether or not we want to use power, keep the lights on again these kind of clip honorable tasks that relatively simple we're sort of we're not doing the most robust processing in the universe you know we're not. We're not taking a you know an image and doing some monumental processing task on it. It's kind of hey is there a person here or hey, where's this person's is focused and then just doing that processing right then and there in in terms of getting that information from the device itself and pouring it somewhere where we can make sense of it I'm imagining. Let's say I run a big grocery store. I would imagine that that information would certainly be streaming out of these various and sundry devices into somewhere central where I could get a general picture of all of this even though it's not hooked up to the electrical system would have you in. It's not being in the cloud I can kind of take the processing. It's been done and just pipe in those results into some kind of some kind of a dashboard. This is what I would imagine that you let me know. Yes. So the method data should be sent out the device. So not the video, for example, your to hold out your feet, just a Meta data. So it just says shelves empty yes or no much. Notice data sending out resumes a lot of Energy Yep and it will drain your battery notes. I'm yeah. That's what he wants to prevent. So you want to proceed locally. And just sent out the result just send out the shelves empty. Yes. No to more central and seek. Got It. Okay. I'm going to see if maybe we can touch on even if it's one or two other small use cases before we talk about how this is technically done, which is obviously you folks are working on you know I'm interested maybe painting A. Little. Bit of a mental picture for the listeners to one or two other examples who got the grocery store camera. We've got potentially a security camera you know. Is there a car in the parking lot? Is there somebody walking somewhere whatever that's that's one. We've got this H Vac. Thanks very simple kind of yes. No type processing going on where we're not like. Scanning and image, and then re factoring Elon Musk's face on this person's body unlike Trent we're not doing fancy things. We're just making a simple decision. What are other real instances where those simple? Yes. Knows with some computer vision or or or audio can be really really valuable. What are the ones you're excited about? So one thing that infer excited about is, for example, hence, gifts for. Detection. Guest recognition. So where simple camera and it decks movements off your hands so for example, that can detect if you swipe through right, there were swiped to the left. So this basically enables any display to become as intuitive touchscreen screen. You can bench to zoom with your fingers swab rights left's makes kroll movements, those kind of things. Back this still a bit too complex from Mike Controller we're working very hard making our models very small and very efficient. Do this on their cheap fair cheek chips. Yeah. Because I was going to say, is there a person in the screen or not? Seems Pretty seems pretty viable to me. Is there a car entering the parking lot or not seems pretty viable to me even when you said gays detection I kind of thought to myself, Oh jeepers that's that's a little bit more complex. That's maybe eight or ten I don't know how many orders of magnitude more complicated than is cereal in the slot or not it seems like to your. Point some of these task it a little bit more complicated. Maybe that's a nice transition into how we're getting this done. Obviously, these chips were not built for this task, but there's so many of them. They're being created every day in a million devices from washing machines to you know little cameras and microphones or whatever the case be, and now we're we're sort of bending artificial intelligence if. You will sort of pack into these little devices. What does that look like what? What's the? What's the technical process to make that happen? Yes, it is very challenging because we want to make sure it does this by controllers can do to most challenging most exciting posible tusks and to this use them buying arise neuro networks. Okay. So I'll try to make not technical but anita give some. Explanation. Sure. Generally, if you have deep learning model, you have dense with millions or hundreds of millions of parameters. Traditionally people used thirty two bits to encode each of those parameters, third bits, bits to make things faster to make things more efficient people with thirteen bits to sixteen bits, and now pretty much everyone is using eight bits, eight bits, each of those parameters, each way to need active deflation. BITs, when we do this and we thought how can make this even more efficient? How can we really get something which everything we show as Mike Controllers to too extreme? We thought well, why don't we use just one single bits to encode each way to Nietzsche Commission just used. For each parameter instead of eight bits. So dismayed your mobile, much smaller because instead of using a bits, you only need one bit and it also makes your model match faster. All operations become much faster but the spit Mara a bit too technical. Bits using by North very difficult, very challenging. been doing a lot of work to to make his work. So to be able to use by Rosner Network, she needed to get several things in place. The first thing that beinart networks require new training algorithms and we've been doing an are still doing a lot of research on how these models Saudis by neural networks can be trained because she still wants to make sure that these models are accurate. So the accuracy said. They don't miss for example, human human is actually welcome from scarborough and they still fest Ns Mall. So we've been doing research on net the second part that you need this need to develop the software to trainees by neurons neural networks. So you need to form does algorithms straining worth software, and we've built this over tools and part of this work sexually also open-source lark acute l.. A. R. Q.. and. The third component that you need is something we go compute engine. This is the piece of soap trained animal and executes its very efficiently on a micro controller. So it is basically like if you just have an APP, your smartphone. For example, the snapchat APP, you can use it if you don't have android or. Needs, an operating system and is compute engine is basically the operating system for deep learning controller and to build a fair efficiently. It's quite difficult is actually very challenging. So our team has done great work and his work very hard to make efficient and fast. So those are the three components deck need to build minutes what we've built on which were still improving. And actually we're also covering another layer of stack, which is an sir chip designed for reconfiguring chips sculpt FPJ's. Through technical. Yeah I. Don't think I should go for it here. Yeah. For a particular audience but the conceptual understanding important for our our folks regardless of the use cases the relative cost the relative applications for business value that's that's certainly relevant and one last thing that Kinda floats to mind role as you talk about this, maybe we can end on this point is sort of what's going to happen in this. Ecosystem of hardware and software at the edge. You know there's there's so much more that's going to be happening. There are some people as you're well aware who are trying to figure out what might be new kinds of hardware. We're GONNA WANNA have in whether it'd be self driving cars whether it be drones or whether it be any kind of you know hand held device a cell phones that will be. Better able to handle the kinds of. Machine learning oriented tasks we want a handle on the edge we don't WANNA have to pipe to the cloud there's other ecosystems that are really about kind of adapting to the existing landscape of hardware and saying okay well, how can we effectively take the cutting edge of what L. is able to do and bring into that into that world? Do you see over the course of the next decade just a bloom of expansion on both of those sides of the camp? What's your thought about the future here? He mean both on The hardware side. So precisely, yeah. I guess I guess people let me frame it a different way people who are trying to reinvent the wheel. Hey, look. If we're going to be at the edge, it's got to be these kinds of chips. This kind of processing trying to overhaul everything versus folks like yourselves at least at the present time where, Hey, there's a huge ecosystem of this existing hardware. Let's make cutting edge work there. Do you see as much explosions happening on both sides fence or maybe do you think about it differently? Yeah I think both sides are pushing heart and there are lots of different approaches both on the hardware side as on the software side, and I think is also necessary because, yeah, we really need much for example to do more exciting things on VICI low power battery powered devices need much cheaper chips that run much more efficient software I. think it's very important to both sides keep pushing very hard and try and ineffective thing. Yeah well, and it'll be interesting to see how you folks develop. Your Roland is obviously new new algorithms enormous for a fray I lord knows five years from now what will be the most popular computer vision approaches right I imagine you guys will be adopting adapting to all new technical ways of getting this stuff done as things move forward and we can't even predict all that stuff but it sounds like for you there have to be innovating on the hardware itself what the idealist and also people adapting the current hardware to get more stuff done that those are viable approaches. Young agree and you need to keep changing both sides. If right now is software that the companies in the research teams dead are designing better or more efficient. So far are purely looking at existing chips. You will not end up with the most efficient solution and the same thing with the hardware side if people are making chips for existing learning algorithms again you end up at the local Mexican. So for example, if you get transformer transformer are very efficient for for jeeps. For. Example G which is very popular. That's been designed to be very efficient for use and eight bits deep learning models that are very popular. Those have also been designed for jeep use those motives that are currently out there in the most efficient in best models that are out there the people. So if you design new deep learning algorithms, new models for new chips, you can end up with a efficient and very powerful solution. So both sides hardware to keep innovating. The future is going to be an exciting one and I think there's a lot of viability for being able to have some nimble battery powered solutions in the early days to be able to fill out these use cases and and deliver some value. So I'm certainly rooting for you guys and seeing how things go on and I. Know That's all we have for time rolling. Thank you so much for being able to join us here on the show today. Think. Very much forever me reading joint it. So. That's all for this episode of the A in business podcast. If you're interested in knowing more about the use cases that we cover here emerge, and if you'd like to be able to have a visual explorer of use cases across retail, which we talked about today across financial services, including insurance and banking across defense heavy industry, it'd be sure to check out emerge plus emerge pluses are premium subscription folks that really want to put an action whether you're a small consultant needs to guide your clients with the best smart out there and really understanding what their next step she'd be or whether you're an enterprise leader who really wants access to not only. Use cases that you can using your own business, but also best practices about measuring our y about adopting and deploying adult building a team successfully, if you want to save yourself the hassle of reinventing the wheel and learn from some of the best of guests that we've had here including heads of AI at public companies, they'd be sure to check out emerge plus you can learn more about the subscription e. m. e. R. J. dot com slash p. one that's plus once Em Yard J. dot com slash P. One, INC learn more about emerge plus that's for this episode. I'll catch you for Thursday's making the business case episode Huron the A and business podcast.