AI at the Edge - When is it Better Than Centralized Compute? - with Geoffrey Tate of Flex Logics

Automatic TRANSCRIPT

This is daniel fidel. And you're listening to a in business. Podcast one of our here from our market research work emerge artificial intelligence. Research is that ai. Executive fluency is really the linchpin to return on investment for projects. If we have leadership that conceptually understands what a i can do what it takes to make a project successful in terms of a deployment or Failures if we have executives understand a representative sense of use cases so they know so realistically what might fit where we would be able to avoid a lot of the early stumbling blocks of where the i kinda fell on. Its face in the enterprise and fortunately executives are becoming more fluent. We obviously want to perpetuate that with our work here at emerge and with this podcast and there is a topic about what it looks like to put. Ai into use that we don't cover a lot but that is worth having on the radar and that is your choice around hardware and compute. We're not gonna talk technical. This is not a show for people who write code for a living. It's a show for people who decide on projects build strategies managed budgets. That's who you folks are. And that's who were delivering for but hardware is still a useful and interesting topic depending on what industry you're in and it's something everybody's going to have to think a bit more about in the decade ahead our guest. This weekend's jeffrey tate. He is the co founder of flex logics flex logic makes a hardware. Jeffries fell out mountain. View where i used to live myself. And we speak this week around when it makes sense to do at the edge and when it makes sense to have centralized Advantages of both. And what are the instances. When ai at the edge might be the smarter. Move to play. We've talked in the past about a in the in retail for computer vision. We've talked in the past about ai in utilities and transportation at the edge but now we talk about conceptually and this is the making the business case episodes. We get a bit more conceptual here. On thursdays we're going to talk about. When does it make sense to have your algorithms doing their work. Far away from your central compute versus. When do you want a pipe. That data somewhere to have the work done. They are having a simple rule of thumb for dealing with. That is awful useful. If you're thinking about use cases that involve complex compute needs and. I think. Jeffrey does a great job of simplifying. Some of those insights so hope you find this episode helpful if you want to support the show and so many of you have been kind to do so over the course of the last six months. It means a lot to me. Please do a five star review on itunes. Let us know what you like. Most all of our best ideas in the last year or year and a half about the show have come from listeners. Like you either contacting me on lincoln or dropping a kind review on itunes and letting us know what kind of episodes you like what you like about the show that's helped us to mould our material and also really helps to support us so if you want to support emerge go ahead over to the ai in business. Podcast on apple. Podcasts dropped five star review and share any particular episode. You light or any themes that you want to see more of. We'd love to hear from you. We really do want to focus on the community this year and you is a listener are a big part of that. So thanks so much for those of you who already have and otherwise without further ado. Let's hop range this episode. This is jeffrey with flex. Logics here in the a in business. Podcast so jeffer-. I wanna start off with certain difference between doing business with the data center versus doing business at the edge. I know that the hardware you folks are working on. Is you know what i think. Most people think hardware. They think about the big rack sitting somewhere in the data center the edges different the edges. New it's its burgeoning. How do you define the edge when you talk to people. Because i think people always think about those racks but it's clearly a blooming ecosystem. Yeah well azusa terms of different people defined differently but where we basically look at businesses any system. Outside of the data center there can be things like cellphones stations in verizon stations that are kind of in between what we're looking at your robots this. He'll cars field ultrasound systems in the field. So these are systems that are separated in well removed from today's yet. Okay and and obviously pretty wide. Berth as to what that could be almost any industry this this could be applied retail. You've got cameras energy you've got i don't know some some big turbine out there. You know generating some power killing in occasional bird. You know it's pretty pretty vast swath of of what what can imply does that. Broad world of edge cluster. In any interesting ways. I think industry would be one that makes sense. Maybe you can talk a bit about that but we also use case what you see edge sort of us for a certain way tammy how do you think about this whole new space. Those of us at home. it's it's sort of. It's new it's novel but how we want to break it up. Well we're just touching the top of the iceberg so we've engaged with a lot of customers see a lot martin's segments and they have different potential sizes so one obvious one is countered. There's cameras all over the place. You mentioned walmart's wells fargo's there's cameras today wired into servers in the back offices of these places and serve servers not in the data center and right now those cameras us recording video in case somebody shoplift something that got her the tape. Now they can add inference start tracking their stores checking buying behavior along the lines along the take to get through the lines things like that. So that's an application where you need object detection and recognition similarly when you're talking about robots robots moving around into distribution for a warehouse. They need to know you know what. Say iraq to put things on. What's the what's the person to make sure you don't get them. Yeah so you're detecting. You're recognizing them taking action appropriately. Same thing happens with cars so those are all object detection recognition models like yolo the three which do an excellent job of doing that. Which are people are flying now and we see medical imaging and there's many types of limiting there's crazy. Mri machines are much less expensive or numerous ultrasound. Machines x-ray cd stamps and stuff in between so there with the people are using models for his more specialized object detection recognition typically. You're standing your knee. It's stuck in there. It's not moving but you're looking to detect some anomalies in the x ray on the ultrasound. South is the baby. Ok is the got busted acl. So it's helping. The radiologist do occur. Job of diagnosing powell. That's what those kinds of models need to be doing. Things like Scientists gamma joining or life sciences. And they are in many cases what they're doing is looking to clean up images using network approaches through extraneous information clarify the. Which if you've ever seen like an ultra sounds like i just recently hasn't surgery. The doctors were trying to find a vein of the my shoulder. I can see the ultrasound. They were looking at a teaching hospital. And i couldn't tell what they said when they when they started a doctor they couldn't tell was on either but eventually they figured it out so these ultrasounds are hard to make out in. Computers can make better judgments which results in better outcomes says. A wide range of applications are seeing for inference models. I think we're just scratching. The surface gets more powerful and cheaper is going to go into more more more systems. Big time yeah and clearly you know The the hardware will also get cheaper with time as well. Cameras have gotten a lot cheaper. You know you're talking a lot about vision certain kinds of equipment opponents and obviously the core hardware like what you folks are working on. You mentioned a lot of vision. Applications the two latter ones. I would actually presume. Maybe we don't necessarily have to be on the edge if i'm at the mayo clinic in they're scanning me. Is that one of those kind of not necessarily at the edge ones like with those latter examples healthcare more traditional sort of in the data center or or do you see plenty of healthcare applications outside of the data center that maybe we can talk about as well while we're talking to people who are already declaring. Hey i in their systems so for whatever reason. They chose to have the the sanction inside their locks. Okay not connect to the data center connecting to the data center as latency. You need a network interface which has always be out and be reliable and it is charge money. It's not free unlike. Google search if you wanna use. Aws or something you're paying so their decision is at the right price is it makes more sense to deploy inference inside their also interesting okay. So you're you're seeing that yet. You know who knows when prices shifted use cases shifted workflow shift if more and more medical devices will not be billed as. Hey we're gonna pipe this over to your data center but hey we're actually doing calculations you just kinda jacket and you can see the results so it sounds like a and is possible that some of the stuff the days after sure we got. We got nothing against that. We see a lot about patients where latency is importance. If you're on a car and you drive in you know you can't be waiting for data scheduling on definitely. Not and if you lose if you can't say well sorry. I issue you know. The data center was available. So you clearly. Applications were real. Time is critical. Anders others were perhaps critical. We see lots of applications where the customers want real. Yeah so so. Let's talk maybe about that. Those factors that encourage shirt on sensors. By the way is i did mention a bunch of vision. But we see widar infrared x ray knight laser. Seen every kind of you know electromagnetic sensor you can thank oh used in various applications. So it's more than just your. Yeah and well. I guess two questions all tack onto a few things you said one of which again healthcare example struck me as potentially not as urgent as autonomous vehicles. And maybe that would be data center. Were but it wouldn't surprise me of the ecosystem evolves so that some of this stuff was just done at the edge. Those products might end up succeeding more. Because you know hospitals setting up the level of maturity they would need to actually pipe this into their own data centers might just be too much of a too much heavy lift compared to doing it doing it on board if the harsh cheap enough to do so. What are the factors. One of you mentioned was latency when you think about what's growing the demand for edge in other words being outside of the data center what's expanding that demand physical distances one. Okay if i'm driving in some obscure part of alaska with some kind of transportation vehicle. Maybe you know. I'm not going to be pipe into the data center says physical distance. Maybe you mentioned need for latency. If we need to make snap decisions we absolutely can't even at a lick of lag and that's going to be another factor that's encourage us to lean in the edge direction. What else do you see. Is those kind of magnetic poles towards making the bigger deal okay. Let's talk about rea- costs. It costs money to run a universal model. And i don't know what the costs are but it's not free so we've had he'd goal. Tell us back when we started into this. Remember we talked to own doorbells as they could recognize people but they actually had to have a little internet connection so there had to be data center connection than the images of the rebound back and they were looking for chips to replace that and put it into their the unit because as long as they were using the data center they had to charge monthly fee to pay monthly. See they were just want by the the on doorbell install exactly yeah so Sanders or wire every image. You're paying for it if you buy chip you pay for it. Wants to the next fifteen years so cost the dates actor you can do. And why use the data such as cheaply fast. Yeah but but is okay. You know in our world today. It's not black and white thing. Computers freezing uses land of local imputes ans- and they're using the data center. When you need to find some information you go to google and you find stuff on their data center but you're excel spreadsheets and all these things are running on your local compute. So you know it's not an either or and our and our regular life using both. It's done based thought responsiveness costs and other totally in. It seems to me like that same kind of blend is going to be natural. You know if we think about where. The computers being used in different industries. It'll be splayed out differently. Right if i run a company with a lot of energy equipment out there in the world. Maybe i'll have x. Percents of my compute that. I could technically categorize at the edge. If i'm a financial services firm just doing underwriting and accounting type stuff maybe we're going to be looking at vastly more in the data center so causes another founder. Yeah there's sermon intermediates. Which is i mentioned earlier. Wells fargo walmart's have cameras is cameras. Don't have any intelligence. Oh is that the wire going back to a server in the back room. And that's worthy capture. The image store them for later use so in that case. They're aggregating the were to into a server rather than having intelligence in the campus so we see customers who wanna buy our gourds to with in servers so than one server in control cameras. And that's the right tradeoff era. Still the ouch. But it's not all the way out to the actually cell sitting in each individual camera again. That exams hardware fit is going to be different. Good or you know you know. Some robots assembly lines in there might be eight robots robots controlled by one infringed. So you can do. There's lots of different trade offs making gifts those best. You know for their outlandish. Yeah again this is not a developed enough space where we have all the best practices right at some point x number of years in the future if i'm best by using completely arbitrary example. I don't chop that much so having a member here fun best buy open a new store and i want you know detect in inventory levels. There's going to be sort of a pretty tried and true orchestrated way of sort of cutting the mustard with my cameras. And having that steph setup hardware wise like at least a couple of cookbook ways of doing it. The a million other brick and mortar folks have used right now. We're feeling that out and like you said might be in. The server might be in the camera. Hey we're gonna we're gonna experiment in world define what's going to be right for the client application so so we have physical distance. We have latency. We have cost. I'm thinking from the perspective of the customer here when i think about. Is this gonna cost more to be done. I loved your doorbell example. That was a great example. Jeff was customer. Doesn't want to pay every month. I wanna sell this thing. I wanna i wanna outsell competitors by being able to even have an accessible price so i don't want to build my customers airmont. I want to have smarts but pay for it one time. That's going to help me grow my business. That was one example. Used where cost made sense that clicked in my head when i'm a business. Functional business leader in retail in energy in whatever sector. And i'm thinking okay. This particular use case is gonna be pricier to run in the data center or pricier to buy quota kokomo edge computing for lack of better terms. Of course we can't give anybody blanket advice but are there ways you like to think through this to kind of find those pockets where edge off makes sense one. Other thing to note in the data center is that data centre has way more power but the products that renovators interupt mice run really really big models and they're very expensive. You know at the edge. He wanted much less expensive solution. So it's not necessarily clear that the products in the davis center will be very good at running edge applications in in the edge. They're running models have billions of weights. I sorta in the data sets. They're running models of billions of ways for running models that out sixty two yearly weights. Which is a lot in. Their chips are optimized for large batch sizes familiar with that term in a data center the others thousands of servers so they can aggregate a whole bunch of somewhere votes in on the parallel and they combined a unison. They crossed the sixty four images at the time at the edge images. Coming in from one camera one at a time and you have to cross the on fly. So it's a different kind of inference that you need to do with the edge than what you do. The data the data center solutions all very powerful are optimized datacenter problems. Which are different from the problems. Got it that's an interesting distinction is well and really makes me think here you know as we go into the future will there be data centers in almost. Certainly the answer's. Yes but neither you. Nor i know the ratios. I'm sure will be different for industry and geo region in everything else but will be data centers that have entire chunks of them built to process much more limited sort of number of features for particular kinds of problems to be more energy efficient. Maybe there will be you know soon. So hey this this chunk of the data centers running through this stuff because we have a lot of it but you know we don't want to spend as much money but as you're saying right now right now they're not optimized for that they're not optimize fredge problems and so that that's part of what makes the cost arguments or strong. You guys are in this space and you're coming out with new chip in the world to sort of operate at the edge clearly for you guys. This is a bet worth making in the edges going to bloom. You know you talked about. We're just at the tip of the iceberg any quick closing notes for folks who are or wondering. Hey what's gonna make edge take offer. They're going to be transit. They're going to be some sort thresholds. We're gonna cross where we're really going to start this snowball of edge being more and more of the compute ecosystem. Any anything you can lead people with today is the solution was. The people are using now for established leaders work but they don't run as fast as they cost too much so the edge market adoption of is still relatively august. People are predicting. The market's gonna grow to ten billion dollars from a half billion today but the reason or the way. Marcus grow. Semiconductors is you've gotta liver equally good performance a tenth of the price. And that's what our new does. It weren't alluring performance this like the market leaders today but a fraction of the cost so that will enable people not just to make their current alter sauces better but to put us into systems where they can't afford to quit quality high-performance inference today because it's too expensive. And that's been expand the market dramatic. Okay so for you. And i again can't make any judgments on your particular product but it sounds like forces in play here are as the use cases expand become more popular. It's just gonna become evident that the cost factors are just gonna hold us back from actually adopting things that we know are going to work in the industry and so we're gonna just have to jump to the edge. There's going to have to be a lebron. In people's way of thinking and managing their harbor eighteen analogy is at the android audi the options for giving a computer maserati and a mercedes benz. Okay got kind of and we. We'd all like Mercedes so there's lots of people who have enough money to buy it toyota or got it and if you can give them a good product at a much lower price point and gives you good performance. Maybe not quite as good but almost is good but it a lot less price all of a sudden a lot. More people input high-performance products. And that's what we're trying he had. It would be far far less cars on the road. If it was only maserati is available. The other analogy. I'll use here. Is that right now. There just aren't that many roads on the road The the united states is pretty pretty well rooted out the ecosystem of. What can this stuff do that will actually deliver value is actually not all that. Well wrote an outright there isn't a clear cut. Hey we all know how to solve fraud at checkout. Hey we all know how to do facial recognition efficiently. Hey we all know those roads just don't exist so these playbook need to develop that people actually have for things like what you guys are working on but luckily use cases are not stopping anytime soon. You guys are in an exciting space and hopefully for those of you listening in some jeff ideas about what's going to get this to tip and where Edge can make an impact going to be useful for you as you think about your own business to jeff. I know it's all we have time on this interview but thanks so much for being able to join us on the show down. That's all for this episode of the ai and business podcasts. Thanks for listening all the way through. I hope you've enjoyed this particular episode. It was kasaka research. Who put us in touch with jeffrey. Awhile ago asako runs a number of ai hardware summits. Kazakh was also worked with in the past about letting people know about the hardware summit. But jeffrey was actually a connection through them. So i wanna give them an extra pat on the back and a big thank you for introducing us to somebody smart who made it on for another episode here on the program and definitely check out kasaka research if you're interested in learning more about hardware otherwise stay tuned right here next tuesday if you want to hear more about use cases because that's what we do every tuesday here on the i and business podcasts or look forward to catching you that.

Coming up next