# A highlight from Prof. Ghassan AlRegib, Professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology.

### Automatic TRANSCRIPT

Cheaper silicon and improving mathematics Kept brought it to a level that is becoming interesting at more interesting every day upset but obviously there are some. There's a mathematical challenges. You know when be look at the rain. It disabled two things with very little date as soon as it's really good in detecting anomalies Deep neural networks requires typically lot of detail trading. They don't necessarily generalize that will and an anomaly detection is is more complex crop right so so this idea that you can use the geoghegan's In the back propagation A back complication bay. You can use gradients to detect anomaly Could you describe a little bit about How come you to that. Yes absolutely and i think you come give very nice kind of foundation to this work. And if i may add about the bengals beckwith fiction radiance and look at the neural networks. In general what happens as you descried. There is a training phase where you have certain data and you have this. These niche were does bunch of Worlds different layers and Insured physically Parameters recalled in weights and bias this network and the whole idea. Is that when you feed these images on the data images allows In this case and then you keep getting all the way with some kind of initial waves and then at the end. Basically you have this kind of application as patients or recognition. And so on. So you kind of have a you have the ground truth retaining days and now you you kinda suck to compare how good your your feet this network For us the predictions competitor run through the. Now you have an error and then used tobacco by gate and in the back room again mechanism you you do with optimization to really come up with With a good way basically what training is to find his ramat disease weights these accusations and in doing so you are going to migration meaning the the lying on a gradient already kind of computer these gradients that will generate for you canada. The optimal the chain of values of for your weight's Or parameters and one of the most challenging applications is as you mentioned. Is we train than it. Works on specific type of data. And then this position concept. Toby can generalize this knowledge to cover other other data. That's not part of the training and the challenge and usually is It's it doesn't know when it doesn't know it's a it's it's the challenge is it always have this kind of foul built in enforcement to make a decision but not all decisions are correct because It may not even fit the model itself. So that's basically where Actually big parts of our tidy Suffered for five years ago is looking into ways how we can empower the network for a couple of things. First thing is we would look to quip than it's been units. Were to know when it doesn't know which i think is very important and the second one is we would like it to differentiate between its knowledge. Space hub space and compared that with what what data as the outside that learns face. And that's weird anomaly. Detection comes into play and as you mentioned Tremendous challenges if i if eighteen as an example of this is very very famous kind of sympathy. The city recorded the eminence with you. Have handwritten digits The nine and then If you train on on duty Zero two two eight for example right now and then now you enter so you have the training model. You finish the training. We have good mother ready. And now you input your testing digit nine image of the handwritten digit nine. Now it hasn't seen it hasn't seen nine tuning and now the question is how will this will detect and recognize what it is and what the peevish have been So the challenge in heat is to say i haven't seen this.