Sentiment Analysis with Rana Gujra, Behavioral Signals CEO

Automatic TRANSCRIPT

The CEO of they will say no Ronald withdraw and About signals. We were born out of research out of Usc AND SALES RESEARCH LOBS. What we do. Is WE USER? Specialized Algorithms to analyze human emotions and behaviors and boys and that's the primary focus area. Which is the tone of voice and the pitch on tonal variants? And what we do from that as we gather a lot of insights and we transform that data usable and actionable information. And then we apply that to words There is business. Kpi's and so if you think about The landscape out there from a comparative metrics. I mean the the most of the MVP and whites interactions offering. You see out there. They tend to focus on what is being said. And what we do is we introduced ability to understand how something is being said in addition to the what bar and that stuff through focusing on the pitch and tunnel variance and deciphering from that Emotions speaking styles behaviors. But we've gone a step further and Some recent work has been around predicting tat so We can actually predict what one of the participants that we're focusing on is going to do in the near future. What actions the person's GonNa take in the near future and we can do that very high level of accuracy so it's almost in terms of sort of getting into the really the minutia aspects of the state of mind and then predicting how or what actions will come out of this conversation so very high level. We can go dive into it. But that's Kinda what they do. That's great very cool so I guess I'm going to start by asking you a dumb question is it. Is it a given. I think it's a given. Most people follow voice assistance and have have kept up with some of these innovations. It seems like it's a given that a voice assistant that can conduct cinnamon analysis and that understands You know the way the words are being said is important if not more important than the actual words being said it seems like a voice system that can do that. We'll be able to outperform one. That can't is there ever a scenario where we wouldn't want sentiment analysis or is it just a given that any any voice assistant to be at that next level it needs to have that ability? Y- You're absolutely right and I think I think the answer is Or Yes I feel but like one way to answer. That is to take a step back and see how the whole arena albouy number actions as progress over the last decade or so so if you were to take a step back and say go back not that far back but say five to seven years ago We were speaking to the capabilities and technologies such as not and you and you know the speech taxed aspects to it as cutting edge bleeding edge. We were like very very enamored by it. You could audio and you could translate that into Or transcribed into actual words That's fascinating and you can do it. By incorporating various accents and various types of speaks and there is a cultural dialect. And all of that and that's fascinating and multiple languages as well but fast forward into twenty training. It's no cutting edge to commodities capabilities to commodities tool that that companies offer and Some of the larger players are really throwing that capability in the market literally for free. So that's that's one thing right. That's changed in the in the dynamics of interaction. The second thing is Can You understand the specific context behind Behind the voice conversations for example. If there's a conversation between a doctor and patient You can. We're the speech to audio. But do you understand what's being said? You understand the lingo. Join us. The main knowledge behind that conversation. And there's a lot of work that's been done in that that most acidic areas are there people who have focused on Doctor patient conversations and others about focused on spiteful spousal conversations does others were focused on between the salesperson and our client So we understand those things now and we could we could. We can apply those that that context. What hasn't changed is understanding state of mind right and so for example. I mean you and I are talking. we're not just You know Focused on understanding what you're saying which is actually we're just saying but I'm really very interested in how you're saying something because it's very important for me and for you to relate to that because what you're going to say next is not just going to be based on what I'm saying but how I'm saying something but that I mean it or not and You know that's incredibly important and you're a and that's what makes the conversation a conversation otherwise it's a single Transcript of One side of transactions. Which is I tell you to do something and you either do it or not do it and you respond back with whether you do it or not do it. That's how we talk for today right and the promise was that virtual assistance will become real assistance. I mean they'll they'll replace that human We need on our lives whether it's A secretary or a business assistant or a social assistant. Or you know a companion who who we can talk to Hasn't happened yet right. And it doesn't I mean as a billions of dollars have been spent More and more languages supported by assistance and more and more skills or attitudes insistence that can door but that have opened doors they can Show the cameras. They can do complex math. They can tell US jokes and play music. But it's one sided. There's no conversation happening and that's because we don't like to converse with things that don't understand You know How I feel.

Coming up next