Fraud, Facebook, Basketball discussed on Marketing Today with Alan Hart


Yes so that campaign launched Dara had been an uber for about nine months and we were sort of waiting for the time to sort of of kind of do like the name of the campaign was moving forward for while it was called fresh? Start right and it was. It was like you know what's the right time. Say Hey kind of we understand these mistakes mistakes we've made and how do we move forward and there's a lot of debate is like tweak knowledge mistakes we refer to them. We just talked about moving forward. The usual stuff is actually funny. It came came out our ad on that is just too funny. came out the same time. FACEBOOK was having its own little recreational ads and then also wells Fargo. I think running the basketball game like I don't remember which one it was where like the three ads showed back to back like literally kind of in the same segment. Where like okay? Well Al so we have that but anyway so we were doing this ad and again uber is a very performance analytics driven culture. And we're basically trying to do this brand ish type campaign and you know I'm spending all this money I think it was a hundred and fifty million or so all told in a bean so it's a decent amount so it's like well how do we kind of measure this and stay a true to serve. The Uber Uber Roots of measuring everything. You do and we think about TV. There's you know a couple of ways to measure it. The first and I would say the most common is what I call the squint tasks test. Where you kind of look at some trendline whether it's of sentiment or of growth or whatever it is and then they're the vertical dotted line where your campaigns paints and you say you see see like well ignore the fact that I went down that's casinos but then it picked up note? Pick up casino and he serves squint to the graph and user. These were make a point. Like here's what it did but you know obviously it's not my favorite fare somewhat better testing control cities. And that's the kind of Maine where you where you can find a bunch of cities test in them and not in others and that kind of works well but especially for Uber. You have a lot of noise and cities ladies because each the cities is very different. Their regulatory issues going on you know there's a strike in Los Angeles like that so it's very hard to sort of have clean testing control cities because the the city's move very independently from each other based on local conditions. So we wanted to try and do it a better way and so we ended up using addressable. TV in. I'll just explain with it is. It's a little bit creepy but not to give and everything else that's going on. You know everyone has a cable box or anyone has cable cable box and about forty percent scent of the cable boxes are such that to various extents. You can target a particular at a particular household and know whether that household had the TV on and presumably saw that ad right so instead of just you know buying demographics of getting general reviews for how many people watched a particular show. You can basically say. Okay I know this many households so my ad and here was kind of the frequency frequency distribution and you can note exactly essentially at the household level. And so we wanted to use this say okay so Dr Campaign for moving forward campaign and we had big had a big national TV buy and then we looked at folks whether they were prospects or active customers or had churn customers and which ones of them had kind of a cable set top box that was addressable TV accessible. That was entered a split test among those so half of them got a bunch of additional. Add viewings we can serve. Show them the ad via their addressable. TV Half didn't you had like a true kind of random swift test on getting a bunch more exposures to these commercials and that's it's nice until like national got basically a sixteen x frequency on the ads and folks that were. You're kind of in this address while tests l.. Got Thirty to forty x right so sort of a big laugh and it's a true random split and the other thing is year not relying on did you. UCR Add to subtract things you know. They saw the ad so they can just look at their behavior on the con side though it was technically complex addressable. TV again. This is two years ago. It's probably gotten better. They were actually fairly good at reporting kind of what happened but they weren't really set up for us to say. Hey here's ended up being. Here's five million names. We want you to show them these ads at this frequency. They were reporting but they weren't so good at. Here's a list. We want you to show him this ad right. The technology wasn't quite there yet but still kind of you know worked reasonably well. We relied heavily on partner six O five like TV DMP and really helped us dish. This whole thing together like I said five million tests five-month control and kind of we ran it in you. It was interesting we can actually sort of track these individuals who had seen the ads to sort of see what their behaviors were some of the results. He so so I would say if you think about is primarily targeted writers driver thought also so you can split the riders into three groups those that were prospects I e it. They're not writers yet but you know everyone should be an uber writer at some point so pretty much. Everyone's a prospect those who are active and those who had churned and there was some splitting. We did within each of those buckets. But that's that's basically the biggest movement and then you can think about okay. We can see sentiment movement and we can see business results movement right A.. And so sentiment movement. We saw a fair amount of sentiment. Movement in the churned folks right and a lot of these were folks who lost due to delete Uber. We also so saw some sentiment change in the active group. We saw very little impact on sentiment in the prospects. They're just apparently just not really tuned in to all the drama right right. I didn't know they didn't know Uber was bad. They didn't know we've gotten so that you're not really listening right okay Thank you know and what was interesting though especially especially in children. We saw pretty good sentiment movement pretty quickly. It was about four months or so before we saw any movement in their actual activity right so there was a slag a- and you know we we dug into why that was and it seemed like so. Most these churned users Had not actually stopped using ridesharing. They'd basically gone to lift WBT and what it turned out was happy. So the sentiment change great but they're used to using lift right now and so what happened. Is We start to get them back if they had a problem with left of. Oh you know they had a really bad right experience or there was a lot of Surge pricing whatever it is and now continued to try. Thank you know what maybe Uber. Isn't that bad after all. Let me reinstall the APP or whatever it is and so. That's where we saw the effect but it was. It was four months out so would have been very hard to sort of detect normal means right. I love the approach because it. It's rare that you get a clean look at things for bring campaign any like you. Just described I've never heard of this use case for a addressable. TV so that's that's pretty fascinating. It did it surprise anybody. Nobody at Uber like the results you saw carrying so we saw the results. I just told you about I think the folks on the more brand side of the Uber World were a little disappointed that there wasn't more of a movement in the business metrics like I said there was some but in terms of if you think of the efficiency of spend spend way versus the efficiency of spending after we'd cut out the fraud to be clear versus the efficiency spending on performance marking kind of it wasn't there and then the question was like okay. Is there way that we should value this movement in sentiment beyond the business metrics that we can now track and maybe the answer is Yes for example. You could imagine if we'd had these things in place before. Would we have had more of a heat shield when libra happened when Susan Fowler happened when the TK video you broke that's really hard to measure even with addressable TV. This doesn't completely solve the okay now. He's cracked had a measure. kind of brand spend just one more four step in that right direction. Gotcha Gotcha. We talked a lot about. I guess writers to this point and I know you do probably even more frankly more marketing marketing or had done more marketing towards the driver equation. Anything come to mind in terms of lessons learned in terms of trying to drive writers. I mean it's hard you try Dr Drivers. It's very confusing. Also like when we talk about the drivers of something like the drivers of changes right like no no. I'm not talking about driver or is I'm talking about the the drivers out. Yeah yeah we would occasionally use the term driver partners internally just to help with that so it was actually there. Were a few things one of those. It was very interesting. Is that in. Your right. Drivers was always the bigger challenge We were spending a neighborhood of kind of like even pre the fraud reduction. We're spending spending more on drivers to three X.. More and then post the fraud related reduction. It was five six x more on drivers. There was one interesting trae off with and I can't say we really resolve this so I was acquiring drivers and looking at sort of CACTUS. LTV ratio and again. These were the heady days of growth so of course CAC was higher than LTV. Okay what are you talking about it. Because the debate is this should be to exit. Three acts and I was spending a decent amount of money doing that. And then there was the driver incentive team which is a marketplace team which was standing money on things to say. Hey driver if you at least a hundred trips this week will give you an extra two hundred dollars right. So those are the drivers they were spending about. I would say three three X.. Kind of more than I was a huge huge spend and there was ongoing debate.

Coming up next