"A lot of good analysis is wasted doing the wrong thing."Anyone who has worked with data on business problems is probably aware of this adage. And this past week, I was reminded once again of this fact while analyzing a marketing program. This example is so striking, because difference between doing the "right" thing and the "almost-right" thing ended up being more than a factor of 10 -- a really big variance on a financial calculation.
Some background. One of my clients does a lot of prospecting on the web. They have various campaigns to increase leads to their web site. These campaigns cost money. Is it worth it to invest in a particular program?
This seems easy enough to answer, assuming the incoming leads are coded with their source (and they seem to be). Just look at the leads coming in. Compare them to the customers who sign up. And the rest, as they say, is just arithmetic.
Let's say that a customer who signups up on the web has an estimated value of $300. And, we can all agree on this number because it is the Finance Number. No need to argue with that.
The first estimate for the number of leads brought in was around 160, produced by the Business Intelligence Group. With an estimated value of $300, the pilot program was generating long term revenue of $48,000 -- much more than the cost of the program. No brainer here. The program worked! Expand the program! Promote the manager!
The second estimate for the number of leads brought in was 12. With an estimated value of $300, the pilot was generating $3,600 in long term revenue -- way less than the cost of the program. Well, we might as well burn the cash and roast marshmellows over the flame. No promotion here. Know any good recruiters?
Both these estimates used the same data sources. The difference was in the understanding of how the "visitor experience" is represented in the data.
For instance, a visitor has come to the site 300 times in the past. The 301st visit was through the new marketing program. Then two weeks later on the 320th visit, magic happens and the visitor becomes a customer. Is the lead responsible for the acquisition? This problem is called channel attribution. If the customer had signed up when s/he clicked as a lead then yes, you could attribute all or most value to that marketing program. But two weeks and 20 visits later? Not likely. The lead was already interested.
And then . . . the visitor keeps using the new browser (or whatever). And then later, s/he decides to login. At that point, the visitor is identified as a customer. And, more importantly, the VisitorId associated with the visitor is now a customer. But that doesn't mean that the lead created the customer. The logging in merely identified an existing customer.
Guess what? This happened more times than you might imagine. In many, many cases, the 160 "customers" generated by the leads had been customers for months and years prior to this marketing campaign. It doesn't make sense to attribute their value to the campaign.
The moral of this story: it is important to understand the data and more importantly, to understand what the data is telling you about the real world. Sometimes in our eagerness to get answers we might miss very important details.
As a final note, we found the problem through a very simple request. Instead of just believing the number 160 in the report generated by the Business Intelligence Group, we insisted on the list of leads and account numbers created by the program. With the list in-hand, the problems were fairly obvious.