How To Tell If Your Lead Scoring Works

Mintigo Lead Scoring

Clients often ask me whether their lead scoring model works as intended. In this post, I’ll walk through some common problems I see with lead scoring models, and what you should look for when evaluating your model.

How often should you review your model?

Done right, lead scoring can be an unfair advantage in your market. It’s completely reasonable to see a $1M in incremental revenue and a 10x improvement over alternative campaigns, as DocuSign recently found. The gains with lead scoring are often in quick wins, so often times you can also see a higher velocity in your pipeline.

While this kind of result is the impetus to create your lead scoring model, it won’t happen if your model remains static. Even if you build an effective data-driven model that yields immediate success, you should still update your model. How often? It depends on your specific circumstance, but in general about as often as you update your product/market and talent.

Changes in Product/Market

A lead scoring model reflects the assumptions about your target market: who can buy your product? Who is more likely to buy your product? An example of when you should consider changing your scoring model is when you shift your product focus downstream toward a lower priced SaaS model. In doing this, more of your prospects will be able to afford your product, so you should account for this with an updated model.

Changes in Talent

Less intuitive is a change in your sales team. For complex sales, a lead scoring model is as much a function of your sales team capabilities as it is a function of your product. When your sales team changes, your model should as well.

To understand this: imagine you have a sales team with deep expertise selling to financial services. Any model you’ve designed will reflect a preference for financial services, as these will be their preferred leads.

Then one day, you hire a rainmaker with deep expertise in biotechnology. Inside your database you have hundreds of leads in biotech that rarely reached an appointment.

If you miss the chance to re-score these leads and surface them to your new sales rep, your scoring model will no longer be relevant.

Minimum Viable Iteration

At the least, you should give more attention to your lead scoring model early on, as you build the initial data set to validate the model against your business.

In reality, however, few business I meet update their lead scoring model in a predictable, routine manner. More often, they wait for shifts in marketing to justify general review of processes:

  1. Change of Marketing admin? Check the scoring
  2. Change of Head of Sales? Check the scoring
  3. Change of Head of Marketing? Check the scoring

While you should update your scoring model more often than this, these are also good opportunities to review your model and make sure it still meets your business needs.

What makes a great lead scoring model?

Let’s start with the basics: what makes a good lead scoring model? Here are four attributes you may not have considered before:

1. Successfully predicts best leads

A great lead scoring model successfully predicts your warmest and best leads. This is easy to test and often easier still to correct (more details later in a subsequent post).

2. Recognizes your assumptions

A great lead scoring model documents the assumptions behind it. Let’s suppose, for example, your scoring model assumes all leads with a Gmail address are bad leads. If you start running a Twitter ad campaign, however, you’ll find most of your leads will have personal email addresses such as Gmail, even if they are great target prospects. If you haven’t documented your assumptions, however, the Gmail assumption may turn into a sacred cow and you won’t be able to easily find and update it to match your new circumstances.

3. Understands the limitations of behavioral data

A common mistake marketers make in their lead scoring is to conclude the lead score is the accurate number representing a lead. In doing this, they promote a view of the world that salespeople know isn’t accurate, but cannot always explain why it’s wrong.

Suppose, for example, you heavily score in favor of email engagement; your top prospects, however, dislike engaging by email and prefer phone calls or print materials. Your model will give them an artificially low score because their digital body language doesn’t match your model.

Another example we’ve found with clients is a buyer segment that doesn’t click on emails for security reasons. This one shift in behavior can have profound effects on your lead scoring model, as you simply won’t be able to accurately provide a number representing their engagement.

4. Gets buy-in from the sales team

Finally, the best scoring model is the one your sales team trusts. This means that the sales team both understands the scoring model as well as believes it will work. Too often, I find salespeople who ignore a scoring model or even reject the idea of scoring outright. This happens when the sales team doesn’t understand what the scores are based on or if the scores are not accurate.

A major part of sales is communicating the model, such that your sales team will see how the lead scoring model will improve their results in calls and emails. That said, you can’t just build a bad model and sell it to your sales team as gold. By the third time your salesperson calls a hot lead only to get shut down, they’ll want to blame your model and go with their ritualized targeting first.

In general, best in class companies will hold weekly review meetings between sales and marketing, where they audit each lead rejected by sales and a sampling of leads that haven’t quite reached the threshold.

This is the easiest and most effective way to validate the model and maintain the confidence of your sales team.

Lead scoring is often implemented and rarely leveraged. While it’s not valuable for every company, those with the volume to justify it can still see tremendous outcomes.

On Demand Webinar: Demystifying Predictive Lead Scoring. Watch Now!
Steven Moody

 

Steven Moody is an expert on growing through strategic content and marketing automation. He was a 2013 Marketo Champion who is interested in the challenges faced by B2B sellers and buyers alike. Tweet with him at @sjmoody and learn more at: http://beachheadmarketing.com