Print Friendly and PDF
Follow

Predictive Scoping Overview

Our Methodology

 

Leadspace predictive models apply advanced artificial intelligence (AI) to find people, match leads to accounts, segment by buyer personas, and surface the next best`leads and accounts to target. Our scoring models are dynamic by design to help you meet the challenges of your target audience. The Leadspace platform continuously evaluates audience engagement with your campaigns to learn from what works and what doesn’t. Those insights are then used to further refine your audience intelligence – enabling you to get the latest intelligence about your market, create up-to-date segments, and optimize future campaigns.

The below diagram outlines the Leadspace process to successfully create and deploy an effective Audience Model.

 

During the scoping workshop we will cover and discuss the following 9 topics in 3 tracks:

 

 

Below are questions which will guide the discussion.

 

Models Scope and Roadmap

 

Business Cases

  1. Are you looking for a scoring solution for outbound targeting, inbound lead scoring or other? (examples: cross-sell / up sell / opportunity scoring)
  2. Will your primary goal for inbound scoring be to provide additional qualified leads to sales or increasing the quality of what’s being delivered today?
  3. Will your primary goal for outbound scoring be identifying net new target accounts and/or prioritizing your account DB?
  4. Which stage of the funnel are you trying to optimize? (examples: lead to MQL, named account to closed-won)
  5. Which engagement channel are you trying to optimize? (examples: inside sales team, email nurturing, website conversion)
  6. Do you currently use an existing scoring / filtering solution? If yes, what are its gaps?

 

Business Environment

  1. Which department/s are involved in implementing a scoring model? Who are the key stakeholders?
  2. What is the priority by business unit and/or product lines?
  3. What is the priority by segment? (examples: SMB vs ENT, Americas vs EMEA)
  4. What is the priority by sales and marketing channel? (example: direct selling vs partner, internal sales team vs outsourced call center)
  5. Is the scoring solution focused on optimized existing go-to-market initiatives that can be modeled through historical data?

 

Metrics & Benchmarks

  1. What is the key metric you expect to influence? (examples: increase average lifetime value for the SMB segment by 10%, increase click rate by 20%, Increase average pipe creation per rep by 15%)
  2. Do you have current benchmarks relative to the metric above?
  3. Are there qualitative metrics that are used to evaluate the scoring solution?

 

Training Set and Model Approach

 

Business Process & Systems

  1. What is the current flow and stages of leads through the funnel?
  2. What systems are being used to support the flow above?
  3. What triggers movement through the funnel stages? Are these triggers manual or automatic?
  4. What are the key lead sources that fill your pipeline?

 

Data Objects

  1. Which data objects will be included for constructing the training set and model deployment? (examples: lead, contact, opportunity, account, custom objects)
  2. What time-window will be used to construct the training set? (example: all leads created in the last 12 months)
  3. Which fields and values are critical for determining the positive/baseline records for training the model? (examples: opportunity stage [closed-won, closed-lost], lead status [MQL, SAL, SQL])
  4. Which 1st party data fields will be used to match against 3rd party data? (examples: company website, company country)
  5. Which 1st party data fields can serve as additional signals in the model? (examples: lead source, activities, existing behavioral score)
  6. Are there expectations regarding specific 3rd party signals that should be included in the model? (examples: custom list of complementary security vendors)
  7. How are lead-to-account linkage and company hierarchies defined and processed?

 

Data & Reporting Biases

  1. Are the following common data reporting biases relevant to your process:
    1. Key fields are being populated with default values that could bias the model
    2. Unqualified / qualified leads are being deleted from the system by partners, sales reps etc.
    3. Unqualified leads are auto-qualified by existing workflows
    4. Fields that are only being populated / modified in late funnel stages
    5. One-time large scale lead / account upload
  2. Are there any significant changes in go-to-market strategy that will contradict historical data? (example: business initiative to focus only on selling to Enterprise companies while there was historic success with SMBs).

 

Architecture and Integration Plan

 

Architecture & Security

  1. Which systems will the Leadspace scoring solution integrate with?
  2. Which organizational function/s will own the integration process?
  3. What are the real-time / on-demand vs batch / recurring scoring scenarios?
  4. What are the security requirements for getting API or offline access to data objects?
  5. Are there specific constraints for getting PII data and / or customer data?
  6. What will be the integration stages before going live (example: sandbox first)

 

Activations Workflow

  1. In which systems and funnel stage will the Leadspace scoring field/s be deployed?
  2. Is the Leadspace scoring determine hard thresholds (filters) or soft threshold (prioritization)?
  3. Will the Leadspace score be activated independently or combined with an existing score / business logic?
  4. Will there be a need for score normalization in specific low converting segments? (example: scores in low converting sales territories will be normalized 0-100)

 

Insights & Analytics

  1. Which insights will be important for marketing strategy purposes (examples: who is the buyer of product A, which markets should I enter)
  2. Which insights will be important for funnel optimization (examples: single source provides low quality leads, ‘A’ leads are getting stuck in early stages)
  3. Which insight will be important for sales adoption? (examples: multiple scoring ‘facets’ such as person vs company scores, signal importance)
  4. Are there specific reporting requirements for model performance?
Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.