Skip to content

Manual Review

Understand manual review—when human analysts must evaluate business verification cases that can't be automatically decided.

5 min read

Manual review is the process where a human analyst evaluates a business verification case that couldn’t be resolved through auto-verification. Manual review applies human judgment to ambiguous data, complex situations, and cases requiring investigation.

When Manual Review Is Needed

Auto-Verification Escalations

Cases escalate to manual review when:

  • Data doesn’t match cleanly across sources
  • Risk rules trigger but aren’t definitive
  • Required information is missing or unclear
  • Conflicting signals need interpretation

Mandatory Review Scenarios

Some situations require human judgment by design:

  • High-risk industries or transaction volumes
  • Enhanced due diligence requirements
  • Adverse media or watchlist matches
  • Complex ownership structures
  • Regulatory requirements for human oversight

Common Escalation Triggers

TriggerExample
Name mismatch”ABC Corp” vs “ABC Corporation LLC”
Address discrepancyDifferent addresses across sources
Status uncertaintyRecently changed status, unclear records
Risk signalIndustry flag, geographic concern
Thin dataMicro-business with limited records
Ownership complexityMultiple layers, international entities

The Manual Review Process

Typical Workflow

  1. Queue assignment: Case enters analyst queue
  2. Initial assessment: Analyst reviews available data
  3. Investigation: Additional research as needed
  4. Decision: Approve, decline, or request more information
  5. Documentation: Record reasoning for audit trail
  6. Action: Trigger downstream processes

What Analysts Review

Submitted information:

  • Application data provided by the business
  • Supporting documents (if collected)
  • Applicant communication history

Retrieved data:

Risk indicators:

  • Why the case escalated
  • Red flags identified
  • Comparison to similar cases

Decision Options

DecisionOutcome
ApproveBusiness passes verification
Approve with conditionsAdditional monitoring, limits
Request informationNeed documents or clarification
DeclineBusiness fails verification
Escalate furtherNeeds senior review or legal input

Challenges in Manual Review

Consistency

Human reviewers may decide similar cases differently:

  • Subjective interpretation of ambiguous data
  • Varying risk tolerance among analysts
  • Inconsistent application of guidelines
  • Decision fatigue affecting quality

Speed vs. Thoroughness

Tension between:

  • Completing reviews quickly (customer experience)
  • Investigating thoroughly (risk management)
  • Documentation requirements (compliance)

Information Gaps

Analysts often work with incomplete information:

  • Sources don’t cover all businesses
  • Data may be stale or conflicting
  • Cannot independently verify some claims
  • Must make judgments under uncertainty

Building Effective Manual Review

Clear Guidelines

Effective review processes include:

  • Decision criteria for common scenarios
  • Examples of approve/decline cases
  • Escalation triggers and paths
  • Documentation requirements

Quality Control

Maintaining consistency through:

  • Sampling and audit of decisions
  • Calibration sessions across team
  • Feedback loops on outcomes
  • Performance metrics beyond speed

Tooling

Analysts need:

  • Consolidated view of all data sources
  • Easy access to additional research tools
  • Clear workflow and queue management
  • Documentation templates and audit trails

Training

Ongoing development on:

  • New fraud patterns and risk indicators
  • Regulatory changes and requirements
  • Industry-specific considerations
  • Using available tools effectively

Manual Review Metrics

Efficiency Metrics

MetricWhat It Measures
Review timeAverage time per case
ThroughputCases completed per analyst
Queue depthBacklog waiting for review
First-touch resolution% resolved without re-queuing

Quality Metrics

MetricWhat It Measures
Approval rate% of manual reviews approved
Reversal rate% of decisions later changed
False positive rateGood businesses declined
False negative rateBad actors approved

Balancing Act

Optimizing one metric often hurts others:

  • Faster review → potentially lower quality
  • Higher approval rate → potentially more risk
  • More thorough → longer queue times

Manual Review in Compliance

Documentation Requirements

Regulators expect:

  • Clear reasoning for decisions
  • Evidence of investigation performed
  • Consistent application of policy
  • Retrievable audit trail

Escalation Governance

Effective programs define:

  • When to escalate to senior review
  • When legal or compliance must be involved
  • How to handle edge cases
  • Appeals and reconsideration process

The Manual Review Funnel

In a well-tuned KYB program:

100% Applications

[Auto-Verification]

60-80% Auto-approved/declined

20-40% → Manual Review

Most resolved by analyst

5-10% → Senior/escalated review

The goal is minimizing manual review volume while catching all cases that truly need human judgment.

Key Takeaways

  • Manual review applies human judgment to cases auto-verification can’t resolve
  • Cases escalate for data mismatches, risk signals, or mandatory review requirements
  • Consistency is a challenge—clear guidelines and quality control help
  • Speed and thoroughness are in tension—balance based on risk tier
  • Documentation matters for compliance—decisions must be auditable
  • The goal is reducing manual review through better auto-verification while preserving quality

Related: Auto-Verification | Enhanced Due Diligence | Entity Verification