Expert judgment

Expert judgment is the use of qualified individuals or groups to provide estimates based on their knowledge and experience. It complements data-driven methods by adding context, identifying risks, and shaping ranges and confidence levels.

Key Points

  • Uses the knowledge and experience of subject matter experts to inform estimates of effort, duration, cost, and risk.
  • Most valuable when data is limited, uncertainty is high, or work is novel.
  • Bias is reduced by engaging multiple experts, using structured methods (for example, Delphi), and asking for ranges.
  • Always document assumptions, drivers, rationale, and confidence levels.
  • Combine with historical data and models; do not rely on expert judgment alone when evidence is available.
  • Facilitation, clear inputs, and traceable outputs improve estimate quality and credibility.

When to Use

  • Early in the project when scope is emerging and detailed data is scarce.
  • When the work is unique, complex, or involves new technology or vendors.
  • To validate or calibrate analogous, parametric, or bottom-up estimates.
  • When time is constrained and a rapid, defensible estimate is needed.
  • To assess uncertainty, identify major cost/schedule drivers, and determine contingency.

How to Estimate

  • Identify required expertise (domain, delivery, technology, compliance) and select diverse, credible SMEs.
  • Prepare clear inputs: scope baseline or backlog items, WBS/work package descriptions, constraints, and definition of done.
  • Select a structured approach: interviews, facilitated workshops, Delphi, planning poker, or three-point estimating.
  • Ask experts for ranges (optimistic, most likely, pessimistic) and underlying assumptions, risks, and drivers.
  • Triangulate with available data (historical records, benchmarks) and reconcile differences transparently.
  • Aggregate results, quantify uncertainty, and record the basis of estimate, including sources and confidence.
  • Review and refine estimates as more information emerges or risks change.

Inputs Needed

  • WBS or decomposed backlog items with acceptance criteria and definitions of done.
  • Scope descriptions, constraints, dependencies, and key milestones.
  • Historical data, benchmarks, or reference classes if available.
  • Known risks, assumptions, and uncertainties affecting the estimate.
  • Resource calendars, team capacity, skills, and productivity factors.
  • Estimation templates, checklists, and guidelines for consistency.

Outputs Produced

  • Effort, duration, and/or cost estimates, often as ranges with distributions.
  • Confidence levels and recommended contingency or buffers.
  • Documented assumptions, basis of estimate, and identified drivers.
  • List of key risks, constraints, and open questions affecting estimates.
  • Updates to planning artifacts (schedule, budget, backlog sizing) and estimation logs.

Assumptions

  • Experts have relevant, recent experience comparable to the current work.
  • Inputs provided to experts are accurate, current, and unambiguous.
  • Experts can allocate sufficient time and will disclose assumptions and uncertainties.
  • Facilitation minimizes bias and avoids undue influence or anchoring.
  • The team will revisit estimates as new data becomes available.

Example

  • A project must integrate with a new third-party platform with limited precedent. The PM convenes architecture, security, QA, and vendor integration SMEs.
  • Using a short Delphi round, the group provides three-point duration estimates for analysis, development, and testing, with assumptions and risks noted.
  • After reconciling differences and comparing one similar past effort, the team agrees on a 6-9 week range with 70% confidence and identifies scope drivers.
  • The PM records the basis of estimate, adds schedule contingency, and plans risk responses for integration and certification delays.

Pitfalls

  • Relying on a single expert or unverified opinion without triangulation.
  • Failing to document assumptions, drivers, and sources, reducing traceability.
  • Overconfidence and narrow ranges that ignore uncertainty and risk.
  • Anchoring on the first number, groupthink, or allowing seniority to dominate.
  • Ignoring historical data or benchmarks when they exist.
  • Confusing effort with duration or overlooking resource constraints and dependencies.

PMP Example Question

A project lacks reliable historical data, but the sponsor needs early duration estimates. To reduce bias and increase credibility, what should the project manager do?

  1. Ask the most senior developer for a single-point estimate for each activity.
  2. Run a Delphi session with cross-functional SMEs to produce three-point estimates and document assumptions.
  3. Perform detailed bottom-up estimating for all tasks without SME involvement.
  4. Use earned value analysis to forecast durations from current CPI and SPI.

Correct Answer: B — Run a Delphi session with cross-functional SMEs to produce three-point estimates and document assumptions.

Explanation: Structured expert judgment with multiple experts and ranges reduces bias and increases reliability. The other options either rely on a single opinion, exclude SMEs, or use tools unsuitable for early estimating.

Advanced Lean Six Sigma — Data-Driven Excellence

Solve complex problems, reduce variation, and improve performance with confidence. This course is designed for professionals who already know the basics and want to apply advanced Lean Six Sigma tools to real business challenges.

This is not abstract statistics or theory-heavy training. You’ll use Excel to perform real analysis, interpret results correctly, and apply tools like DMAIC, SIPOC, MSA, hypothesis testing, and regression without memorizing formulas or relying on expensive software.

You’ll learn how to measure baseline performance, analyze process capability, use control charts to maintain stability, and validate improvements using statistical evidence. Templates, worked examples, and structured walkthroughs help you apply each concept immediately.

Learn through a complete, real-world Lean Six Sigma project and develop the skills to lead data-driven improvements with credibility. If you’re ready to move beyond basics and make decisions backed by data, enroll now and take your Lean Six Sigma expertise to the next level.



Lead with clarity, influence, and outcomes.

HK School of Management brings you a practical, no-fluff Leadership for Project Managers course—built for real projects, tight deadlines, and cross-functional teams. Learn to set direction, align stakeholders, and drive commitment without relying on title. For the price of a lunch, get proven playbooks, and downloadable templates. Backed by a 30-day money-back guarantee—zero risk, high impact.

Learn More