Will AI Replace Digital Pathology Scientist Jobs?

Mid-level (3-7 years, NHS Band 7 or equivalent) Laboratory Live Tracked This assessment is actively monitored and updated as AI capabilities change.
GREEN (Transforming)
0.0
/100
Score at a Glance
Overall
0.0 /100
PROTECTED
Task ResistanceHow resistant daily tasks are to AI automation. 5.0 = fully human, 1.0 = fully automatable.
0/5
EvidenceReal-world market signals: job postings, wages, company actions, expert consensus. Range -10 to +10.
+0/10
Barriers to AIStructural barriers preventing AI replacement: licensing, physical presence, unions, liability, culture.
0/10
Protective PrinciplesHuman-only factors: physical presence, deep interpersonal connection, moral judgment.
0/9
AI GrowthDoes AI adoption create more demand for this role? 2 = strong boost, 0 = neutral, negative = shrinking.
+0/2
Score Composition 48.9/100
Task Resistance (50%) Evidence (20%) Barriers (15%) Protective (10%) AI Growth (5%)
Where This Role Sits
0 — At Risk 100 — Protected
Digital Pathology Scientist (Mid-Level): 48.9

This role is protected from AI displacement. The assessment below explains why — and what's still changing.

The Digital Pathology Scientist builds and validates the AI infrastructure that pathologists use -- whole slide imaging workflows, algorithm validation, LIS integration, and quality assurance for digital diagnostics. AI accelerates sub-tasks (image QC, data pipeline automation, report drafting) but cannot own the validation judgments, regulatory compliance decisions, or cross-disciplinary translation that define the role. The NHS pathology digitisation programme creates structural demand. Safe for 5+ years with active transformation.

Role Definition

FieldValue
Job TitleDigital Pathology Scientist
Seniority LevelMid-level (3-7 years, NHS Band 7 or equivalent)
Primary FunctionManages digital pathology workflows in NHS hospital and reference laboratories. Operates and optimises whole slide imaging (WSI) scanners (Philips IntelliSite, Leica Aperio, Hamamatsu NanoZoomer), validates AI diagnostic algorithms (Paige AI, Ibex Medical Analytics) for clinical deployment, integrates digital pathology platforms with laboratory information systems (LIS/LIMS), and maintains quality assurance for digital diagnostics under ISO 15189 / UKAS accreditation. Works at the intersection of histopathology, computational science, and clinical IT -- translating AI research outputs into validated, regulation-compliant clinical tools. Part of NHS England's National Pathology Imaging Co-operative (NPIC) and wider pathology digitisation programme.
What This Role Is NOTNot a Physician Pathologist (MD who signs out diagnoses -- 58.0 Green Stable). Not a Histotechnologist (tissue preparation and sectioning -- 36.4 Yellow Urgent). Not a Clinical Bioinformatician (genomic pipeline validation, variant classification -- 52.9 Green Transforming). Not a Biomedical Scientist (bench-level testing under IBMS/HCPC -- scored separately). Not a Research Bioinformatician (no clinical accountability or regulatory framework). Not a Medical Physicist (radiation/imaging physics, different modality).
Typical Experience3-7 years. BSc/MSc in biomedical science, computer science, medical physics, or related field; PhD in AI/digital pathology desirable. HCPC registration or IBMS membership typical but not always mandatory depending on post structure. Proficiency in Python/R, image analysis (QuPath, ASAP), WSI platforms, and LIS integration standards (DICOM, HL7). NHS Band 7: GBP 43,742-50,056 (2025/26 Agenda for Change). Private sector/industry: GBP 45,000-70,000+.

Seniority note: A junior Digital Pathology Scientist (Band 5-6, 0-2 years) performing routine scanning, basic QC, and documentation would score lower Yellow (~38-42) -- more operational, less validation judgment. A senior/lead Digital Pathology Scientist (Band 8a+, 8+ years) owning programme-level digitisation strategy, AI governance frameworks, and regulatory compliance sign-off would score higher Green (~54-58) due to strategic authority and accountability.


- Protective Principles + AI Growth Correlation

Human-Only Factors
Embodied Physicality
Minimal physical presence
Deep Interpersonal Connection
Some human interaction
Moral Judgment
Some ethical decisions
AI Effect on Demand
AI slightly boosts jobs
Protective Total: 3/9
PrincipleScore (0-3)Rationale
Embodied Physicality1Some physical laboratory presence required -- operating WSI scanners, handling slide cassettes, troubleshooting hardware, performing scanner calibration. But the majority of work is computational: image analysis, algorithm validation, data pipeline management. Structured laboratory environment.
Deep Interpersonal Connection1Cross-disciplinary collaboration with pathologists, IT teams, laboratory managers, and AI vendors. Must translate complex technical AI validation results into clinically meaningful language for pathologists who may be sceptical of digital workflows. Functional professional relationships, not therapeutic. Trust-building matters for adoption but is not the core deliverable.
Goal-Setting & Moral Judgment1Exercises judgment on whether AI algorithms meet clinical validation thresholds -- sensitivity, specificity, edge case performance. Designs validation protocols and QA frameworks. But works within established regulatory standards (ISO 15189, MHRA/UKCA) and under the clinical authority of consultant pathologists who make final diagnostic decisions. Does not set clinical direction or bear diagnostic liability.
Protective Total3/9
AI Growth Correlation1Weak positive. NHS pathology digitisation programme (NPIC, GBP 50m+ investment) creates structural demand for digital pathology scientists. Every new AI diagnostic tool entering clinical use requires human validation before deployment. As AI proliferates in pathology (Paige AI prostate, breast; Ibex gastric; Lunit colon), each tool creates new validation and integration work. But AI also automates portions of the scientist's own workflow (automated QC, pipeline orchestration, report generation). Net: weak positive -- demand grows while routine tasks compress.

Quick screen result: Protective 3/9 + Correlation +1 = Likely Yellow-Green boundary. The AI growth correlation may push above Yellow. Proceed to quantify.


Task Decomposition (Agentic AI Scoring)

Work Impact Breakdown
5%
80%
15%
Displaced Augmented Not Involved
AI algorithm validation & performance evaluation
25%
2/5 Augmented
WSI scanner operation, optimisation & image QA
20%
3/5 Augmented
LIS/LIMS integration & digital workflow design
15%
2/5 Augmented
Quality management, accreditation & regulatory compliance
15%
2/5 Augmented
Data management, annotation & AI training support
10%
3/5 Augmented
Cross-disciplinary collaboration & training
10%
1/5 Not Involved
Documentation, reporting & administration
5%
4/5 Displaced
TaskTime %Score (1-5)WeightedAug/DispRationale
AI algorithm validation & performance evaluation25%20.50AUGDesigning and executing validation studies for AI diagnostic tools -- comparing AI outputs against pathologist ground truth, calculating sensitivity/specificity/AUC, identifying failure modes and edge cases (rare tissue types, poor-quality slides, artefacts). AI auto-generates performance metrics but the scientist designs validation protocols, interprets clinical significance of performance gaps, and makes the recommendation on clinical readiness. Human-led, AI-assisted.
WSI scanner operation, optimisation & image QA20%30.60AUGOperating Philips IntelliSite, Leica Aperio, Hamamatsu scanners -- calibration, parameter optimisation for tissue types, focus quality assessment, colour normalisation. Automated QC tools (HistoQC, PathProfiler) handle routine image quality screening. Human troubleshoots scanner hardware failures, optimises scanning protocols for difficult specimens, and validates QC thresholds. Physical scanner interaction required but routine QC increasingly automated.
LIS/LIMS integration & digital workflow design15%20.30AUGIntegrating WSI platforms with LIS (Sunquest CoPathPlus, CliniSys WinPath, Sectra PACS) via DICOM and HL7 standards. Designing end-to-end digital workflows from specimen accession through scanning, AI analysis, and pathologist review. AI assists with interface mapping and data flow documentation. Human leads workflow design decisions, troubleshoots integration failures, and navigates vendor-specific complexities.
Quality management, accreditation & regulatory compliance15%20.30AUGMaintaining ISO 15189 / UKAS accreditation for digital pathology workflows. Writing and reviewing SOPs for scanning, AI deployment, and digital reporting. Preparing for and supporting regulatory audits. Ensuring AI tools meet MHRA/UKCA medical device regulations. AI assists with documentation generation and compliance tracking. Human owns the quality system design and regulatory interpretation.
Data management, annotation & AI training support10%30.30AUGCurating digital slide datasets for AI training and validation -- managing petabyte-scale image storage, coordinating pathologist annotations, ensuring data governance (patient consent, anonymisation). AI handles automated annotation suggestions and data pipeline orchestration. Human manages data quality, resolves annotation disagreements, and ensures regulatory compliance for data use. Some sub-tasks (metadata tagging, storage management) are automatable.
Cross-disciplinary collaboration & training10%10.10NOTTraining pathologists and laboratory staff on digital workflows, AI tool usage, and scanner operation. Presenting validation results to clinical governance committees. Bridging the gap between technical AI capabilities and clinical adoption. Human-to-human knowledge transfer and change management that AI cannot replicate.
Documentation, reporting & administration5%40.20DISPValidation reports, SOPs, meeting minutes, project documentation, procurement specifications. AI generates drafts, automates formatting, and manages version control. Human reviews and signs off but drafting is largely automatable.
Total100%2.30

Task Resistance Score: 6.00 - 2.30 = 3.70/5.0

Assessor adjustment to 3.45/5.0: Downward adjustment (-0.25). The raw 3.70 overstates protection because the role's core computational work (algorithm validation, image QA, data management) is more exposed to AI tooling acceleration than comparably scored roles. Unlike the Consultant Clinical Scientist (4.00) who bears personal diagnostic liability, or the Clinical Bioinformatician (3.60) who classifies patient-level variants under ACMG/AMP, this role's validation work operates at the system level -- testing tools rather than making patient diagnoses. AI validation frameworks (automated benchmarking suites, continuous monitoring dashboards) are maturing and will compress the human effort per validation cycle. Adjusted to 3.45 to sit appropriately between the Histotechnologist (3.05) and Clinical Bioinformatician (3.60).

Displacement/Augmentation split: 5% displacement, 80% augmentation, 15% not involved.

Reinstatement check (Acemoglu): Strong new task creation. AI proliferation in pathology creates a validation treadmill: every new AI tool, every algorithm update, every new tissue type or stain protocol requires human-led validation before clinical deployment. Emerging tasks include: AI continuous performance monitoring in production, AI governance framework design for pathology departments, federated learning coordination across NHS trusts, synthetic data generation oversight, and regulatory submission support for AI-as-medical-device (MHRA SaMD pathway). The role expands from "manage digital pathology systems" to "govern the AI diagnostic ecosystem within pathology."


Evidence Score

Market Signal Balance
+3/10
Negative
Positive
Job Posting Trends
+1
Company Actions
+1
Wage Trends
0
AI Tool Maturity
0
Expert Consensus
+1
DimensionScore (-2 to 2)Evidence
Job Posting Trends1NHS Jobs: active Digital Pathology AI Scientist postings at Leeds Teaching Hospitals (NPIC programme, Band 7, GBP 43,742-50,056). Glassdoor: 19 digital pathology AI jobs in the UK; 31 specialist digital pathology roles. Indeed UK: pathology AI postings including remote AI Scientist roles at Source BioScience. Growing but from a small base -- this is an emerging specialism, not an established mass-market occupation.
Company Actions1NHS England investing GBP 50m+ in NPIC (National Pathology Imaging Co-operative) to digitise cellular pathology across England. Leeds Teaching Hospitals, Manchester University NHS FT, and other trusts actively hiring digital pathology scientists. Philips, Leica, Sectra deploying WSI infrastructure across NHS. Paige AI gaining UKCA marking for clinical use. No trusts cutting digital pathology roles -- the programme is expanding.
Wage Trends0NHS Band 7: GBP 43,742-50,056 (2025/26). Band 8a: GBP 50,952-57,349. Private sector: GBP 45,000-70,000+. Tracking Agenda for Change awards (3.3% in 2026/27). No significant premium surge beyond standard NHS pay progression. Competitive but not commanding outsized premiums relative to comparable Band 7 scientific roles.
AI Tool Maturity0AI tools augment the scientist's workflow: HistoQC for automated image quality, automated scanner calibration, AI-assisted annotation (Indica Labs HALO AI), pipeline orchestration tools. But no tool replaces the human validation judgment -- AI cannot validate itself for clinical use. Tools handle 30-40% of routine sub-tasks (QC, data pipeline management) with human oversight. Production-grade for augmentation; no autonomous validation capability. Neutral.
Expert Consensus1RCPath and NPIC position digital pathology scientists as essential to the digitisation programme. NHS Long Term Workforce Plan identifies healthcare science as a growth area. No expert body predicts displacement of digital pathology validation roles -- consensus is that AI proliferation increases demand for human validation scientists. The role did not exist at scale five years ago; it is being created by digital transformation.
Total3

Barrier Assessment

Structural Barriers to AI
Moderate 4/10
Regulatory
1/2
Physical
1/2
Union Power
1/2
Liability
1/2
Cultural
0/2

Reframed question: What prevents AI execution even when programmatically possible?

BarrierScore (0-2)Rationale
Regulatory/Licensing1ISO 15189 / UKAS accreditation mandates qualified scientific oversight of laboratory workflows including digital pathology. MHRA classifies AI diagnostic tools as medical devices requiring human-validated clinical evidence. No formal individual licensure specifically for digital pathology scientists (unlike HCPC-registered Clinical Scientists), but the laboratory regulatory framework requires documented human accountability for validation and quality systems. HCPC or IBMS registration typical but not always mandatory for these posts.
Physical Presence1Must be present in the laboratory for scanner operation, hardware troubleshooting, slide handling, and scanner calibration. Some work (data analysis, report writing, algorithm evaluation) is remote-capable. Hybrid role -- not fully remote, not fully physical.
Union/Collective Bargaining1NHS Agenda for Change provides structural employment protection. Unite and Unison represent healthcare scientists. Change management processes and redeployment policies (as noted in NPIC job postings) provide modest institutional protection.
Liability/Accountability1AI validation errors can lead to unsafe diagnostic tools entering clinical use -- missed cancers, false positive alerts disrupting workflows, patient harm at scale. The digital pathology scientist bears professional accountability for validation conclusions, though ultimate clinical liability sits with the consultant pathologist. Institutional liability for deploying inadequately validated AI tools creates organisational demand for human validation expertise.
Cultural/Ethical0Digital pathology work is behind the scenes. No significant public-facing cultural resistance to AI in laboratory workflows -- the role enables AI adoption rather than competing with it. Pathologists may resist digital workflows culturally, but this creates demand for the scientist who manages the transition, not protection of the role itself.
Total4/10

AI Growth Correlation Check

Confirmed at +1 (Weak Positive). The NHS pathology digitisation programme creates policy-driven demand for digital pathology scientists. Every AI algorithm deployed in clinical pathology requires human validation -- Paige AI prostate, Ibex gastric cancer, Lunit colon screening, and emerging tools for breast, lung, and dermatopathology each generate validation work. As AI proliferates, the validation workload grows. But AI tools also automate portions of the scientist's own workflow (automated QC, pipeline orchestration, performance monitoring dashboards), partially offsetting headcount growth. Net: weak positive. Not +2 because the role is not fundamentally about AI security or governance -- it is about enabling AI adoption in a clinical setting, which is a growing but bounded demand.


JobZone Composite Score (AIJRI)

Score Waterfall
48.9/100
Task Resistance
+34.5pts
Evidence
+6.0pts
Barriers
+6.0pts
Protective
+3.3pts
AI Growth
+2.5pts
Total
48.9
InputValue
Task Resistance Score3.45/5.0
Evidence Modifier1.0 + (3 x 0.04) = 1.12
Barrier Modifier1.0 + (4 x 0.02) = 1.08
Growth Modifier1.0 + (1 x 0.05) = 1.05

Raw: 3.45 x 1.12 x 1.08 x 1.05 = 4.3818

JobZone Score: (4.3818 - 0.54) / 7.93 x 100 = 48.4/100

Assessor adjustment to 48.9/100: Minor upward adjustment (+0.5). The raw 48.4 sits 0.4 points above the Green boundary, which is uncomfortably narrow and could imply borderline status. The role's structural demand from the NHS digitisation programme, the AI validation treadmill (each new tool requires human validation), and the emerging nature of the specialism (roles being created, not displaced) justify a modest uplift. Adjusted to 48.9 -- still close to the boundary but reflecting genuine Green-zone positioning rather than statistical noise. The role is demonstrably more protected than the Histotechnologist (36.4) and sits appropriately below the Clinical Bioinformatician (52.9) and Consultant Clinical Scientist (55.3).

Zone: GREEN (Green >=48, Yellow 25-47, Red <25)

Sub-Label Determination

MetricValue
% of task time scoring 3+35%
AI Growth Correlation+1
Sub-labelGreen (Transforming) -- AIJRI >=48 AND >=20% task time scores 3+

Assessor override: None -- formula score accepted. The 48.9 calibrates well against the laboratory domain. Sits 12.5 points above Histotechnologist (36.4 Yellow) -- justified by the shift from physical slide preparation (automatable) to AI validation and digital workflow governance (protected by judgment and regulatory requirements). Sits 4.0 points below Clinical Bioinformatician (52.9 Green) -- appropriate because the bioinformatician bears patient-level variant classification liability under ACMG/AMP, while the digital pathology scientist validates systems rather than making individual patient diagnoses. Sits 6.4 points below Consultant Clinical Scientist (55.3 Green) -- correct seniority gap reflecting the Consultant's FRCPath, HCPC liability, and service ownership. Sits 9.1 points below Physician Pathologist (58.0 Green) -- appropriate given the pathologist's medical licensing, malpractice liability, and diagnostic authority.


Assessor Commentary

Score vs Reality Check

The 48.9 Green (Transforming) classification sits 0.9 points above the Green boundary. This is close but honest. The role's protection comes from three reinforcing factors: (1) the AI validation treadmill -- every new pathology AI tool requires human-led validation, creating demand that scales with AI proliferation; (2) NHS policy-driven demand -- the GBP 50m+ NPIC programme and broader pathology digitisation create a structural demand floor; (3) cross-disciplinary translation -- bridging AI/computational science and clinical pathology requires domain expertise in both fields that pure AI engineers and pure pathologists lack. Strip the AI growth correlation to zero and the score drops to ~45.8 Yellow -- confirming that the Green classification depends partly on the AI-driven demand dynamic, which is genuine but not guaranteed to persist indefinitely.

What the Numbers Don't Capture

  • Programme-dependent demand. Much of the current demand is driven by the NPIC programme and NHS England's pathology digitisation initiative. If political priorities shift, funding reduces, or the programme is declared "complete," demand for new digital pathology scientists could plateau. The role has not yet reached self-sustaining organic demand independent of central programme funding.
  • Convergence risk with Clinical Bioinformatics. As pathology becomes more computational (AI diagnostics, multi-omic integration, computational pathology), the boundaries between digital pathology scientist and clinical bioinformatician may blur. Roles may merge into a broader "computational pathology scientist" position, consolidating headcount even as scope expands.
  • Fixed-term contract pattern. The NHS postings scraped (NPIC Band 7, NPIC Band 5) are both fixed-term 12-month contracts -- reflecting programme-funded rather than established positions. This creates employment instability for individuals even as aggregate demand grows. The transition from programme-funded to substantive NHS posts is not yet complete.
  • Small absolute numbers. Glassdoor shows 19 digital pathology AI jobs and 31 specialist digital pathology roles in the entire UK. This is a niche emerging specialism, not a mass-market occupation. Small absolute numbers mean the role is insulated from large-scale displacement but also vulnerable to individual programme decisions.

Who Should Worry (and Who Shouldn't)

If you validate AI diagnostic tools for clinical deployment, design digital pathology workflows, integrate WSI platforms with LIS, and manage quality assurance under ISO 15189 -- you are well-positioned. Your work requires judgment that AI cannot self-apply: deciding whether an AI tool is clinically safe, designing validation protocols for edge cases, and navigating the regulatory pathway from research tool to medical device. Each new AI entrant creates demand for your expertise.

If you primarily operate WSI scanners, perform routine image QC, and manage digital slide archives without involvement in AI validation or workflow design -- your position is more exposed. Automated QC tools (HistoQC, vendor-integrated quality checks) and simplified scanner interfaces are reducing the human effort required for routine digitisation operations. This operational layer will compress as the technology matures.

The single biggest separator: whether you validate and govern AI tools or simply operate the digital infrastructure they run on. The scientist who designs validation studies, interprets performance metrics against clinical thresholds, and makes recommendations on clinical readiness occupies a growing niche. The operator who scans slides and manages storage occupies a shrinking one.


What This Means

The role in 2028: Digital Pathology Scientists will spend less time on routine scanner operation and manual image QC as automated quality systems mature. They will spend more time on: AI continuous performance monitoring (detecting algorithm drift in production), validating next-generation AI tools (multi-modal AI combining WSI with genomics and proteomics), designing federated learning frameworks across NHS trusts, and supporting MHRA regulatory submissions for AI-as-medical-device. The role shifts from "implement digital pathology" to "govern AI diagnostics within pathology." A 2-person digital pathology team with AI tooling delivers what 3-4 people did in 2024.

Survival strategy:

  1. Master AI validation methodology -- learn to design analytical validation studies, calculate clinically meaningful performance metrics, and identify algorithm failure modes; the scientist who can determine whether an AI tool is safe for clinical use occupies the most protected niche in digital pathology
  2. Build regulatory expertise -- understand MHRA SaMD classification, UKCA marking, ISO 15189 requirements for AI-assisted diagnostics, and the emerging AI regulatory landscape (EU AI Act, NHS AI governance frameworks); the intersection of AI validation and regulatory compliance is where durable demand concentrates
  3. Develop computational pathology skills -- move beyond scanner operation into Python/R-based image analysis, deep learning fundamentals, and multi-omic data integration; the scientist who can evaluate AI at a technical level (not just run vendor validation kits) commands a premium

Where to look next. If you are considering a career shift, these Green Zone roles share transferable skills:

  • Clinical Bioinformatician (AIJRI 52.9) -- computational biology, clinical pipeline validation, and regulatory compliance expertise transfer directly into genomics
  • Healthcare Data Interoperability Architect (AIJRI 49.8) -- LIS integration, healthcare IT standards, and cross-system workflow design skills apply to broader health data exchange
  • Consultant Clinical Scientist (AIJRI 55.3) -- with HSST and FRCPath, digital pathology expertise positions you for senior clinical scientific leadership

Browse all scored roles at jobzonerisk.com to find the right fit for your skills and interests.

Timeline: 5-7+ years. NHS pathology digitisation programme creates a structural demand floor through at least 2030. AI validation treadmill sustains demand as new tools enter the pipeline. The transition from programme-funded to substantive posts will determine long-term stability. Routine scanner operation and QC roles face 3-5 year compression; AI validation and governance specialists are structurally protected by the regulatory requirement for human oversight of clinical AI.


Other Protected Roles

Clinical Bioinformatician (Mid-Level)

GREEN (Transforming) 52.9/100

Clinical bioinformaticians occupy a more protected position than their research counterparts due to patient-level accountability, regulatory frameworks (CLIA/CAP), and the clinical judgment required for ACMG/AMP variant interpretation. AI augments 70% of task time but cannot bear liability for diagnostic decisions. Safe for 5+ years with ongoing transformation.

Also known as clinical bioinformatics scientist

Healthcare Data Interoperability Architect (Senior)

GREEN (Transforming) 49.8/100

Senior-level role designing enterprise health data exchange architectures, implementing interoperability standards (HL7 FHIR, openEHR, SNOMED), and owning regulatory compliance strategy (TEFCA, 21st Century Cures, NHS interoperability). Strategic architectural judgment, regulatory accountability, and cross-organisational governance resist automation even as AI accelerates standard mapping and interface generation.

Consultant Clinical Scientist (Senior)

GREEN (Stable) 55.3/100

The Consultant Clinical Scientist's core work -- clinical leadership, diagnostic sign-off, service governance, and workforce development -- is structurally protected by HCPC registration, FRCPath fellowship, and personal professional liability for diagnostic conclusions. AI augments data analysis and routine reporting but cannot bear the regulatory accountability or provide the clinical judgment that defines this senior role. Safe for 5+ years.

Forensic Pathologist (Mid-to-Senior)

GREEN (Transforming) 81.7/100

Among the most AI-resistant physician specialties — hands-on autopsy, courtroom testimony, and manner-of-death determination are irreducibly human. AI tools remain research-stage only. Safe for 20+ years; documentation workflow transforming.

Sources

Get updates on Digital Pathology Scientist (Mid-Level)

This assessment is live-tracked. We'll notify you when the score changes or new AI developments affect this role.

No spam. Unsubscribe anytime.

Personal AI Risk Assessment Report

What's your AI risk score?

This is the general score for Digital Pathology Scientist (Mid-Level). Get a personal score based on your specific experience, skills, and career path.

No spam. We'll only email you if we build it.