Role Definition
| Field | Value |
|---|---|
| Job Title | Systems Engineer — Radar |
| Seniority Level | Mid-Level (independently managing radar system requirements, integration, and V&V; 3-7 years experience) |
| Primary Function | Applies systems engineering discipline to radar programmes at defence contractors. Core work includes requirements elicitation and decomposition using DOORS/JAMA, MBSE modelling in Cameo/Rhapsody using SysML, system-level trade studies, interface management between radar and platform systems (avionics, EW, C2), integration and test planning, and verification and validation execution. Operates at the system/platform level — ensuring the radar meets allocated requirements and integrates correctly into the broader weapon system or sensor suite. Works at Lockheed Martin, Raytheon/RTX, Northrop Grumman, BAE Systems, Thales, L3Harris, Saab on ground-based, airborne, shipborne, and space-based radar programmes. |
| What This Role Is NOT | NOT a Radar Systems Engineer (focused on radar design, signal processing, and waveform development — scored 53.9 Green). NOT a general Systems Analyst (desk-based IT role — scored Red). NOT a DSP/Signal Processing Engineer (domain-agnostic algorithm development). NOT a senior chief systems engineer setting multi-programme technical direction. The distinction is breadth vs depth: this role manages radar as a subsystem within a larger platform; the Radar Systems Engineer designs the radar internals. |
| Typical Experience | 3-7 years. BS/MS in systems engineering, electrical engineering, or aerospace engineering. INCOSE CSEP/ASEP certification valued. Security clearance (Secret or TS/SCI) typically required. Proficiency in DOORS, Cameo/Rhapsody (SysML), MATLAB, and DoD acquisition processes. |
Seniority note: Junior systems engineers (0-2 years) running standard requirement traces and writing test procedures under supervision would score Yellow. Chief/principal systems engineers directing system architecture across multi-billion-dollar programmes would score deeper Green (58+ range).
Protective Principles + AI Growth Correlation
| Principle | Score (0-3) | Rationale |
|---|---|---|
| Embodied Physicality | 2 | System integration and test execution requires physical presence — integration labs, anechoic chambers, field test events on military platforms (aircraft, ships, ground vehicles). System-level acceptance testing involves hands-on work with classified hardware in controlled environments. |
| Deep Interpersonal Connection | 1 | Cross-functional coordination with radar designers, software engineers, hardware teams, and military customers. Requirements negotiation and design reviews (SRR, PDR, CDR) are collaborative but transactional. |
| Goal-Setting & Moral Judgment | 2 | System-level trade studies involve consequential decisions — balancing radar performance against platform constraints (power, cooling, weight, EMI), programme risk, schedule, and cost. Go/no-go decisions on system verification with national security implications. |
| Protective Total | 5/9 | |
| AI Growth Correlation | 0 | Radar programme demand driven by defence modernisation and geopolitical threat evolution, not AI adoption. MBSE tools make systems engineers more productive but do not proportionally create or eliminate positions. |
Quick screen result: Protective 5/9 with neutral growth — likely Green (Transforming). Proceed to quantify.
Task Decomposition (Agentic AI Scoring)
| Task | Time % | Score (1-5) | Weighted | Aug/Disp | Rationale |
|---|---|---|---|---|---|
| Requirements engineering & MBSE modelling | 20% | 3 | 0.60 | AUGMENTATION | Eliciting operational requirements, decomposing to radar subsystem, maintaining traceability in DOORS/JAMA, building SysML models in Cameo. AI generates standard requirement decompositions and populates model templates. Engineer owns interpretation of stakeholder needs, conflict resolution between competing requirements, and derived requirement identification. |
| System architecture & trade studies | 15% | 2 | 0.30 | AUGMENTATION | Defining radar system architecture within platform context, performing trade studies (performance vs SWaP-C), allocating requirements to subsystems. AI assists with parametric analysis but architectural decisions — how the radar integrates with EW, C2, and weapons systems — require deep understanding of operational context and platform constraints. |
| System integration & test planning | 15% | 2 | 0.30 | AUGMENTATION | Developing integration test procedures, defining test architectures, planning system-level verification campaigns. Requires understanding of physical test environments (labs, ranges, field sites) and interaction effects between radar and platform systems that AI cannot anticipate from specifications alone. |
| Verification & validation execution | 15% | 2 | 0.30 | AUGMENTATION | Executing system-level V&V — integration testing, interface verification, performance evaluation of radar within the broader system. Physical presence required. Interpreting anomalous results when measured performance diverges from models requires cross-domain expertise. |
| Interface management & coordination | 10% | 2 | 0.20 | AUGMENTATION | Managing ICDs between radar and platform systems (power, cooling, data links, mechanical mounting). Cross-functional coordination with radar designers, software teams, hardware engineers, and military customers. Human judgment on interface conflict resolution. |
| Performance modelling & simulation | 10% | 3 | 0.30 | AUGMENTATION | System-level performance modelling, scenario simulation, requirements verification through analysis. AI accelerates model setup, parameter sweeps, and standard analysis. Engineer designs simulation architecture and interprets results against operational requirements. |
| Technical documentation & compliance | 10% | 4 | 0.40 | DISPLACEMENT | CDRLs, test reports, compliance matrices against MIL-STD requirements, system engineering management plans. Highly structured and templated. AI generates most of this from models and test data with human review. |
| Customer liaison & design reviews | 5% | 2 | 0.10 | AUGMENTATION | Presenting at SRR/PDR/CDR, briefing military customers, translating operational needs into engineering specifications. Requires trust, domain credibility, and real-time judgment. |
| Total | 100% | 2.50 |
Task Resistance Score: 6.00 - 2.50 = 3.50/5.0
Displacement/Augmentation split: 10% displacement, 85% augmentation, 5% not involved.
Reinstatement check (Acemoglu): Moderate reinstatement. AI creates new tasks: validating AI-generated requirement decompositions, managing digital thread integrity across MBSE models, defining V&V approaches for AI/ML-enabled radar subsystems (cognitive radar, ML-based target classification), and ensuring DoD AI ethics compliance for radar systems in the kill chain.
Evidence Score
| Dimension | Score (-2 to 2) | Evidence |
|---|---|---|
| Job Posting Trends | +1 | Active postings from Lockheed Martin, RTX, Saab, Northrop Grumman, GD Mission Systems for Systems Engineer — Radar roles across US and UK. BLS projects 6% growth for Aerospace Engineers (17-2011), 7% for Electronics Engineers (17-2072). Defence radar market projected $22.8B by 2029. Growing steadily but not surging >20%. |
| Company Actions | +1 | Major defence contractors actively hiring systems engineers for radar programmes (LTAMDS, SPY-6, LRDR, Typhoon ECRS Mk2, Saab G1X). No companies cutting radar systems engineers citing AI. MBSE adoption driving demand for SysML-proficient systems engineers. Security clearance requirement limits candidate pool. |
| Wage Trends | +1 | ZipRecruiter average $129,787; Glassdoor $167,937. Mid-level range $100K-$160K+ at defence primes. Growing above inflation. Clearance and MBSE expertise command premiums. PwC reports AI-skilled engineers see up to 56% salary uplift. |
| AI Tool Maturity | 0 | MBSE tools (Cameo, Rhapsody) gaining AI assistance for requirement generation and model population but not replacing systems engineering judgment. DOORS AI features emerging for requirement analysis. Anthropic observed exposure: Aerospace Engineers 7.53%, Electronics Engineers 9.99% — both very low. Tools augment, not replace. |
| Expert Consensus | +1 | INCOSE and defence industry consensus: MBSE and digital engineering transform the systems engineer role but increase demand for systems thinking skills. Defence procurement complexity growing with multi-domain integration (JADC2), driving need for more systems engineers, not fewer. |
| Total | 4 |
Barrier Assessment
Reframed question: What prevents AI execution even when programmatically possible?
| Barrier | Score (0-2) | Rationale |
|---|---|---|
| Regulatory/Licensing | 1 | PE license rarely required in defence industry. However, ITAR/EAR export controls restrict AI tool access for classified programmes. DoD acquisition oversight (DCMA) requires named responsible engineers for system verification. MIL-STD compliance requires human engineering judgment and sign-off. |
| Physical Presence | 2 | System integration labs, anechoic chambers, field test events on military platforms require physical presence. System-level acceptance testing involves classified hardware in controlled environments. Cannot be performed remotely or by AI. |
| Union/Collective Bargaining | 0 | Defence engineering sector, at-will employment. No union protections. |
| Liability/Accountability | 1 | System verification failures have severe consequences — missed requirements flow to operational failures in military systems. Programme reviews (SRR, PDR, CDR, TRR) trace system-level decisions to named engineers. Institutional liability rather than personal (no PE stamp), but consequences are existential for programmes. |
| Cultural/Ethical | 1 | Defence industry conservative about AI in weapons-adjacent system verification. Radar systems are sensors in the kill chain — cultural resistance to removing human oversight from system-level verification of detection and tracking capabilities. DoD AI Ethics Principles create friction. |
| Total | 5/10 |
AI Growth Correlation Check
Confirmed at 0 (Neutral). Demand for radar systems engineers is driven by geopolitical threat evolution, defence modernisation cycles (LTAMDS, SPY-6, JADC2), and NATO spending increases — not AI adoption. MBSE and digital engineering tools make systems engineers more productive but do not proportionally create or eliminate positions. This is Green (Transforming), not Accelerated.
JobZone Composite Score (AIJRI)
| Input | Value |
|---|---|
| Task Resistance Score | 3.50/5.0 |
| Evidence Modifier | 1.0 + (4 x 0.04) = 1.16 |
| Barrier Modifier | 1.0 + (5 x 0.02) = 1.10 |
| Growth Modifier | 1.0 + (0 x 0.05) = 1.00 |
Raw: 3.50 x 1.16 x 1.10 x 1.00 = 4.4660
JobZone Score: (4.4660 - 0.54) / 7.93 x 100 = 49.5/100
Zone: GREEN (Green >=48, Yellow 25-47, Red <25)
Sub-Label Determination
| Metric | Value |
|---|---|
| % of task time scoring 3+ | 40% |
| AI Growth Correlation | 0 |
| Sub-label | Green (Transforming) — AIJRI >=48, 40% >= 20% threshold, Growth != 2 |
Assessor override: None — formula score accepted. At 49.5, this role sits 1.5 points above the Green threshold. The score calibrates correctly within the aerospace/defence engineering cluster: lower than Radar Systems Engineer (53.9) because systems engineering tasks (requirements, MBSE, documentation) have higher AI automation potential than radar design tasks (signal processing, waveform development, antenna design).
Assessor Commentary
Score vs Reality Check
The Green (Transforming) classification at 49.5 is honest but borderline — 1.5 points above the threshold. Protection comes primarily from physical integration and test requirements (V&V on classified military platforms) and defence industry barriers (clearances, ITAR, programme accountability). If MBSE automation matures rapidly and AI generates reliable requirement decompositions, the requirements engineering task could shift from score 3 to score 4, pushing this role toward Yellow. The physical moat holds the line.
What the Numbers Don't Capture
- Security clearance as structural moat. Most radar systems engineering roles require Secret or TS/SCI clearances. AI cannot hold clearances. Classified programmes restrict AI tool deployment, creating a barrier not fully captured in the 5/10 score.
- MBSE tool maturity is accelerating. Cameo and Rhapsody are adding AI-assisted modelling features faster than most engineering tools. The 40% task time at score 3+ could expand as AI generates requirement decompositions and populates system models more autonomously.
- Defence procurement cycle smoothing. Multi-decade programmes (LTAMDS, SPY-6, LRDR) provide demand stability that civilian technology markets lack.
- JADC2 as demand driver. Joint All-Domain Command and Control creates new system-of-systems integration complexity that requires more systems engineers, not fewer.
Who Should Worry (and Who Shouldn't)
Systems engineers on classified defence programmes with active clearances, leading system integration and test campaigns on military platforms, and managing complex multi-domain interfaces are safer than the label suggests. The combination of cleared status, physical test presence, and platform-level integration judgment creates a strong moat. Systems engineers whose work is primarily desk-based MBSE modelling, requirement tracing in DOORS, and compliance documentation generation are more exposed — these structured, process-driven workflows are exactly what AI tools target. The single biggest separator is physical integration involvement: if you spend significant time in integration labs and on field test events, you are well-protected. If your work is populating SysML models and generating traceability matrices from your workstation, AI is already accelerating that work and will increasingly automate it.
What This Means
The role in 2028: Mid-level radar systems engineers spend significantly less time on routine requirement decomposition, model population, and compliance documentation as AI-enhanced MBSE tools mature. More time shifts to managing AI-generated requirement sets, leading physical integration and test campaigns on increasingly complex multi-domain platforms, and defining V&V approaches for AI/ML-enabled radar subsystems. Engineers who combine INCOSE-level systems thinking with deep radar domain knowledge and AI/ML integration experience become the premium profile.
Survival strategy:
- Maximise physical integration and test exposure. System-level I&T on classified military platforms is the AI-resistant core. Seek assignments that put you in integration labs and field test sites, not just behind the MBSE workstation.
- Master AI-enhanced MBSE tools. Become the engineer who leverages AI to generate requirement decompositions and model populations 3x faster — then validates and refines the output with systems engineering judgment.
- Build cross-domain integration expertise. JADC2 and multi-domain sensor fusion create growing demand for systems engineers who can integrate radar with EW, C2, communications, and weapons systems across air, land, sea, space, and cyber domains.
Timeline: 5-7+ years for physical integration and test roles. 3-5 years for desk-based MBSE and requirements management workflows as AI tools mature. Defence procurement cycles provide structural demand stability through 2035+.