The X500 Is Our Test Platform
We develop Overwatch on a workshop fleet of three Holybro X500 V2 quadcopters. PX4 autopilot, Pixhawk 6C flight controller, SiK telemetry radios, the whole platform sitting on a bench a few metres from our desks. We fly them, we crash them, we rebuild them. Every feature that ships in Overwatch has been beaten on against that fleet before it goes anywhere near a customer airframe.
We want to be transparent about this because buyers who see our flight footage sometimes ask, reasonably, why we are running demos on a €500 research quadcopter rather than the industrial frames we sell against. The honest answer is that this is exactly how you are supposed to build flight software. The X500 fleet is a development rig, not the operational aircraft. This post explains why that distinction matters, and how we bridge from bench to deployment without cutting corners.
Why We Develop on X500s
Iteration speed. We break drones. That is the work. A new collision-avoidance path lands, we fly it, and sometimes an edge case we didn't anticipate puts a motor into the ground. On an X500 that is a €150 ESC, a €40 motor, an evening in the workshop, and we are airborne again the next morning. On a €25,000 Freefly Astro Max it is a weeks-long procurement and RMA cycle and a hole in the schedule. A fleet that recovers in hours rather than weeks is the difference between shipping three iterations a week and shipping three a quarter.
MAVLink parity. The X500 runs PX4, which speaks MAVLink v2. So does the Astro Max. So do Sky-Hopper, Quantum Systems' Trinity and Vector, and every other serious industrial MAVLink frame. From Overwatch's perspective, these airframes are indistinguishable at the SDK surface — the same COMMAND_LONG arms them, the same mission protocol uploads waypoints, the same GLOBAL_POSITION_INT reports telemetry, the same PARAM_SET tunes behaviour mid-flight. The stack we validate against our X500s is the stack that runs against the industrial frame. Everything above the autopilot — mission planning, grid generation, Fleet Advisor, Safety Gate, detection pipeline — does not know or care which airframe is underneath it.
Safety culture. We want our engineers cavalier enough to fly aggressive test regimes — aggressive wind envelopes, intentional GPS dropout, deliberate communication loss, multi-drone handoff sequences at close proximity, the full catalogue of failure modes a real SAR mission might hit. Engineers will not test aggressively on aircraft they are afraid of breaking. Low stakes per crash is what enables honest testing. If every flight costs a month of calendar time when something goes wrong, the tests get conservative and the bugs stay hidden.
Cost. A three-aircraft X500 workshop fleet — drones, batteries, chargers, telemetry radios — cost us about €1,780 total. The equivalent industrial test fleet would run north of €75,000. At that price we would have one test drone, not three, and fleet-relay development — which by definition requires multiple drones flying simultaneously — would be impossible. The X500 fleet is what made multi-drone relay orchestration tractable to develop in the first place.
What This Means for Operational Deployment
Overwatch's software stack is airframe-agnostic above the MAVLink abstraction. The mission planner does not know whether it is talking to a Pixhawk on an X500 or a Pixhawk on an Astro Max. The Fleet Advisor's decision engine reasons about battery state, wind, GPS quality, and fleet depth — none of which are airframe-specific. The Safety Gate's inviolable constraints — critical-battery RTH, wind abort, altitude separation, swap-threshold floors — apply identically regardless of the aircraft beneath.
What does differ between airframes is a small set of flight-envelope constants: maximum operating wind, battery endurance, cruise speed, payload limits, camera FOV for grid overlap mathematics. These are pluggable configuration. Swapping between a 15 m/s-capable X500 and a 18 m/s-capable Astro Max is a per-airframe profile, not a code change. We have one codebase, and airframe profiles live alongside it.
This is the point skeptical buyers sometimes miss: when we demo grid search on an X500, we are not demoing X500-specific behaviour. We are demoing the Overwatch mission logic. The same mission logic runs on a customer's Astro Max with a different flight-envelope profile loaded. What you see on the bench is what runs in the field, with a different aircraft underneath.
What We Verify on the Industrial Frame
We do not ship to customers having only flown on X500s. Before any operational deployment, we run a flight campaign on the customer's chosen airframe — whether that is the Freefly Astro Max (NDAA-compliant, the standard for US federal and allied public-safety work), Sky-Hopper (Israeli-built, SAR-native, proven in maritime and mountain operations), Quantum Systems' Trinity or Vector (German, EU sovereignty, fixed-wing and VTOL options for long-range SAR), or the Parrot ANAFI UKR via our AirSDK partnership.
The campaign covers flight-envelope validation at the airframe's real operating limits, mission-protocol compatibility against the specific autopilot firmware the customer runs, end-to-end grid search and fleet relay at the altitudes and areas the customer actually operates in, and safety-gate behaviour — critical-battery RTH, wind abort, communication-loss fallback — exercised against the real aircraft rather than the development rig. Only after that campaign passes does the customer take delivery.
If the customer runs a MAVLink fleet they already own, the software-only (BYOD) tier applies and we run the campaign on their aircraft at their operating site. If they take one of our Integrated Fleet configurations, we run the campaign at our facility before the aircraft ship. Either way, the rule is the same: no operational deployment on an airframe Overwatch has not been flown on.
The Bigger Principle
Every serious software company develops on hardware it can afford to break. Game engines are built on development kits, not retail consoles. Automotive firmware runs on dynamometers and test benches, not on production vehicles. Aerospace avionics are validated on iron birds — non-flying rigs that replicate the aircraft's electrical and hydraulic systems — before they ever see a flight certification programme. The development rig is not a compromise; it is the place where the real work happens and where bugs go to die.
What matters is not whether the development platform is identical to the deployment platform. It is whether the abstraction between them is honest, and whether the verification campaign on the deployment platform is real. We believe both are true for Overwatch. The MAVLink abstraction is thin and well-specified. The per-airframe campaign is non-negotiable. The software that flies in the field has been flown against the field aircraft before it gets there.
That is the discipline we are committed to. An operator who buys Overwatch on an Astro Max is not buying software that was only ever proven on an X500. They are buying software whose core logic has been hammered against thousands of X500 flight hours, and whose airframe-specific behaviour has been verified on the aircraft they will actually fly.
Deployment Options
Overwatch ships in two procurement models. The Integrated Fleet tier pairs the software with an operational airframe of the customer's choice — Freefly Astro Max, Sky-Hopper, Quantum Systems, or Parrot ANAFI UKR — with pricing starting at €15,000 per drone including the flight-campaign verification on the selected platform. The software-only tier is for operators who already run a MAVLink fleet and want to add Overwatch's autonomous mission planning, multi-drone relay, and live ground station on top of their existing aircraft.
If you want to see the full tier breakdown, the pricing page has the detail. If you want to talk about which airframe fits your operation or walk through the verification campaign we would run for your fleet, get in touch. We are direct about how we build, and we are direct about how we deploy.