What It Is

Overwatch Core is a browser-based mission planning and autonomous flight system for the Parrot ANAFI UKR. No application to install. No pilot certification required. No command-line tools or SDK configuration.

The operational sequence is: arrive on scene, open the web application on any device with a browser, draw the search area on the satellite map, configure altitude and overlap, click generate. The system computes a boustrophedon grid search pattern, packages a complete AirSDK flight mission, and uploads it to the drone over WiFi. Click launch. The drone takes off, flies the grid autonomously, runs onboard human detection on every camera frame, and returns to the takeoff point when the search is complete.

Under five minutes from vehicle arrival to drone airborne. That is the target, and in field testing we consistently hit it.

This is the product we have been building toward over the past four blog posts. Each post described a component of the system. This post describes the whole.

The Ground Side

The ground side is a web application. It runs in any modern browser — Chrome, Firefox, Safari, Edge. No plugins, no native code, no dependencies. The interface is a dark tactical theme designed for outdoor use with reduced screen brightness: high-contrast text on dark backgrounds, orange accent colour for critical elements, minimal visual clutter.

The core workflow is a map. The operator sees a satellite view of the area (using Mapbox or similar tile provider, cached for offline use in areas with no cellular connectivity). They draw a polygon on the map defining the search boundary. The polygon can be any shape: convex, concave, L-shaped, rectangular, whatever matches the terrain and the search hypothesis.

As the polygon is drawn, the system computes the search grid in real time. The boustrophedon sweep pattern appears as an overlay on the map, showing the exact flight path the drone will follow. Mission statistics update live: total flight distance, estimated flight time, number of sweep lines, area covered, effective overlap percentage. The operator can adjust altitude, overlap, and flight speed using sliders, and the grid recomputes instantly.

When the operator is satisfied with the mission, they click "Generate Mission." The system packages the flight plan into a complete AirSDK mission — not just a waypoint list, but a full autonomous flight project including the flight supervisor, vision detection service, and safety monitor. The operator connects to the ANAFI's WiFi network, clicks "Upload," and the mission transfers to the drone in seconds. One more click: "Launch."

The Air Side

What uploads to the drone is not a simple waypoint file. It is a complete AirSDK project — a compiled autopilot application that takes full control of the aircraft.

The flight supervisor is a 7-state finite state machine. The states are: idle, takeoff, transit to grid start, grid search (the main sweep), transit to home, landing, and emergency. Transitions between states are deterministic based on mission progress and safety conditions. The supervisor monitors battery level, GPS fix quality, communication link status, and obstacle detection throughout the mission. If any safety threshold is breached, the supervisor transitions to the appropriate state — typically return-to-home or emergency land.

Navigation uses the hybrid GPS/VIO architecture we described previously. The entire flight plan is stored as relative vectors — sequences of {dx, dy, dz, dpsi} displacements from each waypoint to the next. The drone executes these using moveBy commands, tracking displacement using whatever position estimation is available. When GPS is good, it anchors the VIO baseline. When GPS degrades, VIO continues the mission autonomously.

The detection pipeline runs continuously during the grid search phase. Every camera frame is resized and fed through the TFLite model. Detections above the confidence threshold are geolocated, cropped, and published as alerts. The operator sees them appear on the map in real time, or receives them in a batch when connectivity resumes.

The safety monitor runs as an independent thread, watching for conditions that should override normal mission execution. Battery below the return-to-home threshold: abort grid, return immediately. Communication loss exceeding a configurable timeout: continue mission (the drone knows its full flight plan), but flag for operator awareness when the link recovers. Obstacle detected by the drone's native avoidance system: pause, assess, resume or reroute.

What We've Built

Over the past three months, we published four technical posts, each describing a component of this system.

The first post, "Why SAR Teams Need Autonomous Drones, Not Better Pilots," described the problem: manual drone flights create coverage gaps, video feed monitoring suffers from human attention degradation, and the combination means SAR drones today are less effective than they could be. The argument was that the solution is not better piloting but autonomous execution with onboard detection.

The second post, "GPS-Denied Navigation," addressed the navigation architecture. SAR operations happen in environments where GPS is unreliable. We described the hybrid GPS/VIO approach with relative-vector flight plans that survive GPS dropout — the same architecture that runs inside the mission builder's generated flight code.

The third post, "Boustrophedon Search Patterns," covered the grid generation mathematics: computing sweep spacing from camera FOV and altitude, aligning the sweep to the polygon's principal axis, clipping sweep lines to arbitrary polygon boundaries, and converting the grid to relative waypoint vectors. This is the algorithm that runs in the mission builder's grid planner.

The fourth post, "Building Onboard Human Detection," described the detection pipeline: frame capture, resize, TFLite inference, geolocation, and alert publication. We discussed the confidence threshold tradeoff and honestly stated the current limitations around thermal, night operation, and canopy penetration.

Overwatch Core is all four of these components integrated into one system. The ground-side web application handles the mission planning and grid generation. The air-side AirSDK package handles autonomous navigation, search execution, and onboard detection. The operator defines intent; the system handles execution.

Who It's For

This system is built for professional SAR teams who need systematic aerial search capability and cannot afford the time cost of manual flight planning and execution.

Coastguard. Coastal search operations for persons overboard, vessel distress response, and shoreline search. The environments are GPS-challenged (cliffs, sea stacks, reflected signals off water), wind is a constant factor, and the search areas are often long and narrow along coastline. The boustrophedon planner handles non-rectangular polygons natively.

Military. Battlefield casualty search, disaster response in contested environments, and perimeter surveillance. GPS denial may be intentional (jamming) rather than environmental. The relative-vector flight plan architecture makes this system functional in GPS-denied environments by design, not as an afterthought.

Police SAR. Missing person search in rural and wilderness terrain. Time pressure is extreme — the first 24 hours are critical. The system's under-five-minute deployment time means the drone is searching while a manual approach would still be flight planning. The detection pipeline means a single operator can run the search without a dedicated video monitor.

Any team that currently uses drones for SAR and finds that manual flight planning takes too long, manual piloting creates coverage gaps, and video monitoring misses detections. If you have a Parrot ANAFI UKR and a browser, you have what you need.

What's Next

Overwatch Core is the foundation. It handles single-mission autonomous search — one drone, one area, one flight. But SAR operations rarely stop at one battery cycle, and some missions require continuous coverage that no single drone can provide.

Multi-drone coordination is here. We built it as a separate product: Overwatch Orchestrate. It is a bolt-on fleet relay layer that adds battery-aware handoff scheduling, waypoint resume, continuous patrol logic, and exception-only alerting. Two to ten drones, 24/7 coverage, one operator. Orchestrate treats Core as the execution engine for each drone and layers fleet coordination on top.

Thermal camera integration. The most requested feature and the highest-impact planned improvement. Thermal imaging allows detection at night, through light vegetation, and in conditions where visible-spectrum detection fails. This requires a compatible thermal payload for the ANAFI UKR and a model trained on aerial thermal imagery.

Compliance and certification. Operating autonomous drones in national airspace requires regulatory approval that varies by country and use case. We are working toward certifications that allow SAR teams to deploy the system within their existing operational frameworks.

We are looking for early adopter partners. If you run a SAR team — coastguard, military, police, volunteer — and you want to be part of shaping what autonomous aerial search looks like in practice, we want to hear from you. Not a sales pitch: a partnership. We build the technology; you bring the operational expertise. The system gets better from real-world feedback in ways that laboratory testing cannot replicate.

Get in touch.