Different Missions, Different Targets

Different SAR operations search for different things. Coastguard teams scan for vessels in distress. Police units track vehicles. Mountain rescue looks for persons on foot. The detection system on the drone needs to match the mission — you should not need to retrain a model or swap firmware to switch from searching for people to searching for boats.

Until now, the Overwatch detection pipeline treated all objects equally. If the onboard model saw something above the confidence threshold, it flagged it. That works for general surveillance, but it generates noise in targeted operations. A coastguard team searching for a missing fishing vessel does not need alerts for every car on the coast road.

Detection Target Configuration

Overwatch Orchestrate now includes a Detection Target dropdown in the mission planner. Four options:

Person Search — COCO class 0. The model filters for human figures only. Optimised for overboard recovery, missing person searches, and casualty location.

Vessel Detection — COCO class 8. Boats, ships, kayaks. The primary mode for maritime patrol and coastguard operations.

Vehicle Search — COCO classes 2, 5, 7. Cars, buses, trucks. For road incident response, stolen vehicle searches, and traffic monitoring along coastal routes.

General — all classes above the confidence threshold. The original behaviour, now explicitly selectable. Useful for area surveillance where the threat type is unknown.

The selection flows into the mission config that gets deployed to the drone at dispatch time. The onboard SSD MobileNet V2 model already knows all of these classes — it was trained on the full COCO dataset. The target configuration simply filters which detections trigger an alert. No retraining, no model swap, no firmware change. The operator selects the target type, packages the mission, and the drone searches for exactly what the operation requires.

Realistic Detection in the Demo

The Orchestrate demo previously used a hardcoded timer to simulate detections — every N seconds, a detection event fired at the drone's current position. It demonstrated the alert pipeline, but it had no relationship to what the drone was actually looking at.

The demo now places simulated targets near the patrol route. Boats anchored offshore, persons on the coastline. These appear on the map as semi-transparent markers — visible to the operator, positioned where real targets might be found in the operational area.

Detection is now geometry-based. The system calculates each drone's camera field of view based on altitude, gimbal angle, and the ANAFI UKR's known sensor parameters. When that FOV polygon covers a target position, a detection fires. Confidence varies with distance from the FOV centre — a target dead-centre scores 0.82 to 0.85, a target near the edge of the frame scores 0.65 to 0.72. This matches real-world detection behaviour where objects at frame edges produce lower-confidence classifications.

When a detection fires, the target marker lights up on the map, an alert appears in the alert feed, and the Fleet Advisor responds. The Advisor increases relay overlap to re-cover the detection zone on the next handoff — ensuring the incoming drone re-sweeps the area where the target was spotted. A person in the water drifts. A vessel under power moves. Re-covering 200 metres of route around a detection point costs 90 seconds of flight time but catches movement since the last pass.

GPS-Resilient Patrol

GPS loss is a reality in SAR operations. Coastal cliffs block satellite signals. Urban canyons between buildings create multipath interference. Electromagnetic interference from industrial sites or military installations degrades fix quality. A drone that stops patrolling when it loses GPS is a drone that fails in exactly the conditions where SAR operations happen.

The Parrot ANAFI UKR navigates using Visual Inertial Odometry (VIO) when GPS is unavailable. The onboard cameras and IMU track position relative to visual features in the environment. VIO is not as accurate as GPS — position drift accumulates over time — but it keeps the drone flying a controlled path rather than hovering in place or triggering an emergency return.

Overwatch Orchestrate now simulates GPS dropout and recovery during the demo. The GPS indicator in the weather status bar transitions between three states: FIXED (green) when full satellite lock is available, DEGRADED (yellow) when fix quality drops below acceptable threshold, and LOST (VIO) (red) when the drone is navigating purely on visual-inertial tracking.

During GPS loss, VIO drift accumulates. The drone continues its patrol route but its reported position diverges slowly from its actual position. When GPS recovers, the system corrects the drift — the reported position snaps back to the true GPS fix. The operator sees this correction happen in real time on the map.

The key point: relay handoffs continue through GPS loss. The fleet does not stop patrolling when satellites go dark. The orchestrator tracks estimated positions via VIO, the standby drone dispatches on schedule, and the handoff executes at the best-estimate location. When GPS recovers, position accuracy restores. The patrol never stopped.

What This Means for Operators

These are not demo-only features. The detection target configuration flows directly to the real drone mission config. The onboard class filter reads detection_classes from mission_config.json — the same file that defines waypoints, altitude, and camera parameters. When the operator selects "Vessel Detection" in the planner, the packaged mission tells the drone's vision pipeline to report COCO class 8 only. Everything else is suppressed at the edge, before it reaches the ground station.

GPS resilience is built into the ANAFI UKR's flight supervisor. The VIO navigation system is not something we added — it is part of the drone's core flight controller. What Overwatch adds is the ground station's ability to maintain relay operations through GPS loss events. The orchestrator does not require GPS-quality position data to sequence handoffs. It works with the best available position estimate, whether that comes from satellites or from visual-inertial tracking.

The demo shows exactly what the real system does. The physics are the same. The decision logic is the same. The only difference is time compression — the demo runs a 24-hour patrol in 120 seconds. The detection geometry, the GPS state machine, the relay sequencing through signal loss — all of it runs the same code path as a live deployment.

Available Now

Configurable detection targets and GPS-resilient patrol ship in the current release of Overwatch Orchestrate. If you are running Orchestrate today, update your ground station software to access both features. If you are evaluating Overwatch for SAR operations, request a demo — we will run the Orchestrate simulation and walk through detection targeting and GPS resilience on a live patrol.