top of page

AUGMENTING CANINES
Human Computer Interaction
Systems Thinking
Research & Strategy
Wearable Device Design
Signal Processing
Prototyping
Bidirectional EMG haptic telemetry for search and rescue k9 dogs
Figma, Python, MATLAB, Arduino, EMG toolchain (LSTM/CNN), After Effects
Augmenting Canines is a communication system for urban search-and-rescue dogs and handlers, designed to close critical gaps in real-time awareness, physiology monitoring and silent command during high-risk operations.
Developed by a cross disciplinary team of four, the project prototypes a symbiotic network that links handler muscle signals, wearable haptics, and multi-sensor telemetry into a shared interface.

Understanding the NSARAG workflow
Dogs are deployed in the search and rescue stage for 20 minutes, then rest for ~ 40 minutes, cycling repeatedly across long operations. Handlers make decisions with partial information: GPS traces, occasional barks, sporadic visual contact, and experience.
The Problem
When the dog disappears into rubble...
In large scale disasters, canine units enter unstable rubble zones long before humans can safely follow.
Dogs work under extreme, rapidly changing conditions: heat, debris, dust, noise and shifting microclimates all affect scent quality, stamina and mobility.
-
Handlers frequently lose line of sight within minutes of deployment, relying on barks and intuition to infer what the dog has found
-
Confirmation protocols often require two separate dogs to independently indicate the same location before resources are committed, adding time in environments where minutes matter.
-
Physiological decline - overheating, fatigue, dehydration, begins well before visible symptoms, and each dog's thresholds are highly individual.
A multidimensional communication failure: ambiguous behavioural cues, no line of sight, limited void visualisation, and no continuous physiology monitoring.

System Architecture
Downlink: Handler --> Dog
Enabling silent commands using surface EMG from the handlers forearm:
-
A wrist EMG band records muscle activity during discrete gestures such as wrist flexion and fist closure.
-
Pre‑processed EMG signals (bandpass 5–100 Hz, normalization, artifact rejection) feed an LSTM/CNN classifier trained on labelled gestures (Go, Stay, Return).
-
Decoded commands trigger brief, patterned vibration cues (e.g., 2 Hz pulses) delivered through actuators embedded in the dog’s harness.
The haptic channel is tuned to Meissner and Pacinian corpuscles - skin receptors sensitive to low‑ and high‑frequency vibration—so cues are detectable without overwhelming the dog.

Uplink: Dog --> Handler
The uplink carries telemetry from the dog’s environment and body back to the handler:
-
A multisensor collar streams GPS position, body temperature, and other physiological indicators.
-
A GeoVision pod on the harness integrates thermal imaging and head‑tracked video to visualize voids and potential survivor locations.
-
Data arrives on the handler’s display (tablet or AR glasses) as an integrated situation view: temperature map, location drops, status alerts, and live video when needed.
The combined system forms a closed loop: muscle signals become haptics, dog responses and conditions become structured telemetry, and the handler sees a live, interpretable picture of what the dog is experiencing.
Design Goal
Symbiotic Command Network: An integrated system linking wrist EMG decoding, haptic feedback, multi-sensor collar telemetry and AR interface for handlers
Preserve dogs autonomy and natural search behaviour
FEATURE 1
Real time insight into location, environment and physiological conditions
FEATURE 2
FEATURE 3
Enables silent commands to cut through noise, and rubble without adding cognitive load

Final Design

Key Components
SAR K9 Kit, that treats the dog not just as a sensor but as a partner with measurable needs.
FEATURE 1: GEOVISION POD - VOID VISUALISATION AND LOCATION