Instrumented Crutches
This system represents a comprehensive solution for collecting and analyzing biomechanical data from instrumented crutches, integrated with eye-tracking technology for simultaneous gait and visual behavior analysis.
Key Features
📊 Multi-Sensor Data Acquisition
Real-time collection from load cells in the tip and handle of both crutches.
👁️ Eye Tracking Integration
Synchronized eye gaze tracking using Pupil Neon glasses for comprehensive behavioral analysis.
♥️ Physiological Monitoring
PPG (pulse oximetry) and heart rate monitoring via MAX30100 sensors.
💾 HDF5 Data Storage
Efficient hierarchical storage format with automatic indexing and metadata.
System Overview
The system is built on a master/slave architecture, where the right crutch acts as master by hosting the web_server, coordinator, data storage, and eye-tracking control while also collecting sensor data; the left crutch acts as slave and focuses on sensor data acquisition.
All components communicate via the MADS message broker using a publish/subscribe pattern.
Documentation Sections
📚 Available Documentation
- System Architecture - Technical design and agent structure
- Installation Guide - Setup instructions for master and slave
- Usage Manual - Day-to-day operating instructions
System Requirements
| Component | Specification |
|---|---|
| Platform | Raspberry Pi Zero 2 W |
| OS | Raspberry Pi OS |
| MADS | v2.0.0 or above |
| Python | 3.9+ |
| Storage | microSD 32GB+ with fast write speed (UHS-II recommended) |
Supported Sensors
- Load Cells: Strain gauges with HX711 for tip and button load cells (FX292X-100A-XXX-L), and a high-precision AD HAT for the handle load cell
- PPG/Heart Rate: MAX30100 pulse oximeter
- Eye Tracking: Pupil Neon (via Wifi)
- Power: UPS HAT for Raspberry Pi
Contributing & Support
For issues, questions, or contributions, please refer to the GitHub repository.