Instrumented Crutches

This system represents a comprehensive solution for collecting and analyzing biomechanical data from instrumented crutches, integrated with eye-tracking technology for simultaneous gait and visual behavior analysis.

Latest Version: Built on MADS 2.0.0 and above. This system requires Raspberry Pi Zero 2 W or similar platform.
Documentation Notice: This documentation was generated with GitHub Copilot and is currently under review and correction.

Key Features

📊 Multi-Sensor Data Acquisition

Real-time collection from load cells in the tip and handle of both crutches.

👁️ Eye Tracking Integration

Synchronized eye gaze tracking using Pupil Neon glasses for comprehensive behavioral analysis.

♥️ Physiological Monitoring

PPG (pulse oximetry) and heart rate monitoring via MAX30100 sensors.

💾 HDF5 Data Storage

Efficient hierarchical storage format with automatic indexing and metadata.

System Overview

The system is built on a master/slave architecture, where the right crutch acts as master by hosting the web_server, coordinator, data storage, and eye-tracking control while also collecting sensor data; the left crutch acts as slave and focuses on sensor data acquisition.

All components communicate via the MADS message broker using a publish/subscribe pattern.

Documentation Sections

📚 Available Documentation

System Requirements

Component Specification
Platform Raspberry Pi Zero 2 W
OS Raspberry Pi OS
MADS v2.0.0 or above
Python 3.9+
Storage microSD 32GB+ with fast write speed (UHS-II recommended)

Supported Sensors

Contributing & Support

For issues, questions, or contributions, please refer to the GitHub repository.

Research Use Only: This system has been developed for research purposes. Ensure proper ethics approval and informed consent before use with human subjects.