Device Sensors Hub

Access device camera and learn about other sensor integrations.

Device Camera Access
Enable your device camera. This feed can be used for visual data capture for various assessments.

Click "Enable Camera" to start video feed.

Microsoft Kinect v2
Full-body skeletal tracking and posture modeling.

Use: Full-body skeletal tracking and posture modeling.

Assessments Enabled:

  • Gait symmetry and step cycle
  • Sit-to-stand performance
  • Joint range of motion (ROM) for hips, knees, shoulders
  • Postural sway and center-of-mass stability
  • Ergonomic RULA Scoring (Auto-calculates posture strain)
  • Facial Expression Recognition (AI-based mood assessment, opt-in)
Infrared Sensors (Kinect IR + Smartphone IR)
Passive thermal scanning for physiological insights.

Use: Passive thermal scanning.

Assessments Enabled:

  • Localized inflammation detection (e.g., joint heat mapping)
  • Skin temperature differentials for wound surveillance
  • Muscle Contraction Zones (Monitor active vs dormant zones via thermal load mapping - conceptual)
  • Peripheral Cold Sensitivity (Useful for diabetic neuropathy or vascular disorders - conceptual)
User Instructions in App
Guidance for performing assessments.
  • Each test includes an animated demonstration with audio guidance.
  • Explanations provide the clinical context: e.g., “This task helps detect early balance impairment that may lead to falls.”
  • Users receive real-time feedback, corrective tips, and AI-generated summaries after each test.

Together, this hardware ecosystem allows ABI Promedical to function as a comprehensive digital rehabilitation lab—capable of autonomously performing multimodal assessments with clinical-level accuracy, guided interpretation, and actionable output.

Smartphone Sensors
Gyroscope, Accelerometer, Camera, Microphone for mobile-based analysis.

Use: Mobile-based motion, audio, and image analysis.

Assessments Enabled:

  • Hand tremor quantification (neurological screening)
  • Arm swing balance (e.g., Parkinson’s indicators)
  • Voice-based respiratory rate detection
  • Camera-based upper limb ROM estimation
  • Sit-to-Stand & Balance Tasks (Real-time motion tracking of trunk movement)
  • Gait and Step Cadence (Gyro/accelerometer analysis of foot impact and stride timing)
Eye Tracking (Front Camera-based + SDKs)
Visual engagement and oculomotor control testing.

Use: Visual engagement and oculomotor control testing.

Assessments Enabled:

  • Fixation and pursuit tracking (cognitive fatigue index)
  • Reaction time with visual target tests
  • Saccades, Smooth Pursuit, Fixation Holding (Detects cranial nerve function, cognitive fatigue - conceptual)
  • Visual Balance Control (Paired with posture tasks - conceptual)
Wearable Sensors (Optional Integration)
Support for smartwatches, fitness bands, and EMG armbands.

ABI modules support connection to smartwatches, fitness bands, and EMG armbands.

Used for: Continuous HRV, sleep tracking, limb muscle activation, and step cadence.

Advanced Control & Interaction
Enable alternative interaction methods. (Conceptual - requires further integration).

Enable voice commands for app navigation and control. This would typically require microphone access.

Enable eye-tracking for attention monitoring or gaze-based controls. Relies on camera feed.

These features are illustrative. Full implementation requires specific APIs and/or backend processing. User consent for camera/microphone access is paramount.

Manual Data Log & AI Analysis
Log observations or notes related to sensor data or symptoms for backend AI processing.