Mind Controlled Wheelchair - Best Technical Difficulty @ Nex Hacks

Mind Controlled Wheelchair

timeline: 2026

tech stack: Python, OpenBCI, EEG, Arduino, CAD, 3D Printing

Overview

A universal brain-controlled wheelchair system that retrofits onto any existing wheelchair, no permanent modifications required. Users control it through thought alone, using motor-imagery EEG signals translated into real-time physical motion via custom 3D printed motor mounts, wheel hubs, and herringbone gears.

Why

Millions of people rely on wheelchairs, but most systems still require hand, breath, or voice input. For people with paralysis or limited motor control, true independence is out of reach. Traditional joysticks need hand dexterity. Sip-and-puff systems demand consistent breath. Voice control breaks down in noisy rooms. The mind stays sharp though. So the question was simple: what if thinking was enough to move?

How It Works

A 16-electrode EEG headset captures motor-imagery brain signals. The user imagines movement, pushing forward, turning, stopping, and the signal processing pipeline classifies those patterns in real time on a Raspberry Pi. Custom 3D printed motor mounts, wheel hubs, and herringbone gears connect to the drive wheels through IBT-2 motor drivers, all controlled by an Arduino. AI-powered object detection runs in parallel for collision avoidance and automatic safety overrides. Total latency: under 100ms.

Architecture

EEG headset (16 electrodes)
Raspberry Pi (signal classifier)
Arduino
IBT-2 motor drivers
3D printed gears + wheel hubs
BDC wheelchair motors

Results

Awarded Best Technical Difficulty at Nex Hacks at CMU, $1,000 cash prize. The system reliably translates motor imagery into directional control with sub-100ms latency and works on any standard wheelchair without permanent modifications.