TRACER is a fully custom differential-drive robot built from scratch — hardware, firmware, and software. It features real-time autonomous path following, sensor fusion–based pose estimation, and a live web dashboard for monitoring and control. The system spans four interconnected layers: a SvelteKit web frontend, a Python relay backend, a Raspberry Pi robot controller, and an ESP32 firmware core.
View docs for detailed architecture and hardware information. View docs/demos for video and screenshot examples.
┌─────────────────────────────────────────────────────────────┐
│ Web Dashboard │
│ SvelteKit + TypeScript + Socket.IO + Konva.js │
└───────────────────────┬─────────────────────────────────────┘
│ Socket.IO (WebSocket)
┌───────────────────────▼─────────────────────────────────────┐
│ Relay Backend (Laptop) │
│ Flask + Socket.IO + PyGame (controller input) │
└───────────────────────┬─────────────────────────────────────┘
│ Socket.IO (WebSocket)
┌───────────────────────▼─────────────────────────────────────┐
│ Robot Controller (Raspberry Pi) │
│ FastAPI + asyncio + state estimation + path following │
└───────────────────────┬─────────────────────────────────────┘
│ UART Serial (115200 baud)
┌───────────────────────▼─────────────────────────────────────┐
│ Firmware (ESP32 / Arduino) │
│ FreeRTOS + PID motor control + sensor acquisition │
└─────────────────────────────────────────────────────────────┘
path_complete confirmationTRACER/
├── frontend/ # SvelteKit web dashboard
│ └── src/
│ ├── lib/
│ │ ├── components/ # UI components
│ │ │ ├── PathDrawer.svelte # Path editor + robot overlay
│ │ │ ├── RobotControls.svelte # E-Stop + Manual mode buttons
│ │ │ ├── UltrasonicGraph.svelte # Distance history chart
│ │ │ ├── ControlPad.svelte # On-screen joystick
│ │ │ └── ...
│ │ ├── types/ # Zod schemas + Svelte rune classes
│ │ │ ├── QuinticHermiteSpline.svelte.ts
│ │ │ ├── SplinePath.svelte.ts
│ │ │ └── ...
│ │ └── api/
│ │ └── socket.ts # Socket.IO client singleton
│ └── routes/
│ └── +page.svelte # Main dashboard page
│
├── backend/ # Relay server (runs on laptop/server)
│ ├── main.py # Flask + Socket.IO relay + joystick input
│ ├── Controller.py # Xbox controller handling via PyGame
│ └── GestureController.py # ESP32 IMU gesture sensor integration
│
├── rpi/ # Raspberry Pi robot controller
│ └── src/
│ ├── server.py # FastAPI + asyncio Socket.IO server
│ └── models/
│ ├── Robot.py # Main robot class + state machine
│ ├── StateEstimator.py # Odometry + velocity estimation
│ ├── HeadingFilter.py # EKF heading filter (gyro + mag)
│ ├── PoseFilter.py # EKF position filter (encoder dead-reckoning)
│ ├── QuinticHermiteSpline.py # Spline math + arc length LUT + velocity profile
│ ├── Path.py # Multi-segment path + trajectory builder (subprocess)
│ ├── RAMSETE.py # RAMSETE trajectory tracking controller
│ ├── PurePursuit.py # Pure pursuit controller for freehand paths
│ ├── SerialManager.py # UART serial communication with ESP32
│ ├── SensorData.py # Pydantic sensor data model
│ ├── RobotState.py # Pydantic robot state model (pose + velocity)
│ ├── Config.py # All physical robot constants
│ └── Mode.py # State machine modes
│
├── arduino/ # ESP32 firmware
│ └── main/
│ ├── main.ino # RTOS task creation + hardware init
│ ├── mainLoop.cpp # 100 Hz loop: sensors, PID, serial TX
│ ├── serialTask.cpp # Serial RX + command queue
│ ├── commandProcessorTask.cpp # Motor + LCD command execution
│ ├── motors.cpp # PWM motor driver (TB6612FNG)
│ ├── sensors.cpp # IMU + magnetometer reading
│ ├── encoders.cpp # Hardware PCNT encoder interface
│ ├── ultrasonicTask.cpp # 20 Hz ultrasonic trigger/read
│ ├── PID.cpp / PID.h # PID controller implementation
│ └── config.h # Pin assignments + tuning constants
│
└── docs/ # Architecture and protocol documentation
| Event | Direction | Payload | Description |
|---|---|---|---|
joystick_input |
→ | { left_y, right_x } |
Drive command from UI or controller |
set_state |
→ | { state, path_type?, path? } |
Change robot mode; include path for PATH_FOLLOWING |
query |
→ | { query } |
Natural language command string |
start_recording |
→ | — | Begin recording joystick macro |
stop_recording |
→ | — | Stop and save macro |
play_recording |
→ | { timestamp } |
Replay a saved macro |
precision_mode |
→ | — | Toggle precision drive mode |
stop |
→ | — | Emergency stop |
sensor_data |
← | sensor + state + mode | 10 Hz telemetry from robot |
path_complete |
← | { status } |
Robot finished executing path |
active_command |
← | command object | Current AI command being executed |
obstacle_detected |
← | { distance } |
Ultrasonic obstacle alert |
cliff_detected |
← | { ir_front, ir_back } |
IR cliff detection alert |
set_state Payload — PATH_FOLLOWINGFreehand (Pure Pursuit):
{
"state": "PATH_FOLLOWING",
"path_type": "freehand",
"path": [{ "x": 0.0, "y": 0.0 }, ...]
}
Spline (RAMSETE):
{
"state": "PATH_FOLLOWING",
"path_type": "spline",
"path": {
"splines": [{
"start": [x0, y0],
"end": [x1, y1],
"start_velocity": [dx0, dy0],
"end_velocity": [dx1, dy1],
"start_acceleration": [ddx0, ddy0],
"end_acceleration": [ddx1, ddy1]
}]
}
}
Sensor packet (ESP32 → RPi), 37 bytes, 100 Hz:
<B start byte (0xAA)
B packet sequence number
f ultrasonic distance (cm, -1=too far, -2=too close)
hhhhhh IMU: ax, ay, az, gx, gy, gz (raw int16)
f temperature (°C)
fff magnetometer x, y, z (µT)
ii left/right encoder ticks (int32, delta per loop)
B flags: bit0=IR front, bit1=IR back, bit2=new mag data
B battery percentage
I timestamp (µs, uint32)
B checksum (sum of all bytes mod 256)>
Command packet (RPi → ESP32):
| Component | Part |
|---|---|
| Microcontroller | ESP32 (Arduino framework) |
| Motor driver | TB6612FNG (dual H-bridge) |
| Motors | JGB37-520 gear motors with quadrature encoders (56:1, 178 RPM max) |
| IMU | MPU6050 (6-axis accel/gyro, I2C) |
| Magnetometer | QMC5883L (3-axis, I2C) |
| Distance sensor | HC-SR04 ultrasonic (dual, interrupt-driven) |
| Display | SSD1306 128×64 OLED (I2C) |
| Cliff detection | IR sensors (front + rear) |
| Battery monitor | Voltage divider on ADC pin |
| Main computer | Raspberry Pi (runs path following + state estimation) |
| E-Stop | Hardware interrupt pin (falling edge ISR) |
Upload arduino/main/main.ino to the ESP32 via Arduino IDE.
cd rpi
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
python -m main
cd backend
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
python main.py
Connect an Xbox controller before starting if physical joystick input is needed.
cd frontend
npm install
npm run dev
Open http://localhost:5173 in a browser.
Several constants in rpi/src/models/Config.py and arduino/main/config.h require physical calibration:
| Constant | Tool | Description |
|---|---|---|
WHEEL_BASE_CORRECTION |
rpi/src/calibrate_wheelbase.py |
Correct odometry heading for actual track width |
LEFT/RIGHT_CORRECTION |
rpi/src/calibrate_wheel.py |
Correct per-wheel distance from encoder ticks |
kS_LEFT/RIGHT |
rpi/src/calibrate_kS.py |
Static friction feedforward for each motor |
| Magnetometer offsets | backend/mag_calibration.py |
Hard/soft iron calibration via ellipsoid fitting |
| PID gains | arduino/main/config.h |
P_LEFT, I_LEFT, D_LEFT (and right equivalents) |
TRACER/
├── frontend/ # SvelteKit web dashboard
│ ├── src/ # Frontend source code
│ │ ├── lib/ # UI components and utilities
│ │ └── routes/ # Page routes
├── backend/ # Intermediary controller code
│ ├── Controller.py # Joystick input processing
│ └── main.py # Flask web server
├── rpi/ # Raspberry Pi robot controller
│ ├── src/ # Core robot functionality
│ │ ├── models/ # Robot models and types
│ │ ├── ai/ # GPT integration for commands
│ │ └── server.py # WebSocket server
├── arduino/ # Arduino firmware
│ └── main/ # Main controller sketch
└── docs/ # Documentation
Technology: SvelteKit, TypeScript, Socket.IO, TailwindCSS
Key Components:
src/routes/+page.svelte)src/lib/components/JoystickStatus.svelte)src/lib/components/CommandList.svelte)src/lib/components/Recordings.svelte) src/lib/components/ObstructionStatus.svelte)src/lib/components/BatteryPercentage.svelte)src/lib/components/Status.svelte)src/lib/components/Logs.svelte)Features:
Technology: Python, Socket.IO, Flask, PyGame, WebSockets
Components:
Controller Backend (Laptop/Server):
backend/main.py - Flask server with Socket.IObackend/Controller.py - Joystick handling and state managementbackend/ControllerState.py - Control mode enumerationRobot Controller (Raspberry Pi):
rpi/src/server.py - Main WebSocket serverrpi/src/models/Robot.py - Core robot functionalityrpi/src/models/SerialManager.py - Serial communicationrpi/src/models/SensorData.py - Sensor data processingrpi/src/ai/get_commands.py - Natural language command integrationFeatures:
Technology: C++, Arduino Framework
Key Components:
arduino/main/main.ino)Hardware Components:
Features:
User Input (Web UI/Controller) → Backend Socket → Raspberry Pi → Serial → Arduino → Sensors
↓
Telemetry Display (Web UI) ← Backend Socket ← Raspberry Pi ← Serial Data ← Arduino
Frontend ↔ Backend:
joystick_input: Send joystick control values to backendjoystick_mode: Control mode changes (TWO_ARCADE, ONE_ARCADE, TANK, CAR)precision_mode: Toggle for precise movement controlstart_recording: Begin recording joystick movementsstop_recording: End recording and save macroplay_recording: Play back a saved joystick macrosensor_data: Streaming sensor updates from robotactive_command: Current command being executedBackend ↔ Raspberry Pi:
query: Send natural language commands for AI processingjoystick_input: Forward controller commandssensor_data: Telemetry updatesrumble: Trigger controller haptic feedbackstop: Emergency stop commandCommand Structure:
Response Structure:
Requirements:
Running the System:
npm install in the frontend directorypip install -r requirements.txt in the backend directorypip install -r requirements.txt in the rpi directoryarduino/main/main.ino to the Arduino board using the Arduino IDErpi directory and run python -m mainjstest or similar tools to verify)backend directory and run python main.pyfrontend directory and run npm run devhttp://localhost:5173