Dashboard →

Building RunCompanion: An AI-Powered Running Coach

Training for a half-marathon without feedback is like coding without a debugger. You know something is wrong, but you have no idea where. That frustration is what pushed me to build RunCompanion — a cross-platform running coach that lives in your pocket and talks to a robot on the treadmill.

The Problem

Every training app I tried had the same flaw: static plans. Run 5 km on Tuesday, 8 km on Thursday, rest on Sunday. No adaptation, no feedback loop, no context about how yesterday's session actually went. I wanted something that adjusted in real time.

The Stack

RunCompanion is a Flutter app targeting iOS, Android, Windows, and Web from a single codebase. Firebase Firestore handles real-time sync so your plan is available on every device simultaneously. The AI layer lives in a Cloud Function that receives session metrics and returns an updated training recommendation.

  • Flutter 3 — single codebase for all platforms
  • Firebase Firestore — real-time, offline-first sync
  • Cloud Functions — AI session analysis endpoint
  • BLE (Bluetooth Low Energy) — connects to the robot treadmill firmware
  • Riverpod — state management with watch-based reactivity

The BLE Integration

The most interesting engineering challenge was bridging the Flutter app to custom robot firmware over Bluetooth. The firmware runs on a microcontroller attached to the treadmill and exposes a GATT service with characteristics for speed, incline, cadence, and heart-rate zone. Flutter's BLE library handles discovery and subscription — the app renders live metrics the moment a device connects.

Session Planning with AI

After each workout the app sends a summary to the Cloud Function: distance, average pace, heart-rate zones, perceived effort, and how closely the session matched the plan. The function feeds this into a prompt and returns structured JSON — e.g. adjust Tuesday's long run by +1.5 km, or reduce intensity after back-to-back hard days.

What I Learned

Building a mobile + robotics hybrid taught me to think in layers. The presentation layer has no business logic; the BLE adapter is fully isolated from the Firestore adapter; the AI function knows nothing about the UI. Each piece can be replaced independently. That modularity is now a design principle across all MLEbotics projects.

RunCompanion is open-source and lives in the MLEbotics monorepo on GitHub.

Comments

Leave a comment — no account needed.