Feature Updates: How We Fixed Our Biggest Blind Spot: Coaching for Solo Performers

Feature Updates: How We Fixed Our Biggest Blind Spot: Coaching for Solo Performers
Photo by Melanie van Leeuwen / Unsplash

We rebuilt how Performance Coach AI analyzes solo acoustic performances. Here's what changed and why it matters.


Last week, I got detailed feedback from a professional music coach who tested our app with a solo acoustic cover of "Great Balls of Fire."

His verdict? The tempo stability detection was spot on. The homework was pedagogically sound. But one metric was "mathematically correct but musically unfair."

That metric was % Time on Beat.

The Problem: Grid-Based Thinking

Here's what was happening: when you uploaded a solo performance, our system would detect the average BPM (say, 129) and then judge every note against that rigid grid.

If you sped up during the chorus for energy and slowed down for an emotional ad-lib — which is exactly what good musicians do — you'd get penalized. The AI would tell you that you were "off the beat" 53% of the time.

But here's the thing: in a solo performance, you are the rhythm section. If your guitar and voice move together, you're in time with yourself — even if you're not locked to a metronome.

The coach put it perfectly:

"A score of 47% implies the user is 'wrong' more than half the time. However, musically, the user was mostly synchronized with his own guitar."

The Fix: Soloist Mode

We rebuilt the prompting system to detect when there's no backing track or drum stem available. When that happens:

  1. We stop judging "Time on Beat" against a grid. Instead, we focus on tempo stability — how consistent you are with yourself.
  2. The AI understands rubato. Expressive timing fluctuations are now recognized as artistic choices, not errors.
  3. Homework adapts to your context. No more "practice with a backing track" when you don't have one. Instead: "Use a metronome" or "Tap your foot."

This single change dramatically improves the perceived intelligence of the coaching for anyone playing solo acoustic, singer-songwriter material, or any performance without a fixed tempo reference.

Instrument-Specific Coaching

We also shipped focus instrument selection. When you start an analysis, you now choose what you want feedback on: Vocals, Guitar, Bass, Drums, or Keys.

Why does this matter? Because different instruments have different priorities:

InstrumentTop PrioritySecondaryTertiary
VocalsPitch & intonation (35%)Timing & phrasing (25%)Dynamics (20%)
GuitarRhythm & timing (30%)Chord accuracy (25%)Dynamics (20%)
DrumsTiming & groove (40%)Dynamics (25%)Consistency (20%)
BassTiming & groove (35%)Note accuracy (25%)Dynamics (20%)
KeysChord voicing (30%)Timing (25%)Dynamics (20%)

A vocalist needs different feedback than a drummer. Now the AI weights its coaching accordingly.

Rushing vs. Dragging: Knowing Which Way You Drift

Previously, we could tell you that your timing was off. But we couldn't tell you which direction.

This led to awkward situations. The AI might say "you're rushing" when you were actually dragging behind the beat. Not helpful.

We added a new metric: timing direction. The system now calculates both rushing_tendency_pct and dragging_tendency_pct, then labels your overall tendency as:

  • Rushing — consistently pushing ahead of the beat
  • Dragging — consistently falling behind
  • Inconsistent — fluctuating both ways

This means the homework is now accurate. If you're dragging, you get exercises for dragging. If you're rushing, you get exercises for rushing.

Better Audio Analysis with Stem Separation

Under the hood, we integrated Music.AI for stem separation. When you upload a video, we now isolate:

  • Vocals
  • Guitars
  • Bass
  • Drums
  • Keys

This gives us cleaner analysis:

  • Key detection now comes from guitars/keys (harmonic instruments) instead of vocals. This matters because vocalists often use "blue notes" — intentionally flat 3rds and 7ths that are stylistically correct but technically "out of key."
  • BPM detection comes from drums when available, giving us ground-truth tempo instead of guessing from the full mix.
  • Focus instrument metrics are calculated from the isolated stem, not the muddy room mix.

When stems aren't available, we gracefully fall back to room mix analysis. But when they are, the accuracy improvement is significant.

The UX Overhaul

We also redesigned the results page:

  • Psychology-informed ordering: Strengths first (positive reinforcement), then your focus area
  • Full homework exercises: Duration, step-by-step instructions, success criteria, and equipment needed
  • Separate Daily Practice Plan section: Clear distinction between the AI's assessment and your action items
  • "Start Next Session" CTA: Keep the practice loop going

What's Next

We're continuing to refine the coaching based on real user feedback. Coming soon:

  • Multi-instrument analysis — how do your parts work together as a band?
  • Progress tracking — see your metrics improve over time
  • Reference comparison improvements — better YouTube integration

If you haven't tried the app recently, give it another shot. The coaching is meaningfully better than it was two weeks ago.

And as always — reply to any email or DM me directly. Your feedback shapes what we build.

— Eric Neff, Founder