4 min read
Why Healthtech Products Fail in Clinics (And How Engineers Can Do Better)

We’ve built a lot at TIBU Health over three years - mobile apps, EHRs, queueing systems. Some features get used every day; others were abandoned within weeks. The difference is usually not technical quality. It’s whether we designed for clinical reality or our own assumptions about how clinics work.

The most common failure modes

1. Solving problems no one has

Engineers see inefficiency and assume clinics want to fix it. What looks “broken” might be a deliberate workaround that clinicians actually prefer.

  • Example: We built auto-generated discharge summaries. Doctors ignored them because they wanted narrative, customised summaries - not robotic templates.
  • What we changed: We now shadow nurses and observe actual workflows before scoping any feature. Asking “what’s your most frustrating task right now?” surfaces problems worth solving.

2. Adding cognitive load

Clinicians are overwhelmed. Any product that requires new logins or complex workflows just adds to the burden.

  • Example: Our first queue system required manual “start/end” clicks to track consultation status. Nurses kept forgetting because their mental model is the patient in the room, not the status in a system.
  • What we changed: We automated status changes via QR code scans and device-level updates. Adoption jumped immediately.

3. Ignoring existing workflows

Clinics have established routines. Forcing new ones without clear benefits causes resistance.

  • Example: We tried making payment mandatory before consultation. Clinic managers overrode it constantly because patients often negotiate payment plans after seeing the doctor - it had always worked that way.
  • What we changed: We design around existing constraints and optimise within them before proposing workflow changes.

4. Building for data, not for humans

Engineers love dashboards; clinicians love patients.

  • Example: An early EHR version had 30 required fields. Doctors revolted.
  • What we changed: We cut required fields to 8. Documentation time dropped to 3 minutes per visit, and data completeness actually improved because doctors stopped resenting the form.

5. No clinical validation

Healthcare is evidence-based. If you can’t show improved outcomes or reduced wait times, clinics won’t prioritise your tool over what they already have.

  • What we changed: We run pilots before broad rollouts and measure concrete metrics. Showing that our queue system reduced wait times by 35% convinced managers who were initially sceptical.

What we’ve changed in how we work

These failures changed how I think about the engineering process itself, not just the product decisions:

  • Shadowing before scoping: We spend days in clinics watching handoffs and emergencies, not just scheduled consultations. This is where the real edge cases live.
  • Designing for the exhausted user: I ask my team to imagine a nurse at 4:00 AM on hour 10 of a shift. If the interface doesn’t work for her, it’s not ready.
  • Defaulting to “no”: Every field and every notification has to justify its existence. The question isn’t “should we add this?” - it’s “why would we?”
  • Measuring adoption, not just usage: There’s a difference between staff choosing to use something and being forced to. We track both and treat low voluntary adoption as a signal to investigate.

Success looks like software that fits into clinical life so seamlessly that staff forget they’re using it.