HD-logo.gif

Symptom Checker

Healthdirect Symptom Checker

Healthdirect’s Symptom Checker required a complete overhaul of design and user experience.

The completion rate of this web based health assessment tool was at an average of 45% and Healthdirect aimed to improve this.

A roadmap was laid out and together with the DevOps team, we set out to achieve the business goals and to integrate Infermedica’s AI, a sophisticated medical inference engine.


Phase 1

Approach

  • Applied insights from existing research to guide new design decisions

  • Understand Australia government policies and clinical standards

  • The Infermedica AI engine was audited for a comprehensive understanding of its capabilities and limitations.

  • A comparative scan of other products in the health industry was carried out to inspire ideation.

  • The user journey and information architecture (user flow) was mapped out to facilitate for a team discussion, identify gaps and prioritise features

  • Moderated usability tests were carried out in collaboration with external user research agency.

  • Low-fidelity sketches were developed to gather feedback and foster ideation with the engineering team and business stakeholders.

  • Content strategy aimed to help people to understand the medical terms, questions and instructions.

 

Scope

UX facilitation

Cross-collaboration

Content strategy

User research

User interface and detailing

 

Auditing and adapting a global AI triage system for Australia

I studied Infermedica’s triage engine to understand how a globally designed AI symptom checker could be safely used within Australia’s health context through Healthdirect.

My role was to evaluate its behaviour, limits, and fit against Australian clinical expectations, legal constraints, and Healthdirect’s non-diagnostic role.

 

Key findings

  • Strong clinical reasoning with structured symptom assessment

  • Some outputs can sound diagnostic or overly certain without supervision

  • Recommendations need reframing for Australian clinical and policy context

  • Users need clearer signals on when to move from AI guidance to self-care or human support

Design impact

Ensure the system supports people’s decision-making without feeling like it’s deciding for them, while remaining appropriate for Australia’s clinical and legal context.

 

Content audit and glossary

 

User Journey

 

Landscape review

 

Information architecture

 

Low fidelity sketches

 

User testing microcopy and content

 

Accessibility

  • Heuristic reviews conducted to identify usability issues and iterate designs before proceeding to more resource-intensive usability testing.

  • User Research with diverse groups included people with disability and tech-wary individuals. We tested for user friendliness and micro-copy. Iterations of designs through each round of testing with users helped achieve a clear, concise and usable user experience.

  • Adhered to Web Content Accessibility Guidelines (WCAG) standards to ensure interfaces are usable by people with disabilities, such as screen reader compatibility and keyboard navigation.

  • Implemented soothing color schemes that are sensitive to user emotions


Solving user problems

Encourage user as they progress through a lengthy questionnaire

The usability testing revealed that the encouragement screen functioned as a good step in adding a human voice and propelling users to progress.

The steppers and clear instructions helped craft a supportive user experience and ease user anxiety.

 

Actionable care advice

Other key user problems we aimed to solve is the cognitive overload of information and presenting the care advice in a way that is actionable.

The solutions:

  • Help users make decisions by answering the questions they have in mind so they could compare and decide the type of clinic that would suit their needs. E.g. wait times, appointment requirement, cost at different clinics.

  • Number each type of clinic to show in order of recommendation.

  • Bold wordings for important info for easy scanning.

  • Surface key information on service card when user looks for a medical service, such as “No fee”, “Bulk Billing”, “3.0km” for nearest clinic to them.

 

Interaction design

Below is a prototype with implementation of motion design to improve user engagement. This idea was pitched to the business to potentially hire developer who specialises in developing interactive animations.


Achievements

  • Increased user completion rates to 68% by the end of 2023, compared to the previous design’s average of 45%.

  • Further increased completion rates to 86.5% for the 2024 release.

  • Met high consumer demand for digital triage across jurisdictions in Australia

  • Facilitated workshops for workflow improvement and conflict resolution for cross-functional team, reducing project delays by approximately 20%

Scroll on to view artefacts of the end-to-end design process


Phase 2

Identifying Gaps and Opportunities Across a Complex Service System

Following Phase 1 delivery, the Symptom Checker redesign became an input into a broader, time-boxed CX and service design initiative focused on understanding where gaps and opportunities existed across Healthdirect’s service ecosystem.

Rather than optimising a single product, Phase 2 stepped back to examine how people experienced digital services, GP advice, and nurse helplines together, particularly at moments of uncertainty when users were trying to understand what help was available and what to do next.

 

Our purposes

  • Identify system-level gaps between consumer needs, service intent, and actual behaviour

  • Surface future-state opportunity areas across services and channels

  • Use direct consumer research, synthesis, and prototyping to inform strategic decisions and funding discussions

  • Explore how emerging approaches, including AI-assisted support, could responsibly enhance human-led services

 

What I worked on

Working alongside CX leadership and designer, I contributed by:

  • Participating in consumer interviews across prioritised cohorts to understand how people interpreted eligibility, advice, and next steps

  • Synthesising qualitative insights to highlight recurring patterns, misunderstandings, and moments of uncertainty

  • Feeding these insights back into journey maps and ecosystem views to make system gaps visible

  • Prototype new service and product concepts, including AI-assisted ideas, to provoke discussion and uncover insight during user interviews

  • Sharing Phase 1 research findings to strengthen evidence and continuity across phases

  • Contribute to research presentations and cross-functional workshops, aligning future-state thinking with user needs and service realities

This work helped teams move from incremental improvement toward a shared understanding of where change and investment could deliver the greatest value and informs on the product roadmap.

 
 

What we learned

How these learnings shaped our design decisions

  • If we design from what the system can do instead of what people can cope with, we risk getting it wrong.

  • Design AI outputs to explain why a pathway is recommended, not just what to do

  • Avoid diagnostic language in unsupervised contexts to prevent anxiety escalation

  • Keep human support visible to prevent self-service from feeling like abandonment

  • Design for the least-resourced user, not the most confident one

Designing AI for health decisions under stress

Australian consumers often turn to digital tools when they feel anxious or uncertain. In these moments, AI needs to communicate clearly to build trust and guide people safely.

  • Self-care and digital confidence are unevenly distributed across Australians

  • Consumers accept AI’s future, but don’t trust it without human oversight

  • More information does not equal more confidence during symptom escalation

 
 

From insight to opportunity

Consumer interviews and synthesis revealed a small number of recurring patterns.
These were translated into clear opportunity themes to guide future design and service decisions.

 

Outcome and impact

The outputs of this phase:

  • Clarified future-state opportunity areas across Healthdirect services

  • Informed government funding proposals with evidence-based insights

  • Provided leadership with tangible artefacts to evaluate strategic options

  • Created alignment across CX, product, and service design teams on priorities