AP_plus_logo.gif

Australian Payment Plus Testing Portal

Australian Payment Plus

AP+ Testing Portal - Phase 2

List of certification templates

 

Context

The AP+ Testing Portal is the first web application built to replace manual, spreadsheet-heavy certification workflows.

It enables banks, fintechs, and merchants to self-serve certification against AP+ payment products, while giving AP+ admins structured tools for workflow templates, attestations, exceptions, and reporting.

By digitising certification, the portal reduced manual effort, improved transparency, and accelerated member onboarding across the AP+ ecosystem.

Problem

Before the Testing Portal, certification relied on manual spreadsheets and back-and-forth communication.

  • Participants: faced slow and long processes.

  • AP+ Testers: struggled with tracking attestations, exceptions, and progress through manual processes.

  • AP+ overall: suffered from integration delays and higher operational costs.

The challenge: build a scalable, self-service web application that could handle complex certification workflows while meeting compliance standards (e.g., Reserve Bank of Australia).


 

Approach to discovery

I joined the team at a time where the project is at Phase 2 — certification and organisation workflows. Phase 1 had been completed which was Test Case and Test Suite Management.

There was no budget for discovery and user research. The following details the approach I had taken within the scope.

User Journey Mapping

  • Facilitated stakeholder workshops to align on goals, constraints, and user needs for phase 2 of the project

User feedback workshops

  • Interviewed AP+ Testers to uncover inefficiencies they faced in Phase 1 to improve those features and inform on similar patterns in Phase 2 (e.g., cannot bulk delete, clone items)

  • Quantify satisfaction improvements (e.g., via qualitative comments & preference for new UI over old workflow)

  • Assessed error-rate and mis-linking of test cases pre- and post-implementation (e.g., duplicates/missed links) as a usability/unintended-outcome metric.

Landscape review

If there’s no budget for deep user research, I dig into the market instead. I study how others solve similar problems, gather insights from competitors, and workshop the best ideas with stakeholders to make sure we’re still designing with intention.

Scope

Stakeholders management

UX facilitation

Cross-collaboration

Content strategy

Information architecture

User interface and detailing

Figma-to-code workflow

 

User Journey

User feedback workshop

Facilitating Feedback to Inform Phase 2

As part of Phase 2, I facilitated feedback and pain point gathering workshops with internal testers to learn from their usage of Phase 1 features (Test Case Management and Test Suite Management).

The goal was to uncover real-world frustrations and repetitive tasks from Phase 1 that could guide us in designing similar, but more complex workflows in Certification Workflow Management.

From these sessions, I captured:

  • Bulk actions and efficiency gaps: testers wanted faster ways to clone, copy/paste, or delete multiple test steps.

  • Navigation and usability pain points: the system resetting to the top after edits, or difficulty scrolling through large lists of test cases.

  • Information visibility: need to see test names in list view, and toggle the visibility of certain fields.

  • Performance expectations: pagination and load times needed to be smoother when working with large data sets.

  • Validation clarity: clearer separation between blocking errors vs warnings or alerts.

At the same time, testers shared what worked well — e.g., test case cloning and test suite creation were seen as efficient and intuitive.

By capturing these insights, I ensured that Phase 2 designs built on proven successes while addressing gaps, creating a seamless bridge between test case/test suite management (Phase 1) and certification workflows (Phase 2).

 

Approach to design

Information Architecture

  • Based on the findings, mapped out how content and functions are organised before deciding how they look. The purpose was to prevent design from being driven by visuals instead of logic. This also ensure new features can fit into the system logically without breaking hierarchy

  • Identifies navigation gaps and redundancies early by gathering feedback from the cross-function team especially engineers.

  • Gives engineers a clear structural map of the product, so design and development decisions stem from the same logic.

Design System Implementation

  1. Experimented and refined a figma-to-code workflow (Figma MCP server, GitHub Copilot)

  2. Standardised variables and design system best practices, enabling smooth handover to engineers

 

IA - certificate workflow

Solution

Contracted for the delivery of phase 2 of the Testing Portal, I helped delivered two key pillars:

  • Certification Workflow Template Management

    • AP+ Testers can create, edit, and manage templates with steps, tasks, test case linkages, dependencies, and reports.

  • Member Certification

    • Participants can view assigned workflows, complete attestations, raise exceptions, and track progress.

    • Built secure access hierarchies for managing permissions and documentation.


Solving user problems

Collapsed view of steps, tasks and sub-tasks

Problems

  • Complex hierarchy and business logic: Linking rules depended on the structure — if a step had sub-tasks, test cases could not be attached at that level.

  • High cognitive load: After test cases are linked, AP+ Testers had to navigate and view test cases within the lowest level of each step.

 

Solution - multi-level certification hierarchies

Designed a scalable structure for adding, removing steps, tasks, and sub-tasks, and linking test cases integrated with complex business logic.

Balanced clarity for AP+ Testers with system scalability, ensuring workflows didn’t overwhelm participants while still capturing necessary detail.



 

An individual certification template where testers can link test cases. This certification template would then be assigned to external organisations to execute test caes.

 

Problems on linked test cases

  • Risk of errors: Without guardrails of logic, test cases could be duplicated, mis-linked, or missed entirely.

  • User frustration: The process was previously managed in spreadsheets, making it error-prone and hard to validate.

The solutions

  • Introduced a guided stepper interface that broke the task into manageable stages

  • Provided progress indicators so users always knew where they were in the process.

  • Added a review screen where admins could confirm all associations before finalising, reducing mistakes.

  • Used informative error messages vs alerts to distinguish between blocking issues and warnings. This solves the problem in regard to test cases being duplicated or mis-linked.

  • Improve efficiency by allow AP+ testers to load test cases from a test suite where all test cases are ready to be inserted into a certification workflow.

Finding and loading test cases to link them to a certification workflow.

An example on one of the steps that would create variants of a single test case in the next step. The variants are required for assigning to different organisations with different institution roles.

 

IA - Organisation certification

 

Information architecture of the organisation workflow

 

Problem - lack of Visibility into Certification Progress Across Organisations

Previously, certification workflows for each participant were tracked manually, often through emails and spreadsheets.

AP+ admins had no centralised way to see where each organisation was in their certification lifecycle — whether they were “In Progress,” “In Review,” or awaiting attestation or exception approval.

This created delays, confusion, and duplicated effort between internal testers and external participants.

Solution

  • We introduced a unified Organisation Certification Dashboard, consolidating all participants’ workflows and surfacing live status updates directly from the system (e.g., New → In Progress → In Review → Certified). Each status transition now aligns with the underlying workflow logic defined in the system.

  • AP+ Testers can instantly identify which steps are awaiting review or finalisation, removing the need for manual tracking.

User Interface Principle

  • Visibility of system status — users can now see real-time progress and workflow state transitions without relying on external communication, reducing uncertainty and administrative overhead.

 

External Organisations

User Problems

  • Manual email-based attestation caused confusion and delays.

  • Unclear distinction between when to attest vs. raise an exception.

  • No structured way to upload or track evidence, leading to version control issues.

Solutions

  • Introduced an online Completion Certificate with structured fields for attestation, evidence upload, and comments.

  • Linked Test Cases directly to milestones for clarity on scope and progress.

  • Enabled versioned resubmissions and status visibility (e.g., Completed, Exception–Approved/Rejected).

UI Principles

  • Clarity: Clearly grouped sections (Test Cases, Evidence, Response).

  • Feedback: Real-time status indicators and success confirmations after submission.

  • Guidance: Inline help and concise field labels to reduce uncertainty.

 

Raising exceptions

AP+ Testers and reviewers

User Problems

  • No single view of attested milestones, uploaded evidence, or exceptions.

  • Inconsistent review actions and comment tracking across reviewers.

  • Limited visibility on member progress and feedback history.

Solutions

  • Built a centralised review interface showing attested Test Cases, Exceptions, and Evidence in one view.

  • Added inline decision controlsApprove, Reject, Refer Back – with mandatory comments for traceability.

  • Introduced a refer-back workflow allowing members to revise submissions without losing version history.

UI Principles

  • Efficiency: Key actions surfaced inline to reduce navigation and review time.

  • Transparency: Shared visibility of comments, status, and evidence between both parties.

  • Consistency: Unified layout and states across attestation and review flows for a seamless experience.

 

Attestation

 

Impact

  • 20–30% reduction in manual effort for internal testers by replacing spreadsheets with structured workflows.

  • Enabled self-service certification for banks, fintechs, and merchants, reducing onboarding friction.

  • Improved cross-functional collaboration by aligning design outcomes with team OKRs.

  • Designed end-to-end certification workflows supporting regulatory compliance.


Achievements

  • Refined and improved a figma-to-code process using Figma MCP server (Learn more on my substack)

  • Delivered production-ready Figma files, ensuring smooth handover to front-end engineers.

  • Demo best practices and systems thinking to designers and Frontend engineers to improve the above workflows (Dev mode, instructing ai model)

  • AI-enabled workflows:

    • OpenAI Custom agent - UX QA partner to produce reports of manual testing

    • Employ the use of Playwright in the code base to test acceptance criteria.

    • AI-optimised specifications and context driven coding workflow