Training activity information

Details

Perform user testing, including: end to end testing

Type

Entrustable training activity (ETA)

Evidence requirements

Evidence the activity has been undertaken by the trainee repeatedly, consistently, and effectively over time, in a range of situations. This may include occasions where the trainee has not successfully achieved the outcome of the activity themselves. For example, because it was not appropriate to undertake the task in the circumstances or the trainees recognised their own limitations and sought help or advice to ensure the activity reached an appropriate conclusion. ​

Reflection at multiple timepoints on the trainee learning journey for this activity.

Considerations

  • Alpha and beta testing
  • User acceptance testing (UAT)
  • Regression testing
  • Considerations for safety of patient data

Reflective practice guidance

The guidance below is provided to support reflection at different time points, providing you with questions to aid you to reflect for this training activity. They are provided for guidance and should not be considered as a mandatory checklist. Trainees should not be expected to provide answers to each of the guidance questions listed.

Before action

  • What does success look like for performing user testing? How does this activity relate to evaluating and managing a system development project and implementing new applications or upgrades using controlled methodology? What specific aspects of the system/application need to be tested, and what are the expected outcomes of the tests?
  • What is your prior experience with software testing, especially user testing? Are you familiar with test plans, test cases, or bug reporting? What potential challenges might you face, such as identifying realistic test scenarios, documenting issues clearly, or coordinating with users? When would you need to seek advice, e.g., on test methodology or severity classification of issues? How do you feel about conducting user testing?
  • What specific skills do you want to develop in designing or executing test cases and documenting results? What insights do you hope to gain about the user perspective and the practicalities of ensuring system quality before deployment?
  • Have you reviewed the system requirements and design documents to inform your testing? Who are the target users for testing, and how will you recruit or schedule them? What tools or templates will you use for recording test results and issues?

In action

  • As you are performing user testing, specifically focusing on an end-to-end process, are you encountering anything that feels surprising or different from what you are anticipating? For instance, is the system performing unexpectedly, are users interacting with it in ways you didn’t anticipate, or are unexpected issues or behaviours arising during the test run? How does this user testing experience compare with your previous experiences of similar activities?
  • As you encounter an unexpected system behaviour or user interaction during testing, how are you reacting and adapting your approach to performing the user test? For example, if the system crashes or a critical error occurs, how are you adjusting the test in the moment? If a user gets confused or takes an unexpected path, how are you adapting your guidance or observation? Are you considering other ways you could approach this task, such as modifying your test script or focusing on different test cases? Is this affecting your ability to undertake the user testing independently? Are you feeling confident you can reach a successful conclusion despite the unexpected issue, or are you finding it difficult to adapt? Are you recognising when you might need to document an issue immediately or perhaps pause the test session to get help?
  • What new insights or lessons are becoming apparent to you as a result of the unexpected development and your reaction to it during the activity? For example, what are you learning about the system’s functionality or user experience as you test? Are you recognising how you are working within your scope of practice, for example, by knowing when to pause or seek immediate advice regarding testing procedures or critical issues? Top of Form

On action

  • Summarise the end-to-end testing process you performed or observed. What were the key steps or user interactions that felt most critical or revealed unexpected behaviour? Recall any moments during the testing where you had to change your approach or guidance based on what was happening.
  • What did you learn about planning or executing user testing, particularly end-to-end scenarios, and identifying issues? What strengths did you demonstrate in setting up tests, observing users, documenting findings, or interacting with participants? What skills or knowledge gaps were evident, perhaps regarding test script design, handling unexpected user behaviour, or using testing tools? How did this testing experience compare to previous testing activities? Did you incorporate learning from prior reflections on testing? What challenges did you encounter during testing (e.g., unclear user tasks, system instability, difficulty documenting issues)? How did you react to these? Did you need to seek advice on designing tests or classifying issues?
  • What actions or next steps will you take to improve your user testing skills, especially end-to-end testing? What aspects of test methodology, user interaction, or documentation do you need to practise or learn more about? How will you approach planning and conducting user tests differently next time?

Beyond action

  • With your current understanding of the relationship between requirements, acceptance criteria, and system testing, how would you evaluate the user testing you performed? Were the tests sufficient to demonstrate that requirements and acceptance criteria were met? Have you been involved in other testing activities (user, system, etc.) since this training activity? How does your current approach to testing, particularly end-to-end testing, compare to your initial experience? If you discussed this training activity with peers or your training officer, did their feedback or subsequent learning highlight different approaches to user or end-to-end testing? Revisit your reflect-on-action notes for this training activity. What aspects related to identifying issues, managing the testing process, or assessing system readiness now seem more or less significant in light of subsequent testing experiences?
  • How has your direct experience with user testing influenced your ability to critically evaluate the testing phases within a systems development project? How has it contributed to your understanding of the importance of rigorous testing for successful implementation of new applications and upgrades? Did this experience help you appreciate the user perspective and the value of involving users in the testing process? Has it influenced how you think about test planning and documentation?

Relevant learning outcomes

# Outcome
# 5 Outcome

Critically evaluate and manage a system development project within the context of a formal project management methodology.

# 6 Outcome

Implement new applications and upgrades to an existing application using controlled methodology.