Training activity information
Details
Review performance levels of image display devices and carry out performance testing
Type
Entrustable training activity (ETA)
Evidence requirements
Evidence the activity has been undertaken by the trainee repeatedly, consistently, and effectively over time, in a range of situations. This may include occasions where the trainee has not successfully achieved the outcome of the activity themselves. For example, because it was not appropriate to undertake the task in the circumstances or the trainees recognised their own limitations and sought help or advice to ensure the activity reached an appropriate conclusion.
Reflection at multiple timepoints on the trainee learning journey for this activity.
Considerations
- Hard and soft copy display systems
- QA of imaging display systems
- National guidance
- Test patterns
Reflective practice guidance
The guidance below is provided to support reflection at different time points, providing you with questions to aid you to reflect for this training activity. They are provided for guidance and should not be considered as a mandatory checklist. Trainees should not be expected to provide answers to each of the guidance questions listed.
Before action
- What does success look like for this activity?
- Identify what is expected of you in relation to reviewing and testing image display devices.
- Discuss with your training officer to gain clarity on specific performance metrics, testing protocols, and reporting standards expected for this task.
- What is your prior experience of testing or assessing image display devices?
- Think about what you already know regarding display technology, image quality parameters, and QA procedures. Have you used relevant testing software or phantoms before?
- Consider possible challenges you might face during the review and testing, such as calibration issues, ambient light interference, or detecting subtle display artefacts. Think about how you might handle them.
- Do you know when you will need to seek advice or help, and from whom (e.g., IT support for networking issues, medical physicists for complex display calibration, or clinical staff for user feedback)?
- Do you feel confident in your ability to perform the tests accurately and interpret the results in terms of clinical impact?
- Consider the specific skills you want to develop, drawing upon previous experiences with imaging systems or QA.
- This could include skills in using specialised display testing tools, diagnosing subtle display faults, or effectively communicating display performance issues and their impact on image quality to clinical users.
- Identify the specific insights you hope to gain from engaging with this activity. For example, do you hope to understand the relationship between display performance and diagnostic accuracy, the impact of various display settings, or the practical application of QA standards for imaging displays?
- What additional considerations do you need to make before reviewing and testing image display devices?
- Consult actions identified following previous experience of QA activities or working with imaging equipment.
- Are there any known issues with the specific display devices or their environment that need to be addressed?
- Identify important information you need to consider, such as national or local guidelines for display QA, the clinical purpose of the displays being tested, or any specific software or hardware requirements for the testing.
In action
- Are you noticing anything surprising or different from what you anticipate during the process of reviewing and testing image display devices?
- Are you encountering situations such as:
- Unexpected deviations from standard test patterns (e.g., SMPTE, TG18-QC) that defy initial expectations.
- An unusual type of display artefact that you hadn’t anticipated or seen before.
- Conflicting results when testing different display parameters (e.g., luminance uniformity vs. spatial resolution).
- An anticipated display issue (e.g., poor contrast) that turned out to be incorrect or was caused by an unexpected factor.
- How does this experience compare with previous experiences of similar activities, such as any prior observations of display calibration or basic display settings?
- Are you encountering situations such as:
- How is any unexpected development being resolved as you progress during the activity?
- Are you successfully managing the situation yourself, or do you need support because it is beyond your current scope (for example, if a display requires complex hardware intervention or software recalibration beyond your current permissions)?
- What are you learning in this moment as a result of any unexpected development? For example, are you learning a new approach to interpreting a specific test pattern, or a more robust method for troubleshooting display issues?
- How is this impacting your actions? For example, are you responding to the situation appropriately?
- Are you adapting or changing your approach to the testing procedure, such as using an alternative test pattern or software tool? Is it affecting your ability to undertake the activity independently?
- Consider the steps you are taking in the moment, such as:
- Are you re-checking display settings or software configurations immediately?
- Are you consulting relevant national and international standards for medical display QA (e.g., DIN, AAPM) or departmental work instructions more thoroughly than planned?
- Are you seeking advice from your training officer or IT support to understand an anomaly or to address complex hardware/software issues?
- Are you changing your initial approach to documenting observations or formulating recommendations based on new insights?
On action
- Begin by summarising the key points of the image display device review and testing experience, from setting up the test patterns to evaluating the display performance.
- Consider any specific events, actions, or interactions that felt important, including your own feelings during the experience.
- Include any ‘reflect-in-action’ moments where you adapted to the situation as it unfolded. For example, recall if there were unexpected deviations from standard test patterns (e.g., SMPTE, TG18) or an unusual type of display artefact.
- How did you respond to these in the moment (e.g., re-checking display settings, consulting manuals)?
- Identify what learning you can take from this experience. What strengths did you demonstrate during the display performance review, and what skills and/or knowledge gaps were evident (e.g., interpreting complex test patterns, troubleshooting software issues)?
- Compare this experience against previous engagement with similar activities, such as any prior observations of display calibration or basic display settings. Were any previously identified actions for development achieved?
- Has your practice improved in areas like recognising subtle artefacts or assessing luminance uniformity?
- Identify any challenges you experienced (e.g., conflicting results from different parameters, diagnosing an unknown artefact), and how you reacted to these. Did this affect your ability to deal with the situation, and were you able to overcome the challenges?
- Identify anything significant about the activity. Did you need to seek advice or clarification from your training officer or IT support to understand an anomaly or address complex hardware/software issues? Or did you need to escalate to ensure that you were working within your scope of practice if the display required complex hardware intervention or software recalibration beyond your current permissions?
- Identify the actions / ‘next steps’ you will now take to support the assimilation of what you have learnt, including from any feedback you have received. For example, what will you do differently next time to interpret a specific test pattern or to troubleshoot display issues more robustly?
- Has anything changed in terms of what you would do if you were faced with a similar situation again, such as using an alternative test pattern or software tool?
- Do you need to practise any aspect of the activity further, such as refining your approach to documenting observations or formulating recommendations based on new insights?
Beyond action
- Have you revisited the experiences?
- Have you reviewed your actions from your previous reflections for reviewing and testing image display devices?
- What actions did you identify you would need to take to improve your practice (e.g., understanding specific display standards, improving artefact identification, or refining reporting of findings)?
- Have you completed these actions, and are you ready to demonstrate this new learning into practice?
- Consider if your view of the situation has changed because of analysing this with others. For example, has discussing difficult-to-diagnose display artefacts with experienced colleagues provided new insights into their causes or solutions?
- How have these experiences impacted upon current practice?
- Consider how the learning from reviewing and testing image display devices will support you in preparing for observed ‘in-person’ assessments for the module.
- Consider how your practice in image display QA has developed and evolved over time. Have you become more proficient in using display testing software, more astute at visually assessing image quality, or better at recognising when a display issue requires IT or vendor intervention, indicating it’s beyond your current scope of practice?
Relevant learning outcomes
| # | Outcome |
|---|---|
| # 2 |
Outcome
Perform and appraise quality assurance on equipment across a range of diagnostic radiology modalities. |
| # 3 |
Outcome
Identify common image artefacts and make recommendations for rectification. |