Training activity information
Details
Perform an inter-comparison between two radiation dose measurement devices used in diagnostic radiology x-ray quality assurance (QA)
Type
Entrustable training activity (ETA)
Evidence requirements
Evidence the activity has been undertaken by the trainee repeatedly, consistently, and effectively over time, in a range of situations. This may include occasions where the trainee has not successfully achieved the outcome of the activity themselves. For example, because it was not appropriate to undertake the task in the circumstances or the trainees recognised their own limitations and sought help or advice to ensure the activity reached an appropriate conclusion.
Reflection at multiple timepoints on the trainee learning journey for this activity.
Considerations
- Traceability and national standards
- Intercomparison methods between different types of radiation dosimeter
- Measurement error and uncertainty
Reflective practice guidance
The guidance below is provided to support reflection at different time points, providing you with questions to aid you to reflect for this training activity. They are provided for guidance and should not be considered as a mandatory checklist. Trainees should not be expected to provide answers to each of the guidance questions listed.
Before action
- What does success look like for this activity?
- Identify what is expected of you in relation to performing an inter-comparison of radiation dose measurement devices.
- Discuss with your training officer to gain clarity on what is expected of you regarding the accuracy, methodology, and reporting of the inter-comparison results.
- What is your prior experience of performing inter-comparisons or similar QA activities?
- Think about what you already know regarding radiation dose measurement devices, calibration, and QA methodologies. Have you used these specific devices before, or similar ones?
- Consider possible challenges you might face during the inter-comparison, such as device setup, environmental factors, data collection, or unexpected readings, and think about how you might handle them.
- Recognise the scope of your own practice for this activity. For example, do you know when you will need to seek advice or help, and from whom (e.g., a more experienced physicist, a calibration laboratory, or the equipment manufacturer)?
- Are you confident in your ability to perform the technical aspects and interpret the results?
- Consider the specific skills you want to develop, drawing upon any previous experiences with QA or calibration.
- This could include skills in precise device handling, data analysis for comparing measurements, or troubleshooting inconsistencies.
- Identify the specific insights you hope to gain from engaging with this activity. For example, do you hope to understand the practical nuances of dose meter calibration, the sources of measurement uncertainty, or the implications of inter-comparison results for clinical practice?
- What additional considerations do you need to make before performing the inter-comparison?
- Consult actions identified following previous experience of similar QA activities or equipment handling. Are there any lessons learned from past reflections that you need to apply here?
- Identify important information you need to consider, such as specific protocols for the devices, environmental conditions required for accurate measurements, or safety precautions related to radiation sources or equipment operation.
In action
- Are you noticing anything surprising or different from what you anticipate during the process of performing the inter-comparison?
- Are you encountering situations such as:
- A significant unexpected discrepancy in readings between the two dose measurement devices that defies initial expectations.
- Unstable environmental conditions (e.g., temperature, humidity fluctuations) impacting the consistency of measurements.
- Difficulty in ensuring precise and reproducible positioning of both devices within the x-ray field.
- An anticipated calibration issue that turned out to be incorrect during your measurements.
- How does this experience compare with previous experiences of similar activities, such as any observations of device calibration or routine QA checks?
- Are you encountering situations such as:
- How is any unexpected development being resolved as you progress during the activity?
- How are you working within your scope of practice? Are you successfully managing the situation yourself, or do you need support because it is beyond your current scope (for example, if a device shows major discrepancies suggesting a fault, or if recalibration is required)?
- What are you learning in this moment as a result of any unexpected development? For example, are you learning a new approach to ensuring environmental stability, or a more robust method for identifying systematic differences between devices?
- How is this impacting your actions? For example, are you responding to the situation appropriately?
- Are you adapting or changing your approach to the procedure, such as modifying your measurement protocol or sequence? Is it affecting your ability to undertake the activity independently?
- Consider the steps you are taking in the moment, such as:
- Are you re-checking connections, settings, or measurement parameters immediately?
- Are you consulting relevant device manuals or local work instructions for inter-comparisons more thoroughly than planned?
- Are you seeking advice from your training officer or a more experienced colleague to understand an anomaly or unexpected outcome?
- Are you changing your initial approach to data recording or analysis based on new insights?
On action
- Begin by summarising the key points of the inter-comparison experience, from setting up the devices to obtaining the final readings.
- Consider any specific events, actions, or interactions that felt important, including your own feelings during the experience.
- Include any ‘reflect-in-action’ moments where you adapted to the situation as it unfolded. For example, recall if there was a significant unexpected discrepancy in readings between the two devices, or if you encountered unstable environmental conditions. How did you respond to these in the moment (e.g., re-checking connections, seeking immediate advice)?
- Identify what learning you can take from this experience. What strengths did you demonstrate during the inter-comparison, and what skills and/or knowledge gaps were evident (e.g., understanding device calibration curves, troubleshooting environmental factors)?
- Compare this experience against previous engagement with similar activities, such as any prior observations of device calibration or routine QA checks. Were any previously identified actions for development achieved?
- Has your practice improved in areas like precise positioning or anomaly identification?
- Identify any challenges you experienced (e.g., difficulty ensuring reproducible positioning, identifying the cause of a discrepancy), and how you reacted to these. Did this affect your ability to deal with the situation, and were you able to overcome the challenges?
- Identify anything significant about the activity. Did you need to seek advice or clarification from your training officer or a more experienced colleague, for instance, when a device showed major discrepancies? Or did you need to escalate to ensure that you were working within your scope of practice if the issue was beyond your capabilities (e.g., requiring recalibration)?
- Identify the actions / ‘next steps’ you will now take to support the assimilation of what you have learnt, including from any feedback you have received. For example, what will you do differently next time to ensure environmental stability or to more effectively identify systematic differences between devices?
- Has anything changed in terms of what you would do if you were faced with a similar situation again, such as a different approach to modifying your measurement protocol?
- Do you need to practise any aspect of the activity further, such as refining your data recording and analysis based on new insights?
Beyond action
- Have you revisited the experiences?
- Have you reviewed your actions from your previous reflections for performing inter-comparisons of radiation dose measurement devices?
- What actions did you identify you would need to take to improve your practice (e.g., refining setup procedures, improving uncertainty calculations, or addressing device discrepancies)?
- Have you completed these actions, and are you ready to demonstrate this new learning into practice?
- Consider if your view of the situation has changed because of analysing this with others. For example, has discussing device calibration challenges with a senior physicist or another trainee offered new perspectives on troubleshooting or reporting?
- How have these experiences impacted upon current practice?
- Consider how the learning from performing inter-comparisons will support you in preparing for observed ‘in-person’ assessments for the module.
- Consider how your practice in performing QA activities, specifically inter-comparisons, has developed and evolved over time.
- Have you become more efficient in your methodology, more confident in your analysis, or better at recognising when a device issue is beyond your scope of practice and requires escalation to a specialist?
Relevant learning outcomes
| # | Outcome |
|---|---|
| # 2 |
Outcome
Perform and appraise quality assurance on equipment across a range of diagnostic radiology modalities. |