Training activity information
Details
Perform and document a validation or verification for solid cancers
Type
Developmental training activity (DTA)
Evidence requirements
Evidence the activity has been undertaken by the trainee.
Reflection on the activity at one or more time points after the event including learning from the activity and/or areas of the trainees practice for development.
An action plan to implement learning and/or to address skills or knowledge gaps identified.
Reflective practice guidance
The guidance below is provided to support reflection at different time points, providing you with questions to aid you to reflect for this training activity. They are provided for guidance and should not be considered as a mandatory checklist. Trainees should not be expected to provide answers to each of the guidance questions listed.
Before action
What does success look like?
- Identify what is expected of you in relation to performing and documenting a validation or verification for solid cancers.
- Consider how the learning outcomes apply, specifically in relation to analysing, interpreting reports of clinically relevant findings and interpreting QC data/ISO:15189.
- Discuss with your training officer to gain clarity of what is expected of you in relation to expectations for experimental design and required performance criteria.
What is your prior experience of this activity?
- Think about what you already know about validation and verification principles, experimental design, data analysis for performance metrics (e.g., sensitivity, specificity, accuracy), and documentation requirements.
- Consider possible challenges you might face during the activity, such as designing the experiment, obtaining suitable samples, analysing results, or documenting findings comprehensively, and think about how you might handle them.
- Recognise the scope of your own practice for this activity i.e. know when you will need to seek advice or help, and from whom. You will need to seek advice from your Training Officer when required, for example if the validation data for a new assay shows performance metrics (e.g., limit of detection) that fail to meet manufacturer’s claims.
- Acknowledge how you feel about performing a validation or verification.
What do you anticipate you will learn from the experience?
- Consider the specific skills you want to develop, such as planning, executing, analysing, and documenting validation or verification studies.
- Identify the specific insights you hope to gain into the importance of validation/verification in ensuring the reliability and quality of genomic testing for solid cancers.
What additional considerations do you need to make?
- Consult actions identified following previous experiences with validations, verifications, or assay troubleshooting.
- Identify important information you need to consider before embarking on the activity, such as the specific assay or procedure being validated/verified, required performance criteria, relevant guidelines or standards (e.g., ISO:15189), and the documentation template.
In action
Is anything unexpected occurring?
- Are you noticing anything surprising or different from what you anticipate whilst performing a validation or verification experiment or documenting the process? How did this compare to previous validation/verification experiences?
- Are you encountering situations such as:
- Unexpected assay performance (e.g., lower-than-expected sensitivity at the limit of detection)?
- Difficulties with data analysis in real-time due to software errors or data format issues?
- Challenges in documenting deviations (e.g., temperature excursions) as they happened?
How are you reacting to the unexpected development?
- How is this impacting your actions? For example, are you responding to the situation appropriately? Did you immediately troubleshoot or seek guidance from a technical specialist?
- Consider the steps you are taking in the moment, such as immediately documenting the unexpected performance metric with specific details or halting the experiment to re-calibrate equipment to rule out technical error.
- How are you feeling in that moment? For instance, did encountering this issue impact your confidence in performing validation/verification independently? Are you feeling positive you could reach a successful conclusion?
What is the conclusion or outcome?
- Identify how you are working within your scope of practice. For example, are you successfully documenting the unexpected assay performance deviation and proposing mitigation steps? Or are you needing support because the validation data suggests the assay fails to meet clinical requirements and requires immediate review by the lead scientist?
- What are you learning as a result of the unexpected development? For example, are you gaining insight into the critical importance of meticulous documentation when deviations occur during a validation study?
On action
What happened?
- Begin by summarising the key steps you took when performing and documenting the validation or verification experiment for the specific solid cancer assay.
- Consider specific events, actions, or interactions which felt important, such as how you analysed the data to determine performance metrics (e.g., analytical sensitivity), or how you comprehensively documented all experimental parameters and observations.
- Include any ‘reflect-in-action’ moments where you had to adapt to the situation as it unfolded, for instance, immediately pausing the experiment and checking reagent stability when initial quality control samples failed, necessitating real-time troubleshooting.
- How did you feel during this experience, e.g., did you feel focused on technical precision or stressed by the potential that the validation criteria would not be met?
How has this experience contributed to your developing practice?
- Identify what learning you can take from this experience regarding validation and verification. What strengths did you demonstrate, e.g., meticulous documentation of experimental deviations and accurate analysis of performance metrics?
- What skills and/or knowledge gaps were evident, e.g., unfamiliarity with the statistical analysis required to definitively set the assay’s limit of detection?
- Compare this experience against previous engagement with similar activities – were any previously identified actions for development achieved? Has your practice improved in interpreting technical metrics in the context of clinical requirements?
- Identify any challenges you experienced, such as needing to seek advice or clarification on scope of practice regarding whether the observed performance metrics justified the clinical use of the assay, and how you reacted to this.
What will you take from the experience moving forward?
- Identify the actions or ‘next steps’ you will now take to support the assimilation of what you have learnt, including from any feedback you have received, with regards to improving your planning and execution of technical studies.
- What will you do differently next time you approach validation, for instance, by proactively performing a more rigorous calculation of required sample size to ensure statistical power for verification metrics?
- Do you need to practise any aspect of the activity further, such as reviewing guidelines on validation design or key learning outcomes related to assay performance characteristics and quality documentation?
Beyond action
Have you revisited the experiences?
- How have your subsequent experiences of performing validation or verification since completing this specific training activity led you to revisit your initial approach or decisions during that activity? For example, how a subsequent validation of a complex assay requiring specific statistical metrics forced you to re-evaluate the statistical planning and documentation you applied during your first attempt at this training activity?
- Considering what you understand about validation principles, performance criteria, and quality documentation now, were the actions or considerations you identified after your initial reflection on this training activity sufficient? How have you since implemented or adapted improvements in your validation execution and documentation methodology based on further learning and experiences? For example, how you proactively reviewed and integrated the statistical requirements for limit of detection determination into your verification protocols?
- Has discussing validation failures or complex statistical analyses or the impact of verification on the release of diagnostic assays with colleagues, peers, or supervisors changed how you now view your initial experience in this training activity? For example, how professional storytelling with a technical lead about an initial validation that failed due to poor sample selection refined your understanding of the critical nature of meticulous experimental design and control?
How have these experiences impacted upon current practice?
- How has the learning from this initial training activity, in combination with subsequent experiences, contributed to your overall confidence and competence in validation and quality documentation, particularly in preparing for assessments like DOPS or OCEs where you demonstrate technical and quality skills? For example, how your accumulated ability in analysing performance metrics now enables you to confidently discuss the analytical validity and reliability of assays?
- How has reflecting back on this specific training activity, combined with everything you’ve learned since, shaped your current approach to validation/verification? How does this evolved understanding help you identify when something is beyond your scope of practice or requires escalation? For example, how your evolved approach means you now routinely seek advice immediately if validation data suggests the assay fails to meet clinical requirements or manufacturer’s claims?
- Looking holistically at your training journey, how has this initial experience, revisited with your current perspective, contributed to your development in interpreting QC data including EQA and ISO:15189 and ensuring analytical validity of reports?
Relevant learning outcomes
| # | Outcome |
|---|---|
| # 2 |
Outcome
Analyse, interpret and prepare interpretive reports of clinically relevant findings for patients with central nervous system (CNS), sarcoma tumours, and somatic and germline variants in ovarian and breast cancer. |
| # 4 |
Outcome
Interpret QC data including bioinformatic and NGS quality metrics in relation to assay performance, EQA and ISO:15189 standards. |