Doc menu


3.1 Audit results

Compliance with standards         N = 50*

7   (14%) centres report compliance of > 95%  

30 (60%) centres report compliance of >80%

8   (16%) centres report compliance of >70%

3   (6%) centres report compliance of >60%

1   (2%) centre reports compliance of > 50%

1   (2%) centre reports compliance of < 50%

* Includes data from the centre that did not complete the audit tool on Survey Monkey™

3.1.1 Compliance within categories                   N=49

Survey Monkey™ allocated a score of 1 for strongly agree, 2 for agree, 3 for disagree and 4 for strongly disagree. In the summary of responses, each statement was awarded an average score. Therefore, an average Likert Rating > 2.0 implies a tendency to disagree.

3.1.2 Organisational policies and procedures (questions 1-5)

Mean average score: 2.01

Range: 1.92 -2.46

Q1 – Policy for managing placements, including numbers = 2.1

Q2 – Policy to manage multiple numbers of students in a placement = 2.46

3.1.3 Relationship with HEI (questions 6 – 14)

Mean average score: 1.87

Range: 1.60 – 2.34

Q6 – Service Level Agreement in place = 2.26

Q12 – Feedback sought and action plans agreed re students = 2.34

3.1.4 Radiotherapy centre practices (questions 15 – 31)

Mean average score: 1.86

Range: 1.54 – 2.88

Q18 – Regular clinical tutorials = 2.00

Q19 – VERT used for placement learning = 2.88

Q27 – Clinical supervision embedded = 2.22

3.2 Analysis and discussion

The category with the highest mean average score was Organisational policies and procedures. This category also demonstrated a lack of evidence ranging from 38% to 74%. Interestingly, 18 out of 50 centres also reported that there was no formal service level agreement (SLA) with their HEI.

The dissonance between agreement and evidence in this section of the audit tool was explored during visits and it became evident that the score should, in fact, be higher since some RSMs had agreed that policies were in place within the audit tool but, when questioned, stated that that they were not written policies. The apparent lack of clarity about overall student numbers that the department was approved for and arrangements for deploying them was explored during interviews.

Within the category Relationship with HEI(s), the lack of a formal service level agreement is highlighted. During visits, it was confirmed that there is often a good personal relationship between the RSM and university staff but it may not be formalised or, if it is, it is part of the local Trust SLA with its HEIs. The dialogue was described as positive in the sense of being constructive, although the quality of the partnership appeared variable.

Forty seven out of 50 respondents agreed or strongly agreed that there were procedures establishing the lines of communication between centres and HEIs and that timely and appropriate information is received before, during and after placement. Comments about this suggest that the existence of  educational lead/practice educator roles are important in achieving this and, during visits, RSMs and others confirmed that having people in this role is seen as crucial to student support and wellbeing.

Question 12 (feedback being sought and action plans agreed about students within staff meetings) attracted a high score of 2.34. The written comments suggest that some RSMs did not think that staff meetings were the correct environment for this discussion, not that it did not occur.

Involvement of clinical staff in student recruitment was stated to occur in 38 out of 50 Trusts. This issue was tested during visits and it was identified that at least one HEI does not interview prospective students for radiotherapy programmes. It was also commented that staff shortages and the timing of interviews caused some centres not to participate in interviewing.

In the RT Centre Practices section, questions about the use of VERT™ and the occurrence of clinical supervision have the highest scores. Thirty four out of 50 disagree/strongly disagree that VERT™ is used routinely and this has skewed the average score in this section.

With regard to mentoring (questions 21 -24, 29), all centres stated that staff who mentor students receive formal training and, in question 44, routine updates occur. However, the comments make it clear that the term ‘formal’ is interpreted widely. For some staff it means the completion of a formal academic module to prepare facilitators for learning and assessment in practice. For others, it is a day’s training that may take place at the HEI or in the department and may or may not be carried out with the direct involvement of university staff.

A variety of terms to describe those who support student learning is also in existence; mentor, clinical assessor, supervisor, appraiser, practice educator. Additionally, only 39 centres agreed that students always have a designated mentor. When this topic was explored during visits, it was noted that mentoring is sometimes separated from assessment and there is variation in how assessment of practice takes place. The importance of dedicated practice educator roles in managing students’ placement learning was referred to frequently.

Comments made about question 25, (All practitioners are committed to having students and promoting their wellbeing), reveal some ambivalence and contradiction. One manager stated that some staff see this as an optional extra and highlighted the culture change needed while another suggested that there is scope for improvement. In one case, the manager stated that supporting student learning and development is included in Personal Development Review (PDR) objectives. This issue was probed during visits and there was general acknowledgement that, while the majority of staff enjoy having students and recognise their professional responsibility in this regard, there are staff who do not value having students, are perceived as difficult to work with and who students avoid.

Linked to this is the question of how much support students should expect from placement staff. A clear distinction was apparent between those staff who empathise with students who are learning in a complex and difficult clinical environment and those who believe that students expect to be ‘spoon-fed’ and fail to grasp that patient care is at the heart of the radiotherapy service and not them.

3.3 Conclusions

The tool has been found to be reliable within the constraints identified above. Evidence was sampled at each visit and no discrepancies were identified.

There is a lack of written policy to manage the number of students that can be accommodated within centres at any one time and, in particular, when numbers entail having 2 or more students in a placement. The number of commissioned students places significant pressure on centres and promotes  a perception of overcrowding, which may impact on students’ experiences of placement learning.

With regard to the ways in which students may expect to be treated, they are generally not identified specifically but are viewed as being covered by local Trust policies, especially in regard to equality and diversity. This lack of visibility in policies may impact on student support and wellbeing. In addition, there is evidence of cultural differences in how students are perceived and whether their expectations are legitimate and deserve to be met.

The relationship between radiotherapy centres in England and their education providers is viewed positively by the former, although it depends to an extent on good personal relationships and being able ‘to pick up the phone’ if there is a problem. The relationship appears to work best when it is equitable and academic and clinical components of programmes are well integrated.

Arrangements for student support and assessment are very variable. There is no consistent view of what mentoring is and how it should be employed to support student learning and development. There was a variety of opinions about whether mentoring (seen as pastoral support) and assessment (seen as objective skills development) should be undertaken by the same individual or kept separate.

Content tools

Accessibility controls

Text size