Doc menu


The Delphi method

The Delphi method is a consensus technique that allows the systematic collection of informed judgements from a panel of experts. Research carried out in other allied health professions has used the Delphi method to establish research priorities.8, 10, 11, 13

A modified Delphi technique was used to establish and prioritise research priorities for the radiography profession for 2016-2021, using a three-round iterative process. The first round consisted of questionnaires administered through the SurveyMonkey online software ( LLC, California, USA, Data from the first round was summarised and fed back to participants in rounds two and three to enable consensus to be established and the suggested research areas to be prioritised. All data fed back was anonymised to maintain participant confidentiality. Demographic data was also collected, including age, gender, ethnic group, country of work, years of radiography experience and main specialties, along with how recently panel members had been service users or enrolled on a higher education course.


Prior to recruitment, the research team and representatives from the SCoR identified key areas within radiotherapy and diagnostic imaging from which to recruit participants for the expert panel. It was important to ensure all main specialities were represented. The expert panel included those from advanced clinical practice, research, management, clinical service provision, education and users of radiography services as well as a range of radiography specialties including:

  • paediatric radiography;
  • ultrasound;
  • mammography ;
  • reporting radiographers;
  • radiotherapy ;
  • education.

The expert panel was recruited via an email request for volunteers to SCoR expert group members; higher education institution (HEI) course leaders; radiography service managers, and service users.  Course leaders from HEIs were also invited to suggest student representatives to participate in the study.  A call for participants was posted on the SCoR research radiographers’ network webpage and key individuals were also identified and individually contacted by the SCoR and by the research team. Snowball sampling was also used with recruited panel members suggesting others whose expertise would add value to the study.

Potential participants had to fulfil the pre-specified criteria for selection (see Table 1). A recruitment target of approximately 0.4% of SCoR membership (n=90) was set to mirror other Delphi studies.8  Those that met the criteria in Table 1 were invited to be members of the expert panel (n=128). The panel included experts from all four UK countries and from a range of settings, including:

  • public healthcare;
  • private sector healthcare and;
  • Higher Education Institutions (HEIs).

As much as was reasonably practicable, the panel reflected a balance of experts across specialisms, experience and geographic locations. 

To maintain confidentiality and anonymity all panel members were assigned a unique participant ID number. 

Data was stored in a secure electronic site file, on a password-protected network (the network was regularly backed up to ensure security and safety of data) hosted by Sheffield Hallam University.



Have managed diagnostic or therapeutic radiography services (i.e. head of department, deputy head of department or superintendent II)


Have managed diagnostic or therapeutic radiography courses in a Higher Education Institution (i.e. head of department/school/team, deputy head of department/school/team)


Have published papers about diagnostic or therapeutic radiography services in peer-reviewed journals


Have conducted research or a practice development initiative into diagnostic or therapeutic radiography services


Currently or have been a senior practitioner specialising in the area of diagnostic or therapeutic radiography services (consultant practitioner or advanced practitioner)


Currently or have been a user of diagnostic or therapeutic radiography services


Currently or have been in a role that contributes to the development of health policy


Currently enrolled on an undergraduate or postgraduate diagnostic or therapeutic radiography course


Willing to participate in all rounds of the Delphi prioritisation process

Table 1 Participant selection criteria

Procedure and data analysis

Ethical permission was obtained from the Health and Social Care Research Ethics committee at Sheffield Hallam University.  Once individuals had been identified as passing the inclusion criteria for the expert panel they were sent an invitation to an expression of interest (EoI) form, designed using SurveyMonkey software. The EoI form included a participant information sheet with details about the study, including purpose, study sponsors, confidentiality and the complaints procedures. The EoI form also included sections to input contact details and to give consent to participate in the study.  Reminder emails were sent to non-responders on two occasions, five and eight working days after the original EoI form had been distributed. All rounds of the Delphi were conducted using the Surveymonkey online questionnaire software.

Round one

Participants were requested to list up to five research priorities for the radiography profession along with supporting statements to explain the rationale behind their choices. Participants were asked to consider prioritisation criteria when selecting their research priority areas; these were adapted by the research team from criteria used in the Chartered Society of Physiotherapy Delphi study8(Table 2).  Content analysis was used to identify themed areas, using NVivo qualitative data analysis software (QSR International Pty Ltd. Version 10, 2012).  Two researchers (HP and SH) independently analysed all completed questionnaire priorities listed in round one. These were then collated and any disagreement in terms of themes was discussed and agreed. In the round one analysis, care was taken to maintain the wording used by participants to retain validity and to ensure that the themes reflected the panel’s original perspectives. Topics that were identified as not research were removed at this stage.

Does the topic address a significant need or gap in the evidence for radiography practice and/or service delivery?

e.g. Consider evidence of clinical effectiveness, risk and cost effectiveness

Will the research area impact the quality of care and experience for patients, their carers, service users and members of the public?

e.g. Consider the burden of the disease, number of people likely to benefit, likelihood of implementation of findings, and patient benefit

Will the research potentially impact radiography practice?

e.g. Consider the likelihood of implementation of findings and how many Society and College of Radiographers members are likely to utilise the evidence

Will the research area potentially have an impact on managers, service providers, students and practitioners, and be relevant to government policy and priorities?

e.g. Consider the current evidence base in relation to service delivery and cost effectiveness, likelihood of implementation of findings and how radiography services and education are likely to benefit

Does the topic area address existing or emerging technology or techniques, how those might be used currently and any potential they may have for future practice?

e.g. Consider the effectiveness of new techniques over current standard practice or new applications of old modalities/techniques

Table 2 Prioritisation criteria

Round two

In round two, feedback was provided to the expert panel on round one in the form of the research topics grouped into themes, along with supporting statements provided in the original format.  Round two required participants to rate the importance of each research topic using a 1 to 5 Likert scale (1 = very unimportant, 2 = unimportant, 3 = neither important nor unimportant, 4 = important, 5 = very important).  Participants were asked to consider the prioritisation criteria as well as the supporting statements in the context of their own expert knowledge.  Participants were given the option to score all returned priorities and where they felt their expertise was not sufficient to make a judgement, they had the option to select ‘not qualified to assess this topic’. Participants were also given the option of selecting ‘not an area for research’ instead of selecting a rating score.

Consensus was set as being achieved when the following were met:

  • a mean rating of ≥4.0;
  • a coefficient of variation (CV) of ≤30%;
  • ≥75% agreement (% of panel members scoring 4 = important or 5 = very important on the Likert scale).

Round three

In round three, participants were provided with a list of research topics which had reached consensus along with all supporting statements and were asked to rate each research priority area again. Consensus was established once more with the new scores. All analyses were undertaken using SurveyMonkey software, IBM SPSS version 21, and QSR International NVivo version 10.

Research themes

Following analysis of all three rounds, the final priority areas that had reached consensus (n=133) were presented to the SCoR research group members. The research group considered which key themes embodied the priority topics identified through the Delphi consensus process. Five key themes were identified that appeared to encompass the consensus research statements (shown in Table 3).


1.              Technical innovations

2.              Patient and public experience

3.              Service and workforce transformation

4.              Accuracy and safety

5.              Education and training

Table Research themes identified by members of the SCoR research group following consideration of the 133 research priorities

A mapping exercise was undertaken to identify if each priority that reached consensus in the Delphi process could be matched with one of the five themes listed in Table 3.

Two members of the research group independently matched thirty separate priorities to reach agreement on which priorities fit the research themes; the purpose being to ensure that each priority could easily be linked to one of the five themed areas.

Content tools

Accessibility controls

Text size