Rating general practitioner consultation performance in cancer care: does the specialty of assessors matter? A simulated patient study
dc.contributor.author | Jiwa, Moyez | |
dc.contributor.author | Halkett, Georgia | |
dc.contributor.author | Meng, Xingqiong (Rosie) | |
dc.contributor.author | Berg, Melissa | |
dc.date.accessioned | 2017-01-30T13:04:17Z | |
dc.date.available | 2017-01-30T13:04:17Z | |
dc.date.created | 2015-03-02T00:00:54Z | |
dc.date.issued | 2014 | |
dc.identifier.citation | Jiwa, M. and Halkett, G. and Meng, X. and Berg, M. 2014. Rating general practitioner consultation performance in cancer care: does the specialty of assessors matter? A simulated patient study. BMC Family Practice. 15 (152). | |
dc.identifier.uri | http://hdl.handle.net/20.500.11937/28312 | |
dc.identifier.doi | 10.1186/1471-2296-15-152 | |
dc.description.abstract |
BackgroundPatients treated for prostate cancer may present to general practitioners (GPs) for treatment follow up, but may be reticent to have their consultations recorded. Therefore the use of simulated patients allows practitioner consultations to be rated. The aim of this study was to determine whether the speciality of the assessor has an impact on how GP consultation performance is rated. MethodsSix pairs of scenarios were developed for professional actors in two series of consultations by GPs. The scenarios included: chronic radiation proctitis, Prostate Specific Antigen (PSA) ‘bounce’, recurrence of cancer, urethral stricture, erectile dysfunction and depression or anxiety. Participating GPs were furnished with the patient’s past medical history, current medication, prostate cancer details and treatment, details of physical examinations. Consultations were video recorded and assessed for quality by two sets of assessors- a team of two GPs and two Radiation Oncologists deploying the Leicester Assessment Package (LAP). LAP scores by the GPs and Radiation Oncologists were compared. ResultsEight GPs participated. In Series 1 the range of LAP scores by GP assessors was 61%-80%, and 67%-86% for Radiation Oncologist assessors. The range for GP LAP scores in Series 2 was 51%- 82%, and 56%-89% for Radiation Oncologist assessors. Within GP assessor correlations for LAP scores were 0.31 and 0.87 in Series 1 and 2 respectively. Within Radiation Oncologist assessor correlations were 0.50 and 0.72 in Series 1 and 2 respectively. Radiation Oncologist and GP assessor scores were significantly different for 4 doctors and for some scenarios. Anticipatory care was the only domain where GPs scored participants higher than Radiation Oncologist assessors. ConclusionThe assessment of GP consultation performance is not consistent across assessors from different disciplines even when they deploy the same assessment tool. | |
dc.publisher | BioMed Central Ltd. | |
dc.subject | Assessment | |
dc.subject | Radiation therapy | |
dc.subject | Prostate cancer | |
dc.subject | Consultation | |
dc.subject | General practitioner | |
dc.subject | Side effects | |
dc.title | Rating general practitioner consultation performance in cancer care: does the specialty of assessors matter? A simulated patient study | |
dc.type | Journal Article | |
dcterms.source.volume | 15 | |
dcterms.source.issn | 1471-2296 | |
dcterms.source.title | BMC Family Practice | |
curtin.note |
This article is published under the Open Access publishing model and distributed under the terms of the Creative Commons Attribution License | |
curtin.department | Department of Medical Education | |
curtin.accessStatus | Open access |