Curtin University Homepage
  • Library
  • Help
    • Admin

    espace - Curtin’s institutional repository

    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item

    Do "lie scales" measure lying? Evaluating applicant honesty on social desirability scales

    76786 Abstract.docx (21.08Kb)
    76786.pptx (3.740Mb)
    Access Status
    Open access
    Authors
    Hughes, Angus
    Holtrop, Djurre
    Dunlop, Patrick
    Steedman, Grace
    Chan, Joan
    Date
    2019
    Type
    Conference Paper
    
    Metadata
    Show full item record
    Citation
    Hughes, A. and Holtrop, D. and Dunlop, P. and Steedman, G. and Chan, J. 2019. Do "lie scales" measure lying? Evaluating applicant honesty on social desirability scales. In: APS 13th Industrial and Organisational Psychology Conference 2019, 11th Jul 2019, Adelaide.
    Source Conference
    APS 13th Industrial and Organisational Psychology Conference 2019
    Faculty
    Faculty of Business and Law
    School
    Future of Work Institute
    URI
    http://hdl.handle.net/20.500.11937/76543
    Collection
    • Curtin Research Publications
    Abstract

    Aim: Between 30-50% of job applicants are thought to ‘fake’ on personality measures, representing a challenge to the validity of these assessments by practitioners in the selection process (Griffith, Chmielowski, & Yoshita, 2007). Social desirability (SD) scales, sometimes known as ‘lie scales’, are frequently embedded in self-reported personality measures to identify job applicants responding dishonestly (Goffin & Christiansen, 2003). These scales are scored by tallying the number of positive responses to excessively virtuous and unlikely items (e.g. “There has never been an occasion when I took advantage of someone else”). However, there remains debate in the literature as to how effective SD scales really are in identifying dishonest responding. For example, De Vries et al. (2018) have argued that items in SD scales may not be read literally, but are instead endorsed in good faith to signal authentic virtuosity. We therefore tested three central hypotheses using a novel speak-aloud protocol methodology: A) Are item responses keyed as “dishonest” by SD scales evaluated as less honest?; B) Are overall SD scale scores associated with lower honesty on responses to personality scale items?; C) Are these effects increased in high-stakes job application settings? Design: We collected data in two phases. In the first phase, we recruited participants to complete a questionnaire including twelve personality items (from the HEXACO-PI-R; Lee & Ashton, 2004) and a 17-item SD scale (SDS-17; Stöber et al., 2001), 29 items in total. The participants were randomly assigned to a low-stakes honest responding condition, or a high-stakes simulated job application condition (psychology clinic receptionist role). Participants in the high-stakes condition were told the most successful applicant would receive a voucher worth $150AUD. While completing the questionnaire, we applied a method termed a ‘think aloud protocol’. All participants were instructed to verbalise their reasoning and response processes, while these spoken thoughts were recorded. In the second phase, these recordings were transcribed verbatim and presented to 3 to 5 independent judges per questionnaire participant. The judges were asked to rate the speak-aloud participant’s honesty on a five-point scale based on 1) the item, 2) the spoken thoughts, and 3) the chosen response to the item. Method: After removing speak-aloud participants failing the manipulation check in the high-stakes condition, and judges exhibiting careless responding, 46 speak aloud participants (Age M = 19.6, SD = 3.4; 76% female; 24 low-stakes, 22 high-stakes) completed the questionnaire while verbalising their answers, and 174 judges (3-5 per participant, M = 3.82) evaluated their responses. The three central hypotheses were tested using multilevel models in ‘lme4’ for R (Bates, Maechler, Bolker & Walker, 2015). Models evaluated honesty per item as the dependent variable, and condition (high vs low stakes) as a fixed effect independent variable. Hierarchical random effects accounting for variance between judges clustered within participants were specified, as well as crossed random effects accounting for variance between items. Significance tests for independent variables were conducted using Satterthwaite approximations for degrees of freedom. Results: A) Speak-aloud participants in either condition providing a response to items keyed as “dishonest” did appear to receive lower evaluations of honesty by judges compared to those providing “honest” responses, equating to roughly .77 points less on the five-point honesty scale (t = -7.714, p <.001). B) Each speak-aloud participant’s overall SDS-17 scale did not appear to be associated with judges’ evaluations of their honesty on responses to the personality items in either condition (β = -.027, SE = .019, t = -1.42, p = .157). C) There was no evidence of an effect of stakes on evaluated honesty (p = .132), nor of an interaction between high-stakes and ‘keyed’ SD responses (p = .248) or overall SDS-17 score on honesty (p = .100). Conclusion: SD scales are commonly included in personality questionnaires for job applicants to screen for deceptive responding. Independent judges in our study, tasked with evaluating individuals’ actual thought-process and rationalisation behind responding to SD scales, suggest that these scales may not necessarily be effective in this goal. While participants providing keyed “dishonest” responses (following the logic of the SD scale) to SDS items were evaluated as less honest by the judges compared to participants providing keyed “honest” responding, the overall SD scale score was not predictive of perceived dishonesty by participants to the personality questionnaire in either low-stakes or high-stakes applicant conditions. Our results may be limited by a potentially weak manipulation that saw few differences in between high-stakes and low-stakes, although previous think-aloud evidence suggests that generally honest responding to personality measures is not necessarily uncommon in job application scenarios (Robie, Brown & Beaty, 2007). Practitioners using SD scales should therefore administer and interpret results with a high degree of caution, given the lack of predictive ability these scales have for dishonest responding.

    Related items

    Showing items related by title, author, creator and subject.

    • The dynamics of Guanxi in the business context under China's economic transition
      Nie, Katherine Su (2007)
      Numerous popular business publications and academic literature have highlighted that the Chinese cultural phenomenon of guanxi has made noticeable impacts on the economic efficiency in China’s economic transition. Despite ...
    • Honest People Tend to Use Less—Not More—Profanity: Comment on Feldman et al.’s (2017) Study 1
      de Vries, R.E.; Hilbig, B.E.; Zettler, I.; Dunlop, Patrick ; Holtrop, Djurre ; Lee, K.; Ashton, M.C. (2018)
      © The Author(s) 2017. This article shows that the conclusion of Feldman et al.’s (2017) Study 1 that profane individuals tend to be honest is most likely incorrect. We argue that Feldman et al.’s conclusion is based on a ...
    • A randomised comparison trial to evaluate an in-home parent-directed drug education intervention
      Beatty, Shelley Ellen (2003)
      The long-term regular use of tobacco and hazardous alcohol use are responsible for significant mortality and morbidity as well as social and economic harm in Australia each year. There is necessary the more cost-efficient ...
    Advanced search

    Browse

    Communities & CollectionsIssue DateAuthorTitleSubjectDocument TypeThis CollectionIssue DateAuthorTitleSubjectDocument Type

    My Account

    Admin

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Follow Curtin

    • 
    • 
    • 
    • 
    • 

    CRICOS Provider Code: 00301JABN: 99 143 842 569TEQSA: PRV12158

    Copyright | Disclaimer | Privacy statement | Accessibility

    Curtin would like to pay respect to the Aboriginal and Torres Strait Islander members of our community by acknowledging the traditional owners of the land on which the Perth campus is located, the Whadjuk people of the Nyungar Nation; and on our Kalgoorlie campus, the Wongutha people of the North-Eastern Goldfields.