Show simple item record

dc.contributor.authorVenable, John
dc.contributor.authorPries-Heje, J.
dc.contributor.authorBaskerville, Richard
dc.contributor.editorPeffers, K.
dc.contributor.editorRothenberger, M.
dc.contributor.editorKuechler, B.
dc.date.accessioned2017-03-15T22:06:08Z
dc.date.available2017-03-15T22:06:08Z
dc.date.created2017-02-24T00:09:04Z
dc.date.issued2012
dc.identifier.citationVenable, J. and Pries-Heje, J. and Baskerville, R. 2012. A Comprehensive Framework for Evaluation in Design Science Research, in Peffers, K., Rothenberger, M. & Kuechler, B. (ed), 7th International Conference on Design Science Research in Information Systems (DESRIST 2012), May 14 2012, pp. 423-438. Las Vegas, USA: Springer-Verlag.
dc.identifier.urihttp://hdl.handle.net/20.500.11937/49584
dc.description.abstract

Evaluation is a central and essential activity in conducting rigorous Design Science Research (DSR), yet there is surprisingly little guidance about designing the DSR evaluation activity beyond suggesting possible methods that could be used for evaluation. This paper extends the notable exception of theexisting framework of Pries-Heje et al [11] to address this problem. The paper proposes an extended DSR evaluation framework together with a DSR evaluation design method that can guide DSR researchers in choosing an appropriate strategy for evaluation of the design artifacts and design theories that form the output from DSR. The extended DSR evaluation framework asks the DSR researcher to consider (as input to the choice of the DSR evaluation strategy) contextual factors of goals, conditions, and constraints on the DSR evaluation, e.g. the type and level of desired rigor, the type of artifact, the need to support formative development of the designed artifacts, the properties of theartifact to be evaluated, and the constraints on resources available, such as time, labor, facilities, expertise, and access to research subjects. The framework and method support matching these in the first instance to one or more DSR evaluation strategies, including the choice of ex ante (prior to artifactconstruction) versus ex post evaluation (after artifact construction) and naturalistic (e.g., field setting) versus artificial evaluation (e.g., laboratory setting). Based on the recommended evaluation strategy(ies), guidance is provided concerning what methodologies might be appropriate within the chosenstrategy(ies).

dc.publisherSpringer-Verlag
dc.subjectInformation Systems Evaluation
dc.subjectResearch Methodology
dc.subjectEvaluation Strategy
dc.subjectDesign Science Research
dc.subjectEvaluation Method
dc.titleA Comprehensive Framework for Evaluation in Design Science Research
dc.typeConference Paper
dcterms.source.startPage423
dcterms.source.endPage438
dcterms.source.titleThe 7th International Conference on Design Science Research in Information Systems (DESRIST) 2012 Proceedings
dcterms.source.seriesThe 7th International Conference on Design Science Research in Information Systems (DESRIST) 2012 Proceedings
dcterms.source.isbn9783642298622
dcterms.source.conference7th International Conference on Design Science Research in Information Systems (DESRIST 2012)
dcterms.source.conference-start-dateMay 14 2012
dcterms.source.conferencelocationLas Vegas, USA
dcterms.source.placeUSA
curtin.departmentSchool of Information Systems
curtin.accessStatusFulltext not available


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record