For Review Only A COMPONENT BASED CONCEPTUAL METHODOLOGY AND HOLISTIC FRAMEWORK FOR IT SYSTEM IMPLEMENATION AND MANAGEMENT: COMBINING BOTH ACTION RESEARCH AND DESIGN SCIENCE METHODOLOGIES Journal: MIS Quarterly Manuscript ID: 2015-TR-13891 Category: Theory and Review Keywords: implementation, socio-technical system, method, decision support systems, Information technology management, explanatory, IS development, technical innovation, modeling, software design MIS Quarterly
36
Embed
A Component Based Conceptual Methodology and Holistic Framework for IT System Implementation and Management. Combining Both the Action Research and Design Science Methodologies
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
For Review O
nly
A COMPONENT BASED CONCEPTUAL METHODOLOGY AND
HOLISTIC FRAMEWORK FOR IT SYSTEM IMPLEMENATION
AND MANAGEMENT: COMBINING BOTH ACTION RESEARCH
AND DESIGN SCIENCE METHODOLOGIES
Journal: MIS Quarterly
Manuscript ID: 2015-TR-13891
Category: Theory and Review
Keywords: implementation, socio-technical system, method, decision support systems, Information technology management, explanatory, IS development, technical innovation, modeling, software design
MIS Quarterly
For Review O
nly
A Component Based Conceptual Methodology and Holistic
Framework for IT System Implementation and
Management. Combining Both the Action Research and
opinion all conceptual aspects of IT implementation and IT management have now
been covered in one study or another. At the time of writing this paper the only
holistic model that exists which addresses all the relevant issues and offers
standardised solutions is Bulgacs’s (2013) software based Information Technology
Implementation Program (ITIP) project.
“An important distinction between lo-fi and hi-fi prototypes is that lo-fi prototypes are “passive” models (paper based) while hi-fi prototypes (software based) are dynamic models with the purpose to expose some kind of behaviour of the future
system.” (Goldkuhl, 2013, p. 15).
It must be noted that comparable methodologies to the ITIP do exist, minus the
interactive software element, but these have been created by individual companies to
target specific contextual problems and have not been tested in different environments
(Peffer et al. 2008). “One of the differences with the ITIP project approach is that it
has equalled the importance of an individual to that of a workgroup or entire
organisation and vice versa.” (Bulgacs, 2015, p. 20). This creates as objective a lens
as possible to survey the field. Sheffield (2005) stated that any further studies into
current IS’s had to combine the three established streams of IS research namely
technology, behaviour and organisation (Cole et al, 2005; Hevner & Chatterjee,
2010). Hence any balanced investigation and study of the subject has to integrate and
in some part consolidate these three subject subsets. It also means any further study
will require researchers to have multiple-skill sets (Hevner & Chatterjee, 2010). Any
study that ‘black boxes’ or sidelines any of the subsets could lead to misleading
results (Fichman, 1992; Wixom & Todd, 2005). As already highlighted there are
sufficient bodies of work in each of these streams for a researcher to reference and
begin holistic model design and/or modification.
As mentioned one issue noted in historic projects is researchers abilities
(Bulgacs, 2014b). Historically there have been many studies that have sidelined
essential factors (Hevner et al. 2004; Stenmark, 2013). Here it is suggested that in
most cases this is because past contributors lacked knowledge in said missing field’s
e.g. why would a social scientist with a background in psychology be interested in
how the electronics inside a computer are designed or function? In relation to
human-technology interaction basic knowledge of this is essential as design and
technical characteristics have to be included in any analysis (Hevner & Chatterjee,
2010). “Even professional researchers, who ought to be ready to welcome change in
taken-as-given structures of thinking, show the same tendency to distort perceptions
of the world rather than change the mental structures we use to give us our bearings.”
(Checkland, 2000, p.18.). Cole et al. (2005) stated that any type of research must
make a dual contribution to both academia and practice. In the following sections it
will be discussed how two different paradigmatic approaches, one technical design
science engineering the other conceptual action research (Iivari et al. 1998) can be
combined to create a new process model for IT implementation and management.
There have been several papers which compare action research with design
science engineering (see Goldkuhl, 2013 for an in depth investigation) however the
aim of this paper is to create the building blocks for a workable model. In the
following sections the conceptual framework and domain limits will be established.
“Action Research aims to solve current practical problems while expanding
scientific knowledge. Unlike other research methods, where the researcher seeks to
study organizational phenomena but not to change them, the action researcher is
concerned to create organizational change and simultaneously to study the process”
(Baskerville & Myers, 2004, p.329).
Which means
“The researcher is actively involved, with explicit benefit for both researcher
and organization. Second, the knowledge obtained can be immediately applied;
There is not the sense of the detached observer, but that of an active participant
Wishing to utilize any new knowledge based on an explicit, clear conceptual
framework. Third, the research is a (typically cyclical) process linking theory and practice.” (Cole et al. 2005, p. 329).
IS based Action Research sits more comfortably in the organisational science domain
of the IT implementation subject (Wood-Harper, 1985). Some studies have stated that
its more relatable to consultation methodology (Avison et al., 2007; Baskerville,
1999). This is because it is a reactive technique. “This is a major criticism of Action
Research–it does not easily produce generalisable learning” (Wood-Harper & Avison,
2003). Here it is proposed that new components can be added that will formalise the
“open-ended” techniques incorporated making it a more standardised methodology
overall. However a competitive advantage via IT is only sustainable while the IT and
its usage cannot be replicated (Melville et al., 2004).
“As a sustainable source of competitive advantage benchmarking is inherently
suspect since it appears to emphasize the systematic observation and replication of competitive resources rather than the design specific applications” (Powell
and Dent-Micallef, 1997, p.383).
As stated the fundamental methodological constructs of Action Research are based in
psychology (Baskerville & Myers, 2004). This is not conducive to creating a holistic
IS implementation model when factoring in technological characteristics. The
problems this creates can clearly be seen when deconstructing Davis’s (1986)
Technology Acceptance Model (TAM) and its subsequent revisions and modifications
(Abugabah & Sanzogni, 2014; Bagozzi, 2007; Benbesat and Barki, 2007;
Venkatesh, et al. 2003). “Information systems is a multi-perspective discipline and
should have a pluralism of research methods.” (Wood-Harper, 1985, p. 165). Bulgacs
(2014a, p.154) states “attempting to view technological implementation from an
acceptance perspective gives the researcher an impression of a multidimensional if
not infinite subject. It is posited that this is the wrong way to deconstruct IT
implementation methodology.” Due to the human element of technology management
a balance (or re-balance) had to be found between flexible and
prescribed/standardised techniques. In response to these unbalanced approaches
Avison & Wood-Harper (1990b) proposed the Multi-view framework
“The main motivation for Multi-view was to include these human or organisational
aspects fully into information systems development. But we also wanted to suggest
that information systems development was not a step-by-step, prescriptive process,
but iterative and sometimes applied differently as circumstances dictated.”
(Wood-Harper & Avison, 2003. p. 6)
In practice the multi-view framework is applied using Action Research techniques
(Bell & Wood-Harper, 2003). However as an implementation methodology the
multi-view concept lacks definitive domains and is fuzzy in its determinants and
limits (Iivari et al.1998). This is due to the affect context has on the applied ‘soft
systems’ method. Also diagnosis paths through the multi-view model are unclear and
even confusing due to the models none sequential nature. Although it included
conceptual technology maintenance and modification facets they were
underdeveloped and as in the TAM and comparable models it provides no hard or
quantitative technical solutions. As mentioned these are factors that are essential to
human computer interaction analysis. Later Multi-view 2 was developed in an attempt
to address these issues (Avison et al., 1998). However again it was interpretive in its
approach (Baskerville, 1999) and did not offer standardised solutions.
“The Multi-view framework has been applied in a number of situations. None
describes Multi-view working perfectly in an organization according to prescription, but all have delivered lessons furthering Multi-view development.
Multi-view includes tools and techniques blended into a common approach,
Each used on a contingency basis, that is, as appropriate for each problem situation.” (Avison et al., 1999. p.95).
Some researchers such as Hevner et al. (2004) and Peffers et al. (2008) have
attempted to take a more structured approach to the subject and utilised an
engineering principles. For instance Bulgacs (2013) incorporated the human decision
making process into an interactive software program. “Design Science research is
poised to take its rightful place as an equal companion to natural science research in
the Information Systems (IS) field.” (Hevner, 2007, p.87).
“Design Science, as conceptualized by Simon (1996), supports a pragmatic research paradigm that calls for the creation of innovative artefacts to solve
real-world problems. Thus, Design Science Research combines a focus on the IT
artefact with a high priority on relevance in the application domain.” (Hevner & Chatterjee, 2010, p. 9).
Hevner proposed that an ‘artefact’ could be designed that incorporated processes that
allowed standardised practical solutions to be offered for IT implementation and
management problems. Peffers et al. (2008) agreed and stated that there was enough
published research for a conceptual model to be created. They also stated that the
Design Science Research Methodology (DSRM) could be used in different contextual
settings. The approach generated a standardised method, which is an issue Action
Research practitioners have been struggling with (Seine et al. 2011). However
“Current DR methods are based on stage-gate models in that they separate and
sequence building and evaluation. Thus, they do not support the conditions necessary
� Should the new model/process being designed be an action research methodology
that includes system design factors or be a system design method that incorporates
socio/political aspects?
Hevner & Chatterjee (2010, p. 18) state “The risk comes when experts in other
research paradigms attempt to apply their standards of rigor to design research
projects in which creative inspiration or gut instinct may lead to design decisions.”
“From a practical standpoint, it is expected that if a new system or method is
constructed it should provide better solutions to IS problems than existing systems or methods – that is, it should be faster, more efficient, more elegant, or
have some other feature that makes it a superior solution.” (Burstein & Gregor,
1999, p.128)
As stated there are many conceptual models which have been utilised in IS based
research this is confounding as most of these work on an expository basis (Peffers et
al, 2008) i.e., they show how a system works but offer no way of improving them or
any solutions to ‘hard’ issues (Straub & Burton-Jones, 2007). Conceptual models are
quantified in some part and presented as guidelines, processes and procedures to
follow. These “guidelines” are attempts to standardise methodology. This is useful to
engineers because standardisation facilitates smooth integration of both off-the-shelf
and custom components during initial development and subsequent maintenance
phases (Hasselbring, 2002).
Terminology use within the IS discipline has become a problem due to
researchers from different academic disciplines using words and terms which,
although familiar to those from the same discipline, can mean something completely
different to a researcher from another. This ultimately creates confusion and
misunderstanding (Bulgacs, 2015). For instance implemented technology has been
referred to as an artefact (Orlikowski and Iacono, 2001) but then the methodology has
also been described as an artefact (Lee, 2007). Here this is considered a mistake, the
technology is exactly that technology or Information Technology, you could go even
further and specifically state that the technological “artefacts” are in fact personal
computers! In relation to using computers to solve problems Iilvari et al. (1998, p.
175) state “As artefacts, they do not describe any existing reality rather they create a
new one”. Hence descriptive analysis and explanations are useful for theorizing and
finding reference points when evaluating which notation and conceptual tools to
include in practical models (Iivari et al. 1998). So clearly selecting as general a
terminology as possible is preferential but what about when you add new factors and
explanations which disciplines language is best to use? Establishing a standardised
language is one of the first steps in defining a discipline. However will a combined
Action Research/Design Science model require the creation of more conceptual tools,
techniques and tasks? Any words that describe these would have to be selected
carefully so as not to add to the confusion.
As the manner in which IT systems function are reliant on the contextual
environment said environment needs to be investigated so specific context is
understood by the researcher. In many cases it is likely that after an initial
investigation ‘off the shelf’ solutions could be offered rather than a customized one.
Clients - customers: ‘beneficiaries or victims affected by the system’s activities’.
Actors - actors: ‘agents who carry out, or cause to be carried out, the main activities
of the system, especially its main transformation’.
Transformation - transformation process: ‘the means by which defined inputs are
transformed into defined output’ (where input is current situation
and output is desired situation).
World view - Weltanschauung: ‘an outlook, framework or image that makes this
particular root definition meaningful’.
Owner - ownership of the system: ‘some agency having a prime concern for the
system and the ultimate power to cause the system to cease to exist’.
Environmental constraints - environmental constraints: ‘features of the system’s
environments and/or wider systems which it has to take as ‘‘given’’.
(Basden & Wood-Harper 2006, p. 62.)
CATWOE is also comparable to object oriented analysis. Both methods use system,
subsystems and wider system (Checkland, 2000). We will not go too deeply into SSM
here as this would distract from the overall purpose of this paper. Transformation is
especially interesting in the CATWOE model as this relates to strategic alignment
between company goals and intended use of technology.
1. You must accept and act according to the assumption that social reality is socially
constructed, continuously.
2. You must use explicit intellectual devices consciously to explore, understand
and act in the situation in question.
3. You must include in the intellectual devices ‘holons’ in the form of
systems models of purposeful activity built on the basis of declared
worldviews. (Checkland, 2000, p.38.)
Any change/improvement process in a company that relates to IT is a dynamic
process that changes as more information is discovered the human element obviously
includes debates with those involved in the situation being investigated “but do not
expect the debate to be tidy or predictable; be deft, light on your feet, ready to follow
where the debate leads, unready to follow any dogmatic line.” (Checkland, 2000, p.
33.) In some respects Action and Design Science research are both qualitative and quantitative. It is the point were these too techniques meet that integration between
the two methodologies can begin.
4.2 Practical framework, architecture and boundaries
Here a method is taken from soft systems analysis (Checkland, 1993; Checkland
2000). This initially requires the person carrying out the analysis to view the system
being evaluated holistically. This is because the componentisation process cannot
begin until the limits or domain of the issue have been defined. The objective of the
evaluation is to make an IT/IS system better somehow. This means viewing the
problem being addressed in the “rich” sense and attempting to include as many facets
and factors as possible (Monk & Howard, 1998). Keeping in mind the three most
important aspects human behaviour/activity, organisation and technology. Soft
systems methodology moves the initial view from a specific problem to “the idea of a
situation which some people, for various reasons, may regard as problematical.”
(Checkland, 2000, p. 15). However the rich picture itself could create problems later
in the analysis as the researcher is already beginning to define where the issue may be.
This can be solved by allowing each “component” of the diagram to be modifiable.
The initial framework includes the three main aspects of IS management those being
the aforementioned technological, behavioural and organisation. The first question
that needs to be asked is “what is the problem that needs addressing?”. Here it is
suggested that the analyst uses the definitions of technology, organisation and
behaviour defined by Bulgacs’s (2013) ITIP project. The first step is too define the
problem domain and parameters. Here this is IT technology management. The second
is to identify the primary elements. In this case they are behaviour, technology and
organisation. Then issues can be subdivided into components and added to the
specific elements the problem is defined as being. Further more Bulgacs (2014a,
p.154) stated “A reference point needed to be identified that would include all actors
(developers, managers and users) the domain limits were found to be at the point of
sale, i.e., intersection of a person/company selling IT and a person/company wishing
to buy IT”.
To create a set of guidelines or standard process you have to follow the
definitions defined in the domain description/framework. “An application domain
consists of the people, organizational systems, and technical systems that interact to
work toward a goal.” (Hevner & Chatterjee, 2010, p. 17). Once the study framework
has been set the componentization of all incorporated factors can go on virtually
indefinitely starting at the broad domain subject area down to lets say how the brain
functions electro-chemically Once the “problem/issue” has been defined create
possible solutions to it. Seine et al. (2011) suggested building a conceptual
solution/model to a problem initially (Alpha version) then using the information
created in this stage to design an actual system prototype (Beta version), essentially a
simulation.
In AR while carrying out an exploratory activity it is very easy to miss an
essential evaluative factor such as not interviewing the most important person
involved in the companies IT processes. Hence a researcher has to see past the
scenery to view the real environment. The researcher needs to create an
Inter-subjective model (which includes both subjective and objective factors).
“Design research projects are often performed in a specific application context and the resulting designs and design research contributions may be clearly
influenced by the opportunities and constraints of the application domain.
Additional research may be needed to generalize the research results to broader domains.” (Hevner &Chatterjee, 2010, p. 15).
Researchers and practitioners have to be included in an implementation project as this
allows the inclusion of practical as well as technical issues (Bulgacs, 2014a).
“Action design researchers bring their knowledge of theory and technological
advances, while the practitioners bring practical hypotheses and knowledge of
organizational work practices.” (Seine et al, 2011, p. 43). Action Research breaks
down into activities that have to be carried out to solve the problem situation
contextually. “Good design science research often begins by identifying and
representing opportunities and problems in an actual application
are coming up with completely different solutions (Wood-Harper & Avison, 2003). It
is essentially an anti-standardisation technique that holistically only propagates IT
standardisation and connectivity issues. The classic definition of AR is localised and
does not create generalisable knowledge (Goldkuhl, 2013).
“AR views organizations as a configuration of interacting variables, some of
which are highly interdependent; to introduce change into this configuration,
one begins with several possible points of intervention and discovers that change may require manipulation of several variables” (Cole et al. 2005, p. 329).
In Action Research it is assumed that you cannot know the best solution to a problem
until an analysis has been carried out (Baskerville, 1999, Coughlan & Coughlan,
2002). However as a researcher will be dealing with some form of standardised IT
system and there have been many publications in the subject it is very unlikely they
will come up with a completely original solution. Hence an experienced researcher
will have a general idea of what will fix the problem before it is diagnosed. Seine et
al. (2011, p. 45) stated “At the time of publication, we reasoned that the methodology
used, canonical action research provided little support for interweaving the building
of the IT artefact, intervening in the organization, and evaluating.”
A noticeable problem is those from a social science background choosing
social, cultural and political solutions to problems over technical ones. “Action
research in local situations is concerned not with social facts but with study of the
myths and meanings which individuals and groups attribute to their world and so
make sense of it.” (Checkland, 2000, p. 42.) Some researchers and practitioners state
that Action Research is a “flexible” technique whereas others argue that it is too open
to interpretation and lacks cohesive rules and processes (Avison, et al, 1999; Bulgacs,
2015; Holwell, 1997). When you consider that the entire issue revolves around using
technology to solve problems it soon becomes apparent how bizarre a position this is
to take. Essentially if you minimise the technological component of IT management,
which exact technology based problems are going to be solved? Soft Systems/Action
Research is already a quantitative methodology at heart, e.g. try to explain to someone
what an in-house IT system is without using technical jargon. Even though some
Action Researchers have tried to avoid a technical focus it clearly isn't possible.
Indeed this is one of the problems with the methodology, it doesn’t really seem to like
itself! Action Research practitioners use mental models to define what individuals are
doing within a process (Checkland, 2000), as mentioned, this is componentising the
issue. Many of those who have published in the subject appear to have lost sight of
this. For a comprehensive investigation no fundamental aspect of the IS being
evaluated can be sidelined or ‘black boxed’. In a way Action Research black boxes
everything initially and only opens the boxes when they are found to be relevant.
“Action research should be conducted in such a way that the whole process
is subsequently recoverable by anyone interested in critically scrutinizing the
research. This means declaring explicitly, at the start of the research, the intellectual frameworks and the process of using them which will be used to
define what counts as knowledge in this piece of research.”
“Under the CPM, the researcher conducts an independent diagnosis of the
organization, plans actions based on that diagnosis, and then implements and
evaluates those change actions. Following a change intervention, the researcher
reflects on intervention outcomes and makes an explicit decision whether to proceed through an additional change cycle.” (Cole et al. 2005, p. 231).
The continual evaluation of the effects that environment and context (particularly
human-human and human-technology interaction) have on the design process is
something that is lacking in typical design led approaches (Greenberg & Buxton,
2008; Redish, 2007). “The iterations stop either when the organization decides to
adopt or reject the ensemble artefact, and/or when the contributions of additional
cycles are marginal.” (Seine et al, 2011, p. 42). Most notable is the fact that in Action
Research the change cycle generally either involves training of staff or modifying the
technology both of which are relatable to Design methodology. However from a
design perspective includes the introduction of a possible prototype IT artefact/system
(McLaren & Buijs, 2011).
4.4 Design Science
“Design Science “is fundamentally a problem-solving paradigm. It seeks to
create innovations that define the ideas, practices, technical capabilities, and
products through which the analysis, design, implementation, and use of
information systems can be effectively and efficiently accomplished”.
(Hevner and Chaterjee 2010, p.11).
Design Science research emanates from the engineering disciplines (Goldkuhl, 2013).
“Engineering research culture places explicit value on incrementally effective
applicable problem solutions” (Peffers et al. 2008, p. 49). “One of the defining features of the design science approach is that the evaluation of the new IS artefact
involves broadly answering the question ‘How well does it work?’ rather than merely
how valid or a reliable it is” (Mclaren & Buijs, 2011, p.3.). One of the advantages of
the Design Science approach is that it is a method familiar to those involved in design
of any description such as engineering, architecture and the arts (Hevner, 2007).
Which is to say the designer generally has some idea what the final artefact will be
(even a painter knows that the final outcome will not be a song but be a painting of
some description).
“Design, the act of creating an explicitly applicable solution to a problem, is an accepted research paradigm in other disciplines, such as engineering, it has been
employed in just a small minority of research papers published in our own best
journals to produce artefacts that are applicable to research or practice.” (Peffers et al. 2008, p. 49.)
Design science research aims to create prescribed solutions to problems utilising
standardised processes and techniques (Goldkuhl, 2013). “This articulation of the
class of problems, the class of solutions, and the design principles for this class
directly satisfied Action Design Research’s generalization principle.” (Seine et al.
2011, p 50). Historically the Action Research and Design Engineering paradigms
have evolved separately from each other (Cole et al. 2005).
“Justifying the value of a solution accomplishes two things: it motivates the
researcher and the audience of the research to pursue the solution and to accept
the results and it helps to understand the reasoning associated with the
researcher’s understanding of the problem. Resources required for this activity
include knowledge of the state of the problem and the importance of its solution.”
(Peffers et al. 2008, p.60).
“In the early 1990’s the IS community recognized the importance of Design Science
Research to improve the effectiveness and utility of the IT artefact in the context of
solving real-world business problems.” (Hevner & Chaterjee, 2010, p.9). As with
many engineering projects initially Hevner at al. (2004) did not include a theorize
and justify phase. “The main result from design research is the artefacts designed”
(Goldkuhl, 2013, p. 10). However how is an Engineer supposed to conceptualise a
technical solution to a problem with out “theorizing”. This is a major oversight in
Goldkuhls study. The difference between an engineer and a social scientist
conceptualising a problem is that an engineer generally does not consider abstract
concepts. In many respects conceptual models are a form of knowledge engineering.
In the case of designing standardised IT implementation formulas and techniques
abstract concepts are nigh on useless. That is to say an engineer can only analyse
variable abstract factors they cannot evaluate them. Hence they will view any kind of
abstraction in designing a standardised model as pointless. “The main reason for this
lack of attention is probably due to the fact that conceptual modelling is more of an
‘art’ than a ‘science’ and therefore it is difficult to define methods and procedures”
(Robinson, 2008, p.281.). However design engineering is fed by behavioural research
information/data for the evaluation to process, hence two more questions Q2 and Q3
are asked. Once the user has input the information the function can carry out the
evaluation and then open the next question set. This process is repeated until the
program converges on a solution. As can be seen this connects together the decision
making diagram in Figure 1 and the componentisation diagram in Figure 7.
Essentially each function is asking what decision needs to be made and what
information is required to make the decision. The researcher/analyst needs to order
the questions in relation to what is perceived to be highest to lowest relevance.
However the order may need to be changed or an evaluation/component modified or
redefined later in the analysis. Essentially the information gained from asking one set
of questions selects the next set of questions asked (sequential). The Iterative IS/IT
design cycle needs to be adaptable to the contextual socio-political environment the
IT “artefact”/system is going to function within. The results of any system
intervention can be analysed and used to guide the design process and aid in
generating innovations. “Various forms of the organizational context are thus
inscribed into the artefact during its development and use.” (Seine et al. 2011, p. 40).
“The emerging artefact, as well as the theories ingrained in it, are continuously instantiated and repeatedly tested through organizational intervention and
subjected to participating members’ assumptions, expectations, and
knowledge. This highly participatory process builds organizational commitment and guides the eventual design of the ensemble artefact.”
(Seine et al. 2011, p. 42).
Any model created has to be understood and usable by both technologists and
managers (Hevner & Chaterjee, 2010). It is no good going all technological when an
average business manager needs to understand explanations and vice versa.
5 Methodology comparison.
Every company has a business architecture that can be used to describe all the
processes, tasks, activities, hierarchies, departments, functions etc. that exist within it
(McMillan, 2002). As stated each of these individual descriptions can be
componentized and connected using object oriented techniques. However consensus
on the steps/inter-connections of how the activities related to IT management and
implementation should be ordered is elusive (Seine at el. 2011). Peffers et al. (2008,
p.46.) conceptually componentised the steps utilised for design science research thus;
Bulgacs, S.(2015) The Fourth Phase of Creating a Standardised International
Information Technology Implementation, Modification and Maintenance Software Application: The ITIP Project. The International Journal of Business and Systems
Research, Vol. 9, No. 1, pp. 1-31.
Burstein, F. & Gregor, S. (1999) The Systems Development or Engineering Approach
to Research in Information Systems: An Action Research Perspective.
Proceedings The 10TH Australasian Conference on Information Systems.
Australia, pp. 122- 135.
Cater-Steel, A., Rout, T. and Toleman, M. (2005) Addressing the Challenges of Replication
Surveys in Software Engineering Research, IEEE.
Checkland, P. (1993) Systems Thinking, Systems Practice, John Wiley and Sons, West
Sussex, England. Checkland, P. (2000) Soft Systems Methodology: A Thirty Year Retrospective. Systems
Research and Behavioural Science, Vol. 17, No.1, pp. 11-58.
Cole R, Purao S, Rossi M, Sein M (2005) Being Proactive: Where Action Research meets Design Research, Proceedings of the Twenty-Sixth International Conference on
Information Systems, Las Vegas, p 325-336
Conradi, R., Fernstrom, C. and Fugetta, A. (1994) A Conceptual Framework for Evolving Software Processes, in Finkelstein, A., Kramer, J. and Bashar, A.N. (Eds.):
Software Process Modelling and Technology, John Wiley and Sons, West
Sussex, England. Coughlan, P., & Coghlan, D. (2002) Action research for operations management.
International Journal of Operations & Production Management, Vol. 22, No. 2,
pp. 220-240. Davis, F.D. (1986) A Technology Acceptance Model for Empirically Testing New End-User
Information Systems: Theory and Results. Doctoral dissertation, MIT Sloan
School of Management, Cambridge, MA. Desanctis, G., & Gallupe, R. B. (1987). A foundation for the study of group decision
support systems. Management science, Vol. 33, No, 5, pp. 589-609.
Fichman, R.G. (1992) Information Technology Diffusion: A Review of Empirical Research,
ICIS (92) Proceedings of the Thirteenth International Conference on Information
Systems, University of Minnesota, Minneapolis, USA, pp.195–206.
Goldkuhl G (2013) Action research vs. design research: using practice research as a lens
for comparison and integration, accepted to the 2nd Workshop on IT Artefact
Design & Workpractice Improvement, Tilburg
Gössler, G., Graf, S., Majster-Cederbaum, M., Martens, M., & Sifakis, J. (2007). An
approach to modelling and verification of component based systems. In SOFSEM
2007: Theory and Practice of Computer Science (pp. 295-308). Springer Berlin
Heidelberg.
Greenberg, S., & Buxton, B. (2008). Usability evaluation considered harmful (some of the time). In Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems, pp. 111-120.
Guazzelli, A., Stathatos, K., & Zeller, M. (2009). Efficient deployment of predictive analytics
through open standards and cloud computing. ACM SIGKDD Explorations
Newsletter, Vol. 11, No, 1, pp.32-38.
Hasselbring, W. (2000) Information system integration. Communications of the
ACM, Vol 43, No.6, pp. 32-38.
Hasselbring, W. (2002) Component-Based Software Engineering. In S.K. Chang (ed.):
Handbook of Software Engineering and Knowledge Engineering, vol. 2, World
Henderson, J.C. and Venkatraman, N. (1993) ‘Strategic alignment: leveraging
information technology for transforming organizations’, IBM Systems Journal,
Vol. 32, No. 1, pp.4–16.
Hevner, A. (2007) A Three Cycle View of Design Science Research. Scandinavian
Journal of Information Systems, Vol. 19, No. 2, pp. 87-92.
Hevner A, & Chatterjee S (2010) Design research in information systems. Theory and
practice, Springer, New York
Hevner, A.R., March, S.T., Park, J. & Ram, S.(2004) Design Science in Information Systems
Research. MIS Quarterly, Vol. 28, No. 1, pp. 75-105.
Holwell, S. E. (1997) Soft systems methodology and its role in information systems,
Ph.D. Dissertation, Lancaster University, UK. Hsu, C.H. and Lin, J.C.C. (2008) Acceptance of Blog usage: the roles of technology
acceptance, social influence and knowledge sharing motivation. Information
and Management, No. 45, pp.65–74. Hu, X., Zeigler, B.P. & Mittal, S (2005) Variable Structure in DEVS Component-Based
Modelling and Simulation. Simulation, Vol. 81, No. 2, pp. 91-102.
Iivari, J., Hirscheim, R. & Klein, H.(1998) A Paradigmatic Analysis Contrasting
Information Systems Development Approaches and Methodologies.
Information Systems Research, Vol.9, No.2, pp. 164-193.
Järvinen P (2007) Action research is similar to design science, Quality &
Quantity, Vol 41, p 37–54
King, R. and He, J. (2006) A Meta-analysis of the Technology Acceptance Model.
Information and Management, May, Vol. 43, No. 6, pp.740–755 [Sourced from
Google Scholar 11/11/11].
Lee A (2007) Action is an artifact: What action research and design science offers
each other, in Kock N, (Ed. 2007) Information Systems Action Research. An Applied View of Emerging Concepts and Methods, Springer.
Leek, T. & Peng, R. (2015) What is the Question? Science, Vol. 347, No. 6228,
pp. 1314-1315. Legris, P., Ingham, J. and Collette, P. (2002) Why do People Use Information Technology?
A Critical Review of the Technology Acceptance Model. Information &
Management, Vol. 40, No. 2003, pp.191–204. Liew, A. and Sundaram, D. (2005) ‘Complex decision making processes: their model and
support’, Proceedings of the 38th Hawaii International Conference on System
Sciences 2005. Lin, C. T., & Lee, C. S. G. (1991). Neural-network-based fuzzy logic control and
decision system. Computers, IEEE Transactions on, Vol. 40, No. 12,
pp. 1320-1336.
Mamdani, E. H., & Assilian, S. (1975). An experiment in linguistic synthesis with
a fuzzy logic controller. International journal of man-machine studies,
Vol. 7, No. 1, pp. 1-13.
March, S. & Hevner A. (2007) Integrated Decision Support Systems: A Data
Warehousing Perspective. Decision Support Systems, Vol. 43, No. 2007,
pp. 1031-1043.
March, S. & Storey, V. (2008) Design Science in the Information Systems Discipline: An Introduction to the Special Issue on Design Science Research. MIS Quarterly,
Vol. 32, No. 4, PP. 725-730.
McLaren, T. & Buijs, P. (2011). A design science approach for developing information
systems research instruments. [online] http://www.rug.nl/staff/p.buijs/design_science
Melville, N., Kraemer, K. and Gurbaxani, V. (2004) Information Technology and
Organizational Performance: An Integrative Model of IT Business Value. MIS
Quarterly, Vol. 28, No. 2, pp.283–322.
Mitroff, I & Silvers, A (2008) Dirty Rotten Strategies: How We Trick Ourselves into
Solving the Wrong Problems Precisely, Stanford, California: Stanford
Business Books.
Monk, A & Howard, S (1998) The rich picture: a tool for reasoning about work context.
Interactions, Vol. 5, No 2), pp. 21–30. Myers, M.D. (1995) A Disaster for Everyone to See: An Interpretive Analysis of a Failed IS
Project. Accounting Management and Information Technology, Vol. 5, No.4,
pp.185-201.
O'Brien, R. (2001), “An Overview of the Methodological Approach of Action Research”, In Roberto Richardson (Ed.) “Theory and Practice of Action Research” Brazil:
Universidade Federal da Paraíba http://www.web.ca/~robrien/papers/arfinal.html,
(Accessed 20/4/2015)
Orlikowski, W.J. and Iacono, C.S. (2001) ‘Research commentary: desperately seeking the
‘IT’ in it research – a call to theorizing the IT artefact’, Information Systems
Research, Vol. 12, No. 2, pp.121–134.
Peffers, K. Tuunanen, T., Rothenberger, M. & Chatterjee, S. (2008) A Design Science
Research Methodology for Information Systems Research. Journal of
Management Information Systems. Vol .24, No. 3, pp. 45-78.
Pries-Heje, J & Baskerville, R. (2008) The Design Theory Nexus. MIS Quarterly.
Vol. 32, No. 4, pp. 731-755.
Powell, T.C. and Dent-Micallef, A. (1997) Information Technology as Competitive Advantage: The Role of Human, Business and Technology Resources. Strategic
Management Journal, Vol. 18, No. 5, pp.375–405.
Rapoport, R. N. (1970) Three Dilemmas in Action Research with Special Reference
to the Tavistock Experience. Human Relations, Vol. 23, No.6, pp. 499-513. Robey, D. and Boudreau, M.C. (1999) Accounting for Contradictory Organizational
Consequences of Information Technology: Theoretical Directions and
Methodological Implications. Information Systems Research, Vol. 10, No. 2, pp.167–185.
Robinson, S. (2008). Conceptual modelling for simulation Part I: definition and
requirements. Journal of the operational research society Vol. 59, No. 3,
pp. 278-290.
Sawang,S. (2008) Innovation Implementation Effectiveness: A Multi-organizational Test of
Klein Conn and Sorra’s Model. Ph.D Thesis, Queensland University of
Technology.
Schepers, J. and Wetzels, M. (2006) A Meta-analysis of the Technology Acceptance Model:
Investigating Subjective Norm and Moderation Effects. Information and
Management, Vol.44, No. 2007, pp.90-103.
Shmueli G, & Koppius O (2011) Predictive analytics in information systems research.
Management Inform. Systems Quart. Vol. 35, No. 3, pp. 553 – 572.
Seine, M., Henfridsson,O., Purao, S. & Lindgren, R. (2010) Action Design Research. MIS Quarterly, Vol. 36, No. 1, pp. 37- 56.
Sheffield, J. (2005) Systemic knowledge and the V-model. Int. J. Business Information
Systems, Vol.1, Nos. 1/2, pp.83–101. Stahl, G. (2000) Collaborative Information Environments to Support Knowledge
Construction by Communities. AI Society, Vol. 14, No. 1, pp.1–27.
Stenmark, D. (2013) Distrust in Information Systems Research: A Need for Stronger Theoretical Contributions to Our Discipline. IEE Computer Society. Proceedings:
46th Hawaii Conference on System Sciences, pp. 4532-4540.
Straub Jr., D.W. and Burton-Jones, A. (2007) Veni, Vidi, Vici: Breaking the TAM Logjam. Journal of the Association for Information Systems, Article 5, Vol. 8, No. 4,
guide.com/decision-tree.html (accessed 22 March 2015). Turner, M., Kitchenham, B., Brereton, P., Charters, S. and Budgen, D. (2010) Does the
Technology Acceptance Model Predict Actual Use? A Systematic Literature
Review. Information and Software Technology, No. 52, pp.463–479. Redish, J. (2007) Expanding usability testing to evaluate complex systems. Journal
of Usability Studies, Vol. 2, No. 3, pp. 102-111.
Venkatesh, V., Morris, M.G., Davis, G.B. and Davis, F.D. (2003) User Acceptance
of Information Technology: Toward a Unified View. MIS Quarterly, Vol. 27, No. 3, pp.425–478.
Vidgen, R., Wood-Harper, T. & Wood, R. (1993) A Soft Systems Approach to Information
Systems Quality. Scandinavian Journal of Information Systems, Vol. 5, No.1, pp 97-112.
Volta, A. (1800). On the electricity excited by the mere contact of conducting substances of
different kinds. In a letter from Mr. Alexander Volta, FRS Professor of Natural Philosophy in the University of Pavia, to the Rt. Hon. Sir Joseph Banks, Bart.
KBPRS. Philosophical transactions of the Royal Society of London, pp. 403-431.
Walsham, G (1995) Interpretive Case Studies in IS Research: Nature and Method.
European Journal for Information Systems, Vol. 4, pp 74-78.
Wells, M.A. (2012) Organisational Change and Acceptance: Perspectives of the Technology
Acceptance Model. in Vaidya, K. (Ed.): Inter-Organizational Information
Systems and Business Management: Theories for Researchers, IGI Global,
Pennsylvania, USA.
White, S. E., Dittrich, J. E., & Lang, J. R. (1980). The effects of group decision-
making process and problem situation complexity on implementation
attempts. Administrative Science Quarterly, Vol. 25, No. 3, pp. 428-440. Wieringa R, Morali A (2012) Technical action research as a validation method in
information systems design science, Proceedings DESRIST 2012, LNCS 7286, Springer, Berlin
Wixom, B.H. and Todd, P.A. (2005) A Theoretical Integration of User Satisfaction and
Technology Acceptance. Information Systems Research, Vol. 16, No. 1, pp.85–102.
Wood-Harper, T. (1985) Research Methods in Information Systems: Using Action Research.
Mumford et al., Eds. Elsevier Science Publishers: Holland. Wood-Harper, A. T., & Avison, D. E. (2003). Bringing Social and Organisational
Issues into Information Systems Development: The Story of Multi-view. in
Clarke, S., Coakes, E., Hunter, M. & Wenn, A (eds.) Socio-technical and Human Cognition Elements of Information Systems. London: Information
Science Publishing, pp. 5-21.
Wood-Harper, T. & Wood, B. (2005) Multi-view as Social Informatics in action: Past,
Present and Future. Information Technology and People, Vol. 18, No. 1,
pp. 26-32.
Xiangrong, L. (2010) Hierarchical Decision Making with Supply Chain Applications,
PhD thesis, Drexel University, North America.
Zadeh, L. A. (1996). Fuzzy logic= computing with words. Fuzzy Systems, IEEE