Top Banner
Evaluating ecommerce websites cognitive efciency: An integrative framework based on data envelopment analysis Corrado lo Storto * University of Naples Federico II, Department of Industrial Engineering, Piazzale V. Tecchio n. 80, 80125 Naples, Italy article info Article history: Received 25 March 2011 Accepted 28 March 2013 Keywords: Cognition Humanecomputer interaction Electronic commerce Website Efciency Data envelopment analysis Uncertainty Ambiguity abstract This paper presents an integrative framework to evaluate ecommerce website efciency from the user viewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven from theories of information processing and cognition and considers the website efciency as a measure of its quality and performance. When the users interact with the website interfaces to perform a task, they are involved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, and experiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and un- certainty, and the search (over-)time during navigation that they perceive determine the effort size e and, as a consequence, the cognitive cost amount e they have to bear to perform their task. On the contrary, task performing and result achievement provide the users with cognitive benets, making interaction with the website potentially attractive, satisfying, and useful. In total, 9 variables are measured, classied in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 52 ecommerce websites that sell products in the information technology and media market. A stepwise regression is performed to assess the inuence of cognitive costs and benets that mostly affect website efciency. Ó 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved. 1. Introduction The World Wide Web (WWW) has dramatically changed the way consumers buy goods and services, collect information to compare products, and companies conduct their business. Con- sumers can choose among several websites when surf the net, and henceforth websites compete to grasp their attention and receive visits in the future. Recent statistics indicate that although there is a growing number of people that use the WWW to search for product information, to do price comparisons, and to collect useful infor- mation in order to make their purchasing decision, the number of actual online purchases remains still relatively small (Moe and Fader, 2004). Findings from a research carried on by IPSOS Belgium in 2006 (IPSOS, 2008a,b) showed that a not negligible percentage (38%) of consumers did not buy online as they found it difcult to make a choice among a great range of available products, features and parameters. Poor website design can be a major factor that negatively affects the consumer purchasing decisions and, as a consequence, the protability of ecommerce businesses and re- tailers may be losing a large percentage of potential sales simply because their websites are confusing and difcult to use (Lais, 2002; Nielsen, 2001). Several scholars found that website charac- teristics may affect the perception of the website quality developed by the users, and, as a consequence, their purchasing behavior and decision outcome (Dholakia and Rego, 1998; Ho and Wu, 1999; Jarvenpaa and Todd, 1997; Ranganathan and Ganapathy, 2002; Wolnbarger and Gilly, 2002). Good or improved website design may even enhance credibility of information content, supporting e for instance e navigation of anxious users who are more suscep- tible to deception (Singh and Iding, 2011). Thus, for those busi- nesses that use the WWW as a marketing channel it is crucial to assess the appreciation rate for their website either against the competitorswebsites or more conventional marketing tools and channels, and to determine what makes a consumer satised with the website, as well as what might be potential causes of dissatis- faction (Djamasbi et al., 2010). In the literature, several methods and approaches to website evaluation have been proposed. They differ as to the goal of the evaluation (usability measurement, website effectiveness evalua- tion, interface features denition, website quality assessment), how information and data useful for the evaluation are collected (ob- servations and log activity, heuristics, focus groups, questionnaires, brainstorming, website features enumeration, cognitive walk- * Tel.: þ39 081 768 2932; fax: þ39 081 768 2154. E-mail address: [email protected]. Contents lists available at SciVerse ScienceDirect Applied Ergonomics journal homepage: www.elsevier.com/locate/apergo 0003-6870/$ e see front matter Ó 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved. http://dx.doi.org/10.1016/j.apergo.2013.03.031 Applied Ergonomics 44 (2013) 1004e1014
11

Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

Feb 25, 2023

Download

Documents

Carlo Capuano
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

at SciVerse ScienceDirect

Applied Ergonomics 44 (2013) 1004e1014

Contents lists available

Applied Ergonomics

journal homepage: www.elsevier .com/locate/apergo

Evaluating ecommerce websites cognitive efficiency: An integrativeframework based on data envelopment analysis

Corrado lo Storto*

University of Naples Federico II, Department of Industrial Engineering, Piazzale V. Tecchio n. 80, 80125 Naples, Italy

a r t i c l e i n f o

Article history:Received 25 March 2011Accepted 28 March 2013

Keywords:CognitionHumanecomputer interactionElectronic commerceWebsiteEfficiencyData envelopment analysisUncertaintyAmbiguity

* Tel.: þ39 081 768 2932; fax: þ39 081 768 2154.E-mail address: [email protected].

0003-6870/$ e see front matter � 2013 Elsevier Ltdhttp://dx.doi.org/10.1016/j.apergo.2013.03.031

a b s t r a c t

This paper presents an integrative framework to evaluate ecommerce website efficiency from the userviewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven fromtheories of information processing and cognition and considers the website efficiency as a measure of itsquality and performance. When the users interact with the website interfaces to perform a task, they areinvolved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, andexperiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and un-certainty, and the search (over-)time during navigation that they perceive determine the effort size e

and, as a consequence, the cognitive cost amount e they have to bear to perform their task. On thecontrary, task performing and result achievement provide the users with cognitive benefits, makinginteraction with the website potentially attractive, satisfying, and useful. In total, 9 variables aremeasured, classified in a set of 3 website macro-dimensions (user experience, site navigability andstructure). The framework is implemented to compare 52 ecommerce websites that sell products in theinformation technology and media market. A stepwise regression is performed to assess the influence ofcognitive costs and benefits that mostly affect website efficiency.

� 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

1. Introduction

The World Wide Web (WWW) has dramatically changed theway consumers buy goods and services, collect information tocompare products, and companies conduct their business. Con-sumers can choose among several websites when surf the net, andhenceforth websites compete to grasp their attention and receivevisits in the future. Recent statistics indicate that although there is agrowing number of people that use theWWW to search for productinformation, to do price comparisons, and to collect useful infor-mation in order to make their purchasing decision, the number ofactual online purchases remains still relatively small (Moe andFader, 2004). Findings from a research carried on by IPSOSBelgium in 2006 (IPSOS, 2008a,b) showed that a not negligiblepercentage (38%) of consumers did not buy online as they found itdifficult tomake a choice among a great range of available products,features and parameters. Poor website design can be a major factorthat negatively affects the consumer purchasing decisions and, as aconsequence, the profitability of ecommerce businesses and re-tailers may be losing a large percentage of potential sales simply

and The Ergonomics Society. All ri

because their websites are confusing and difficult to use (Lais,2002; Nielsen, 2001). Several scholars found that website charac-teristics may affect the perception of the website quality developedby the users, and, as a consequence, their purchasing behavior anddecision outcome (Dholakia and Rego, 1998; Ho and Wu, 1999;Jarvenpaa and Todd, 1997; Ranganathan and Ganapathy, 2002;Wolfinbarger and Gilly, 2002). Good or improved website designmay even enhance credibility of information content, supporting e

for instance e navigation of anxious users who are more suscep-tible to deception (Singh and Iding, 2011). Thus, for those busi-nesses that use the WWW as a marketing channel it is crucial toassess the appreciation rate for their website either against thecompetitors’ websites or more conventional marketing tools andchannels, and to determine what makes a consumer satisfied withthe website, as well as what might be potential causes of dissatis-faction (Djamasbi et al., 2010).

In the literature, several methods and approaches to websiteevaluation have been proposed. They differ as to the goal of theevaluation (usability measurement, website effectiveness evalua-tion, interface features definition, website quality assessment), howinformation and data useful for the evaluation are collected (ob-servations and log activity, heuristics, focus groups, questionnaires,brainstorming, website features enumeration, cognitive walk-

ghts reserved.

Page 2: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

1 That is consistent with the view of cognition as a distributed phenomenonbetween individuals and artifacts used to perform a task (e.g., paper notes, docu-ments, more or less complex technological devices, electronic interfaces, etc.) (Florand Hutchins, 1992; Hutchins, 1995b; Hutchins and Klausen, 1996; Rogers, 2004;Suchman, 1987).

C. lo Storto / Applied Ergonomics 44 (2013) 1004e1014 1005

through and the user testing), and type and nature of informationcollected (navigation, website content, customer behavior, search-ing mechanisms, website security, technical features) (see, forinstance: Bauer and Scharl, 2000; Chen and Macredie, 2005; Coxand Dale, 2002; Cunliffe, 2000; Day, 1997; Grandon andRanganathan, 2001; Jordan, 1998; Koufaris et al., 2002; Magoutaset al., 2010; Mich et al., 2003; Nielsen, 1995; Nielsen andLoranger, 2006; Palmer, 2002). In particular, scholars and practi-tioners who focused their efforts on the evaluation of websitedesign quality and usability provided a number of guidelines andcriteria for this aim, stressing the importance of usability assess-ment in the study of online purchasing behavior as a measure ofwebsite quality and use (Agarwal and Venkatesh, 2002; Nielsen,2000; Palmer, 2002). But, as some researchers emphasize, evenwhen a website has a low usability, other features, i.e. its visualappeal, may be an important factor that affects the positive judg-ment of the user relative to that website (Djamasbi et al., 2010;Lindgaard et al., 2006; Schenkman and Jönsson, 2000). In addition,traditional website usability evaluation privileges the use of qual-itative data, and in general usability approaches are unable to resortto users’ emotions and experiences (Chamorro-Koc et al., 2009). AsHahn and Kauffman (2004) claim, there are a number of issues thatmake common approaches which are successfully adopted toassess electronic interfaces not very suited for ecommerce websiteevaluation, and particularly: 1) as websites are continuously rede-signed and updated, costly experts have to be frequently involvedin a continuous evaluation process, and 2) users of ecommercewebsites may be very heterogeneous customers whose habits,preferences, and attitudes differ. As a consequence, website designhas to match the needs of a differentiated population of users andwebsite performance assessment should be conducted involving avariegated sample of them.

Although in the last years many scholars have provided frame-works and a number of methods to evaluate ecommerce websitesspecifically (Boyd, 2002; Merwe and Bekker, 2003), generally thereis a lack of theoretical justifications of the frameworks and evalu-ation criteria they adopt. Moreover, these approaches also neglector do not effectively model the cognitive process of online con-sumers that determines how they perceive the quality of thewebsites that are experiencing. Indeed, information processing andcognition are central activities when consumers interact withwebsites (Zhang and von Dran, 2000).

This paper presents an integrative framework inspired by con-cepts driven from various theories of cognition and informationprocessing e i.e., cognitive load, distributed cognition, cognitive fit,and media richness (Clark and Chalmers, 1998; Daft and Lengel,1986; Hutchins, 1995a,b; Lave, 1988; Rogers and Ellis, 1994;Ramarapu et al., 1997; Vessey, 1991) that helps to assess andcompare ecommerce websites. This framework adopts the conceptof cognitive efficiency as a measurement of the website quality andperformance, measured by the ratio of user perceived cognitivebenefits to cognitive costs. Further, it uses Data EnvelopmentAnalysis to calculate the website cognitive efficiency score. Theframework is implemented to compare 52 ecommerce websites ofbusinesses selling goods in the information technology and mediamarket (PC equipment, music CDs, DVDs, books). A stepwiseregression analysis is performed to identify cognitive costs andbenefits that mostly affect website cognitive efficiency. The paperhas the following structure. Firstly, information processing andcognition issues related to the website usage are discussed. Sec-ondly, the userewebsite interaction cognitive efficiency is pre-sented. Thirdly, the sources of cognitive costs and benefits areanalyzed. Next, the methodology for the measurement of thewebsite cognitive efficiency and implementation of the frameworkare illustrated. Finally, results are reported and discussed.

2. The website evaluation framework

2.1. Information processing and cognition of website users

Ecommerce websites, acting as decision support systems, allowthe consumers to perform problem-solving and decision-makingmore efficiently along the stages of online purchasing, providingthem with an aid capable to reduce their cognitive effort and in-crease their information processing and cognitive ability (Helanderand Khalid, 2000). The interaction occurring between the user andthe website forms a unique coupled cognitive system in which thewebsite, providing the user with patterns of action and an additiveshort-memory, supports this latter by keeping the internal mentalinformation processing activity less complex (Clark and Chalmers,1998; Halverson, 1994). In such a way, the external resources (i.e.,personal notes, paper catalogs, webpages, etc.) become a part of theuser cognitive system (Edmondson and Beale, 2008).1 The repre-sentations of information and knowledge by means of a websiteinterface have a number of properties that may greatly affect theusers cognitive effort when they perform their task on the WWW(Carroll, 1987; Wright et al., 2000; Zhang, 1991):

� firstly, as underlined, they can provide memory aids, andbecause information resides in front of the user, that has not tobe remembered. The website supports cognition not justpassively by merely representing itself, but actively by regis-tering and storing user activities for future use, and thusfunctioning like an external memory (Clark, 1997; Kirsh, 1996;Kirsch and Maglio, 1994). When the user surfs the WWW, thebrowser e as an external representation e can advise if a pagehas already been visited, rememberingwhich page the user hasalready visited and which has not yet. For instance, links topages that have not been seen by the user are blue, while linksto previously seen pages appear purple or red. Henceforth,when the user interacts with thewebsite interface, informationand knowledge are continuously transformed through mental,external and technological representational states, and interms of distributed cognition achievements are characterizedas the assembling of various representational states (Staggersand Norcio, 1993; Zhang and Norman, 1994);

� secondly, they can anchor and structure the user cognitivebehavior, because physical structures in the external repre-sentations constrain the range of possible cognitive behaviors,allowing some of them and prohibiting others. The websiteindeed provides filtering devices and structures to representand store information and knowledge through the use ofsymbols, constraints and working rules which are comple-mentary to the user’s ones (see Table 1);

� thirdly, they change the nature of task, making its performanceeasier by simplifying the problem and reducing the user in-formation load. To a certain extent, the website addresses howuser cognition evolves during interaction as the websitedynamically assumes new configurations that better fit theuser information processing capability;

� fourthly, technology makes website interfaces very flexibledecision support systems. Hence, by removing or addingdifferent features and functions such as text, audio, colors,interactivity, customization, etc., websites may assume several

Page 3: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

Table 1The distributed information and knowledge structures.

Information and knowledge structures

Websiteinterface

Plans (sequence of actions, events and states that can occurto perform a certain task)Goals (a defined state of the website)Possible actions (a set of actions that the user intuitively beliefsto be able to undertake to achieve a goal)History (a sequence of actions, events and states occurredduring usereinterface interaction)Actioneeffect relation (a cause relation between action/eventand state)State (the objects value at every instant of the interaction)

User Schemes and mental modelsSituation knowledgeCognitive style

2 Literature differentiates these two cognitive states of individuals. In particular,uncertainty is a consequence of the perceived lack of information (Daft and Lengel,1986; Galbraith, 1973). This lack of information increases with task complexity, i.e.the variety of its constituting actions and their interrelationships. When a greatnumber of actions have to be executed, it is not easy to define accurately all thedetails before starting task execution, and new information has to be acquiredduring interaction as unexpected events produce a gap in the knowledge. On theother side, ambiguity emerges when there might be several different in-terpretations, even contrasting, of a certain situation (Daft and Weick, 1984). Wheneither tasks or actions cannot be correctly framed, their execution cannot be easilyplanned and has ambiguous aspects.

C. lo Storto / Applied Ergonomics 44 (2013) 1004e10141006

configurations, and, consequently, have different degrees ofrichness, becoming more or less responsive to the informationneeds of the user and task characteristics (Palmer, 2002;Carlson and Zmud, 1999);

� finally, they provide information that can be directly perceivedand used without being interpreted and formulated explicitly,as it emerges from the physical constraints. For instance, easilyidentifiable and meaningful icons on a website page mayreduce perceived complexity and support the user navigationunequivocally (Cheng and Patterson, 2007).

2.2. The website cognitive efficiency in the userewebsiteinteraction

As human short-term working memory capacity is limited, awebsite should be capable to eliminate or reduce the workingmemory overload associated to the need to mentally retain andintegrate several pieces of information and knowledge, but, at thesame time, to provide the user with a pleasant and satisfyingexperience (Sweller, 1994). Cognitive efficiency, then, may be aneffective measure of how much cognitive work the users have toperform outside of their working memory in a given task, given theconstraining nature of the website interfaces. The theories ofcognitive efficiency suggest that, when the user cognitive tasks canbe performed easily and quickly, the cognitive effort can be mini-mized and the search performance and the user gratification can bemaximized. If information is almost processed automatically withthe support of the website, the user cognitive efficiency may sub-stantially increase. Thus, better websites allow the user to have agreater cognitive efficiency during interaction. Following this dis-cussion, an efficiency index can be constructed, by considering theratio of the user perceived cognitive benefits to cognitive costsincurred during the interaction with the website:

efficiency ¼P

cognitive benefitsPcognitive costs

(1)

2.3. Sources of cognitive costs and benefits

During the interaction with the website, the user has to bear acognitive cost because of the efforts made to search, interpret andprocess information (Suchman, 1987). If the website user is eitherunable to locate critical information or to manage the informationoverload, and feels disoriented because information alreadycollected is either redundant or ambiguous, proceeding with thissearch and retrieving supplementary information necessary toaccomplish the task will be difficult (Germanprez and Zigurs, 2003).That will increase the user perception of task complexity and

bewilderment, causing the search activity to become psycholog-ically and cognitively costly (Chevalier and Kicka, 2006; Sweller,1988, 1994; Wilkie, 1994). The amount of ambiguity and uncer-tainty perceived by the user is a measure of the cognitive costs tobear when using the website.2 Literature suggests that individualsperceive and process uncertain and ambiguous information differ-ently, depending on their cognitive capabilities, past experience, andattitudes (Grisé and Gallupe, 2000; Kuhlthau and Tama, 2001;Tversky,1972), as their cognitive and learning styleworks as a sort ofinternal mechanism filtering or activating their cognitive capabil-ities (Anderson, 2006). In the distributed cognition view, theperceived uncertainty and ambiguity are not only related to the typeof task to be accomplished using the website or to the user infor-mation processing capabilities, but also to the website featuresthemselves. Poor website design requests to the user a greaterattention to grasp the logic and the way the website works. Viceversa, a well designed website interface may simplify the access toinformation and reduce both ambiguity and uncertainty. Timeneeded to search for information also affects the amount of cogni-tive cost the user has to bear in the interactionwith thewebsite. Themore the user has to think about how to use the website interface,the less a sufficient amount of cognitive and perceptual resourceswill be available to perform the main task. Time perception is thusassociated to mental workload (Baldauf et al., 2009). A loweramount of time consumed to achieve a task is usually associated tomore efficient information processing and effective website design(Hong et al., 2004).When thewebsite includes both highlighted andnon-highlighted options or differently colored options, the searchtime for collecting useful information depends both on the timenecessary to find the target when there is a highlighted option andthe time necessary to find the target when there is no highlightedoption or this option is differently colored (Crosby et al., 2003).Search time for information is strongly related to website features,i.e. the density and visual complexity of the background (Drury et al.,1978; Tuch et al., 2009). Moreover, a low speed at which websitepages load and commands are carried out contributes to make theuser frustrated when the website is used. However, it may happenthat when the interaction with the website is really enjoyable, theuser becomes so engaged in this interaction, so full immersed inwhat is doing and focused on the task to be totally unaware of thetime which is passing. In this case, the perception that the user hasof time becomes distorted, and an objective measure of it might notbe associated to the real cognitive cost sustained. Even, much moretime to complete the task can be spent, if the interaction with thewebsite is pleasing and the satisfiedwebsite user has fallen in a stateof flow (Hoffman et al., 2000; Hoffman et al., 2002). Thus, anobjective measurement of time may be even misleading in theassessment of website quality.

Interacting with thewebsite interface is for the user also a sourceof gratification and satisfaction, which are associated both to thequality of interaction itself and the way it develops, and the suc-cessful goal attainment. Online shopping is intrinsically more un-certain and risky than conventional shopping for the consumer,

Page 4: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

Utente

navigability

- ambiguity (A)- uncertainty (U)- time (T)

- usefulness (US)- satisfaction (S)- attractiveness (AT)

C. lo Storto / Applied Ergonomics 44 (2013) 1004e1014 1007

because of the inability to check the product or service qualitybefore buying and receiving it at home, or to find enough infor-mation about the seller, or monitor the safety of sending personaland financial information. Further, it provides the consumer with adifferent experience from shopping in a physical retail store, and asthe web store is unable to reproduce the environment and the at-mosphere of a physical retailer due to the technical limitations of theelectronic interface, the website design has to compensate for theloss of them (East, 1997; Engel et al., 1995). Thus, the users willgenerally be more attracted by websites that make them at ease,nourishing their trust. Scholars found that the positive experienceduring online shopping is an important determinant of retainingonline consumers (Rice, 1997), and the quality of a retailing websiteis a dominant antecedent of the user satisfaction within the onlineshopping environment (Wolfinbarger and Gilly, 2002).Csíkszentmihályi (1990) uses the concept of “states of optimalexperience” or “flow” to refer to a cognitive state of the userwhich ischaracterized as being intrinsically enjoyable, accompanied by a lossof self-consciousness, and self-reinforcing.3 Vellido et al. (2000)found that the easiness of use of the online store website and theperceived control over it by the user are major determinants of usersatisfaction that explain purchasing on theweb. As empirical studiesreveal (Csíkszentmihályi, 1990; Hoffman et al., 2000), a state of flowis generally associated to a good balance between how much theusers perceive that the interaction with the website is challengingand to what extent their skills match the websites complexity. Thismeans that an interaction with a website which is apparentlyextremely easy-to-use might be as frustrating as the interactionwith an ambiguous or technically deficient website. That is a majorconcern that should be taken into account when using conventionalapproaches to the assessment of website usability.

3. The implementation of the framework

3.1. The website cognitive efficiency measurement

The measurement of the cognitive efficiency of a website im-plies the development of a complexmodel that relates a set of inputand output variables associated to measurements of the userperceived cognitive costs and benefits. For this aim, scholars havesuggested the implementation of Data Envelopment Analysis(DEA), a mathematical non parametric multi-criteria techniquebased on a seminal idea proposed by Farrell (1957) that provides ameasure of the relative efficiency of a number of units based on anot necessarily known or pre-defined conversion process of inputsinto outputs (Charnes et al., 1978). Particularly, DEA adopts linearprogramming to identify a production frontier as locus of efficientunits. In the original formulation by Charnes et al. (1978) (CCRmodel), units which are placed on the production frontier are 100%efficient, while units that are displaced away from the frontier areconsidered not efficient. If a unit is less than 100% efficient, a linearcombination of other units yields the same level of outputsconsuming a lower amount of inputs which is actually used by thatunit. This technique allows to avoid some of the critical assump-tionsmade tomeasure efficiency, assessing the relative efficiency ofunits without the need to introduce any a priori minimalassumption relatively to the functional relationships between inputand output factors in these units to build the efficiency frontier. As

3 In the WWW context the flow describes an online experience where the usersare completely engaged with their online interaction with the website, judging itintrinsically interesting to the extent that they often perceive a sense of control overthe website interaction, losing track of time passing and of the immediate physicalsurroundings (Hoffman et al., 2002; Trevino and Webster, 1992; Webster et al.,1993).

the standard development of the DEAmodel produces an efficiencymeasure which is between 0 and 1, and does not generate a rankingof units, Andersen and Petersen (1993) e modifying the originalCCRmodele introduced the concept of super-efficiency that makesthe efficiency analysis more discerning and provides a full notcensored ranking of unit efficiency (Lovell and Rouse, 2003; Zhu,1996). Details relative to how DEA works and the re-formulatedmodel measuring super-efficiency are illustrated in the Appendix.

3.2. The input and output variables

Alpar et al. (2001) propose a productivity model for websitescomparison treating page views as output, and webpages, scripts,and some web constructs as inputs. Alpar and Donthu (2007)conceptualize shopping websites as a computer information sys-tem and sales channel. Data measuring input and output variablesin their model are collected from three different sources: siteanalysis by means of a webcrawler, website visiting statisticsretrieved by a market research company, and the measurement ofvisitor’s perceptions. Hahn and Kauffman (2004) also provide aconceptualization of a website as a production function thattransforms inputs (functions) into outputs (purchase transactions).Adopting this model, they evaluate one single ecommerce website.Rather than benchmarking different websites, the scholarscompare completed purchase transactions performed by a greatnumber of customers that use the same ecommerce website. Intheir DEA model, inputs consist of customers use of a set of websitefunctions, while the only output is the product type basket atcheckout. Finally, Benslimane and Yang (2006) implemented DEAto identify the minimum function set that generates the maximumvalue for buyers that use commercial websites for electronic pro-curement. Particularly, they consider four website functions(identification, selection, execution, and post sale support) as in-puts, and three dimensions of website commercial value (useful-ness, reduced research and processing cost) as outputs. Literatureon website usability has suggested three macro-dimensions (orareas) useful for measuring performance of awebsite (Jordan,1998;Nielsen, 1995, 2000; Nielsen and Loranger, 2006): user experience,website navigability and structure. These dimensions are used inthe proposed framework tomeasure the user cognitive benefits andcosts (see Fig. 1). Particularly, the “experience” dimension includesthose cognitive benefits that a user receives in terms of perceivedsatisfaction (S), usefulness (US) and attractiveness (AT) for the useof the website. The remaining two dimensions, “navigability” and“structure” are related to website characteristics that can originatecognitive costs for the user along the interaction with it. Thesecharacteristics can generate ambiguity (A) and uncertainty (U), anddetermine an over-consumption of time (T) in the search for usefulinformation.

experience

structure

- ambiguity (A)- uncertainty (U)- time (T)

Fig. 1. A measure of website efficiency.

Page 5: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

C. lo Storto / Applied Ergonomics 44 (2013) 1004e10141008

Thus, the website efficiency index assumes the following shape:PBenefitExperienceðUS; S;ATÞP

CostStructureðA;U; TÞ þP

CostNavigabilityðA;U; TÞ(2)

3.3. The sample

The framework is implemented to compare 52 websites ofbusinesses that sell their products online (PC equipment, musicCDroms, DVDs, books).

Literature emphasizes how e generally e usability cannot beconceptualized independently of the context where it should beassessed, but its definition and implementability are dependentboth on the specific goals and tasks that can be performed bymeansof the website and on the website user target (Lecerof and Paterno,1998). Scholars also found that the user perception of websiteeasiness of use and control of userewebsite interface interactionvary with the kind of task that has to be accomplished (Guriting andNdubisi, 2006; Laforet and Li, 2005). Henceforth, in the preparatorystage of the study, a particular attention was paid to define andcircumscribe a specific task and sample of users. Consistently withthe tradition and methods of usability research suggesting thatwebsite assessment is performed through an intense user involve-ment, the evaluation framework was implemented in an experi-mental setting according to the following steps:

a) customization of the questionnaire to collect judgments rela-tive to the 52 websites of the sample.4 The original question-nairewas designed and tested in a previous study by the author(lo Storto, 2004a,b). The questionnaire was designed to mea-sure all the nine variables of the framework that provide proxymeasurements of the user cognitive benefits and costs (i.e., thecognitive cost induced by a sense of ambiguity during websitenavigation, the cognitive cost induced by thewebsite structure,the cognitive cost associated to the perception of time passingto navigate in search of useful information, the experiencelinked to website attractiveness, etc.). Constructs measuringthe framework variables were built using a list of statements,adopting scales reported in the literature and already tested(see lo Storto, 2004a,b). The users were requested to expresstheir agreement/disagreement for the statement content usinga 9 level measurement grid. After filling in questionnaires, thereliability of scales was measured both by calculating theCronbach a index and performing a factorial analysis.5 Valueswere acceptable and consistent with previous studies (loStorto, 2004a,b) (see Table 2). The first section of the ques-tionnaire included a scale to classify the cognitive and learningstyle of the user.6 This scale adopted the PLSI (Paragon

4 The number of websites in the sample is consistent with the sum of input andoutput variables. Andersen and Petersen (1993) underline that, unless the sum ofthe input and output variables is small relatively to the number of units, DEAimplementation provides generally a large number of units as 100% efficient.Further, input and output variables are usually highly correlated with one another.Thus, the greater the number of input and output variables, the less discerning theanalysis (Jenkins and Murray, 2003).

5 At this stage sample used to check scale reliability included 84 questionnaires.6 Taking into account the influence of the cognitive and learning style of website

users is of paramount importance to design successful websites. Indeed, scholarsclaim that information processing capabilities are influenced by personal charac-teristics of the consumers (McGuire, 1976), by their past experience (Tversky, 1972),and cognitive ability (Grisé and Gallupe, 2000). Newell and Simon (1972) suggestthat every individual processes information in a unique way, showing a differentcapability to do that. Jones et al. (1998) found that males and females react differ-ently to images, while Meyers-Levy and Maheswaran (1991) point out that malesfocus their attention on a more limited number of issues when process information.Similar findings emerged in eye-tracking experiments (Lorigo et al., 2006).

Educational Consulting Student Learning Style Inventory)model developed by Shindler (1992) and is based on the Kolb(1984), Jung (1971), and Myers-Briggs and McCaulley (1992)frameworks.7 Information relative to the cognitive andlearning style of students who were involved in the study aswebsite users was therefore used to select a reduced number ofwebsite evaluators to preserve a high degree of homogeneity;

b) in the front page, the questionnaire also requested the user toprovide information relative to the technology adopted (i.e.,speed of connection, PC motherboard chip frequency, etc.), theevaluation process (i.e., amount of time dedicated to thewebsite assessment, number of steps in which the process wasfragmented, etc). This information was utilized to pick upquestionnaires that made the sample used to implement DEAmore homogeneous assuring a high internal validity;

c) the questionnaire was administered to a sample of 135 com-puter science engineering, biomedical engineering and auto-mation engineering undergraduate students at the beginningof the term, providing themwith all information relative to theaim of the study and questionnaire filling, inviting them toreturn it no later than 4 weeks. Each student was requested toassess all 52 websites of the sample and filling in the firstsection of the questionnaire that contained the PLSI scale.Participants were also informed that they had to assess web-sites features rather than their content. At the due date, slightlymore than 30% of questionnaires had been filled in. Aftersoliciting the return of the questionnaires, new questionnaireswere filled in the next 4 weeks. The final amount of ques-tionnaires available for the study after 8 weeks was 92 units.However, at this step 8 questionnaires were rejected becausethey had several unfilled scales and, as a consequence, were notacceptable for the study;

d) in the next step, the 84 questionnaires were classified andclustered according to the student cognitive and learning style.Eleven questionnaires were excluded from the analysis as theyhad been filled in by means of technologies and an assessmentprocess that were not compatible with the rest of the sample(i.e., the user had a slow connection speed modem or thequestionnaire was filled in fragmenting the evaluation ofwebsites into an exaggerated number of steps). Furthermore,because it was not possible to have a univocal identification ofthe cognitive and learning style of the user for 22 question-naires, they were discarded and not considered for the nextstep of the study. The final sample that was used in the nextsteps includes 51 questionnaires. Fig. 2 reports the outcome ofquestionnaire grouping according to cognitive and learningstyle. As it shows, the cognitive and learning style labeled as“ESFJ” includes the largest number of questionnaires. From thisgroup, 5 questionnaires associated to 5 students wererandomly chosen and used in the calculation of websitecognitive efficiency and comparison. The ESFJs are particularlyinteresting as a reference group for the test of the framework,both for its large diffusion in the population e it is the secondmost common type, close to 12% e and the cognitive charac-teristics of these users. The ESFJs are generally uncomfortablewith uncertain and ambiguous contexts, but enjoy routine andworking methodically with an attention to procedures andspecifications, focusing on details rather than the overall

7 The scale of the Shindler model includes 48 statements. It classifies individualsinto 16 categories, according to their cognitive and learning characters, combiningtogether 4 dimensions: Introverts (I)/Extroverts(E), Sensing(S)/Intuitive(I); Feel-er(F)/Thinker(T), Judger(J)/Perceiver(P) (see http://www.oswego.edu/plsi/plsinfo.htm).

Page 6: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

Table 2Internal reliability of scales used to assess cognitive dimensions of websites.

Variable Cognitivedimension

Scale # items Cronbach a

AMB_STRU(I1) Websitestructure

Ambiguity induced bywebsite structure

9 0.90

UNC_STRU(I2) Uncertainty induced bywebsite structure

3 0.92

TIME_STRU(I3) Time consuming inducedby website structure

3 0.87

AMB_NAV(I4) Websitenavigability

Ambiguity generatedduring website navigation

13 0.94

UNC_NAV(I5) Uncertainty generatedduring website navigation

4 0.85

TIME_NAV(I6) Time consuming duringwebsite navigation

5 0.79

USE(O1) Website-userinteractionexperience

Website usefulness 7 0.91SAT(O2) User satisfaction for

website usage4 0.89

ATT(O3) Website attractiveness 5 0.83

Fig. 2. Cognitive and learning styles identified.

C. lo Storto / Applied Ergonomics 44 (2013) 1004e1014 1009

picture. They are often unable to deal with unstructuredproblems without the support of others;

e) for every scale, judgments given by the 5 selected studentswere averaged obtaining a unique evaluation. These valueswere used to run DEA. The DEA technique implemented in thestudy has adopted a radial measure for efficiency and an inputorientation (cognitive costs), in order to get a quantification ofthe input reduction necessary to make a not efficient website100% efficient keeping the same output value (the cognitivebenefits) (Charnes et al., 1978). It was assumed that scale effectsare not influential in the study, and henceforth the CCR DEAmodel was implemented. In order to have a ranking for the100% efficiency websites too, the re-formulated CCRmodel wasimplemented to calculate super-efficiency measures.8 Finally, astepwise backward regression analysis was performed to findcognitive costs and benefits that mostly affect websiteefficiency.

4. Results

Table 2 displays the values for the Cronbach a index calculatedfor measuring the internal reliability of scales used to measure thecognitive variables of the framework. The a value is always greaterthan 0.70. This threshold measure is considered acceptable to usethe measurement scale without the risk to have a scarce internalconsistency. The table also reports information relative to scale sizein terms of number of items.9

Table 3 shows Pearson correlations between variables. As ex-pected, the cognitive dimensions of the framework associated bothto the user cognitive costs perceived during website navigation andto the cognitive costs induced by the website structure are nega-tively correlated to the cognitive dimensions (benefits) relative touser experience. Moreover, correlations between the cognitive costdimensions have positive sign and are strong, particularly for am-biguity and uncertainty, indicating that these sources of cognitivecost influence each other. These findings are confirming the logicalconsistency of the framework developed. Correlations betweenvariables that measure specific aspects of the user experience in theinteraction with the website are also very strong and positive.

Table 4 summarizes the outcome of DEA. According to the CCRmodel, websites considered inefficient have an efficiency measurelower than 100%. A website is inefficient if a virtual website can begenerated as a linear combination of some websites in the sample,

8 For not efficient websites, efficiency and super-efficiency have the same value(Andersen and Petersen, 1993).

9 The indications provided by the Cronbach a calculation are consistent with thefactor analysis outcome. For the sake of brevity, the results have not been included.

and this virtual website offers at least the same amount of cognitivebenefits to the user charging a lower amount of cognitive costscompared to the real website under examination. Data in Table 4indicate that 48 websites result relatively inefficient. Efficiencyevaluation provided by the CCR DEA model shows the inefficiencyscore of a website and its reference set, i.e. the set of efficientwebsites that generate the virtual website on the frontier used as abenchmark. But, it does not provide any ranking among all web-sites. Therefore, for instance, from this analysis it can be inferredthat the website dell is inefficient at 78.94% compared to thewebsite primestore which is its reference set, while the websitestrabilia is inefficient at 39.11% compared to a virtual website thatcombine together thewebsites ebest and primestorewhich are in itsreference set. The website strabilia should therefore reduce thecognitive cost suffered by the user approximately by 60% withoutreducing the perceived cognitive benefits to increase its efficiencyscore. Table 4 reports the website superefficiency scores for web-sites resulted 100% efficient in the CCR DEA model, too.

Table 5 presents major statistics relative to input and outputvariables of the DEA model, and to efficiency and superefficiencyscores. Efficiency measurements are in the range 25.22e100%,while superefficiency ranges from 25.22% to 127.29%. These find-ings point out that the sample websites largely differ as to theirefficiency rate (std.dev. is 21.40%), and a small number of websitesare really cognitively 100% efficient. Average CCR efficiency isindeed only 63.52%.

Inefficient websites over-utilize specific inputs (e.g., force theusers to make a greater cognitive effort during navigation) orunder-produce outputs (e.g., provide the users with a scarce grat-ification when they use the websites). The performance of aninefficient website might be improved either by increasing the usercognitive benefits or by decreasing cognitive costs when it is used.The extent to which cognitive costs should be reduced to make thewebsite cognitively efficient for the user can be indicated by apercentage. The extent to which cognitive benefits have to beincreased to move the website to the efficiency frontier can bemeasured by a percentage, too. Table 6 reports statistics showingrelative reduction of cognitive costs or increase of benefits thatwould be necessary to make inefficient websites 100% efficientaccording to the input-oriented CCR DEA model. In particular, thecolumn “Mean” in this table displays the average relative amount ofpotential improvements, while the following two columns “Min”and “Max” respectively indicate the smallest and the greatest im-provements necessary to increase the efficiency score of a websiteto 100%. Henceforth, this table shows potential improvements of

Page 7: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

Table 3Pearson correlations between the website cognitive dimensions.

Variable 1 2 3 4 5 6 7 8 9

1 AMB_STRU(I1) 12 UNC_STRU(I2) 0.934 13 TIME_STRU(I3) 0.718 0.679 14 AMB_NAV(I4) 0.893 0.871 0.745 15 UNC_NAV(I5) 0.866 0.845 0.696 0.896 16 TIME_NAV(I6) 0.683 0.706 0.898 0.770 0.721 17 USE(O1) �0.695 �0.683 �0.639 �0.653 �0.660 �0.682 18 SAT(O2) �0.753 �0.767 �0.696 �0.684 �0.739 �0.702 0.926 19 ATT(O3) �0.760 �0.771 �0.694 �0.727 �0.763 �0.702 0.909 0.918 1

C. lo Storto / Applied Ergonomics 44 (2013) 1004e10141010

“three virtual websites” generated from the sample, where the firstone is a medium efficiency rated website, the second one is the bestinefficient website (i.e., built considering the smallest improve-ments for all inputs and outputs), and the third one is the worst

Table 4DEA efficiency measures and the reference set of inefficient websites.

Code Website CCRefficiency (%)

Superefficiency(%)

Referenceset

WS 1 frael 66.41 66.41 13, 34WS 2 mediaworld 26.48 26.48 13, 29, 34WS 3 chl 61.65 61.65 13, 34WS 4 vobis 26.32 26.32 13, 29, 34WS 5 computerdiscount 30.33 30.33 13, 29, 34WS 6 computerunion 30.35 30.35 13, 29, 34WS 7 strabilia 39.11 39.11 13, 34WS 8 wellcome 72.74 72.74 13WS 9 compy 25.22 25.22 13, 29, 34WS 10 eprice 33.29 33.29 13, 29, 34WS 11 zetabyte 64.73 64.73 13, 29WS12 wireshop 78.22 78.22 13, 29WS 13 ebest 100.00 127.29 e

WS 14 eplaza 76.58 76.58 13, 29, 34WS 15 oeminformatica 84.94 84.94 20, 29WS 16 prime 94.88 94.88 20, 29WS 17 dualpower-pc 62.48 62.48 13WS 18 e-comp 81.04 81.04 13, 20, 29WS 19 aroundstore 89.85 89.85 13, 34WS 20 bitstore 100.00 110.95 e

WS 21 spartano 76.73 76.73 13, 29WS 22 miocomputer 67.33 67.33 13, 29, 34WS 23 telpc 27.53 27.53 13, 29WS 24 input-computer 98.88 98.88 29, 34WS 25 acaminformatica 70.41 70.41 13, 29, 34WS 26 websight 45.03 45.03 13, 34WS 27 inforeashop 76.64 76.64 29, 34WS 28 pcservicesrl 83.35 83.35 29, 34WS 29 computerstore 100.00 113.49 e

WS 30 buzville 75.85 75.85 13, 29, 34WS 31 fraelpoint 82.33 82.33 13, 34WS 32 mytechline 39.59 39.59 13, 29, 34WS 33 dell 78.94 78.94 34WS 34 primestore 100.00 119.61 e

WS 35 dgsystem 57.54 57.54 13, 34WS 36 gigamatic 41.16 41.16 13, 29, 34WS 37 oicom 64.78 64.78 13, 34WS 38 help-informatica 66.38 66.38 13, 34WS 39 worldcenter 59.71 59.71 13, 20WS 40 bow 78.10 78.10 13, 34WS 41 cdsound 54.52 54.52 13, 29, 34WS 42 massivemusicstore 51.54 51.54 13, 29WS 43 lifegatemusicshop 50.36 50.36 13, 29, 34WS 44 cdbox 62.81 62.81 29, 34WS 45 linus-records 50.20 50.20 13, 29, 34WS 46 musicstore 53.50 53.50 13, 29, 34WS 47 cdmusic 58.85 58.85 13, 29, 34WS 48 topten 54.59 54.59 13, 29, 34WS 49 cdshopdvd 48.74 48.74 13, 29WS 50 discoland 45.44 45.44 13, 29WS 51 dvdland 72.58 72.58 13WS 52 internetbookshop 65.07 65.07 13, 29, 34

inefficient website (i.e., considering the greatest improvementsrelative to all inputs and outputs). As expected, consequently to theadoption of an input-oriented DEA model, the reduction of cogni-tive costs weighs more than the increase of benefits and appearsmore critical to achieve efficiency. On average, a cognitive costreduction higher than 40% is necessary to make efficient a websitebelonging to sample which is not 100% efficient. In particular, thenavigability macro-dimension of the website seems even morecritical as the minimum relative improvement required to make awebsite efficient is not less than about 10% (%TIME_NAV), unlike thestructure macro-dimension. For this latter, improvement needed isclose to 5% (%AMB_STRU) to reduce the amount of perceived am-biguity and about 1% the amount of cognitive effort linked toperceived high time and uncertainty. In theory, this means thate inthe best case (i.e., the virtual best inefficient website) e it is moreexpedient for the web designer to improve the website perceivedcognitive benefits to costs ratio by focusing on the website struc-ture rather than on navigability, as a greater effort is necessary forthis latter. The cognitive cost due to arising of ambiguity duringwebsite navigation is particularly great. The minimum improve-ment required for achieving full efficiency remains greater than theimprovement needed to reduce uncertainty and time consumption.As to cognitive benefits, unlike cognitive costs, differences betweentheir improvement rates are more evident. The average improve-ment for website usefulness needed to make a website efficient israther irrelevant (only 1.79%), while the maximum improvement isclose to 10%. The mean improvement of the attractiveness dimen-sion of the cognitive benefit necessary to increase the website ef-ficiency is largely higher (7.64%), and the maximum improvementis a bit less than 37%.

Table 7 shows the outcome of the regression analysis thatadopted the super-efficiency measurement as dependent variable.In particular, a backward stepwise regression was performed,including all variables measuring cognitive costs and benefits in the

Table 5Framework variables and efficiency statistics.

Frameworkvariable andefficiency measure

Cognitivedimension

Mean (SD) Min Max

AMB_STRU(I1) Websitestructure

3.560 (0.960) 1.933 6.289UNC_STRU(I2) 3.103 (0.991) 1.733 6.533TIME_STRU(I3) 3.202 (1.164) 1.400 5.833

AMB_NAV(I4) Websitenavigability

3.347 (0.959) 1.862 5.485UNC_NAV(I5) 3.171 (0.846) 1.750 5.500TIME_NAV(I6) 3.448 (0.970) 1.640 5.800

USE(O1) Website-userinteractionexperience

6.497 (0.882) 4.286 7.886SAT(O2) 6.438 (1.046) 3.750 8.050ATT(O3) 6.380 (1.130) 3.680 8.200

Efficiency (CCR) 63.52% (21.40%) 25.22% 100.00%Superefficiency 64.89% (24.22%) 25.22% 127.29%

Page 8: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

Table 6Statistics relative to websites potential improvements.

Variableimprovement

Cognitivedimension

Mean (SD) Min Max

%AMB_STRU(I1) Websitestructure

41.97% (17.79%) 5.10% 74.80%%UNC_STRU(I2) 43.99% (18.49%) 1.10% 78.80%%TIME_STRU(I3) 47.52% (19.61%) 1.10% 78.70%

%AMB_NAV(I4) Websitenavigability

44.51% (17.33%) 16.20% 77.20%%UNC_NAV(I5) 45.78% (17.08%) 10.60% 76.70%%TIME_NAV(I6) 45.84% (16.94%) 10.20% 76.30%

%USE(O1) Website-userinteractionexperience

1.79% (2.88%) 0.00% 10.90%%SAT(O2) 5.55% (8.22%) 0.00% 40.20%%ATT(O3) 7.64% (9.91%) 0.00% 36.80%

C. lo Storto / Applied Ergonomics 44 (2013) 1004e1014 1011

regression model as independent variables at the initial step. Fequal to 11.00 was adopted as a threshold to select variables toenter or remove from the regression model. At step 7, the regres-sion model includes 2 variables only, the first one measuring thecognitive cost that the website users bear when they perceiveambiguity during navigation, and the second one the cognitivebenefit coming from the perceived website usefulness. Signs,negative for the first variable, and positive for the second one, areconsistent with the logics of the website evaluation framework.Cognitive efficiency seems mostly influenced by ambiguityperceived during website navigation, and by the cognitive benefitassociated to the amount of usefulness the user perceives. Thesefindings support what emerged from data in Table 6 relative to theimprovement analysis to make websites efficient. These dataidentified the specific component of cognitive costs generated bynavigation ambiguity as critical, and website usefulness as a vari-able that remains indeed not modifiable.

5. Discussion and conclusion

The study has illustrated how the proposed framework based onthe theories of cognition and the implementation of Data Envel-opment Analysis can be fruitfully used to carry on evaluation andcomparison of ecommerce websites, with the aim to identifyimprovement trajectories. This framework models the usereweb-site interaction as a black box that provides the user with a set offunctions that allow task performance. When these functions areexploited, a cognitive effort has to bemade by the user, more or lessbalanced by a sense of gratification which is associated to goalachievement. In this framework, the cognitive effort depends eitheron the amount of ambiguity and uncertainty perceived, or time

Table 7Stepwise regression with super-efficiency as dependent variable and cognitive dimensio

Beta coefficients at different steps

Step 0 Step 1 Step 2 St

AMB_STRU(I1) 0.034 0.021UNC_STRU(I2) �0.020TIME_STRU(I3) �0.100 �0.100 �0.090 �AMB_NAV(I4) �0.290 �0.290* �0.280* �UNC_NAV(I5) �0.180 �0.180 �0.180 �TIME_NAV(I6) �0.080 �0.080 �0.090 �USE(O1) 0.183 0.177 0.176SAT(O2) 0.076 0.081 0.076ATT(O3) 0.140 0.144 0.145

Adjusted R-squared 0.830 0.834 0.838F 28.669 33.015 38.588 4p 0.000 0.000 0.000

* Indicates values that are significant with p < 0.10.

necessary to perform task during the interaction with the website.Cognitive benefits associated to website usage include attractionfor the website, and a feeling of satisfaction and usefulness for a fullor partial goal achievement.

As the framework provides a holistic view of the website effi-ciency and preliminary insights to further refinement assessment,complementary methods and techniques such as eye-tracking andA/B split testing can be implemented helping to investigate deeperinto the users cognitive behavior during their interaction with thewebsites, and to identify specific page areas or parts that have animpact on the users cognitive workload or benefit (Bojko, 2006;Chatham et al., 2004; Djamasbi et al., 2010; Kohavi et al., 2009). Forinstance, if the time-related cognitive cost greatly affects thewebsite cognitive efficiency, eye tracking experiments could beused to assess which page or component of a pagemakes it difficultto locate the right starting point for an action. Indeed, cognitivepsychology literature has showed how eye behavior is linked touser decision-making, reasoning and cognitive processing (Di Stasiet al., 2011; Rayner, 1998). Henceforth, several metrics used in eyetracking (fixation duration, gazing time, saccade rate, pupil dilata-tion, scanpath differences, etc.) may provide further insights intohow website users process information and cognition develops,suggesting how to make search more efficient (e.g., identifyinglinks and control buttons more easily) or effective (e.g., under-standing the meaning of objects, and finding effective patterns ofaction to achieve the target), and even discovering causes of a failedsearch. However, eye-tracking experiments are generally costly andtime consuming because the assessment study should be neces-sarily conducted in a lab and one user at the time can be involved inthe evaluation activity. On the contrary, the proposed frameworkprovides clues about how users behave when interact with anumber of websites at a lower cost. As a third stage of a morecomprehensive evaluation analysis, the A/B split testing could befurther implemented as a diagnostic tool to corroborate findingsfrom the proposed framework and eye tracking studies.

The implementation of the framework preliminarily requiresthe knowledge of the technical system, the users, their goals, theircognitive and learning style. The performance of the cognitive ac-tivity during humanecomputer interaction is indeed the outcomeof a number of factors, both objective and subjective (Ping Zhang,2004). The objective factors that have an effect on the perfor-mance are related to the technical features of the technologicalsystem used (connection speed, monitor quality, browser version,computer performance, etc.). The subjective factors are generallyrelated to the user capabilities, personal traits, disabilities, attitudesand motivation (capability to use computer effectively, interest for

ns as independent variables.

ep 3 Step 4 Step 5 Step 6 Step 7

0.110 �0.170* �0.180* �0.180*0.270* �0.280* �0.270* �0.470* �0.570*0.200 �0.200 �0.250*0.090 0.347*0.220 0.241* 0.377* 0.426*

0.165 0.152

0.841 0.843 0.843 0.834 0.8235.840 55.668 69.471 86.619 119.9230.000 0.000 0.000 0.000 0.000

Page 9: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

C. lo Storto / Applied Ergonomics 44 (2013) 1004e10141012

the site content, emotional state, concentration capacity, capabilityto work under ambiguity and uncertainty, etc.). For instance, someindividuals do not like computers, do not feel familiar with Internetnavigation, or are made anxious by them (Singh et al., 2001). Otherindividuals, vice versa, are strongly attracted by computers andhave acquired a great capability to use Internet. Even, this lattergroup of individuals may have different preferences for interactionstyles, colors, graphic presentations, data presentations, informa-tion structure, character type, etc. Kang and Yoon (2008) havedemonstrated that the age of users may seriously affect theirbehavior when they interact with complicated electronic devices.The task typology may substantially influence the need of infor-mation of the user and the way this is effectively structured (Gayet al., 2001; Radha and Murphy, 1992). The cultural, ethnical, racebackground also affects the user attitudes toward the websiteassessment and the information need. So, for instance, users whohave a cognitive and learning style more oriented to reflectionmight prefer websites that differ from those ones preferred byusers having a cognitive and learning style more oriented to action(Kotzé, 2000). Individuals differ as to their capability to processinformation. Each individual possesses a certain capability to pro-cess information which is independent of the task to perform(Scandura, 1971). Henceforth, the identification of the informationprocessing capability of a typical user of a website is important fordesigning more efficient websites.

A number of minor limitations emerged from the study that,however, do not call into question the importance and quality of theframework developed:

� even though awell structured questionnaire has been designedand used to collect perceptual judgments to evaluate websites,the assessment framework based on cognition remainsfundamentally qualitative, and e as a consequence e datasubjectivity and context dependence can influence the outputof the assessment. This issue should be taken in due consid-eration when assessment analyses are carried on;

� the study was conducted in a setting of the users choice withthe aim to simulate a real usereecommercewebsite interactioncontext. That impeded to have a rigorous control of the settingand the potential for interruptions and bias.

This framework has provided useful insights and similar studiescan be useful either to design a new website or redesign an oldwebsite to improve the perception that users develop about it. Inparticular,

� measurements relative to cognitive costs and benefits thathave been developed are indeed consistent with the logicsunderlying the framework, as the correlation analysis pointsout a negative association between costs and benefits;

� the implementation of DEA has showed that two model vari-ables have a major influence on website cognitive efficiency,the cognitive cost linked to ambiguity perceived during web-site navigation and the cognitive benefit associated to thewebsite perceived usefulness when the user interact with thislatter. The stepwise regression analysis outcome supportsthese findings;

� as to the sample that has been used in the efficiency analysis,reducing cognitive costs for the users is more important thanincreasing their perceived benefits. Furthermore, the naviga-bility dimension of the framework seems more critical;

� the framework is extremely easy to use and can be adopted toevaluate and compare ecommerce websites that sell differentproducts and services, not requiring any particular ability orknowledge of users to fill in the questionnaire;

� as websites are compared only in the user perspective, it isunnecessary to collect any specific and confidential informa-tion about the business (i.e., the number of contacts, onlinepurchases, etc.);

� the framework is flexible and further measurement scales canbe introduced to investigate more in depth the determinantsaffecting the users cognition and their mental attitude towardthe website, i.e. the perceived credibility about the websitecontent, trust development, etc.;

� including information relative to cognitive and learning styleof users, Data Envelopment Analysis could be also used as aninvestigative technique useful for exploring the structure ofjudgments relative to the cognitive process dimensions ofusers that generate website rankings in order to identifyvariables that can influence these structures of preferences.

Appendix

A.1. Mathematical formulation of data envelopment analysis

Charnes et al. (1978) defined efficiency of a unit k as follows:

hk ¼Ps

y¼1 vkyOkyPrx¼1 ukxIkx

where Ikx ¼ the quantity of input x utilized by unit k; Oky ¼ thequantity of output y supplied by unit k; ukx ¼ the weight associatedto input x; vky ¼ the weight associated to input y; r¼ the number ofinputs; s ¼ the number of outputs; hk ¼ the efficiency of unit k;n ¼ the number of units.

The values of weights ukx and vky for a generic unit k aredetermined with the goal tomaximize its efficiencye as possible asclose to 100%, while keeping the efficiency of other units having thesame weights lower than 100%.

In the basic DEA formulation (CCR model), the efficiency of unitk is calculated determining values of uk1, uk2, ., ukr and vk1, vk2, .,vks that maximize

hk ¼Ps

y¼1 vkyOkyPrx¼1 ukxIkx

subject to:

hi ¼Ps

y¼1 vkyOiyPrx¼1 ukxIix

� 1 for i ¼ 1;2;.k;.;n

This latter constraint imposes that no other unit may have an ef-ficiency measure greater than 100% with the same weights.

The original formulation can be conveniently rewritten throughan algebraic manipulation of the constraint to maximize

hk ¼Xsy¼1

vkyOky

subject to:

Xrx¼1

ukxIkx ¼ 1

and0@Xs

y¼1

vkyOiy

1A�

Xrx¼1

ukyIix

!� 0 for i ¼ 1;2;.; k;.;n

Page 10: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

C. lo Storto / Applied Ergonomics 44 (2013) 1004e1014 1013

ukx � 0; vky � 0; for every value of k; x and y

where the first constraint sets the normalization of the denomi-nator in the original efficiency measure to 100%.

The basic formulation is thus transformed into a set of n linearprogramming problems, one for each unit k.

Andersen and Petersen (1993) suggest to calculate super-efficiency of units modifying the constraint in the original formu-lation of the CCR model as follows0@Xs

y¼1

vkyOiy

1A�

Xrx¼1

ukyIix

!� 0 for i ¼ 1;2;.;n with isk

This means that this constraint is omitted for the unit i ¼ k.

References

Agarwal, R., Venkatesh, V., 2002. Assessing a firm’s web presence: a heuristicevaluation procedure for the measurement of usability. Information SystemsResearch 13 (2), 168e186.

Alpar, P., Donthu, N., 2007. Productivity of internet shops. International Journal ofElectronic Business 5 (3), 243e262.

Alpar, P., Porembski, M., Pickerodt, S., 2001. Measuring the efficiency of websitetraffic generation. International Journal of Electronic Commerce 6 (1), 53e74.

Andersen, P., Petersen, N.C., 1993. A procedure for ranking efficient units in dataenvelopment analysis. Management Science 39, 1261e1264.

Anderson, T.D., 2006. Uncertainty in action: observing information seeking withinthe creative processes of scholarly research. Information Research 12 (1).October.

Baldauf, D., Burgard, E., Wittmann, M., 2009. Time perception as a workload mea-sure in simulated cardriving. Applied Ergonomics 40, 929e935.

Bauer, C., Scharl, A., 2000. Quantitative evaluation of web site content and structure.Internet Research: Electronic Networking Applications and Policy 10 (1), 31e43.

Benslimane, Y., Yang, Z., 2006. Analysis of functionalities of commercial websitesused for procurement: implications on design and operations. InternationalJournal of the Information Systems for Logistics and Management 2 (1), 35e42.

Bojko, A., 2006. Using eye tracking to compare web pages design: a case study.Journal of Usability Studies 1 (3), 112e120.

Boyd, A., 2002. The goals, questions, indicators, measures (GQIM) approach to themeasurement of customer satisfaction with e-commerce web sites. Aslib Pro-ceedings 54 (3), 177e187.

Carlson, J.R., Zmud, R.W., 1999. Channel expansion theory and the experientialnature of media richness perceptions. Academy of Management Journal 42 (2),153e170.

Carroll, J.M., 1987. Interfacing Thought: Cognitive Aspects of HumaneComputerInteraction. The MIT Press, Cambridge, MA.

Chamorro-Koc, M., Popovic, V., Emmison, M., 2009. Human experience and productusability: principles to assist the design of usereproduct interactions. AppliedErgonomics 40, 648e656.

Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring efficiency of decision makingunits. European Journal of Operations Research 2 (6), 429e444.

Chatham, B., Temkin, B.D., Amato, M., 2004. A Primer on A/B Testing. ForresterResearch Inc., Cambridge.

Chen, S.Y., Macredie, R.D., 2005. The assessment of usability of electronic shopping:a heuristic evaluation. International Journal of Information Management 25 (6),516e532.

Cheng, H.I., Patterson, P.E., 2007. Iconic hyperlinks on e-commerce websites.Applied Ergonomics 38, 65e69.

Chevalier, A., Kicka, M., 2006. Web designers and web users: influence of the er-gonomic quality of the web site on the information search. International Journalof HumaneComputer Studies 64, 1031e1048.

Clark, A., 1997. Being There: Putting Brain, Body, and World Together Again. The MITPress, Cambridge, Mass.

Clark, A., Chalmers, D.J., 1998. The extended mind. Analysis 58, 10e23.Cox, J., Dale, B.G., 2002. Key quality factors in web site design and use: an ex-

amination. International Journal of Quality & Reliability Management 19 (7),862e888.

Crosby, M.E., Iding, M.K., Chin, D.N., 2003. Research on task complexity as a foun-dation for augmented cognition. In: Proceedings of the 36th Hawaii Interna-tional Conference on System Sciences (HICSS’03).

Csíkszentmihályi, M., 1990. Flow: the Psychology of Optimal Experience. Harperand Row, New York.

Cunliffe, D., 2000. Developing usable web sites: a review and model. InternetResearch: Electronic Networking Application and Policy 10 (4), 295e307.

Daft, R.L., Lengel, R.H., 1986. Organizational information requirements, mediarichness and structural design. Management Science 32, 554e571.

Daft, R.L., Weick, K.E., 1984. Toward a model of organizations as interpretationsystems. Academy of Management Review 9, 284e295.

Day, A., 1997. A model for monitoring web site effectiveness. Internet Research:Electronic Networking Applications and Policy 7 (2), 1e9.

Dholakia, U.M., Rego, L.L., 1998. What makes commercial web pages popular? Anempirical investigation of web page effectiveness. European Journal of Mar-keting 32 (7/8), 724e736.

Di Stasi, L.L., Antolí, A., Cañas, J.J., 2011. Main sequence: an index for detectingmental workload variation in complex tasks. Applied Ergonomics 42,807e813.

Djamasbi, S., Siegel, M., Tullis, T., 2010. Generation Y, web design, and eye tracking.International Journal of HumaneComputer Studies 68 (5), 307e323.

Drury, C., Clement, M., Clement, R., 1978. The effect of area, density, and number ofbackground characters on visual search. Human Factors 20 (5), 597e602.

East, R., 1997. Consumer Behavior: Advances and Applications in Marketing. Pren-tice Hall, , London.

Edmondson, W.H., Beale, R., 2008. Projected cognition e extending distributedcognition for the study of human interaction with computers. Interacting withComputers 20, 128e140.

Engel, J.F., Blackwell, R.D., Miniard, P.W., 1995. Consumer Behavior. Dryden Press,Fort Worth, TX.

Farrell, M.J., 1957. The measurement of productive efficiency. Journal of Royal Sta-tistical Society, A CXX, 253e281.

Flor, N.V., Hutchins, E., 1992. Analyzing distributed cognition in software teams: acase study of collaborative programming during adaptive software mainte-nance. In: Koenemann-Belliveau, J., Moher, T., Robertson, S. (Eds.), EmpiricalStudies of Programmers: Fourth Workshop. Ablex, Norwood, NJ, pp. 36e64.

Galbraith, J.R.,1973. Designing Complex Organizations. Addison-Wesley, Reading,MA.Gay, G., Stefanone, M., Grace-Martin, M., Hembrooke, H., 2001. The effects of

wireless computing in collaborative learning environments. InternationalJournal of HumaneComputer Interaction 13 (2), 257e276.

Germanprez, M., Zigurs, I., 2003. Causal factors for website complexity. Sprouts:Working Papers on Information Environments, Systems and Organizations 3(2), 107e121. Spring.

Grandon, E.E., Ranganathan, C., 2001. The impact of content and design of web siteson online sales. In: Proceedings of the 7th Americas Conference on InformationSystems, vol. 5, pp. 920e926.

Grisé, M., Gallupe, R.B., 2000. Information overload: addressing the productivityparadox in face-to-face electronic meetings. Journal of Management Informa-tion Systems 16 (3), 157e185.

Guriting, P., Ndubisi, N.O., 2006. Borneo online banking: evaluating customerperceptions and behavioural intention. Management Research News 29 (1/2),6e15.

Hahn, J., Kauffman, R.J., 2004. A methodology for business value-driven websiteevaluation: a data envelopment analysis approach. In: Proceedings of theThird Annual Workshop on HCI Research in MIS, Washington, D.C., December10e11.

Halverson, C.A., 1994. Distributed Cognition as a Theoretical Framework for HCI:Don’t Throw the Baby Out with the Bathwater e the Importance of the Cursorin Air Traffic Control. Report 9403. Department of Cognitive Science, Universityof California, San Diego.

Helander, M.G., Khalid, H.M., 2000. Modeling the customer in electronic commerce.Applied Ergonomics 31, 609e619.

Ho, C.F., Wu, W.S., 1999. Antecedents of customer satisfaction on the internet: anempirical study of online shopping. In: Sprague Jr., R.H., Nunamaker Jr., J.F.(Eds.), Proceedings of the Thirty-Second Annual Hawaii International Confer-ence on System Sciences, Maui, Hawaii, Jan. 4e7. IEEE Computer Society Press,Los Alamitos, CA, USA.

Hoffman, D.L., Novak, T.P., Duhachek, A., 2002. The influence of goal directed andexperiential activities on online flow experiences. Journal of Consumer Psy-chology 13.

Hoffman, D.L., Novak, T.P., Yung, Y.F., 2000. Measuring the customer experience inonline environments: a structural modeling approach. Marketing Science 19,22e42.

Hong, W., Thong, J.Y.L., Tam, K.T., 2004. The effects of information format andshopping task on consumer’s online shopping behavior: a cognitive fitperspective. Journal of Management Information Systems 21 (3), 149e184.

Hutchins, E., 1995a. Cognition in the Wild. The MIT Press, Cambridge.Hutchins, E., 1995b. How a cockpit remembers its speeds. Cognitive Science 19,

126e189.Hutchins, E., Klausen, T., 1996. Distributed cognition in an airline cockpit. In:

Middleton, D., Engeström, Y. (Eds.), Communication and Cognition at Work.Cambridge University Press, Cambridge, Mass., pp. 15e54.

IPSOS, 2008a. Overall satisfaction with retailer-ICT, survey commissioned by DGHealth and Consumers, IPSOS Belgium.

IPSOS, 2008b. Overall satisfaction with entertainment and leisure goods, surveycommissioned by DG Health and Consumers, IPSOS Belgium.

Jarvenpaa, S.L., Todd, P.A., 1997. Consumer reactions to electronic shopping on theworld wide web. Journal of Electronic Commerce 1 (2), 59e88.

Jenkins, L., Murray, A., 2003. A multivariate statistical approach to reducing thenumber of variables in data envelopment analysis. European Journal of Oper-ational Research 147, 51e61.

Jones, M.Y., Stanaland, A.J., Gelb, B.D., 1998. Beefcake cheesecake: insights for ad-vertisers. Journal of Advertising 27 (2), 33e51.

Jordan, P.W., 1998. An Introduction to Usability. Taylor & Francis, London.Jung, C.G., 1971. Psychological Types (Collected Works of C. G. Jung, Volume 6), third

ed. Princeton University Press, , Princeton, NJ.

Page 11: Evaluating ecommerce websites cognitive efficiency: An integrative framework based on data envelopment analysis

C. lo Storto / Applied Ergonomics 44 (2013) 1004e10141014

Kang, N.E., Yoon, W.C., 2008. Age- and experience-related user behavior differencesin the use of complicated electronic devices. International Journal of HumaneComputer Studies 66, 425e437.

Kirsch, D., Maglio, P., 1994. On distinguishing epistemic from pragmatic action.Cognitive Science 18, 513e549.

Kirsh, D., 1996. Adapting the environment instead of oneself. Adaptive Behavior 4(3/4), 415e452.

Kohavi, R., Longbotham, R., Sommerfield, D., Henne, R.M., 2009. Controlled exper-iments on the web: survey and practical guide. Data Mining Knowledge Dis-covery 18, 140e181.

Kolb, D., 1984. Learning Styles Inventory. McBer & Co., Boston.Kotzé, P., 2000. Defining and specifying graphs using formal model-based tech-

niques. South African Computer Journal 26, 217e221.Koufaris, M., Kambil, A., LaBarbera, P.A., 2002. Consumer behavior in web-based

commerce: and empirical study. International Journal of Electronic Commerce6 (2), 115e138.

Kuhlthau, C.C., Tama, S.L., 2001. Information search process of lawyers: a call for‘just for me’ information services. Journal of Documentation 57 (1), 25e43.

Laforet, S., Li, X., 2005. Consumers’ attitudes towards online and mobile banking inChina. International Journal of Bank Marketing 23 (5), 362e380.

Lais, S., June 17, 2002. How to stop web shopper flight. Computerworld, 44e45.Lave, J., 1988. Cognition in Practice: Mind, Mathematics and Culture in Everyday

Life. Cambridge University Press, Cambridge.Lecerof, A., Paterno, F., 1998. Automatic support for usability evaluation. IEEE

Transaction on Software Engineering 24 (10), 863e888.Lindgaard, G., Fernandes, G., Dudek, C., Brown, J., 2006. Attention web designers:

you have 50 milliseconds to make a good first impression! Behaviour & Infor-mation Technology 25 (2), 115e126.

lo Storto, C., 2004a. Un approccio cognitive-based per la valutazione dell’efficienzaed il confronto dei siti web, MTISD’04. Facoltà di Scienze Economiche edAziendali, Università del Sannio, 24e25 giugno 2004.

lo Storto, C., 2004b. Processi cognitivi, interazione utente-computer e valutazionedei siti web: un approccio metodologico. 42� Convegno AICA (CD-ROM). Uni-versità del Sannio, Benevento, 28e30 settembre 2004.

Lorigo, L., Pan, B., Hembrooke, H., Joachims, T., Granka, L., Gay, G., 2006. The in-fluence of task and gender on search and evaluation behavior using Google.Information Processing and Management 42 (4), 1123e1131.

Lovell, K., Rouse, P., 2003. Equivalent standard DEA models to provide super-efficiency scores. Journal of the Operational Research Society 54, 101e108.

Magoutas, B., Schmidt, K.U., Mentzas, G., Stojanovic, L., 2010. An adaptive e-ques-tionnaire for measuring user perceived portal quality. International Journal ofHumaneComputer Studies 68, 729e745.

McGuire, J., 1976. Some internal psychological factors influencing consumer choice.Journal of Consumer Research 2, 302e319.

Merwe, R., Bekker, J., 2003. A framework and methodology for evaluating e-com-merce web sites. Internet Research: Electronic Networking Applications andPolicy 13 (5), 330e341.

Meyers-Levy, J., Maheswaran, D., 1991. Exploring differences in males’ and females’processing strategy. Journal of Consumers Research 18, 63e70.

Mich, L., Franch, M., Gaio, L., 2003. Evaluating and designing web site quality. IEEEMultiMedia 10 (1), 34e43.

Moe, W.W., Fader, P.S., 2004. Capturing evolving visit behavior in clickstream data.Journal of Interactive Marketing 18 (1), 5e19.

Myers-Briggs, I., McCaulley, M., 1992. Manual: a Guide to the Development and Useof the Myers-Briggs Type Indicator. Consulting Psychologists Press.

Newell, A., Simon, H., 1972. Human Problem Solving. Prentice Hall.Nielsen, J., 1995. Multimedia and Hypertext: the Internet and Beyond. AP Profes-

sional, Boston.Nielsen, J., 2000. Designing Web Usability: the Practice of Simplicity. New Riders

Publishing, Indianapolis, IN.Nielsen, J., August 19, 2001. Did poor usability kill e-commerce? Alertbox. Available

on Internet: http://www.useit.com/alertbox/20010819.html.Nielsen, J., Loranger, H., 2006. Prioritizing Web Usability. New Riders Press, Ber-

keley, CA.Palmer, J.W., 2002. Web site usability, design, and performance metrics. Information

Systems Research 13 (2), 151e167.Ping Zhang, N.L., 2004. An assessment of humanecomputer interaction research in

management information systems: topics and methods. Computers in HumanBehavior 20, 125e147.

Radha, R., Murphy, C., 1992. Searching versus browsing in hypertext. Hypermedia 4(1), 1e31.

Ramarapu, N.K., Frolick, M.N., Wilkes, R.B., Wetherbe, J.C., 1997. The emergence ofhypertext and problem solving: an experimental investigation of accessing andusing information from linear versus nonlinear systems. Decision Sciences 28(4), 825e849.

Ranganathan, C., Ganapathy, S., 2002. Key dimensions of business-to-consumerwebsites. Information and Management 39, 457e465.

Rayner, K., 1998. Eye movements and information processing: 20 years of research.Psychological Bulletin 124 (3), 372e422.

Rice, M., 1997. What makes users revisit a web site. Marketing News 31 (6), 23.Rogers, Y., 2004. New theoretical approaches for humanecomputer interaction.

Annual Review of Information Science and Technology 38, 87e143.Rogers, Y., Ellis, J., 1994. Distributed cognition: an alternative framework for ana-

lysing and explaining collaborative working. Journal of Information Technology9, 119e128.

Scandura, J.M., 1971. Deterministic theorizing in structural learning: three levels ofempiricism. Journal of Structural Learning 3, 21e53.

Schenkman, B.N., Jönsson, F.U., 2000. Aesthetics and preferences of web pages.Behavior & Information Technology 19 (5), 367e377.

Shindler, J., 1992. The Paragon Learning Style Indicator. Paragon EducationalConsulting, Seattle, WA.

Singh, M., Iding, M., 2011. Does credibility count? Singaporean students’ evaluationof social studies web sites. Psychology and Learning 1 (4), 19e35.

Singh, S., Erwin, G., Kotzé, P., 2001. Electronic Business Accepted Practices (e-BAP):Standardization of HCI for E-Commerce in South Africa. Technical Report,UNISA-TR-2001-22 and SAICSIT 2001 Postgraduate Research Symposium. Uni-versity of South Africa.

Staggers, N., Norcio, A.F., 1993. Mental models: concepts for humanecomputerinteraction research. International Journal of ManeMachine Studies 38,587e605.

Suchman, L.A., 1987. Plans and Situated Actions: the Problem of Human ComputerInteraction. Cambridge University Press, Cambridge.

Sweller, J., 1988. Cognitive load during problem solving: effects on learning.Cognitive Science 12, 257e285.

Sweller, J., 1994. Cognitive load theory, learning difficulty, and instructional design.Learning and Instruction 4, 295e312.

Trevino, L.K., Webster, J., 1992. Flow in computer mediated communication.Communication Research 19 (5), 539e573.

Tuch, A.N., Bargas-Avila, J.A., Opwis, K., Wilhelm, F.H., 2009. Visual complexity ofwebsites: effects on user’s experience, physiology, performance, and memory.International Journal of Human Computer Studies 67, 703e715.

Tversky, A., 1972. Elimination by aspects: a theory of choice. Psychological Review79, 281e299.

Vellido, A., Lisboa, P.J.G., Meehan, K., 2000. Quantitative characterization and pre-diction of on-line purchasing behavior: a latent variable approach. InternationalJournal of Electronic Commerce 4 (4), 83e104.

Vessey, I., 1991. Cognitive fit: a theory-based analysis of the graphs versus tablesliterature. Decision Sciences 22 (2), 219e240.

Webster, J., Trevino, L.K., Ryan, L., 1993. The dimensionality and correlates offlow in humanecomputer interaction. Computers in Human Behavior 9,411e426.

Wilkie, W.L., 1994. Consumer Behaviour, third ed. John Wiley and Sons.Wolfinbarger, M., Gilly, M.C., 2002. ComQ: dimensionalizing, measuring and pre-

dicting quality of the e-tail experience. In: Proceedings of the American Mar-keting Association Conference, February 22e25, vol. 13.

Wright, P.C., Fields, R.E., Harrison, M.D., 2000. Analyzing humanecomputer inter-action as distributed cognition: the resources model. HumaneComputerInteraction 15 (1), 1e41.

Zhang, J., Norman, D.A., 1994. Representations in distributed cognitive tasks.Cognitive Science 18, 87e122.

Zhang, J., 1991. The interaction of internal and external representations in a problemsolving task. In: Proceedings of the Thirteenth Annual Conference of CognitiveScience Society. Erlbaum, Hillsdale, NJ.

Zhang, P., von Dran, G.M., 2000. Satisfactor and dissatisfactors: a two-factor modelfor website design and evaluation. Journal of the American Society for Infor-mation Science 51 (4), 1253e1268.

Zhu, J., 1996. Robustness of the efficient DMUs in data envelopment analysis. Eu-ropean Journal of Operational Research 90, 451e460.