Top Banner
Running head: A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 1 This article appears in: Reid, A. J. (2014). A case study in social annotation of digital text. Journal of Applied Learning Technology, 4(2), 15-25. A Case Study in Social Annotation of Digital Text Alan J. Reid Coastal Carolina University [email protected]
26

A case study in social annotation of digital text

Mar 31, 2023

Download

Documents

Emily Crookston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: A case study in social annotation of digital text

Running head: A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 1

This article appears in: Reid, A. J. (2014). A case study in social annotation of digital text. Journal of Applied Learning Technology, 4(2), 15-25.

A Case Study in Social Annotation of Digital Text

Alan J. Reid

Coastal Carolina University

[email protected]

Page 2: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 2  

Abstract

Recent learning technologies in digital text have transformed the reading process from being a private activity into a social and collaborative experience. This research addressed the use of social annotation (SA) tools, which allow readers to interact asynchronously with other readers through shared highlights and comments embedded directly in the text. In this case study, participants were divided into three treatment groups (N = 32): Group 1 read a digital text and annotated synchronously with fellow readers. Group 2 read the text with the existing annotations but did not contribute. Group 3, represented as the control group, read the narrative text with no annotations and did not mark the text. Results indicated a significant difference between groups in terms of academic achievement, motivation, and mental effort. These findings suggest the use of SA tools is beneficial to readers in a variety of ways and should be explored in larger settings.

Keywords: Social annotation, eReading, digital text

Page 3: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 3  

Introduction

Traditionally, reading has been a private, isolated activity: A one-way

conversation between the author and the reader. The digital revolution, however, has

transformed reading into a sociable experience that can be shared with friends, family,

and even complete strangers. Facilitating this shift away from print-based text and

towards digital text is the popularity of eBooks. Defined as an electronic version of a

printed book (“ebook,” 2013), eBooks are becoming increasingly pervasive in the public

and educational sectors, where there are four times more people reading eBooks on a

typical day now than was the case two years ago (Zickuhr, Rainie, Purcell, Madden, &

Brenner, 2012). Because of the influx and affordability of eBooks and the devices on

which they are consumed, it is natural that reading digital text, or eReading, would

embrace the social model that is so engrained in Web 2.0 technologies. Sharing and

collaboration have become hallmark features of online technologies, and the traditionally

personal experience of reading a book is no exception.

Social Annotation

In general, research has shown the benefits associated with annotating a text.

Methods of textual annotation, sometimes referred to “marginalia,” include (a) writing

brief summaries in the margins, (b) giving examples of concepts, (c) generating possible

test questions, (d) noting puzzling or confusing ideas (Simpson & Nist, 1990). Shared

annotation is not a recent invention. Before online technologies facilitated the sharing of

documents with others, readers often experienced shared annotations through existing

notes and comments in textbooks made by previous readers (Nokelainen, Miettinen,

Kurhila, Floreen, & Tirri, 2005). Social annotation (SA) tools enable readers to annotate

a document online and share their comments with others privately or publicly. A number

Page 4: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 4  

of sites exist currently that could be used as a SA tool, though they are intended more as

social bookmarking sites, including Diigo (www.diigo.com) and tumblr

(www.tumblr.com). SA tools are less common, though researchers have developed

programs such as HyLighter (Archibald, 2010), VPen (Hwang, Wang, & Sharples, 2007),

SpreadCrumbs (Kawase, Herder, & Nejdl, 2009), and Educosm (Nokelainen, Kurhila,

Miettinen, Floreen, & Tirri, 2003) for the specific purpose of social annotation.

It has been argued that Google Drive (formerly known as Google Docs) should

not be considered a SA tool because it “provides an online social support platform but

does not allow annotating of newly uploaded electronic materials/documents created via

other tools” (Novak, Razzouk, & Johnson, 2012, p.40). However, in the present study, a

Google document was created and shared using Google Drive, and as such, it meets the

criteria for a proper SA tool, as according to Novak et al. (2012): The ability to add

annotations, highlights, and operational as a collaboration/information sharing online

platform. The only essential factor of a SA tool that a Google Drive document does not

embrace is the ability to mark annotations privately (Glover, Xu, & Hardaker, 2007).

Rhetorical Interface

The definition of “interface” at its most fundamental level, means “a point where

two systems, subjects, organizations, etc., meet and interact” (“Interface”). Carnegie

(2009) defines three rhetorical modes of interactivity within new media: Multi-

directionality, manipulability, and presence. The first mode, multi-directionality, refers to

an ability to interact with outside objects by linking to them within the interface. A

Google Drive document, when implemented as a SA tool, allows for hypertextuality

within the commenting tools. Further, a higher level of interactivity is fostered since

Page 5: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 5  

readers can comment directly on existing comments (see Figure 1). The second mode of

interactivity, manipulability, is more restrictive in the Google Drive environment. Few

options are customizable, therefore limiting the tool in this sense. The last mode of

rhetorical interactivity is presence, which is fully supported in a shared Google Drive

document, where readers can interact synchronously (as in Group 1) or asynchronously

(as in Group 2) with fellow readers.

Figure 1. An example of modes of rhetorical interactivity: multi-directionality and presence.

The rhetorical interface of a SA tool should communicate to the reader a higher

level of interactivity, which in turn, fosters a more engaging and productive learning

environment. Carnegie (2009) argues that in new media, the interface “works continually

to engage the audience not simply in action but in interaction” (p.171). The

popularization of SA tools and the integration of social features in online and digital text

warrants further investigation into the effectiveness and benefits of using a SA tool in an

academic climate. It remains unclear whether the use of SA tools promotes a

collaborative and social reading experience or distracts learning and places strain on

Page 6: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 6  

cognitive load. As a result, this study investigated the following research questions:

1. Does the use of a SA tool affect perceived interest or learner affect towards the text? 2. Does the use of a SA tool lead to an improvement in academic achievement? 3. Does the use of a SA tool increase the reader’s invested mental effort?

Method

Participants

This study was conducted at a small community college (approximately 1,600

enrolled) in the mid-Atlantic region of the United States. The researcher used a

convenience sampling (N = 32) of students enrolled in three different sections of

Developmental English courses for the Fall 2013 semester. All participants were

undergraduates, and the majority of the students were aged 18-20 and in their first

semester of college. Further, the majority of participants described their reading habits as

“occasional” (see Figure 2). There were 17 males and 15 females.

Age Range

18-20 21-25 26-30

31-35 36+

Semesters

1 2 3-5

Reading Habits

Rare Limited

Occasional Frequent

Page 7: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 7  

Figure 2. The demographic survey revealed that the majority of the participants were aged 18-20, in their first semester of college, and identified themselves of reading “occasionally.” Material and Procedure

Narrative text. The narrative short story “A Sound of Thunder” by Ray Bradbury

(approximately 4,300 words, Lexile© level 540) was presented to all participants in the

form of a shared Google Drive document. Students were seated in front of a laptop

computer and were provided a link to the shared document during class time. They were

given one hour to complete the treatment, and none of the participants had read the

narrative text previously.

Procedure. Prior to the study, all participants had extensive experience with

annotating texts in the form of a PDF, as this was a standard weekly assignment.

Participants were comfortable with the use of highlighting important information and

adding comments to the text. Although the use of PDF’s was common in the course, none

of the participants had used a social annotation tool before the implementation of this

study.

Each class was categorized into a different group. Group 1 (n = 13) was instructed

to read the online text and make annotations during reading. Because a Google Drive

document was used, these comments appeared in the right-hand margin of the document

synchronously, allowing students to see comments being added in real time (see Figure

3). Students in Group 1 also had the capability of adding responses to other students’

posted comments. Group 2 (n = 9) was derived from a different course section, and

participants were provided the link to the Google Drive document in class. They were

Page 8: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 8  

instructed to read the text, but editing of the document was disabled, so they could not

add, edit, or delete any comments. Figure 3 shows an example screen that was annotated

synchronously by Group 1 and was viewed statically by Group 2. The third class, Group

3 (n = 10), simply read the online text and did not see the comments, nor were they able

to add any comments to the document.

All participants were provided a print-based packet that included the demographic

survey, perceived interest questionnaire, learner affect questionnaire, mental effort

survey, and the 10-question comprehension posttest. Participants were instructed to read

the online text prior to opening the packet and completing the questionnaires, surveys,

and test. All groups were allowed one hour to complete the treatment. One week after the

treatment, all groups were asked to complete a brief questionnaire that asked for general

comments on the presentation of the material. No personal identifying information was

collected.

Page 9: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 9  

Figure 3. Screenshot from the shared Google Drive document. Group 1 annotated the text synchronously. Group 2 saw this page as it is displayed but did not have editing capabilities, and Group 3 saw only the text with no annotations.

Page 10: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 10  

Instruments

Following the text, the readers completed a demographic survey, a perceived

interest questionnaire, a learner affect questionnaire, a mental effort survey, and a 10-

question comprehension posttest, in that sequence.

Perceived interest questionnaire. This 10-item questionnaire was adapted from

Schraw, Bruning, and Svoboda (1995) and asked the reader to self-report levels of

interest in the text. The Likert-type survey (Appendix B) asked the readers ten questions

and responses ranged from Strongly Disagree to Strongly Agree (1 to 5, respectively).

Each participant received a total score of perceived interest based on her responses.

Learner affect questionnaire. The participants also reported a score (“not at all,”

“a little,” “somewhat,” “quite a bit,” “very,” “extremely”) for seven different adjectives

that described his or her state during reading (Razon, Turner, Johnson, Arsal, &

Tenenbaum, 2012). Respondents used the same 6-point scale to answer a question on

motivation and how likely he or she was to read similar texts in the future. These

questions consisted of, “How motivated were you while reading this text?,” and “To what

degree do you wish to read more texts like this one?” (see Appendix C).

Mental effort survey. The survey used to measure mental effort during reading

was adapted from Mackersie and Cones (2011). It asked participants a total of five

questions sub-categorized into mental demand, temporal demand, performance, effort,

and frustration (see Appendix D). Possible scores ranged from 0 (low) to 100 (high).

Participants received an overall score on the survey at the end of the instruction, but the

mental effort question “How hard did you have to work in your attempt to understand the

Page 11: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 11  

contents of this text?” was also embedded at the bottom of each page of the narrative text.

Participants recorded their responses to this question on mental effort during reading.

Comprehension posttest. The posttest consisted of ten multiple choice questions

at the recall-level (see Appendix E). The items were adapted from an online web activity

(“Quia”). Participants were not permitted to return to the text after beginning the posttest.

Results

This section reports the findings from the statistical analyses conducted to

determine the effects of SA tools on learner interest, achievement, and mental effort (N =

32). Table 1 presents the overall mean results collapsed across conditions for each of the

dependent variables.

Table 1 Mean results collapsed across conditions n Perceived

Interest Mental Effort

Comprehension Posttest

M SD M SD M SD Group 1 13 34.0 8.45 191.31 105.60 76.92 14.94 Group 2 9 37.0 3.74 239.44 82.41 84.28 16.18 Group 3 10 33.6 4.81 328.11 88.54 53.00 18.89 Research Question: Interest and Affect

The first research question investigated whether or not the presentation of the text

using the SA tool would impact learners’ perceptions towards the text. A one-way

between-groups analysis of variance (ANOVA) was conducted, and no significant

difference between groups existed on the perceived interest questionnaire. In terms of

reader affect, there were no significant differences detected in the total scores between

Page 12: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 12  

groups. However, in response to the question on motivation, “How motivated were you

while reading this text?,” a statistically significant difference did exist F(2, 29) = 5.67, p

= .008. A Tukey post hoc test indicated Group 1 (M =3.92, SD = 1.04) reported a

significantly higher level of motivation while reading the text, when compared to the

control group (M = 2.5, SD = .97).

Table 2 One-way between-groups analysis of variance for motivation Source SS df MS F p

Group 12.30 2 6.15 5.674 .008*

Error 31.423 29 1.08

Total 43.72 31

* p < .05 level, two-tailed Research Question: Achievement

This research question asked whether a difference would exist between treatments

in terms of academic achievement. A one-way between groups ANOVA revealed a

statistically significant different between groups F(2, 27) = 8.92, p = .001. A Tukey post

hoc comparison revealed the control group scored significantly lower (M = 53, SD =

18.89) on the comprehension posttest compared to Group 1 (M = 76.92, SD = 14.94) and

Group 2 (M = 84.28, SD = 16.18).

Page 13: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 13  

Table 3 One-way between-groups analysis of variance for posttest score Source SS df MS F p

Group 4928.32 2 2464.16 8.92 .001*

Error 7458.35 27 276.24

Total 12386.67 29

* p < .05 level, two-tailed

Research Question: Mental Effort

The final research question sought differences in mental effort between the

treatment groups. The total score on the mental effort survey was analyzed for differences

using a one-way between-groups ANOVA, and a statistically significant difference was

identified, F(2, 28) = 5.58, p = .009. The control group (Group 3) reported significantly

higher levels of mental effort (M = 328.11, SD = 88.54) when compared to Group 1 (M =

191.31, SD = 105.60).

Table 4 One-way between-groups analysis of variance for mental effort Source SS df MS F p

Group 99922.12 2 49961.06 5.58 .009*

Error 250851.88 28 8959.00

Total 350774.00 30

* p < .05 level, two-tailed

Page 14: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 14  

Mental effort was also reported throughout the text. At the end of every page of

text, all participants reported their levels of mental effort in response to the question:

“How hard did you have to work in your attempt to understand the contents of this text?”

Possible scores ranged from 0 (low) to 100 (high). Figure 3 shows the linear trend of the

mean results from each group for each of the twelve trials. Group 1 reported the lowest

levels of mental effort in all but one of the trials. Group 3 (control) reported the highest

levels of mental effort in all but the first two trials.

Figure 3. A linear chart depicting the mean scores of repeated measure of mental effort across twelve trials for each group.

0

10

20

30

40

50

60

1 2 3 4 5 6 7 8 9 10 11 12 Trial

Group 1

Group 2

Group 3

Page 15: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 15  

Discussion

The results of this study found the most robust treatment (Group 1), which

actively utilized the SA tool, had significantly lower levels of mental effort, scored

significantly higher on the comprehension posttest, and reported significantly more

motivation to read the text when compared to the control. Although participants in Group

2 only read existing annotations and did not implement the use of the SA tool, the

treatment group still outperformed the control group on the comprehension posttest at a

statistically significant level. This finding is consistent with previous research that

showed an increase in comprehension questions, but not in higher-level critical thinking

questions (Hwang, et al., 2007; Johnson, Archibald, & Tenenbaum, 2010; Razon et al.,

2012). It would appear that the use of the SA tool increased reader interactivity during

reading, and lead to more motivation and more successful academic achievement.

Surprisingly, this interactivity lowered mental effort and did not place extra cognitive

strain on the reader.

Interface Design

These findings reinforce Carnegie’s (2009) interpretation of new media as a

networked, interactive, and rhetorical device that operates as an “exordium,” which is

continuously present and demands engagement from the learner. In this case study, the

use of a SA tool in the form of a shared Google Drive document did not yield a

significant difference between groups when investigating perceived interest or learner

affect, but this is contrary to other studies in which students preferred the use of a SA tool

(Gao & Johnson, (in press); Kawase, Herder, & Nejdl 2009). The social nature of the

Page 16: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 16  

interface allowed readers to experience the text in a social setting, a powerful

motivational tool. Likewise, Inman (2003) emphasizes the “shape of the page” as being

as important as any other element in the text (p.24). A social interface provides the reader

with a scaffolding for understanding and a context by which he or she can measure

comprehension while reassuring a sense of social presence and camaraderie. Newer

technologies such as the eReading applications Kobo and Readmill are beginning to see

the value in implementing these social annotation features into digital text.

Conclusion

The results of this case study suggest social annotation to be a powerful strategy

for readers in terms of reading comprehension and motivation. Further, the use of a

shared Google Drive document demonstrates the ease of use and accessibility to a basic,

but effective, SA tool. Because of the small sample size of this study, these findings are

not generalizable to all college undergraduates, though they warrant further investigation

into the benefits of social annotation while reading digital text. Future research should

also explore the impact of the use of a SA tool on different types of readers and for

different levels of question types.

The intention of this case study was to investigate the impact of a SA tool in the

form of a shared Google Drive document on college undergraduates in terms of interest,

affect, achievement, and mental effort. Preliminary results were positive for those

experiencing the narrative text with both synchronous and asynchronous commenting.

The cultural shift from print to digital-based text also affords the opportunity for sharing

and collaboration among readers. As suggested by this case study, there is little trade-off

and much to gain for enhancing reader motivation and achievement in a shared digital

Page 17: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 17  

text. As such, the use of SA tools should be examined in greater depth during a time in

which our digital lives, including reading, are becoming a more sociable experience

rather than individual, isolated acts.

Page 18: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 18  

References Archibald, T. N. (2010). The effect of the integration of social annotation technology,

first principles of instruction, and team-based learning on students' reading comprehension, critical thinking, and meta-cognitive skills. PhD Dissertation, Florida State University, Tallahassee, FL.

eBook. (2013) Oxford Dictionaries. In Oxford Dictionaries. Retrieved November 12,

2013 from http://www.oxforddictionaries.com/us/definition/american_english/e--book

Carnegie, T. (2009). Interface as exordium: The rhetoric of interactivity. Computers and

Composition, 26, 164-173. doi: 10.1016/j.compcom.2009.05.005 Gao, F. (2013). A case study of using a social annotation tool to support collaboratively

learning. Internet and Higher Education, 17, 76-83. doi: 10.1016/j.iheduc.2012.11.002

Gao, F. & Johnson, T.E. (in press). Learning web-based materials collaboratively with a

web annotation tool. Glover, I., Xu, Z., & Hardaker, G. (2007). Online annotation – Research and practices.

Computers and Education, 49, 1308-1320. Hwang, W. -Y., Wang, C. -Y., & Sharples, M. (2007). A study of multimedia annotation

of web-based materials. Computers & Education, 48, 680–699. Inman, James. (2003). Electronic texts and the concept of close reading: A cyborg

anthropologist’s perspective. In J. R. Walker & O. O. Oviedo (Eds.), TnT: Texts and Technoloy. Cresskill, New Jersey: Hampton Press, Inc.

Interface, (n.d.). Oxford English Dictionary online. Retrieved November 12, 2013, from

http://oxforddictionaries.com/us/definition/american_english/interface?q=interface

Johnson, T.E., Archibald, T.N., & Tenenbaum, G. (2010). Individual and team annotation

effects on students’ reading comprehension, critical thinking, and metacognitive skills. Computers in Human Behavior, 26(6), 1496-1507. doi: 10.1016/j.chb.2010.05.014

Kawase, R., Herder, E., & Nedjl, W. (2009). A comparison of paper-based and online annotations in the workplace. Paper presented at the Proceedings of the 4th European Conference on Technology Enhanced Learning: Learning in the Synergy of Multiple Disciplines, Nice, France.

Page 19: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 19  

Nokelainen, P., Kurhila, J., Miettinen, M., Floreen, P., & Tirri, H. (2003). Evaluating the role of a shared document-based annotation tool in learner-centered collaborative learning. Paper presented at the Advanced Learning Technologies. The 3rd IEEE International Conference.

Nokelainen, P., Miettinen, M., Kurhila, J., Floreen, P., & Tirri, H. (2005). A shared

document-based annotation tool to support learner-centered collaborative learning. British Journal of Educational Technology, 36(5), 757-770.

Novak, E., Razzouk, R., & Johnson, T.E. (2012). The educational use of social

annotation tools in higher education: A literature review. Internet and Higher Education, 15(1), 39-49. doi: 10.1016/j.iheduc.2011.09.002

Quia, (n.d.). A Sound of Thunder. Retrieved November 12, 2013, from

http://www.quia.com/quiz/1593324.html Razon, S., Turner, J., Johnson, T.E., Arsal, G., & Tenenbaum, G. (2012). Effects of a

collaborative annotation method on students’ learning and learning-related motivation and affect. Computers in Human Behavior, 28(2), 350-359. doi: 10.1016/j.chb.2011.10.004

Schraw, G., Bruning, R., & Svoboda, C. (1995). Sources of situational interest. Journal

of Literacy Research, 27(1). doi: 10.1080/10862969509547866 Simpson, M.L., & Nist, S.L. (1990). Textbook annotation: An effective and efficient

study strategy for college students. Journal of Reading, 34(2), 122-129. Zickuhr, K., Rainie, L., Purcell, K., Madden, M., & Brenner, J. The Rise of e-Reading,

Pew Internet and American Life Project. April 4, 2012 http://libraries.pewinternet.org/2012/04/04/the-rise-of-e-reading/ accessed on October 29, 2013.

Page 20: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 20  

Appendix A Demographic Survey

What is your gender? Male Female What is your age range? 18-20 21-25 26-30 31-35 36+ How would you describe your reading habits? Rare Limited Occasional Frequent How many semesters have you been enrolled in college? 1 2 3-5 6 or more

Page 21: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 21  

Appendix B Perceived Interest Questionnaire

Strongly Disagree

Disagree Neither Agree nor Disagree

Agree Strongly Agree

I thought this text was very interesting.

I’d like to discuss this text with others at some point.

I would complete this text again if I had the chance.

I got caught up in the text without trying to.

Ill probably think about the implications of this text for some time to come.

I thought the text’s topic was fascinating.

I think others would find this text interesting.

I would like to learn more about this topic in the future.

This text was one of the most interesting things I’ve learned in a long time.

The text really grabbed my attention.

Adapted from: Schraw, G., Bruning, R., & Svoboda, C. (1995). Sources of situational interest. Journal of Literacy Research, 27(1). doi: 10.1080/10862969509547866

Page 22: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 22  

Appendix C Learner Affect Questionnaire

Group 1: Students differ in how they feel about reading and annotating. Please use the scale below to rate how you feel about reading, and digitally annotating this text. Groups 2 & 3: Students differ in how they feel about reading text. Please use the scale below to rate how you feel about reading this text.

Not at all A little Somewhat Quite a bit Very Extremely

Excited 1 2 3 4 5 6

Worried 1 2 3 4 5 6

Optimistic 1 2 3 4 5 6

Distressed 1 2 3 4 5 6

Happy 1 2 3 4 5 6

Uncertain 1 2 3 4 5 6

Pessimistic 1 2 3 4 5 6

Other: ____________

1 2 3 4 5 6

How motivated were you while reading this text?

Not at all A little Somewhat Quite a bit Very Extremely

1 2 3 4 5 6

To what degree do you wish to read more texts like this one?

Not at all A little Somewhat Quite a bit Very Extremely

1 2 3 4 5 6

Adapted from Razon, S., Turner, J., Johnson, T.E., Arsal, G., & Tenenbaum, G. (2012). Effects of a collaborative annotation method on students’ learning and learning-related motivation and affect. Computers in Human Behavior, 28(2), 350-359. doi: 10.1016/j.chb.2011.10.004

Page 23: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 23  

Appendix D Mental Effort Survey

Mental Effort (Repeated Measure): How hard did you have to work in your attempt to understand the contents of this text? (0 = Low, 100 = High) ________________________________________________________________________ Mental Demand: How mentally demanding was the task? _______ (0 = Low, 100 = High) Temporal Demand: How hurried or rushed was the pace of the task? _______ (0 = Low, 100 = High) Performance: How successful were you in accomplishing what you were asked to do? ________ (0 = Low, 100 = High) Effort: How hard did you have to work to accomplish your level of performance? ________ (0 = Low, 100 = High) Frustration: How insecure, discouraged, irritated, stressed, and annoyed were you? ________ (0 = Low, 100 = High) Adapted from Mackersie, C., & Cones, H. (2011). Subjective and psychophysiological indices of listening effort in a competing-talker task. Journal of the American Academy of Audiology, 22(2), 113-122. doi: 10.3766/jaaa.22.2.6

Page 24: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 24  

Appendix E Comprehension Posttest

1. Just before returning to the present, Travis orders Eckels to: Take a photo of the dead dino Bury the dino Retrieve the bullets Pay $10,000 2. What is the first thing Eckels notices when he returns from the time safari? A dead butterfly The sign on the office door The mud on his boots A strange smell in the air 3. How do hunters know which dinosaurs they can shoot? They are all dinosaur experts They have seen pictures of the dino they can kill The tour guides point out the correct dino The correct dino has been marked with red paint 4. What has Eckels done to change the course of time? He did not follow the rules given to him by Travis He removed the bullets from the dead dino He killed a butterfly He left the path 5. Travis blames Eckels for the change in the world. How does Travis respond? He kills him He decides to quit time travel He wants the world to know about Eckel’s mistake He decides to support Deutsher 6. When facing the mighty dinosaur, Eckels is Terrified and wants out of the safari Sorry he wasted his money Determined to shoot it on its own Disappointed because it is an easy kill 7. Why is it so important that the hunters remain on the path? There is a heavy fine for leaving the path The hunters are protected from harm only if they remain on the path

Page 25: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 25  

The guides are afraid someone will get lost if they leave the path By disturbing anything in the past, they could change the future 8. Some dinosaurs are more difficult to kill than others because they Are able to outwit their attackers Have two brains Run extremely fast Can blend in with the jungle 9. How does Eckel's feel about Deutsher's election as president at the end of the story? He feels terrified by this change He thinks Deutsher will be good for the country He doesn't care who is president He is happy about it 10. The story’s title, “A Sound of Thunder,” refers to what, exactly?

Answers may include: A reference to the dinosaurs and the sound of their footsteps. A reference to Eckels being shot by Travis at the end of the story.

Adapted from Quia. Retrieved from http://www.quia.com/quiz/1593324.html              

Page 26: A case study in social annotation of digital text

A CASE STUDY IN SOCIAL ANNOTATION OF DIGITAL TEXT 26  

About the Author I earned my Ph.D. in Instructional Design & Technology from Old Dominion University, an M.A. in Teaching from the University of North Carolina Wilmington, and a B.A. in English from The Ohio State University.

Currently, I am an Assistant Professor of First-Year Writing & Instructional Technologies at Coastal Carolina University in Myrtle Beach, SC. I also teach a variety of ID courses in the Instructional Design & Technology doctoral program at Old Dominion University, and I hold a part-time position as an Educational Research Associate in the Center for Research and Reform in Education (CRRE) at Johns Hopkins University.

My research interests lie within the areas of metacognition and self-regulation, particularly in new media. I have written extensively on generative learning strategy use in digital text, the implementation of social media in higher ed, and a variety of other topics including the effects of social annotation and perceptual span on reading in digital media. Eventually, I would like to explore the constraints on cognitive efficiency as a result of multitasking.

Presently, I am designing a digital badge initiative for Coastal Carolina University, Coastal Composition Commons, in which undergraduates earn digital badges for demonstrating mastery in writing competencies. An institutional ethnography and experimental study are being conducted. My passions include surfing, running with my dog, and having impromptu beach trips with my son (Phoenix), my daughter (Stella), and my loving, supportive, and brilliant wife, Alison. I also enjoy photography; a portfolio of my ongoing work can be seen here: Salty Lens Photography Alan Reid @alanjreidphd