Top Banner
12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] ict.usc.edu // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT At the University of Southern California Institute for Creative Technologies (ICT), leaders in artificial intelligence, graphics, virtual reality (VR) and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society. Established in 1999, ICT is a DoD-sponsored University Affiliated Research Center (UARC) working in collaboration with the U.S Army Research Laboratory. UARCs are aligned with prestigious institutions conducting research at the forefront of science and innovation. ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, education and more. Research projects explore and expand how people engage with computers, through virtual characters, video games and simulated scenarios. ICT is a recognized leader in the development of virtual humans who look, think and behave like real people. ICT prototypes provide engaging experiences to improve skills in decision- making, cultural awareness, leadership and coping, to name a few. ey allow veterans to go online and speak anonymously to an interactive virtual coach who can remotely recognize signs of depression, PTSD and suicide risk. ey provide training in how to address cases of performance or personal issues through practice with a computer-generated virtual human education system. ey can simulate what goes wrong when Soldiers don’t consider the cultural sensitivities and indirect consequences of even their smallest interactions. Being based in Los Angeles facilitates collaboration with major movie and game makers. ICT graphics innovations help create realistic computer- generated characters in Hollywood blockbusters and also enhance virtual characters for museum and military projects. ICT’s groundbreaking research and advanced technology demonstrations are both making an impact today and paving the way for what is possible in the future. Facts and Figures Approximately 50 installations, hospitals and clinics use ICT’s virtual reality therapy for treating PTSD. $34 million saved by applying ICT’s natural language stystem to a single U.S. Army project. 1 Academy-Award received for developing ICT’s Light Stages, used in Avatar and Spider-Man 2. 120 ICT-authored academic journal articles and conference papers published in 2012. 6/2013
44

Download all PDF overviews.

Jan 02, 2017

Download

Documents

truongque
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies (ICT), leaders in artificial intelligence, graphics, virtual reality (VR) and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Established in 1999, ICT is a DoD-sponsored University Affiliated Research Center (UARC) working in collaboration with the U.S Army Research Laboratory. UARCs are aligned with prestigious institutions conducting research at the forefront of science and innovation.

ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, education and more. Research projects explore and expand how people engage with computers, through virtual characters, video games and simulated scenarios. ICT is a recognized leader in the development of virtual humans who look, think and behave like real people. ICT prototypes provide engaging experiences to improve skills in decision-making, cultural awareness, leadership and coping, to name a few. They allow veterans to go online and speak anonymously to an interactive virtual coach who can remotely recognize signs of depression, PTSD and suicide risk. They provide training in how to address cases of performance or personal issues through practice with a computer-generated virtual human education system. They can simulate what goes wrong when Soldiers don’t consider the cultural sensitivities and indirect consequences of even their smallest interactions. Being based in Los Angeles facilitates collaboration with major movie and game makers. ICT graphics innovations help create realistic computer-generated characters in Hollywood blockbusters and also enhance virtual characters for museum and military projects. ICT’s groundbreaking research and advanced technology demonstrations are both making an impact today and paving the way for what is possible in the future.

Facts and Figures• Approximately 50 installations, hospitals and clinics use ICT’s virtual

reality therapy for treating PTSD.• $34 million saved by applying ICT’s natural language stystem to a single

U.S. Army project.• 1 Academy-Award received for developing ICT’s Light Stages, used in

Avatar and Spider-Man 2.• 120 ICT-authored academic journal articles and conference papers

published in 2012.

6/2013

Page 2: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Ada and GraceVirtual Human Museum Guides

Bringing Science and Technology to Life

Ada and Grace, ICT’s virtual human museum guides, debuted at the Boston Museum of Science in 2009 and have interacted with close to 200,000 visitors there. Designed to advance the public’s awareness of, and engagement in, computer science and emerging learning technologies, the virtual guides make a museum visit richer by answering visitor questions, suggesting exhibits and explaining the technology that makes them work. Named for Ada Lovelace and Grace Hopper, two female computer science pioneers, these digital docents are among the first and most advanced virtual humans ever created to speak face-to-face with museum visitors. As both examples and explainers of technical scientific concepts, they represent a new and potentially transformative medium for engaging the public in science. A collaboration between ICT and the Boston Museum of Science, Boston, this NSF-funded project highlights the educational and research potential of virtual characters by getting them out of the lab and interacting with people in meaningful and memorable ways. At the museum, they don’t just serve as guides but as a technology exhibit too. Displays placed next to the characters further educate visitors by showing the underlying processing the virtual humans perform in areas such as automatic speech recognition and natural language processing that allow the 19-year-old twins to move, listen, and talk just like real young adults. Musem visitors not only observe science, they also participate in the process of science: data acquired from visitor interactions with the virtual humans is being used on an ongoing basis to improve the performance and knowledge of the virtual humans. This rich database can have benefits for other virtual human applications in areas such as training, education, medical interventions, and entertainment. In addition, by moving a research project into a museum, the Virtual Museum Guides project transforms museums from a place where science is merely displayed to a place where science is actually done.

Related project: Coach Mike

1/2013

2009-present

Page 3: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Advanced Prototype Demonstrations Turning ICT Research into Real World Impact

Using science and storytelling, ICT prototypes provide engaging experiences to improve skills in decision-making, cultural awareness, leadership and coping, to name a few.

Imagine a veteran going online to speak anonymously to SimCoach, an interactive virtual guide who can remotely recognize signs of depression, PTSD and suicide risk and point the person in need to resources that can help.

Picture a soldier learning how to address a case of sexual harassment through practice with ELITE and INOTS, computer-generated virtual human education systems that can role play conversations and teach a classroom of participants the best way to handle tricky personnel issues.

Consider BiLAT and UrbanSim, computer games now available to troops stationed around the globe that can simulate what goes wrong when they don’t consider the cultural sensitivities and indirect consequences of even their smallest interactions.

Explore Virtual Iraq/Afghanistan, which provides relief from post-traumatic stress through virtual reality exposure therapy. Patients, with the help of clinicians, confront their trauma memories in a virtual world.

Meet Ada and Grace, responsive virtual human museum guides who promote STEM education by answering questions, suggesting exhibits and explaining the technology that makes them work.

Experience MCIT and DICE-T, which provide counter-IED training via a narrative storylines, immersive environments and a live multiplayer video game-based practice tools.

These inventive examples represent some of the groundbreaking research and advanced technology demonstrations coming out of the University of Southern California Institute for Creative Technologies that are both making an impact today and paving the way for what is possible in the future.

6/2012

Page 4: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Authoring Realistic Learning Environments with Stories (ARLES)

Narrative Evidence for Author Retrieval (NEAR)

Nonfiction stories of personal experience play an important role in effective training and education, both in the interaction between instructors and students and in the development of immersive computer-based learning environments. In this research, we develop new technologies for finding nonfiction stories relevant to specific learning objectives among the millions of personal stories that people post to their public weblogs. We have developed a pipeline for the automatic collection of tens of millions of stories from streams of weblog data. We search this collection using innovative new text and image-based tools. To support the use stories in the development of U.S. Army training systems, we identify stories that describe the execution of skills that are analogous to Army tasks.

In Government FY2014, ARLES research will incorporated into a new project:

This project will develop interactive information retrieval prototypes in which training developers can articulate criteria of authors whose content they seek using natural language (English text), which will be used to locate authors among millions of those who contribute narrative content to public weblogs that meet these criteria. In this work, we capitalize on recent advances in abductive language interpretation, which enable us to advance new approaches to information retrieval that is sensitive to both explicit and implicit evidence in authors’ writings.

These projects are funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies.

9/2012

Page 5: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Automatic Analysis of Discourse Structure

With the amount of information available as electronic text increasing at a staggering pace, machine understanding of natural language stands as a key barrier in the creation of computer systems that can take advantage of the knowledge encoded in written documents. Current approaches to automatic extraction of facts and other information from text focus mostly on what can be learned from single sentences, one sentence at a time, examining word-to-word relationships. This often corresponds to the notion of who did what to whom.

Constraining computation to single sentences has allowed researchers to explore increasingly sophisticated machine learning models that capture complex structure. In contrast, when going beyond sentence boundaries, the field is dominated by simpler models that largely ignore language structure.

This research project will explore models that target relationships that hold among larger phrases and sentences, capturing for example statements of cause and effect that involve multiple sentences, addressing the additional questions of why, how and when in broader contexts than what can be extracted from individual sentences, and accounting for the overall rhetorical structure of texts. This approach involves new strategies for dealing with the exponentially large search space of this task in a way that both addresses fundamental barriers to dealing with language structure in a principled way, and opens new possibilities for future research directions involving rhetorical relations such as causality, attribution, elaboration and comparison.

The proposed work aims, on one hand, to improve our understanding of efficient and effective computational modeling of natural language structure and, on the other hand, to enable deeper natural language understanding in applications such as automatic knowledge-base construction and conversational virtual human systems, allowing for the creation of smarter artificial agents with more robust language capabilities.

This research is funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies.

9/2013

Page 6: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

AXLArmy Excellence in Leadership

AXL provides an engaging and memorable way to transfer tacit knowledge and develop critical thinking through case-method teaching, filmed storytelling and interactive training. AXL includes filmed cases created in collaboration with Hollywood talent to address specific leadership issues and an easily modifiable website, AXLnet.

It is estimated that over 10,000 Soldiers trained using AXL video. They have been shown to West Point Cadets. A business ethics version was developed for the USC Marshall School of Business.

Filmed CasesICT has developed five filmed leadership cases addressing complex decision-making skills for the U.S. Army. The films, all based on real-life situations, were brought to life by experienced Hollywood screenwriters and professional Hollywood actors. The first case, Power Hungry, a 13-minute film set against the backdrop of a food distribution operation in Afghanistan, addresses lessons on how to think like a commander. Trip Wire uses the leadership challenges posed by the threat of IEDs in Iraq to consider the balance between force presence and mission accomplished, and Red Tight addresses interpreting threat levels in a Patriot battery operation. Working with the U.S. Army Chaplaincy, ICT developed Fallen Eagle, a two-part film series for squad level training told from the perspectives of both enlisted and officer ranks. It focuses on moral and ethical decision-making on the battlefield.

AXLnetAXLnet provides a dynamic and interactive experience for students and easy-to-use tools for instructors to author customized lessons. The system draws on ICT’s research in natural language processing to allow students to interview characters from the cases through free-text questions. The system can also provide feedback and tailor the learning experience based on student responses.

External CollaboratorsU.S. Army Research Institute Leader Development Research Unit, USC Rossier School of Education, U.S. Army Air Defense Artillery School, U.S. Corps of Chaplains, United States Military Academy, USC Marshall School of Business

6/2012

2003-2010

Page 7: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

BiLATBilateral Negotiation Trainer

BiLAT is a portable-PC based training program designed with a specific objective in mind: to provide students an immersive and compelling training environment to practice their skills in conducting meetings and negotiations in a specific cultural context. The application was a winner of a 2008 U.S. Army Modeling and Simulation Award and has been deployed as part of a training curriculum for officers assigned to foreign posts. BiLAT transitioned to the U.S. Army is available for download from their MilGaming website.

In BiLAT, students assume the role of a U.S. Army officer who needs to conduct a series of meetings with local leaders to achieve the mission objectives. Students must establish their own relationships with these characters and be sensitive to the characters’ cultural conventions. Any misstep could set the negotiations back or end them completely. Students must also apply sound negotiation strategies such as finding win-win solutions and properly preparing prior to the meeting.

USC’s Game Innovation Lab was involved in the game design as well as creating a compelling set of scenarios with realistic characters that would be appropriate for the training objectives identified. Also, the BiLAT infrastructure uses research technologies including a dialogue manager, SmartBody animation technology and the PsychSim social simulation system from ICT’s virtual human research project, as well as an intelligent coach and tutor to provide the student with run-time coaching and in-depth feedback during after action reviews.

BiLAT AIDE is a complementary web-based course created with the USC Rossier School of Education. The course further enhances learning by providing instruction on the theories behind the practice of negotiation and cultural understanding.

BiLAT was a part of the Learning with Adaptive Simulation and Training (LAST) Army Technology Objective (ATO). The project was a collaboration between the University of Southern California’s Institute for Creative Technologies (ICT), U.S. Army Research Institute for the Behavioral and Social Sciences (ARI), U.S. Army Research Laboratory Human Research and Engineering Directorate (ARL-HRED) and U.S. Army Simulation and Training Technology Center (STTC).

6/2012

2004-2008

Page 8: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Captivating Virtual Instruction for Training

The Captivating Virtual Instruction for Training (CVIT) project is a two-year research effort seeking to produce a blueprint for mapping effective instructional techniques used in a live classroom setting to core enabling technologies, which may then be used for the design and development of engaging virtual and distributed learning (dL) applications. This 6.3 project will use an existing Army program of instruction (POI) as the research platform which has demonstrated a history of quality instruction, accomplished student learning, and positive feedback from course participants.

The output of CVIT will be a dL version of the POI which will serve two primary purposes:

1. supplement existing classroom instruction by providing students with engaging, interactive material, consistent practice and performance feedback

2. potentially replace the need for certain live or resident instruction with a dL component

USC ICT will partner with a broad cross-section of organizations, both within and outside of the Army, to leverage and expand upon existing areas of research related to codifying effective instructional strategies. Partners include TRADOC, the Maneuver Center of Excellence (MCOE), the Army Research Institute (ARI) and USC’s Rossier School of Education. We will analyze how to effectively incorporate specific instructor methodologies into the USC ICT evidence-based instructional and system design processes. In doing so, the essence and style of military instructors will be captured and codified in a manner that allows them to be delivered digitally without the need for a human instructor in the loop. The CVIT project is multi-disciplinary and will bridge research from the fields of virtual humans, intelligent tutoring, cognitive and learning sciences, military instruction and entertainment.

This research is funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies.

9/2013

Page 9: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Cerebella and Generating Virtual Character Performance from AudioStacy Marsella

Cerebella was designed to fundamentally change the economics of creating virtual humans, lowering the barriers of entry to creating virtual human applications. It automates the generation of physical behaviors for virtual humans, including nonverbal behaviors (such as gestures, posture shifts and facial expressions) accompanying the virtual humans dialog, responses to perceptual events (such as gazing at objects that pass by) as well as listening behaviors (such as nodding to signal that one is attending to the speaker). Modular processing pipelines transform the input into behavior schedules, written in the Behavior Markup Language and then passed to a character animation system.

Designed as a highly flexible and extensible component, Cerebella realizes a robust process that supports a variety of use patterns. For example, to generate the character’s nonverbal behavior for an utterance, Cerebella can take as input detailed information about a character’s mental state (e.g., emotion, attitude, etc.) and communicative intent. On the other hand, in the absence of such information, Cerebella will analyze the utterance text and prosody to infer that information. It can be used online to generate behavior in real-time or offline to generate nonverbal behavior schedules that will be cached for later use. Offline use has allowed Cerebella to be incorporated into behavior editors that support mixed initiative, iterative design of behavior schedules with a human author, whereby Cerebella and the human author can iterate over a cycle of Cerebella behavior schedule generation and human author modification the schedule.

At its simplest, one only need pass Cerebella the text of what the virtual human should say. Cerebella analyzes the dialog audio and text and then generates appropriate nonverbal behavior, thereby greatly reducing development time.

This basic research effort is funded under ICT’s basic UARC contract – along with funding from DARPA (as part of ICT’s work on the DCAPS project).

For questions about Cerebella and usage, please contact Stacy Marsella, Ph.D. at [email protected].

3/2013

Page 10: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

CHOASCombat Hunter Action and Observation Simulation

The Combat Hunter Action and Observation Simulation (CHAOS) was part of ICT’s work on the Future Immersive Training Environment (FITE) Joint Capabilities Technology Demonstration ( JCTD) and its goal of developing next-generation training for infantry small units. Located at the Infantry Immersion Trainer (IIT) at Camp Pendleton, CHAOS demonstrates advanced capabilities in immersive training for a mixed reality environment.

CHAOS incorporated mixed reality and immersive techniques, including the ability to interact with virtual characters and use of a storyline to drive participants towards specific choreographed experiences.

ICT developed a multi-room installation for the CHAOS environment that represents a house compound in Helmand Province, Afghanistan. The interior and exterior settings include a mix of real and virtual elements. ICT also developed virtual characters that can interact with the infantry squad and each other in one area of the compound. The virtual characters are part of a scenario that requires the squad to apply the techniques of tactical questioning, information gathering and acute observation in order to be successful in the CHAOS mission.

One of the key issues dealt with is decision-making under stress and chaos on the battlefield, when squads must be prepared for both lethal and non-lethal engagements. Instead of dictating right and wrong times to use lethal vs. non-lethal tactics, techniques and procedures (TTPs), squads are asked to apply good judgment about rules of engagement, escalation of force, and shoot-no shoot choices depending on the situation and the appropriateness. Key to the learning experience is the after action review (AAR) conducted outside of the CHAOS environment. The AAR is designed to help the squads understand their performance and give them confidence for future missions.

ICT collaborated with military subject matter experts, government personnel, training developers, consultants and contractors to develop the FITE scenarios, including the CHAOS scenario, which runs 5-10 minutes in length and is replayable to allow for multiple paths through the scenario. The scenario is designed to be set up in the IIT beforehand and continue in the IIT afterwards, as appropriate.

6/2012

2010-2011

Page 11: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Coach MikeVirtual Human Museum Guide

SummaryA National Science Foundation-funded collaboration between the University of Southern California Institute for Creative Technologies and the Boston Museum of Science, Coach Mike is a virtual human who “works” at the museum’s Robot Park exhibit teaching visitors how to program a robot. A recent evaluation conducted by the Institute for Learning Innovation found that Coach Mike’s presence in Robot Park leads to more productive interactions with the exhibit.

BackgroundCoach Mike was inspired by Professor Michael Horn at Northwestern University who created Robot Park so that museum visitors could have a fun and intuitive way to learn programming. Visitors to Robot Park who had the help of museum volunteers tended to stay longer and do more programming than those who did not have a guide. So, working with museum staff, ICT researchers built Coach Mike to simulate some of these interactions.

TechnologiesA pedagogical manager acts as the hub by monitoring physical inputs from the exhibit, triggering virtual human actions, assessing user actions, and providing learning support. Coach Mike’s animations run on ICT’s SmartBody system and in the Gamebryo game engine. He speaks via synthesized speech.

Coach Mike uses the techniques of artificial intelligence to support visitors: he estimates their knowledge, can judge when programs are correct (or not), and is willing to give feedback and suggestions. Pedagogical decisions are driven by a rule-based cognitive model of coaching that models a frequently changing world state. Built to simulate museum staffs’ strategies, the model encodes a variety of tutoring and motivation tactics to orient people to the exhibit, encourage them to try new things, suggest specific problems, and give knowledge-based feedback on their programs. A general aim is to balance the importance of exploration and play with the goal of giving feedback and guidance for specific challenges. Coach Mike’s help is always delivered in entertaining and encouraging ways that seek to maximize visitor engagement.

6/2012

2010-present

Page 12: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Cognitive/Virtual Human ArchitecturePaul S. Rosenbloom

Professor, USC Viterbi School of Engineering Computer Science DepartmentProject Leader, USC Institute for Creative Technologies

The goal of this effort is to develop a functionally elegant, grand unified, cognitive architecture in support of virtual humans (and hopefully intelligent agents/robots – and eventually even a new form of unified theory of human cognition – as well).

A cognitive architecture is a hypothesis about: (1) the fixed structures that provide a mind, whether in natural or artificial systems; and (2) how they work together – in conjunction with knowledge and skills embodied within the architecture – to yield intelligent behavior in a diversity of complex environments.

A functionally elegant architecture yields a broad range of capabilities from the interactions among a small general set of mechanisms, holding out the possibility of combining deep science with highly practical outcomes. A grand unified architecture integrates across higher-level thought processes plus any other aspects critical for successful behavior in human-like environments, such as perception, motor control, and emotions.

Our focus at this point is on the development of the Sigma (∑) architecture, which provides a hybrid (discrete+continuous) mixed (symbolic+probabilistic) approach, based on graphical models and piecewise-linear functions. To date with Sigma we have generated results across memory and learning, problem solving and decision making, mental imagery and perception, and natural language.

Until now, ICT’s integrated virtual human systems have been developed within version 7 of the Soar cognitive architecture. Soar has served the virtual human research efforts well, and the SASO system to this date remains one of the most sophisticated integrated virtual human systems worldwide. However, the underlying cognitive architecture is showing its age and despite its considerable strengths, it is becoming increasingly difficult to adapt the system to current-day needs.

The research and development of Sigma will allow us to leverage advances in research, knowledge and technology in order to create a novel cognitive architecture that is free of legacy drawbacks and ideally suited for the further advancement of virtual humans, to yield systems that are broadly, deeply and robustly cognitive, interactive with their physical and social worlds, and adaptive given their interactions and experience.

This effort, funded under ICT’s basic UARC contract – along with funding from AFOSR and ONR – supports several TRADOC Warfighter Outcomes (WFO) directly and indirectly, in particular where training system call for virtual characters.

3/2013

Page 13: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

COSMOS/SIMComputational Simulation and Modeling of Society / Social Intelligence Modeling

The Computational Simulation and Modeling of Society (COSMOS) project researches algorithms and models for social simulations that can faithfully model real-world social interactions. Beginning in Government FY2014 this work will be known as the Social Intelligence Modeling (SIM) project.

COSMOS has explored the modeling and simulating of both small- and large-scale social interaction, based on the hypothesis that social entities, individuals or groups, can be modeled as goal-seeking decision-makers that have beliefs about other entities. Based on this hypothesis, a framework for crafting social simulations has been developed, PsychSim, that has been used in a range of transitioned applications such as ICT’s UrbanSim and BiLAT projects.

New research directions in the SIM project will:

• Develop general-purpose algorithms that can suggest to an author possible changes to the scenario models that remove the discrepancies between the model and the pedagogy or real-world data.

• Compare models of action effects against real-world data in order to facilitate data-driven approaches to modeling.

• Model the cognitive consequences of emotion, coping behavior.• Integrate with Cerebella (ICT’s new and more capable behavior

generating system that replaces the previously-used Non-Verbal Behavior Generator)

• Provide agent modeling for the Cognitive Gym concept, wherein future trainees would be able to engage in a series of standard leadership exercises (such as those used in both business and military leadership courses) but with cognitively, socially- and emotionally-competent virtual role players.

Due to the increased use of simulation methods in the study of social systems, our efforts aim to benefit fundamental research in both social science and computer science while leading to improved simulations.

This research is funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies.

9/2013

Page 14: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

DICE-TDismounted Interactive Counter-IED Environment for Training

The Dismounted Interactive Counter-IED Environment for Training (DICE-T) effort is focused on developing prototype training applications designed to introduce, reinforce and assess dismounted training concepts and principles in an engaging and immersive environment before live training and deployment. The DICE-T experience uses a combination of narrative video and immersive gameplay to deliver over-arching “first principles” related to threat assessment, especially as they relate to IEDs.

The DICE-T effort provides two prototype deliverables: a tablet touch-screen single player version (Android or iPad) and a mutliplayer Red versus Blue environment (PC-based). The tablet version can be used as a standalone or can train a squad at one time in a kiosk setup. The multiplayer version accommodates a squad.

The system sends trainees on various interactive missions that emphasize critical components of dismounted patrol: planning a route, executing a patrol and countering threats, and mission debrief/AAR. The game scenarios represent real-world dismounted patrol situations, and trainees receive a video mission brief describing the current threats in the area. As training progresses, difficulty and complexity of the missions escalate as more information is provided in the brief. Each mission begins with a video that introduces threat-assessment concepts and highlights specific lessons for each phase. This material is based on established learning objectives and represents the “crawl” phase of a crawl-walk-run training continuum.

One of the goals for DICE-T is to help novices think like experts before they are deployed. Trainees use what they have learned in the classroom, think about what they would do during live training, and get a deeper understanding of the underlying principles of dismounted counter-IED behavior. Using evidence-based practices and assessment techniques for adult instruction, DICE-T provides an engaging element to traditional classroom instruction, and better prepares trainees for live exercises.

An initial prototype was delivered December 2011. A more advanced version with DSTS content integration was fielded for the Bold Quest multinational training exercise at Ft. Benning in September 2012, with the Red versus Blue multiplayer prototype targeted for December 2013.This project is funded by the Joint Improvised Explosive Device Defeat Organization.

5/2013

2011-present

Page 15: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

ELITEEmergent Leader Immersive Training Environment

The Emergent Leader Immersive Training Environment (ELITE) targets leadership and basic counseling for junior leaders in the U.S. Army. The ELITE experience incorporates a virtual human, classroom response technology and real-time data tracking tools to support the instruction, practice and assessment of interpersonal communication skills.

While the Army recognizes that communication skills are important, junior leaders often receive little or no opportunity to practice important interpersonal skills. If they do receive practice, live role-play sessions may be used. In an effort to provide a structured framework for teaching and practicing communication skills, ELITE replaces one human role-player with a life-sized virtual human. The virtual human component addresses the issues inherent to live role-play practice sessions that cannot be easily standardized, tracked and assessed following the interaction.

After receiving up-front instruction and example demonstrations on basic strategies for helping a subordinate with a performance problem or a personal issue, one student from a class of 50 is selected to speak to the virtual human, and the rest of the students participate by selecting the option they would choose using remote-controlled clickers. ELITE’s instructional framework, interactive response system and visual data tracking are engaging tools to facilitate practice and class discussion, including an instructor-led After Action Review, before junior leaders reach their first assignments. Ultimately, the ELITE experience provides several interactive case studies and a framework for learning how to employ interpersonal skills related to basic counseling. To allow for great access to the ELITE training, a laptop version has been developed called ELITE Lite.

ELITE is a partnership with the U.S. Army Simulation and Training Technology Center (STTC), Orlando, Florida, and the Maneuver Center of Excellence (MCoE), Fort Benning, Georgia. The first ELITE prototype was installed at Fort Benning’s MCoE in late 2011. An ELITE demonstration was part of the Army Exhibit at AUSA in October 2011 and the STTC exhibit at I/ITSEC in December 2011. ICT developed a similar Navy prototype, the Immersive Naval Officer Training System (INOTS), which is in use at Officer Training Command Newport. ELITE’s technology and instructional approach has been leveraged by the USC School of Social Work’s Center for Innovation and Research on Veterans & Military Families (CIR) to train Motivational Interviewing with the Motivational Interviewing Learning Environment and Simulation (MILES) effort.

9/2013

2010-present

Page 16: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

ELITE Lite

The Emergent Leader Immersive Training Environment (ELITE) Lite provides a laptop training capability to teach interpersonal skills to United States (US) Army junior leaders by presenting real-world instructional scenarios in an engaging, self-reinforcing manner. The purpose of the training experience is to provide junior leaders with an opportunity to learn, practice and assess interpersonal communication skills for use in basic counseling. The ELITE content incorporates Army-approved leadership doctrine (FM 6-22), evidence-based instructional design methodologies and ICT research technologies, such as virtual humans and intelligent tutoring, to create a challenging yet engaging training experience.

The ELITE Lite software has three Officer scenarios and three NCO scenarios. All of the scenarios are based on real-world counseling issues such as financial troubles, post-deployment readjustment and alcohol-related performance issues. These scenarios offer students a chance to practice the interpersonal communication skills they learn during the ELITE Lite instruction. The package includes three phases: Up-front Instruction, Practice Environment and After Action Review (AAR).

The total training time for ELITE Lite is anywhere from one to two hours depending on a student’s proficiency. Time will vary depending on student experience level, performance and engagement. Some students may take time to review missed concepts based on how well they respond to quiz questions. Some students may choose to watch all suggested training vignettes and comparisons. Some students may thoroughly engage in the AAR after an interaction in the practice environment. Some students may choose to practice all three scenarios for their given rank.

In the contemporary US Army, leaders must not only be prepared for the tactical side of leadership, but also the personal and soft side of leadership as well. Effective communication between leaders and their subordinates is paramount towards maintaining the combat effectiveness of the force, and ELITE Lite offers young US Army leaders a unique opportunity to learn and practice interpersonal skills so they are better prepared for the interactions they will encounter.

11/2013

2012-present

Figure 1: Up-front Instruction

Figure 2: Practice Environment

Figure 3: After Action Review

Page 17: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Enhanced Environment for Communication and Collaboration (E2C2)

E2C2 is funded by ONR Swampworks and executed by the USC Institute for Creative Technologies. The E2C2 project seeks to answer a simple question that has complicated answers – what will a future office/work environment look like in 2020 and how will it enable people to better function? E2C2 essentially is looking at creating new models for what could be termed “Communication Operating Systems” (COS). The initial E2C2 COS (codename BlueShark) will be a mix of conceptual frameworks, hardware and software that will better enable decision makers and help optimizeoperator performance.

BlueShark is an experiment to look at a variety of ways to possibly collaborate more efficiently. Some experiments might be more a dream as the technology might not have been invented yet, some experiments might be using cutting edge technology not quite ready for prime time, and some experiments might look at new to market technologies that just have not been applied to

7/2013

workforce experience yet. And the best might be experiments driven by “the audience” that combines different technologies in different ways to better meet some objective.

The overall objective of BlueShark is combine the creative minds of the Institute for Creative Technologies (U.S. Army University Affiliated Research Center for virtual systems), USC, Hollywood studios and U.S. Navy and Marine Corps personnel to take advantage of our country’s youth’s recent education experience (both formal (schools) and informal (self-taught)) and the rapid advancement of information technologies and apply them to existing naval environments (in an experimental space) and feedback information to technology creators/designers, ship designers, and mission planners on better, more efficient ways to collaborate, train and work. Individual spin off experiments may take place on a case by case basis on naval assets for detailed long term evaluations.

Incorporating the best tools and techniques that ICT has to offer this exciting project will utilize, virtual and augmented reality, virtual humans, artificial intelligence, human-computer interfaces and other research and prototypes.

Learn more at e2c2.ict.usc.edu.

Before

After (not a computer rendering)

Page 18: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Game-Based RehabilitationGame-based Rehabilitation Tools using Microsoft Kinect

The Game-Based Rehabilitation (“Games for Rehab”) research effort at the ICT, led by Dr. Belinda Lange, explores the research and development of interactive tools for therapists and their clients who are recovering from trauma (such as stroke or traumatic brain injury), or are members of high-risk groups, such as the elderly at risk for falls. Games for Rehab consists of a series of projects that draw upon video game mechanics, techniques, and hardware to leverage the potential of personalized, interactive environments to improve therapy – both in a clinical setting as well as a patient’s home.

Anyone who’s experienced traditional physical therapy may be familiar with the fact that once they are done with their visit to a therapist, they are usually sent home with a list of exercises to practice. Not only can these prescriptions provide limited motivation and engagement, but they also give little in terms of feedback on how a person is performing them. Also, it is difficult for the clinician to ensure that the patient is actually adhering to the program.

Game-Based Rehab can address these gaps and offer potential solutions. It is becoming increasingly more common to find commercial video game systems and game titles being used for rehabilitation (in nursing homes and physical therapy clinics). It is important to keep in mind, however, that these games are designed for entertainment. They are not always appropriate for people in a rehabilitation context. The games often ramp-up in difficulty quickly, present feed-back to the user that may not be relevant (“You failed!”), and do not allow for customization of activity based on a user’s individual abilities or circumstance.

The Game-Based Rehab group develops applications and exercises that a clinician can use with their patients based on their individual goals and abilities. Using affordable, off-the-shelf devices like the Microsoft Kinect, Dr. Lange and her team have created game-like rehabilitation and exercise environments that:• set goals based on the user’s range of motion and ability• are engaging and motivating to the user• allow health care professionals to precisely deliver, control, and

monitor their clients’ efforts and progress

2010-present

2/2013

Page 19: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Graphics LabGraphics Research

The ICT Graphics Lab develops new techniques for scanning 3D objects and for creating and displaying photorealistic graphics of people, objects and environments. Research includes a specular object scanner for capturing the reflectance properties, a game-ready, photoreal facial animation with pore accurate detail and an automultiscopic 3D display for immersive holographic viewing.

SELECTED RESEARCH PROJECTSGlasses ScannerDigitally recording realistic models of real-world objects is a long-standing problem in computer graphics, with applications in online commerce, visual effects, and interactive entertainment. ICT Graphics Lab has developed a novel technique for acquiring the geometry and spatially-varying reflectance properties of specular 3D objects by observing them under continuous spherical harmonic illumination conditions. This technique is used to digitize the shape and reflectance of a variety of objects difficult to acquire with other techniques.

Digital Ira In collaboration with Activision, the Graphics Lab created a real-time, photoreal digital human face which can be seen from any viewpoint and in any lighting condition. The virtual human face moves realistically and weathers inspection even in a tight closeup. In addition, the facial animation demo runs in a game-ready production pipeline. The lab continues to focus on simulating eyelid bulge, displacement shading, ambient transmittance and other dynamic effects.

Pico ArrayUSC ICT Graphics Lab has developed a dense projector display prototype that is optimized in size and resolution to display an automultiscopic life-sized 3D human face. The lab is also developing the next generation automultiscopic projector array for full body holographic display. The result is to recreate a better virtual person-to-person communication experience as well as to inform current and future work in interactive holographic displays. Applications include 3D teleconferencing and characters for education and entertainment.

These projects are funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies and also through collaborations with industry partners including Activision and 1-800CONTACTS.com.

9/2013

Page 20: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Gunslinger

Imagine stepping into a portal to another place and time. You appear in a darkened room, hearing the tinkling of an upright piano, the clinking of glasses, and horses and coaches in the distance. As your eyes focus, you make out a long, wooden bar full of glasses and whiskey bottles. Feeling a weight around your hips you realize you are wearing a holster with a six-shot revolver. A conspicuous metal star sits pinned to your chest. The star says “U.S. Ranger.” Suddenly you realize someone is staring at you from across. He looks like a bartender out of an old American western movie. “Howdy Ranger,” he says. “You’re here to rid our town of that evil bandit, Rio Laine, right?” You feel several eyes turn to you, waiting expectantly for an answer…

Welcome to Gunslinger, an interactive-entertainment application of virtual humans that transforms this iconic movie scene into a vivid semblance of reality.

Gunslinger combines virtual humans technology, and Hollywood storytelling and set building into an engaging, mixed-reality, story-driven experience, where a single participant can play the hero in a wild west setting by interacting both verbally and non-verbally with multiple virtual characters.

The Gunslinger project also pushes the frontier of virtual human research by proposing a new architecture for story-driven interaction. The system combines traditional question-answering dialogue techniques with a capability for biasing question understanding and dialogue initiative through an explicit story representation. The system incorporates advanced speech recognition techniques and visual sensing to recognize multimodal user input. It further extends existing behavior generation methods such as BEAT and SmartBody to drive tightly coupled dialogue exchanges between characters. Together, these capabilities strive to seek a balance between the open-ended dialogue interaction and carefully crafted narrative.

6/2012

2007-present

Page 21: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

The goal of the Immersive Commanders Environment (ICE) project is to develop a suite of tools that will help future commanders practice and art and science of mission command. At the heart of ICE is a comprehensive, multi-layered story line that will link the various training applications into a single, simulated command experience. A central student model will ‘over-watch’ the entire experience and provide assessment, feedback and mentoring that will help new commanders grow and develop enhanced skills from the training.

The U.S. Army Command and General Staff College (CGSC), School for Command Preparation (SCP), part of the Combined Arms Center (CAC) at Fort Leavenworth, has defined the requirements for ICE as part of its effort to implement the Army Learning Model (ALM) 2015 and revolutionize the way future commanders are trained in an institutional setting; however, it will also be adaptable as a part of continued individual training in operational unit settings.

The effort in FY13 focused on developing a design of the one of the modules of ICE: the MultiLat tool. ICT developed a paper and pencil prototype that could be used as the basis for the development of a software tool. This research was funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies. At present, there is no additional funding for ICE or MultiLat development.

9/2013

ICEImmersive Commanders Environment

Page 22: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Immersive & Cognitive Training AidsICT leverages the capital of the entertainment and academic communities to develop immersive training for a range of training domains. This unique approach has led to the creation of an array of innovative applications that have transitioned Army-wide.

SELECTED PROJECTS Joint Fires and Effects Trainer System ( JFETS) The JFETS is a suite of state-of-the-art immersive virtual reality environments designed to help make critical decisions under stress by recreating conditions that place Soldiers in current operational settings, including heat, wind, explosions, human distress noise, and snipers. JFETS also provides artificial intelligence behaviors to insurgent forces and realistic and civilians. Installed at Ft Sill in 2004, JFETS has trained tens of thousands of troops, and is currently being used by members of the U.S. Army and Marine Corps.

Cognitive Air Defense – Training System (CAD-TS) The CAD-TS Engagement Control Station Simulation prepares Soldiers to use the U.S. Army Patriot missile defense engagement operations center for the Patriot firing unit. Installed at Ft. Sill, it is designed to help bridge the gap between recognizing the 2D scope of information from the radar interface and understanding that information based on realistic visualizations of the 3D airspace. The CAD-TS ECS2 trains and assesses Soldiers’ abilities to recognize and respond to perceived threats with complete situational awareness.

Distribution Management Cognitive Training Initiative (DMCTI) Winner of a 2008 Army Modeling and Simulation Award for Army-wide team training, the DMCTI prototype application trains U.S. Army logistical planners and supports the understanding of the Army distribution management process. The DMCTI promotes the development of strategies for best exploiting the capabilities of logistics management systems. A post exercise review provides students with an evaluation as well as a representation of how their performance compares to experts in the field.

These projects were sponsored by the U.S. Army Simulation, Training and Technology Center (STTC), along with industry partners Game Production Services, Quicksilver Software, Research Analysis and Maintenance, and Stranger Entertainment.

6/2012

Page 23: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

INOTSImmersive Naval Officer Training System

The Immersive Naval Officer Training System (INOTS) targets leadership and basic counseling for junior leaders in the U.S. Navy. The INOTS experience incorporates a virtual human, classroom response technology and real-time data tracking tools to support the instruction, practice and assessment of interpersonal communication skills. While the Navy recognizes that communication skills are important, junior leaders often receive little or no opportunity to practice important interpersonal skills prior to deployment. If they do receive practice, live role-play sessions may be used. In an effort to provide a structured framework for teaching and practicing communication skills, INOTS replaces one human role-player with a life-sized virtual human. The virtual human component addresses the issues inherent to live role-play practice sessions that cannot be easily standardized, tracked and assessed following the interaction. After receiving upfront instruction and demonstrations on basic strategies for helping a subordinate with a performance problem or a personal issue, one student from a class of 50 is selected to speak to the virtual human, and the rest of the students participate by selecting the option they would choose using remote-controlled clickers. INOTS’ instructional framework, interactive response system and visual data tracking are engaging tools to facilitate practice and class discussion, including an instructor-led after action review, before junior leaders reach their first assignments. Ultimately, the INOTS experience provides several interactive case studies and a framework for learning how to employ interpersonal skills related to basic counseling. INOTS is a partnership with the Office of Naval Research (ONR), Naval Service Training Command (NSTC) and the Officer Training Command Newport (OTCN). INOTS was installed at OTCN in August 2011 with its current focus to supplement the Division Officer Leadership Course (DOLC) in support of the Officer Candidate School (OCS) and Officer Development School (ODS). ICT developed a similar Army prototype, the Emergent Leader Immersive Training Environment (ELITE), which is in use at the Maneuver Center of Excellence (MCoE) at Fort Benning, Georgia.

2010-present

6/2012

Page 24: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Learning Sciences

Learning Sciences involves the research and application of artificial intelligence and video game techniques to solve educational challenges, including fostering complex problem solving skills and promoting engagement. The goal of these efforts is computer-mediated instruction that understands and leverages how people learn and what teaching and tutoring methods are most effective.

Research efforts encompass the design, authoring, guidance, mentoring, and assessment involved in computer-mediated instruction – including educational video games - and how people best recall and retain lessons they are taught.

SELECTED RESEARCH AREASIntelligent Tutoring SystemsThis work seeks to maximize the value of virtual human-based training experiences through the use of explainable artificial intelligence, guidance and feedback, automated after-action reviews, and intelligent control of virtual human behaviors. It also seeks to develop a suite of authoring components specifically for use with virtual humans that is designed with learning in mind from the outset.

Informal Science EducationCoach Mike, is a National Science Foundation-funded collaboration between the University of Southern California Institute for Creative Technologies and the Boston Museum of Science that teaches visitors how to program a robot. Not only can he guide people to get the robot turning, buzzing, singing, and more, but he is capable of describing how the exhibit actually works and creating specific challenges for guests to solve. He’s there to explain, encourage, and give help when needed. A general aim is to balance the importance of exploration and play with the goal of giving feedback and guidance for specific challenges. Thus, Coach Mike’s help is always delivered in entertaining and encouraging ways that seek to maximize visitor engagement.

6/2012

Page 25: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

The Mixed Reality Lab (MxR) at the USC Institute for Creative Technologies explores techniques and technologies to improve the fluency of human-computer interactions and create visceral synthetic experiences.

Mark Bolas, the MxR Lab’s director is also a professor at the Interactive-Media Division at the USC School of Cinematic Arts. His research and prototypes focus on immersive systems for education, training and entertainment that incorporate both real and virtual elements. Projects push the boundaries of immersive experience design, through virtual reality and alternative controllers.

MxR’s suite of low-cost immersive viewers, including the Socket HMD, the Socket Mobile (FOV2GO) and the iNVerse immersive reader, enables the creation of 3-D, immersive virtual and augmented reality experiences using smart phones and tablets. These low-cost, lightweight systems can be used to create portable virtual reality applications for training, education, health and fitness, entertainment and more. These software and hardware platforms are part of the open-source design philosophy that helped inform the design of the new Oculus Rift HMD.

1/2013

Low-Cost Immersive ViewerICT Mixed-Reality Lab

Page 26: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

MCITMobile Counter-IED Interactive Trainer

With MCIT, ICT took a “first principles” approach to counter-IED training, focusing on understanding how terrain is a weapon, reading atmospherics, and helping trainees make predictions and be proactive rather than reactive. The target audience includes lower level enlisted (E1-4,5) and lower level Officers (O1-2).

MCIT consists of a series of four (4) modified 40’ Conex boxes (CBs). The first three CBs introduce the various types of IEDs and familiarize troops with how insurgents utilize IEDs, both from the BLUFOR and OPFOR viewpoints. Narrative story vignettes from an insurgent bomb maker and U.S. Soldier or Marine help deliver the training materials and guide the trainees through the self-paced experience. In the last box, trainees take on the roles of being an insurgent ambush team as well as mounted patrol and take part in an interactive red vs. blue exercise. A debrief follows the experience to assess lessons learned.

Working with educational psychologists and military subject matter experts, ICT also developed the Experiential Counter-IED Immersive Training Environment (ExCITE) that combines a rich suite of physical, visual, aural, and virtual elements including narrative video and multiplayer red vs. blue game for deliberate practice.

Several prototypes were delivered in 2009. ExCITE content was deployed to follow-on production systems systems (not done by ICT), with another nine systems deployed. ICT continues to work on new scenarios, content for training around dismounted c-IED issues, and further enhancements/improvements. ICT designed and built a multi-cultural/multi-language variant (MCIT-MC) that was deployed in Germany in Dec 10. This unit localized the video and software with versions in Bulgarian, Polish and Romanian. To date over 30,000 troops have been trained with MCIT and over 1300 Polish troops have been trained in MCIT-MC.

MCIT was accomplished through ICT’s strategic partnership with the developer of the original MCIT concept, A-T Solutions, in support of JIEDDO JCOE, Army Simulation, Training and Technology Center (STTC), industry partners, Psychic Bunny, Blind Spots Content, Isolated Ground, Quicksilver Software, Stranger Entertainment.

6/2012

2009-present

Page 27: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Medical Virtual RealityThe ICT MedVR Lab explores and evaluates areas where VR can add value over traditional assessment and intervention approaches. Areas of specialization are in using VR for mental health therapy, motor skills rehabilitation, cognitive assessment and clinical skills training

SELECTED RESEARCH PROJECTSSimCoachSimCoach is a web-based virtual human designed to provide an anonymous and accessible way to overcome some of the existing resistance to seeking care, to facilitate communication about mental health issues, and to help soldiers, veterans and their families to realize that there are resources available for them. SimCoach can ask a series of questions about the user’s symptoms and provides access to relevant resources.

Virtual Iraq/AfghanistanVirtual Iraq/Afghanistan, delivers virtual reality exposure therapy for treating post-traumatic stress. Currently in use at over 60 clinical sites, including VA hospitals, military bases and university centers the Virtual Iraq/Afghanistan exposure therapy approach has been shown to produce a meaningful reduction in PTS symptoms.

Stress Resilience In Virtual Environments (STRIVE) STRIVE is a pre-deployment approach to understanding and training troops for combat stress. It includes a realistic combat experience portrayed within a virtual reality story and an interaction with an intelligent virtual mentor that can explain how the brain and the body react to stress and present relevant exercises for managing it.

Games for Rehabilitation ICT’s Games for Rehab Lab focuses on the creation of virtual reality and game-based tools that can improve both assessment and training. Current prototypes include Jewel Mine, a rehabilitation therapy tool designed to motivate patients with stroke, traumatic brain or spinal cord injuries.

Virtual PatientsThis effort builds virtual standardized patient applications for clinician training that integrate models of emotion and personality into the language and state of the character, as well as investigates the use of dramatic interactive narratives involving virtual patients in order to elicit engagement in learning.

6/2012

Page 28: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

SummaryThe University of Southern California Institute for Creative Technologies’ (ICT) pioneering efforts within DARPA’s Detection and Computational Analysis of Psychological Signals (DCAPS) project encompass advances in the artificial intelligence fields of machine learning, natural language processing and computer vision. These technologies identify indicators of psychological distress such as depression, anxiety and PTSD, and are being integrated into ICT’s virtual human application to provide healthcare support.

GoalsThis effort seeks to enable a new generation of clinical decision support tools and interactive virtual agent-based healthcare dissemination/delivery systems that are able to recognize and identify psychological distress from multimodal signals. These tools would provide military personnel and their families’ better awareness and access to care while reducing the stigma of seeking help.

For example, the system’s early identification of a patient’s high or low distress state would generate the appropriate information that could help a clinician diagnose a potential stress disorder. User-state sensing can also be used to create long-term patient profiles that would be used to assess change over time.

CapabilitiesICT is expanding its expertise in automatic human behavior analysis to identify indicators of psychological distress in people. Two technological systems are central to the effort.

Multisense automatically tracks and analyzes in real-time facial expressions, body posture, acoustic features, linguistic patterns and higher-level behavior descriptors (e.g. attention and fidgeting). Multisense infers from these signals and behaviors, indicators of psychological distress that directly inform SimSensei, the virtual human.

SimSensei is a virtual human platform able to sense real-time audio-visual signals captured by Multisense. It is specifically designed for healthcare support and is based on the 10+ years of expertise at ICT with virtual human research and development. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the perceived user state and intent, through its own speech and gestures.

DCAPS is not aimed at providing an exact diagnosis, but at providing a general metric of psychological health.

10/2013

Multisense and SimSenseiA Multimodal Research Platform for Realtime Assessment of Distress Indicators

Page 29: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Mixed Reality Lab (MxR)Mixed Reality Research and Development

The Mixed Reality Lab (MxR) at ICT researches and develops tools and techniques to improve fluency of human-computer interaction and create visceral synthetic experiences, immersive educational systems and training simulations that incorporate real, virtual and augmented elements. The lab conducts basic and applied research in areas of mixed, virtual and augmented reality, with attention toward creating prototype solutions as well as transition points for commercial and research vectors. A “skunkworks” design and development approach is guided by studies of human perception, cognition, and social interactions. Projects include:

Sharing SpaceMxR develops immersive displays to transport virtual characters into the real world and make virtual humans seem like real people who occupy real physical space – virtual characters make direction-correct eye contact and sounds The lab designs, develops, and evaluates innovative immersive systems that incorporate advanced visual, audio, and haptic sensors and transducers. This approach leads to innovative display technologies and techniques that induce users to react in the realistic and naturalistic ways needed in effective training and learning experiences.

Stretching SpaceThis project seeks to overcome the physical space limitations of natural locomotion through research and development of redirected walking techniques – perceptual illusions that decouple the users’ virtual motions from their path in the real world. Results from our perceptual studies have demonstrated that redirected walking can preserve the feeling of moving naturally through a stable virtual world while simultaneously keeping the user physically constrained within the boundaries of the real workspace, and have led us to explore and develop more flexible methodologies for applying these techniques in practical training environments.

Emerging TechThe MxR Lab leverages consumer technologies and prototypes innovative solutions to provide quality immersive training that is both accessible and low-cost. The lab’s open source initiative has led to the use of smartphone and tablet based immersive viewers for training and data visualization as well as the commercial release of the low-cost, high field-of-view HMD, the Oculus Rift.

The Mixed Reality Research and Development group receives mission and customer funding through ICT’s UARC contract to do basic and applied research and advanced technology demonstrations.

9/2013

Page 30: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Narrative

The Narrative Group at ICT investigates storytelling and the human mind, exploring how people experience, interpret, and narrate the events in their lives. We pursue this research goal using diverse interdisciplinary methods, including the large-scale analysis of narrative in social media, the logical formalization of commonsense knowledge, and the creation of story-based learning environments.

Narrative analysisThe rise of social media has created new opportunities for an empirical science of storytelling. Over the last few years, we have collected tens of millions of personal stories from Internet weblogs for use in a wide variety of analyses. We have studies the health information needs expressed by parents of children with cancer, the gender differences in the way that people describe strokes and heart attacks, and cross-cultural differences in the way that people frame the events in their lives in terms of sacred values.

Narrative intelligenceA central engineering challenge in the creation of human-like artificial intelligence is to enable commonsense reasoning about the everyday world. At ICT, we pursue two opposite approaches to the problem of acquiring commonsense knowledge. On one hand, we employ traditional knowledge engineering methods to author formal commonsense content theories in first-order predicate logic. On the other hand, we attempt to harvest commonsense knowledge directly from the millions of personal stories that people post to their Internet weblogs.

Story-based learning environmentsImmersive training simulations provide environments where learners can acquire and practice cognitive skills through guided, interactive experiences. Crafting effective simulations is still more of an art than a science, requiring the collaborative efforts of writers, experts, instructors, technologists, and learning science researchers. We support these creative efforts through the development of authoring tools and methodologies, helping teams articulate instructional objectives and construct scenario content through the analysis of the real-world experiences narrated by practitioners.

6/2012

Page 31: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Natural Language Dialogue Group

The primary goal of the ICT Natural Language Dialogue Group is to create computational models of purposeful communication between individuals. These models can be used for analyzing the structure and content of human conversation and to create artificial agents who can engage in human-like interaction with people and other agents. The group has research and development expertise in a range of enabling areas, including dialogue systems, spoken and natural language understanding, dialogue management, natural language generation, speech synthesis, and evaluation of dialogue systems.

Selected Research TopicsTypology of Dialogue Genres – We investigate a broad range of dialogue situations, which differ in the content, structure and meanings expressed, as well as the roles, relationships and individual and joint purposes of the participants. We have created virtual agents who act as interviewers, interview subjects, collaborative partners, training exercise role players, and non-cooperative negotiators. We also examine the best performing algorithms and system architectures for each of these genres.

Incremental Dialogue – We create models of dialogue that allow understanding and response to a person’s utterance as they are still speaking, which reduces overall processing latencies and allows richer and more human-like feedback.

Machine Learning for Dialogue - We make use of a variety of discriminative and generative models for natural language understanding, as well as reinforcement learning for dialogue management.

In addition, the group collaborates with others at ICT and elsewhere on integrated virtual humans, and transitioning natural language dialogue capability for use in training and other interactive applications. Sample applications include:

The Twins – Ada and Grace are life-sized virtual humans who have been at the Boston Museum of Science since 2009, answering questions from visitors about the museum and their technologies and their personal backgrounds.

SimCoach – a web portal where veterans can anonymously communicate with a virtual human about PTSD and find helpful resources.

INOTS – an educational platform, where Navy officers can practice active listening skills.

7/2012

Learn more at projects.ict.usc.edu/nld/group.

Page 32: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Naturalistic Avatar Interactions in Virtual Environments

Virtual environments (VEs) are increasingly being used for training of complex functional skills and activities and also in the rehabilitation setting to improve physical and cognitive impairments. New low-cost, 3D tracking hardware provides opportunities to use interaction techniques that replace a keyboard or mouse with physical movements or gestures. The fidelity of a user’s experience can be further enhanced through their representation (avatar) within the VE, their perspective on the screen, and the type of display on which the interaction is presented. However, it is not clear how experiencing a virtual environment through these naturalistic interactions affects the user experience, particularly performance and engagement.

This research aims to explore how a person controls an avatar on the screen and interacts with virtual objects in the scene to provide more naturalistic interaction within virtual environments, as well as what effects (if any) the type of avatar and perspective may have on performance and outcome of a functional task.

The findings of this proposed research have the potential to improve future training tools and capabilities by informing the development of realistic immersive training systems, increasing the potential to use simulation for basic skill development, and providing knowledge to assist in the creation of flexible, tailored content within VEs.

This research is funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies.

9/2013

Page 33: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

New Dimensions in Testimony2012-present

New Dimensions in Testimony is an initiative to record and display testimony in a way that will continue the dialogue between Holocaust survivors and learners far into the future.

A collaboration between the USC Shoah Foundation and the USC Institute for Creative Technologies, in partnership with Conscience Display, New Dimensions in Testimony will yield insights into the experiences of survivors through a new set of interview questions, some that survivors are asked on a regular basis, plus many of which have not been asked before.

The project uses ICT’s Light Stage technology and records interviews with seven cameras for high-fidelity playback; as well as natural language technology, which will allow people to engage with the testimonies conversationally by asking questions that trigger relevant, spoken responses. ICT is also pioneering display technology that will enable the testimonies to be projected in 3D.

The goal is to develop interactive 3-D exhibits in which learners can have simulated, educational conversations with survivors though the fourth dimension of time. Years from now, long after the last survivor has passed on, the New Dimensions in Testimony project can provide a path to enable young people to listen to a survivor and ask their own questions directly, encouraging them, each in their own way, to reflect on the deep and meaningful consequences of the Holocaust.

The project also advances the age-old tradition of passing down lessons through oral storytelling, but with the latest technologies available.

The technologies in New Dimesions in Testimony are built upon ICT basic research in graphics and natural language. These efforts are funded by the UARC contract.

1/20133/2013

Page 34: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

SimCoachReducing Barriers to Care for Military Personnel and Families

The SimCoach project develops virtual human support agents to serve as online guides for promoting access to psychological healthcare information and for assisting military personnel, veterans and family members — particularly those who might not otherwise seek help — in breaking down barriers to initiating care, including mental health support. The SimCoach goal is to motivate users to take the first step and seek information and advice with regard to their healthcare, including psychological health, traumatic brain injury and addiction; and their general personal welfare, including external stressors such as economic or transition issues. If determined necessary, SimCoach will encourage users to take the next step towards seeking more traditional resources. The SimCoach virtual support agents do not deliver diagnosis or treatment, nor do they aim to replace human providers and experts. Rather, SimCoach characters provide users an accessible and anonymous way engage in a dialogue about healthcare concerns. By guiding the user through a sequence of user-specific exercises and assessments, SimCoach characters are able to solicit basic anonymous background information about the user’s history and clinical/psychosocial concerns. With this information they can provide advice and support, direct the user to relevant online content, andpotentially facilitate the process of seeking appropriate care with a live clinical provider. The project features ICT’s first web-based interactive virtual human and is currently being rolled out to select users for data collection and analysis in order to improve the application and refine process for public online virtual humans. A customized, regional SimCoach is also currently running on the Braveheart website, a veteran support initiative of the Atlanta Braves and Emory University at: http://braveheartveterans.org. This work is currently funded by the The U.S. Army Research Laboratory’s Army Research Office (ARO). Previous support came from the Defense Centers of Excellence for Psychological Health and Traumatic Brain Injury (DCoE) and the Telemedicine & Advanced Technology Research Center (TATRC).

9/2013

2009-present

Page 35: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

SimCoach ProQoL Delivery EnhancementWhat is it?The SimCoach for ProQOL project is a prototype web-based experience for care-providers in the military (ex. – combat medics, physicians, nurses, administrators, etc.), where users can:• take the Professional Quality of Life (ProQOL) survey and receive

immediate feedback on their results• learn about provider fatigue, burnout, secondary traumatic stress as

they may relate to the work they do• learn about self-care and develop their own self-care plan to

maintain their wellness

Who is it for?Care providers in the military -- both active-duty and civilian -- all of whom face the potential of burnout, provider fatigue, and secondary traumatic stress in the course of the physically and psychologically challenging work they perform.

When?Being a web-based system, users can visit the site and interact with the SimCoach any time they have access to a computer and network in a private and confidential manner.

How?Users interact with the SimCoach via standard, major web browser at home or at work. Users can enter free text, answer multiple-choice questionnaires, view videos, and download a .pdf template for their own self-care plan.

Why?The project aims to extend the reach of the US Army’s Surgeon General mandate that all military care providers take the ProQOL by providing additional relevant resources, self-care techniques, and the opportunity to actively create a realistic self-care plan with the assistance of a SimCoach, a “Virtual Human” research technology developed at USC’s Institute for Creative Technologies.

This project was funded by the U.S. Army Medical Department (AMEDD).

9/2013

Page 36: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Situated Pedagogical AuthoringFor Virtual Human-based Training

Summary: The long-term vision for the Situated Pedagogical Authoring (SPA) project at ICT is to simplify the process of creating knowledge for automated assessment and feedback in virtual environments. We are building and evaluating software tools that enable non-technical users to create content for immersive, virtual learning environments. Specifically, these tools support the (1) defining learning objectives, (2) writing effective feedback content, (3) performing informative assessments, and (4) designing appropriate scaffolding for reflection and self-directed learning. Our hypothesis is that authoring in an environment that emulates the actual learner’s experience eases the technical burdens normally associated with content creation for intelligent learning environments and improves efficiency of creating learning content. Additional visualization and tracking tools provided by the system seek to promote the creation of pedagogically effective and thorough learning content. The current prototype targets the ELITE system and learning with virtual humans for leadership and counseling skills.

Motivation: Advanced learning technologies are playing a central role in the evolution and modernization of educational practices around the world, both in civilian and military training contexts. Unfortunately, costs associated with building and maintaining such systems have not decreased at a sufficient rate, and so uptake by end users (e.g., schoolhouses, students, training developers) has been slowed. The pursuit of authoring tools for advanced learning technologies, like intelligent tutoring systems, represents a prominent research trend to address these challenges. SPA seeks to provide tools for instructors and subject-matter experts to create and customize the guidance and assessment learners receive while practicing in virtual learning environments.

Future work: SPA is a fully implemented prototype and is undergoing multiple forms of evaluation. The focus of these evaluations are to (1) identify and meet end user needs in the classroom and out, (2) investigate the capability of SPA tools to promote better content creation for learning, and (3) elaborate on the design and creative skills needed during the act of authoring pedagogical content. Further technical advances seek to ease deployment for end users and increase the richness of assessments provided by the system.

This research is funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies.

9/2013

Page 37: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

The Stress Resilience In Virtual Environments (STRIVE) project aims to create a set of combat simulations that are part of a multi-episode interactive narrative experience. Users can be immersed within challenging combat contexts and interact with virtual characters within these episodes as part of an experiential learning approach for delivering psychoeducational material, stress management techniques and cognitive-behavioral emotional coping strategies. The STRIVE project aims to present this approach to service members prior to deployment as part of a program designed to better prepare military personnel for the types of emotional challenges that are inherent in the combat environment.

This effort is based on two scientific principles; 1) pre-exposure to traumatic events within a safe environment provides some degree of protection for those exposed to subsequent trauma; and 2) resilience, or the rate and effectiveness with which someone returns to normal after stress (a process termed allostasis), can be strengthened through systematic training.

To provide training consistent with these principles, STRIVE features six virtual reality scenarios developed with advanced gaming development software, cinematically designed lighting, sound and narrative that maximizes character development and emotional engagement as well as clinical appropriateness. Each scenario consists of a combat segment with a pivotal trauma, an event frequently reported to be the emotional source of post-traumatic stress disorder (PTSD) ruminations such as witnessing the death of a child, or the loss of a comrade. A virtual human mentor delivers a resilience training segment within the traumatic context. The resilience training techniques used in STRIVE are directly developed from the dimensions of resilience identified in the Headington Institute Resilience Inventory (HIRI), the first multi-dimensional assessment of resilience. Current training focuses on adaptability, emotional regulation, behavioral regulation, cognitive behavioral therapy appraisal methods, social support, empathy, hardiness and meaning in work. These factors are used to guide curriculum given their effective summary of current theorization of resilience.

The effectiveness of these scenarios will be tested in a study to be conducted at Camp Pendleton in mid 2013. The work is funded through ICT’s Army UARC contract and the Office of Naval Research. It is based upon ICT’s virtual reality exposure therapy system for treating PTSD.

9/2013

STRIVEStress Resilience in Virtual Environments2011-present

Page 38: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

TOPSS-VWTransitional Online Post-deployment Soldier Support in Virtual Worlds

TOPSS-VW also known as “Coming Home,” explores the domains of persistent, easily accessible virtual worlds for delivering 3D tele-health experiences. Using the Second Life platform, we have created a specialized online virtual world for the benefit of post-deployment soldiers who are reintegrating to civilian life. This is often a difficult time for many soldiers due to both physical and psychological challenges. “Coming Home” provides a dedicated space where soldiers can interact, find camaraderie, and connect to resources for healing and transition. Thus, this environment serves a function similar to the twentieth century VFW Halls.

The project builds on two of ICT’s research strengths: intelligent virtual humans and immersive virtual world expertise. ICT’s virtual humans “live” in this virtual world, where they serve as helpers, guides and storytellers, enhancing player experience and providing helpful information and way finding.

This work expands current virtual world capabilities by creating activities that provide physical – “real” – world benefits to participants. For example, inspired by breathing techniques of many relaxation therapies, a veteran can use controlled breathing in an ordinary headset microphone to cause their avatar to run on a virtual jogging path. Results from a study on this activity have shown a significant reduction in stress markers for participants.

“Coming Home” implements evidence-based Mindfulness-Based Stress Reduction (MBSR) within the virtual world. With experts from the University of San Diego Center for Mindfulness, two experimental eight-week sessions were held. Virtual world mindfulness classes will also be held through our partnerships with AMEDD at Ft. Sam Houston, and the National Intrepid Center of Excellence in TBI and Psychological Health (NICoE) in Bethesda MD.

In addition, veterans can experience a Warrior’s Journey that consists of interactive narratives about the life and ideals of classic warriors, designed to be relevant to today’s soldiers. The capstone of the Warrior’s Journey is an interactive authoring system where veterans can create their own stories.

Visit cominghomecenter.org to learn more about TOPSS-VW.

6/2012

2007-present

Page 39: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

UrbanSim

UrbanSim is a PC-based virtual training application for practicing the art of mission command in complex counterinsurgency and stability operations. It consists of a game-based practice environment, a web-based multimedia primer on doctrinal concepts of counterinsurgency and a suite of scenario authoring tools. The UrbanSim practice environment allows trainees to take on the role of an Army battalion commander and to plan and execute operations in the context of a difficult fictional training scenario. After developing their commander’s intent, identifying their lines of effort and information requirements, andselecting their measures of effectiveness, trainees direct the actions of abattalion as they attempt to maintain stability, fight insurgency, reconstruct civil infrastructure and prepare for transition. UrbanSim targets trainees’ abilities to maintain situational awareness, anticipate second and third order effects of actions and adapt their strategies in theface of difficult situations. UrbanSim is driven by an underlying socio-cultural behavior model, coupled with a novel story engine that interjects events and situations based on the real-world experience of former commanders. UrbanSim includes an intelligent tutoring system, which provides guidance to trainees during execution, as well as after action review capabilities. In April 2011, UrbanSim transitioned to the Army and is available at the MilGaming portal. It was named a program of record for two Army programs, Games for Training and the Army Low Overhead Training Toolkit (ALOTT), and has seen widespread application across institutional and operational settings. Key deployment sites include; the School for Command Preparation ,Ft. Leavenworth, KS; the Maneuver Captain’s Career Course, Ft. Benning, GA; and the Warrior Skills Training Center, Ft. Hood, TX The UrbanSim project is being performed under the ICT contract being managed by the United States Army Simulation and Training Technology Center (STTC).

6/2012

2006-present

Page 40: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Virtual Human Toolkit

The ICT Virtual Human Toolkit is a collection of modules, tools, and libraries designed to aid and support researchers and developers with the creation of virtual human conversational characters. The Toolkit is an on-going, ever-changing, innovative system fueled by basic research performed at the University of Southern California (USC) Institute for Creative Technologies (ICT) and its partners.

Designed for easy mixing and matching with a research project’s proprietary or 3rd-party software, the Toolkit provides a widely accepted platform on which new technologies can be built. The goal is to make creating virtual humans easier and more accessible, and thus expand the realm of virtual human research and applications.

The ICT Virtual Human Toolkit is built upon a common modular architecture which enables users to utilize all modules as is, one or more modules coupled with proprietary components, or one or more modules in other existing systems.

Our technology emphasizes natural language interaction, nonverbal behavior, and perception and is broken up into the following main modules:

• AcquireSpeech: A tool to send audio, or text, to speech recognizers and to relay the information to the entire system.

• MultiSense: A perception framework that enables multiple sensing and understanding modules to inter-operate simultaneously.

• Non-Player Character Editor (NPCEditor): A suite of tools which work together to create appropriate dialogue responses to users’ inputs.

• Nonverbal Behavior Generator (NVBG): A rule-based behavior planner that infers communicative functions from the surface text and selects appropriate behaviors that augment and complement the characters’ dialogue.

• Rapport 1.0: An agent that provides nonverbal feedback based on human nonverbal and verbal input.

• SmartBody (SB): A modular, controller-based character animation system that uses the Behavior Markup Language.

• vhtoolkitUnity: A renderer with custom authoring and debug tools based upon the Unity game engine.

For more information visit: https://vhtoolkit.ict.usc.edu/

11/2013

2006-present

Page 41: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Virtual HumansInteractive virtual worlds provide a powerful medium for experiential learning. The overarching is goal is to enrich such worlds with virtual humans—autonomous agents that support face-to-face interaction with people in virtual environments—thereby making them applicable to a wide range of training tasks that currently require labor-intensive live exercises, role playing or are taught non-experientially. ICT’s virtual human work promotes fundamental advances in artificial intelligence, graphics and animation. Agents must perceive and respond to events in the virtual world, they must have and express realistic emotions, and they must be able to carry on spoken dialogues with humans and other agents, including all the non-verbal communication that accompanies human speech. The virtual human effort consists of seven closely linked sub-efforts. These efforts include:

Cognition and Emotion: Research and development relating to the cognitive reasoning of a virtual human, emphasizing the close connection between cognition and emotion po-sited by current psychological and neuroscience findings (Gratch & Marsella, Project Leaders).

Natural Language Processing: Research and development of sophisticated natural lan-guage processing (NLP) capabilities to allow virtual humans to both understand and pro-duce English speech in the context of a coherent ongoing task (Traum, Project Leader).

Nonverbal Perception and Learning: Research and development efforts to recognize, model and predict human nonverbal behavior in the context of interaction with virtual humans, robots and/or other human participants (Morency, Project Leader).

Virtual Human Embodiment: Research and development of virtual human physical beha-viors including when behaviors are exhibited, their communicative function, and how to effectively realize the motion in a virtual human body (Marsella, Project Leader).

Integrated Virtual Humans: Development to support the integration of basic research ef-forts into a coherent, common and shared architecture for virtual humans. This effort con-tributes primarily to the challenge of building a modular and authorable virtual human through the development of core tools and infrastructure (Hartholt, Project Leader).

A New Breed of Cognitive Architectures: Research and development of a new breed of virtual human architecture that is broad-spectrum, tightly integrated, prediction-oriented and functionally elegant (Rosenbloom, Project Leader).

Assessing the Social Effects of Virtual Humans: Evaluation of virtual humans with the aim to understand the relationship between virtual human fidelity (both visual and beha-vioral) and learning outcomes within the domain of interpersonal-skills training (Gratch, Project Leader)

Character Animation and Simulation: Investigate, discover and develop methods for the synthesis of motion on a virtual character. Efforts include development of a large motion database primary from motion capture, identification of animation and synthesis methods suitable for photorealistic characters, as well as control over hand postures and gestures. (Shapiro, Project Leader)

This research is funded by the U.S. Army as part of the core mission of the USC Institute for Creative Technologies.

9/2013

Page 42: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Virtual Patients

The Virtual Patient project uses virtual human technology to create realistic lifelike characters to train future clinicians in therapeutic interview skills. The use of virtual patient technology is not meant to replace human standardized patients but augment live actor programs with virtual characters that are available 24/7 and can portray a multitude of characters and conditions that might be difficult for actors to represent or repeat with success.

The project began as an offshoot from the virtual human project when researchers from ICT and the USC Keck School of Medicine Department of Psychiatry won a USC Provost Teaching with Technology Grant. The success of this effort led additional research efforts involving virtual patients.

Current efforts include the development of virtual patients for military specific scenarios for the U.S. Army Simulation and Training Technology Center (STTC) and a collaboration with the Center for Innovation and Research for Veterans and Military Families at the USC School of Social Work and the U.S. Army Telemedicine and Advanced Research Technology Center (TATRC) to apply virtual patients to train social workers in military specific issues. These can include struggles with family life, return to service and post-traumatic stress.

Future efforts include a virtual patient system that can be delivered over the web and mobile devices and the development of a virtual patient toolkit to allow for the creation customized characters for clinical skills training and practice.

6/2012

2006-present

Page 43: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

Virtual Reality Exposure TherapyPost Traumatic Stress Disorder Treatment

ICT’s virtual reality exposure therapy is aimed at providing relief from post-traumatic stress.

Currently found at approximately 50 sites, including VA hospitals, military bases and university centers, ICT’s Virtual Iraq/Afghanistan exposure therapy approach has been shown to produce a meaningful reduction in PTS symptoms. Additional randomized controlled studies are ongoing.

Exposure therapy, in which a patient – guided by a trained therapist – confronts their trauma memories through a retelling of the experience, is now endorsed as an “evidence-based” treatment for PTS. ICT researchers added to this therapy by leveraging virtual art assets that were originally built for the commercially successful X-Box game and combat tactical simulation scenario, Full Spectrum Warrior. The current applications consist of a series of virtual scenarios specifically designed to represent relevant contexts for VR exposure therapy, including Middle-Eastern themed city and desert road environments. In addition to the visual stimuli presented in the VR head mounted display, directional 3D audio, vibrations and smells can be delivered into the simulation. Now rather than relying exclusively on imagining a particular scenario, a patient can experience it again in a virtual world under very safe and controlled conditions. Young military personnel, having grown up with digital gaming technology, may actually be more attracted to and comfortable with a VR treatment approach as an alternative to traditional “talk therapy”.

The therapy requires well-trained clinical care providers that understand the unique challenges that they may face with service members and veterans suffering from the wounds of war. Stimulus presentation is controlled by the clinician via a separate “wizard of Oz” interface, with the clinician in full audio contact with the patient. ICT researchers are also adapting the system as a tool for stress resilience training and PTS assessment.

Collaborators include JoAnn Difede, Weill Cornell Medical Center; Greg Reger, Madigan Army Medical Center; Barbara Rothbaum, Emory University; and Virtually Better, Inc. This basic and applied research effort is currently funded through TATRC.

7/2013

2005-present

Page 44: Download all PDF overviews.

12015 Waterfront Drive // Playa Vista, CA 90094-2536 // 310.574.5700 tel // 310.574.5725 fax // [email protected] // facebook.com/USCICT // Twitter: @USC_ICT // youtube.com/USCICT

At the University of Southern California Institute for Creative Technologies leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society.

VRCPATVirtual Reality Cognitive Performance Assessment Test

ICT has developed an adaptive virtual environment for assessment and rehabilitation of neurocognitive and affective functioning. This project brings together a team of researchers to incorporate cutting-edge neuropsychological and psychological assessment into state of the art interactive virtual Iraqi/Afghani scenarios, including a simulated city, checkpoint, and Humvee. The Army’s Needs: An Adaptive VRCPAT based upon individual Soldier differences can be used to greatly enhance assessment and training:

1. Assess Soldier’s performance within VRCPAT allows for the establishment of a baseline that is reflective of individual differences.

2. Neurocognitive and psychophysiological profile data may be used for real-time adaptation of the VRCPAT.

3. Evolution of these profiles developed for use in VRCPAT could lead to direct training of military operations in the real world.

How ICT Met Those Needs: Findings from our research have provided the military with the following:

1. A neurocognitive and psychophysiological interface modeled off of trainees interacting in a virtual environment that mimics Iraqi and Afghan environments, for modeling a trainee’s adaptive responses to environmental situations.

2. A system for military trainers to develop more reliable and valid measures of training performance.

3. Civilian dual-use capability in conditions involving psychophysiological correlates to neurocognitive function and emotion regulation in persons immersed within a virtual environment.

FutureICT is extending the VRCPAT findings by examining performance not simply by a user, but teams of Soldiers. Facts and Figures• VRCPAT is being used to run subjects at Tripler Army Medical

Center, Ft. Lewis, Madigan Army Medical Center, West Point, USC and UCSD.

• VRCPAT has been used in studies with over 400 subjects, including both Soldiers and civilians.

6/2012

2006-present