Ambiculus: LED-based Low-Resolution Peripheral Display ... · in front of the screen, thus partitioning the screen into a left and right side. We place the LEDs in arrays with a width

Post on 30-Jul-2020

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

Ambiculus: LED-based Low-Resolution Peripheral DisplayExtension for Immersive Head-Mounted Displays

Paul Lubos, Gerd Bruder, Oscar Ariza, Frank SteinickeDepartment of Informatics

University of HamburgHamburg, Germany

{lubos,bruder,ariza,steinicke}@informatik.uni-hamburg.de

ABSTRACTPeripheral vision in immersive virtual environments is im-portant for application fields that require high spatial aware-ness and veridical impressions of three-dimensional spaces.Head-mounted displays (HMDs), however, use displays andoptical elements in front of a user’s eyes, which often do notnatively support a wide field of view to stimulate the en-tire human visual field. Such limited visual angles are oftenidentified as causes of reduced navigation performance andsense of presence.

In this paper we present an approach to extend the visualfield of HMDs towards the periphery by incorporating ad-ditional optical LED elements structured in an array, whichprovide additional low-resolution information in the periph-ery of a user’s eyes. We detail our approach, technical real-ization, and present an experiment, in which we show thatsuch far peripheral stimulation can increase subjective es-timates of presence, and has the potential to change userbehavior during navigation in a virtual environment.

KeywordsHead-mounted displays; LED displays; peripheral vision

1. INTRODUCTIONHead-mounted displays (HMDs) can provide visual infor-

mation of a virtual environment (VE) over a large portion ofthe human visual field. For instance, the Oculus Rift DK1HMD provides a visual field of 90 degrees and the FakespaceLabs Wide5 HMD 150 degrees horizontally. However, thecurrently used optics and flat display technologies do notsupport stimulation of human far peripheral vision [11]. Thetotal horizontal binocular visual field of the human eyes ap-proximates on average 200 degrees, with the vertical fieldapproximating 120 degrees [9]. Restrictions of the field ofview (FOV) in the real world are known to affect humanperception and change behavior, such as navigation and ma-neuvering performance [4] and spatial estimation [10].

ACM ISBN 978-1-4503-2138-9.

DOI: 10.1145/1235

In the field of virtual reality (VR), stimulation of pe-ripheral vision has shown to elicit more natural behavior,perception and performance, i. e., reducing differences be-tween immersive virtual environments (IVEs) and the realworld [8]. However, providing visual information of a VEover the entire human visual field imposes challenges anddifficulties not only on HMD display technologies, but alsoon computer graphics rendering techniques based on planargeometric projections [11]. Moreover, recent display trendssuggest that even simple extensions of ambient light in theperiphery of screens [12] or HMDs [13] might help users feel-ing more present in VEs.

In this paper we present a peripheral low-resolution vi-sual field extension for immersive HMDs based on Light-Emitting Diodes (LEDs). We describe our approach as wellas our hardware and software implementation. We presentan evaluation of the effects of such peripheral stimulationand discuss the results.

2. RELATED WORKWhile most early HMD technologies only supported com-

parably small diagonal FOVs, i. e., about 30 to 60 degrees,and current-state HMDs support medium peripheral stimu-lation of about 90 degrees ,e. g. Oculus Rift DK2, only fewHMDs were build that stimulate more, such as the FakespaceLabs Wide5 with 150 degrees or Arthur’s 176 degree HMD [1],neither of which encompassed the whole 200 degrees of thehuman total horizontal visual field. Far peripheral visionis important due to a higher sensitivity to optic flow andmotion detection than in foveal or near-peripheral vision,as well as higher detectability of temporal changes and anincreased sensitivity to luminance in low-light situations, al-though visual resolution in such regions is largely reduced [9].

A large body of research has been focused on identify-ing effects of the FOV of HMDs on spatial perception, taskperformance, presence and behavior. For instance, Bolteet al. [2] found that a reduced FOV can change the user’sbehavior and increase the amount of necessary head rota-tions. Moreover, in an experiment conducted by Jones etal. [5] they found using a Wide5 HMD and emulating dif-ferent FOVs that a 150 degrees visual field significantly im-proved distance judgments in a VE compared to a reducedFOV of 60 degrees and showed that even static peripheralstimulation with white light in the far periphery of HMDsresulted in more accurate distance and size judgments [6].

Investigating how presence is affected by the visual field ofvirtual stimulation, experiments revealed a correlation be-tween increased FOV and increased presence, but also in-

creased simulator sickness [7]. Even related work in non-immersive display environments seems to agree that a largerFOV increases presence and engagement, even if the FOVis only artificially increased by using ambient lighting in theperiphery of screens [12].

3. AMBICULUS PERIPHERAL DISPLAYEXTENSION

In this section we first detail our conceptual approach ofextending the FOV of current-state HMDs, and then detailour current implementation based on an Oculus Rift DK2.

3.1 ConceptOur hardware concept for the Ambiculus peripheral dis-

play expansion is to mount arrays of RGB LEDs around thecentral display units of current-state HMDs (cf. [13]). Dueto the predominance of flat rectangular screens, similar toor equal to cellphone-screens, used in recent HMDs we as-sume that the HMD has a rectangular shape in which opticalmagnification lenses for the left and right eye are mountedin front of the screen, thus partitioning the screen into a leftand right side. We place the LEDs in arrays with a width ofone to n LEDs in rectangular shapes around the screen area,and add diffusors above the LEDs to diffuse the light and tocompensate for the spatially discrete light sources. Hence,users wearing the HMD see the VE on the main display infront of their eyes, and they receive light from the LEDs inthe far periphery. As the peripheral light is not transmittedthrough the lenses of the HMD it is not subject to the opticaldistortions usually found with non-pupilforming lenses [11].By incorporating multiple layers of LEDs around the centraldisplay it is possible to increase the peripheral resolution.

Usually, Ambilight setups in the home cinema market in-tercept the input image using video input signals and repeatvisual information from the screen’s edges towards the pe-riphery using single LEDs or strips [3]. We tested this ap-proach by attaching LED stripes inside HMDs and were abit surprised to notice that it seemed to suffice to increasepresence, which we observed anecdotally during informal pi-lot tests.

3.2 RealizationAs an exemplary implementation of a low resolution pe-

ripheral display, we decided to utilize RGB LEDs with anintegrated WS2812B Controller1. We implemented the pe-ripheral light capture as a Unity3D plugin and we used ourown protocol to transmit the rendered pixel data through avirtual serial port to an Arduino Nano board. The ArduinoNano board uses the Adafruid Neopixel library2 to transmitthe color data to the LEDs (also called pixels). This allowsthe transmission of any signal and any color, which, for ex-ample, can include a warning light in case a user comes closeto a collision with a physical obstacle in a head-tracked IVE.

In our current implementation we used RGB LEDs froma strip with 144 LEDs per meter at 10 mm width with 5 Vand 60 mA at maximum brightness. We included 50 LEDs(see Figure 1b) into the frame of an Oculus Rift DK2 HMD,and used an external power supply to supply the LEDs withpower (see Figure 1a). Additionally, the scripts allow de-coupling the LED output from the visual output, giving de-

1http://www.adafruit.com/datasheets/WS2812B.pdf2https://github.com/adafruit/Adafruit NeoPixel

(a) (b)

Figure 1: a) The Oculus Rift DK2 with peripheraldisplay extension prototype and diffusing foil. b)Schematic illustrating the placement of the LEDs

velopers the possibility to display warnings or change theatmosphere of a scene with the illumination.

The current prototype supports a far peripheral visualoutput with a resolution corresponding to the number ofLEDs in the grid (see Figure 1a), increasing the FOV to ap-proximately 118 degrees. The LEDs were diffused and colorvalues were optically calibrated to ensure the LED bright-ness does not surpass the HMD screen brightness.

4. EXPERIMENTIn this section we describe the experiment that we con-

ducted to evaluate the peripheral display extension describedin Section 3. We address the following research questions:

Q1 Can peripheral stimulation increase the user’s senseof presence?

Q2 Does light in the periphery change the user’s behaviorwhile navigating through a VE?

4.1 ParticipantsWe recruited fifteen participants from our department for

our experiment (2 female, ages 20 – 52, M = 30.1). All ofthem had normal or corrected vision. Seven subjects re-ported that they have participated in an HMD experimentbefore. The mean duration of the trials wearing the HMDwas about ten minutes (30 min with questionnaire and in-structions).

4.2 MaterialsAs illustrated in Figure 2a, users wore an Oculus Rift

DK2 HMD using the Oculus DK2 IR camera to provide sixdegrees of freedom optical tracking. The participants wereseated in an MWE Lab Emperor 1510 chair and used a com-puter mouse as input with their dominant hand. We imple-mented view-directed steering with the speed controlled bythe mouse. By pressing the left mouse button participantsmoved forward, the right mouse button moved them back-ward, and pressing no button allowed them to stop and lookaround with head-tracking. The Oculus Rift DK2 offers anominal diagonal FOV of approximately 100 degrees at aresolution of 1920× 1080 pixels (960× 1080 for each eye).

The visual stimulus consisted of a 3D forest scene in brightdaylight (see Figure 2b), which was rendered with Unity3Don an Intel computer with a Core i7 hexacore at 3.5GHzCPU, 32 GB RAM and two Nvidia GeForce GTX980 in anSLI array. We used the traditional edge-expansion approachto compute the ambient light for the LEDs as described inSection 3.

(a)

(b)

Figure 2: a) Photo of a participant during the ex-periment with annotations. b) The visual stimulus,including one of the gates the participants were in-structed to traverse.

4.3 MethodsHalf of the participants started with the peripheral LEDs

turned on and then completed the second part without pe-ripheral LEDs, and vice versa. To complete a trial, theparticipants were instructed to pass through ten gates inthe virtual forest scene. Upon traversing the last gate thetrial was completed. The participants were informed thatit was not a requirement to complete the task as quickly aspossible. Since the participants were seated and not ableto turn 360 degrees comfortably, the gates were placed toallow the users to walk in a general forward direction witha maximum of 45 degrees head turning (see Figure 4). Thedependent variables were the time needed to complete thetrial by passing the final gate, the distance traveled in thevirtual world, the head movements in meters, the overallangle of head rotations.

As a measure of presence and simulator sickness, userswere asked to fill in a set of subjective questionnaires. Be-fore completing the trials, participants were asked to fillin Kennedy’s Simulator Sickness Questionnaire (SSQ) andan Immersive Tendencies Questionnaire (ITQ). Afterwards,participants were instructed to put the HMD on and fol-low the on-screen task instructions. After reaching the finalgate, participants had to take the HMD off and answer a postSSQ, a Slater-Usoh-Steed (SUS) presence questionnaire aswell as a Witmer-Singer Presence Questionnaire (WS-PQ).Then, participants had to put the HMD on again and afterthe second trial they had to fill in another post SSQ, SUS,

(a) (b)

Figure 3: Results of the presence questionnaires:The y-axis shows the presence score and the x -axis shows the conditions. a) Witmer-Singer PQ.b) Slater-Usoh-Steed PQ.

and WS-PQ, as well as demographic data and we collectedadditional subjective impressions and comments.

4.4 ResultsWe had to remove one participant due to simulator sick-

ness and two participants because they did not understandthe task and performed erratic movements, since it was, asthey commented, their first time wearing an HMD and theywanted to test the device limitations.

Behavioral MeasuresWe analyzed the results with a paired-samples t-test at the5% significance level.

LED M SD t(11) p

timeon 224.13s 43.74s

1.958 .076off 202.54s 29.18s

distanceon 578.28m 61.17m

2.407 < .05off 541.12m 14.15m

head trans-lation

on 9.07m 5.87m2.223 < .05

off 6.33m 3.39m

head rota-tion

on 4071.5◦ 2454.4◦2.009 .070

off 3026.8◦ 489.8◦

QuestionnairesWe found no significant difference in the average increasein SSQ scores during the trials with Ambiculus (M = 13.09,SD = 34.74) and without Ambiculus (M = 2.49, SD = 8.18)in the experiment (Z = .65, p= .52).

We analyzed the WS-PQ, SUS and SSQ questionnaireswith a non-parametric Wilcoxon Signed Rank Test at the5% significance level. The WS-PQ indicated that there wasno statistically significant difference between the Ambiculusenabled (M = 73.50, SD = 12.62) and disabled (M = 66.25,SD = 12.30) presence scores, but a trend (Z = 62, p= .07).The SUS questionnaire indicated that there was no statis-tically significant difference between the Ambiculus enabled(M = 4.94, SD = 1.23) and disabled (M = 3.99, SD = .96)presence scores, but a trend (Z = 62, p= .07). We analyzedthe ITQ with a non-parametric Mann-Whitney U Test butfound no significant difference in ITQ scores (start with-out Ambiculus = 61, SD = 11.40; with Ambiculus = 57.86SD = 23.34; U = 17.00, n1 = 5, n2 = 7, p= 1.00).

Figure 4: The paths the participants’ walked in theexperiment. (red) Ambiculus deactivated. (Green)Ambiculus active.

4.5 DiscussionWe collected additional informal comments, which over-

all indicated that participants enjoyed moving through thevirtual forest more if the peripheral light was turned on andless if it was turned off. The quantitative behavioral re-sults support this notion, in particular, considering the timespent in the virtual world and distance traveled and partic-ipants performed more movements with their head in thereal world, i. e., translations and rotations. This is an in-dicator that the user’s behavior is influenced by the addedperipheral stimuli, as illustrated by Figure 4. When askedto reflect upon their behavior and indicate reasons after theexperiment, some participants commented that they had ahigher sense that the virtual world actually spanned entirelyaround their head such that it felt more natural to movearound in it. Another participant mentioned that the pe-ripheral light encouraged to look around and especially thedifference between bright and dark places was very presentand caused ambiance.

We assume that by adding even low resolution peripheralvisual stimuli, the participant’s desire to explore the scenewas stimulated, likely due to the increased sense of pres-ence. The results of the presence questionnaires supportthese comments and show a trend that the participants feltmore present in the Ambiculus condition.

During the debriefing, two participants mentioned thatthe additional light in the periphery increased simulator sick-ness symptoms. The results of the SSQ questionnaires sup-port this statement, although the differences between thetwo conditions were not significant and have to be consid-ered in future work.

5. CONCLUSIONIn this paper we presented an approach to provide pe-

ripheral visual stimulation for HMDs which is based on RGBLEDs that are structured in an array around the main HMDscreen. The implemented solution is intended to be open-source, allowing interested parties to use the plugin andsketches to replicate the device. Although the approachdoes not support a high resolution comparable to that inthe central fields of an HMD, we found in an experimentthat it nevertheless had benefits for subjective estimates ofpresence. Moreover, we observed that the peripheral lightchanged the behavior of users while navigating through arealistic virtual 3D scene.

In the described experiment we computed the peripherallight by repeating the border colors of the displayed imagestowards the periphery. Although the results are encourag-ing, we believe that spatial awareness and presence might be

further supported by using visually faithful rendering for theLEDs in the periphery, on which future work should focus.Moreover, future work may focus on the potential to pro-vide optic flow stimuli with multiple layers of LEDs in thefar periphery, which might be sufficient to affect self-motionvelocity estimates while moving through a virtual environ-ment due to the high sensitivity of the periphery of humaneyes to visual motion.

6. REFERENCES[1] K. Arthur. Effects of Field of View on Task

Performance with Head-Mounted Displays. InProceedings of ACM Conference on Human Factors inComputing Systems (CHI), pages 29–30, 1996.

[2] B. Bolte, G. Bruder, F. Steinicke, K. Hinrichs, andM. Lappe. Augmentation techniques for efficientexploration in head-mounted display environments. InProceedings of ACM Symposium on Virtual RealitySoftware and Technology (VRST), pages 11–18, 2010.

[3] E. Diederiks, E. Meinders, L. Van, R. Peters,J. Eggen, and K. Van. Method of and system forcontrolling an ambient light and lighting unit, Jan. 152004. WO Patent App. PCT/IB2003/002,875.

[4] S. Hassan, J. Hicks, H.Lei, and K. Turano. What isthe minimum field of view required for efficientnavigation? Vision Research, 47(16):2115–2123, 2007.

[5] J. Jones, E. Suma, D. Krum, and M. Bolas.Comparability of Narrow and Wide Field-Of-ViewHead-Mounted Displays for Medium-Field DistanceJudgments. In Proceedings of Symposium on AppliedPerception (SAP), pages 119–119. ACM, 2012.

[6] J. Jones, J. E. Swan II, and M. Bolas. Peripheralstimulation and its effect on perceived spatial scale invirtual environments. IEEE Transactions onVisualization and Computer Graphics (TVCG),19(4):701–710, 2013.

[7] J.-W. Lin, H. Duh, D. Parker, H. Abi-Rached, andT. Furness. Effects of field of view on presence,enjoyment, memory, and simulator sickness in avirtual environment. In Proceedings of IEEE VirtualReality (VR), pages 164–171, 2002.

[8] J. Loomis and J. Knapp. Visual Perception ofEgocentric Distance in Real and VirtualEnvironments. In L. Hettinger and M. Haas, editors,Virtual and Adaptive Environments, pages 21–46.Erlbaum, 2003.

[9] S. Palmer. Vision Science: Photons toPhenomenology. A Bradford Book, 1999.

[10] R. Patterson, M. Winterbottom, and B. Pierce.Perceptual issues in the use of head-mounted visualdisplays. Human Factors: The Journal of the HumanFactors and Ergonomics Society, 48(3):555–573, 2006.

[11] C. E. Rash. Helmet-mounted Displays: Sensation,Perception, and Cognition Issues. U.S. ArmyAeromedical Research; 1st edition, 2009.

[12] P. Seuntiens, I. Vogels, and A. van Keersop. Visualexperience of 3d-tv with pixelated ambilight.Proceedings of PRESENCE, 2007.

[13] J. Tang and A. Fadell. Peripheral treatment forhead-mounted displays, U.S. Patent US 20080088936A1, Apple Inc., 2012.

top related