Top Banner
Journal of Computing and Information Science in Engineering Journal Copy of e-mail Notification Journal of Computing and Information Science in Engineering Published by ASME Dear Author, YOUR PAGE PROOF IS AVAILABLE IN PDF FORMAT; please refer to this URL address http://115.111.50.156/jw/AuthorProofLogin.aspx?pwd=1e9714316576 Login: your e-mail address Password: 1e9714316576 The site contains 1 file. You will need to have Adobe Acrobat Reader software to read these files. This is free software and is available for user downloading at http://www.adobe.com/products/acrobat/readstep.html. This file contains: Adobe Acrobat Users - NOTES tool sheet, a copy of your page proofs for your article Please read the page proofs carefully and: 1) indicate changes or corrections using e-annotation; 2) answer all queries; 3) proofread any tables and equations carefully; 4) check that any special characters have translated correctly. Special Notes: Your prompt attention to and return of page proofs will help expedite publication of your work. Thank you for your cooperation. Return to Isabel Castillo, Journal Production Manager, E-mail: [email protected] If you have any questions regarding your article, please contact me. PLEASE ALWAYS INCLUDE YOUR ARTICLE NO. (JCISE-11-1453) WITH ALL CORRESPONDENCE. This e-proof is to be used only for the purpose of returning corrections to the publisher. Sincerely, Isabel Castillo, Journal Production Manager, E-mail: [email protected]
10
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: View_with Author Annotations

Journal of Computing and Information Science in Engineering Journal Copy of e-mail Notification

Journal of Computing and Information Science in Engineering Published by ASME

Dear Author,

YOUR PAGE PROOF IS AVAILABLE IN PDF FORMAT; please refer to this URL address

http://115.111.50.156/jw/AuthorProofLogin.aspx?pwd=1e9714316576

Login: your e-mail address

Password: 1e9714316576

The site contains 1 file. You will need to have Adobe Acrobat Reader software to read these files. This is free

software and is available for user downloading at http://www.adobe.com/products/acrobat/readstep.html.

This file contains:

Adobe Acrobat Users - NOTES tool sheet, a copy of your page proofs for your article

Please read the page proofs carefully and:

1) indicate changes or corrections using e-annotation;

2) answer all queries;

3) proofread any tables and equations carefully;

4) check that any special characters have translated correctly.

Special Notes:

Your prompt attention to and return of page proofs will help expedite publication of your work. Thank you for your

cooperation.

Return to Isabel Castillo, Journal Production Manager, E-mail: [email protected]

If you have any questions regarding your article, please contact me. PLEASE ALWAYS INCLUDE YOUR

ARTICLE NO. (JCISE-11-1453) WITH ALL CORRESPONDENCE.

This e-proof is to be used only for the purpose of returning corrections to the publisher.

Sincerely,

Isabel Castillo, Journal Production Manager, E-mail: [email protected]

Page 2: View_with Author Annotations

AUTHOR QUERY FORM

Journal: J. Comput. Inf. Sci. Eng.

Article Number: JCISE-11-1453

Please provide your responses and any corrections by

annotating this PDF and uploading it to ASME’s eProof

website as detailed in the Welcome email.

Dear Author,

Below are the queries associated with your article; please answer all of these queries before sending the proof back to

Cenveo. Production and publication of your paper will continue after you return corrections or respond that there are no

additional corrections.

Location in

article

Query / Remark: click on the Q link to navigate

to the appropriate spot in the proof. There, insert your comments as a PDF annotation.

AQ1 Please check the edit made to article title.

AQ2 Please provide city, postal code, and country location for both the affiliations given.

AQ3 Please provide definition for “AR”.

AQ4 Please note that “id” has been changed to “ID” in the text. Please check.

AQ5 Please check does the “following section” refer to Sec. 5 here?

AQ6 Please check does the “next section” refer to Sec. 6 here?

AQ7 Please check the presentation of Ref. 3 and please check the edit made to volume number of the same.

AQ8 Please check the place of the Conference held in Ref. 4.

AQ9 Please provide city location of the publisher for Refs. 6 and 9.

AQ10 Please check the abbreviation of the journal title in Ref. 12.

Thank you for your assistance.

Page 3: View_with Author Annotations

PROOF COPY [JCISE-11-1453]

Augmented Technical Drawings:

1 A Novel Technique for Natural

2 Interactive Visualization of

3Computer-Aided Design ModelsAQ1

4 Michele Fiorentino5 e-mail: [email protected]

6 Antonio E. Uva7 e-mail: [email protected]

8 Giuseppe Monno9 e-mail: [email protected]

10

11 Dipartimento di Ingegneria Meccanica e Gestionale,12 Politecnico di BariAQ2

13 Rafael Radkowski14 Heinz Nixdorf Institute,15 University of Paderborn16 e-mail: [email protected]

18 Engineering 3D models become complex that paper drawings rep-19 resentation and monitor/mouse navigation on a computer-aided20 design (CAD) system is difficult and also frustrating. We present a21 novel interface paradigm for natural interactive visualization. It22 incorporates augmented reality enhanced engineering drawings23 and natural and tangible interfaces. We embedded IDAQ3 markers on24 technical drawings to provide CAD data retrieval and tangible25 navigation. A user gets access to the CAD navigation functions26 using the drawing itself as a tangible device. In addition, CAD27 functions are triggered with gestures recognized by a video cam-28 era. We performed a pilot test assessment with industrial engi-29 neers. The results demonstrate the feasibility and the effectiveness30 of these techniques. [DOI: 10.1115/1.4006431]

31 1 Introduction

32 Nowadays, 3D CAD facilitates the representation and visualiza-33 tion of almost every industrial product. Beyond the simple geo-34 metrical data, virtual models, denoted by the term digital master35 (DM), can potentially store the entire engineering knowledge cre-36 ated by engineers and others during the product development pro-37 cess, such as assembly sequences, material properties, costs,38 simulation data, and other related information.39 Although digital master can theoretically replace paper-based40 technical drawings, those are still the first choice for technical data41 exchange between different enterprises or departments. As a conse-42 quence, industry generates and handles stacks of drawings. They43 need to be managed and continuously updated in the revision44 process, generating an unwanted waste of time, cost, and paper.45 The industry starts to use new formats like 3D-pdf [1] or Col-46 lada [2] for exchanging technical data, mainly in the form of 3D47 meshes for geometry visualization. Nevertheless, in general,48 industry has not implemented the digital master workflow yet.49 In our opinion, the most limiting aspect to the industrial usage50 of the paperless approach is the desktop user interface and its51 interaction techniques. Current CAD interaction metaphors rely

52on standard graphical user interface (GUI) created for mouse/key-53board PCs and they are optimized for precise geometry creation,54parameterization, constraints definition, etc.55Two limits have been encountered when dealing with the navi-56gation of complex and data rich 3D digital masters: (i) usability57and (ii) combination of real drawings/mock-ups and virtual58models.59The interface design of the current CAD GUIs uses nonstandard60means to navigate the 3D geometry: It requires a sequence of61mouse buttons or a combination of keyboard and mouse entries to62activate pan, zoom, and rotate. Untrained users (in marketing, pro-63duction, etc.) need time to learn and to adapt to different CAD64systems. Meanwhile, it requires a cognitive overhead to convert a65simple 3D positioning into specific CAD commands. Moreover,66designers commonly discuss and annotate paper drafts, and also67handle and observe real prototypes or defected components from68different points of view. The second main limitation of current69interfaces is the difficulty to combine virtual designs and real70objects in one effective workspace.71To overcome these problems, we propose a novel concept to72access the 3D models called the augmented technical drawings73(ATD) [3,4]. The ATD incorporates augmented reality technology74and tangible interaction techniques to provide an easy-to-use75interface to DM navigation.76The tangible interaction and a tangible user interface (TUI)77facilitate to interact with digital information using the physical78environment [5]. TUIs have emerged as an alternative paradigm79to conventional GUIs allowing users’ to manipulate objects in vir-80tual space using real, thus “tangible,” objects.81We used an AR human-computer interface to augment paper82drawings with superimposed DM information. The AR can con-83textually present high interactive technical information (CAD84models, text, charts) and register (i.e., colocated) them with the85drawings. In addition, AR can also augment a physical manufac-86tured object like a prototype or partial manufactured parts.87The main goal of this research work is to enhance the ATD con-88cept by using natural user interfaces. A user interface is described89as natural when it is designed to “reuse existing skills for interact-90ing directly with content” [6].91Figure 1 shows the setup of the AR workspace: The operator92wears a head mounted display (HMD), a goggle-like device that93superimposes the user’s visual perception of the physical paper94drawing with computer generated content. The user accesses an95enriched digital model of the product simply by showing the96related drawing to a video camera. The digital content appears on97the virtual display; however, it appears colocated on the drawing.98The combination of hand gesture-based interactions and tangi-99ble interfaces facilitates an effective and easy-to-use manipula-

100tion. In fact, an engineer or an operator can interact with the

Fig. 1 Natural and tangible interfaces using the augmentedtechnical drawings

Contributed by the Computers and Information Division of ASME for publicationin the JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING. Manuscriptreceived October 31, 2011; final manuscript received March 13, 2012; publishedonline xx xx, xxxx. Assoc. Editor: Shuming Gao.

J_ID: JCIS DOI: 10.1115/1.4006431 Date: 28-March-12 Stage: Page: 1 Total Pages: 9

ID: veeraragavanb Time: 12:38 I Path: V:/3b2/JCIS/Vol00000/120009/APPFile/AS-JCIS120009

Journal of Computing and Information Science in Engineering MONTH 2012, Vol. 00 / 000000-1Copyright VC 2012 by ASME

mikeflower
Nota
ok
mikeflower
Nota
70126, Bari, Italy,
mikeflower
Nota
33102, Paderborn, Germany
mikeflower
Nota
Well, Augmented reality is according to the author well know concept. I tried to reduce AR in one line: Augmented reality (AR): technology to enhance user’s real world perception with interactive and spatially referenced computer-generated graphics or data.
Page 4: View_with Author Annotations

PROOF COPY [JCISE-11-1453]

101 technical drawings in a natural way and as she/he is used to work102 with. In addition, our AR-enhanced drawings can display digital103 technical content with the following advantages: (i) gesture and104 tangible-based interface can trigger CAD functions; (ii) the DM is105 colocated on the drawing and the data is up to date; and (iii) low106 cost deployment in industry.107 In Sec. 2, we present the related work. Section 3 introduces the108 novel concept of the natural and tangible augmented technical109 drawings. Section 4 describes the technical system necessary to110 apply this concept. Section 5 explains our interaction techniques111 in detail. Finally, we present a case study to demonstrate the112 implementation and draw the conclusions.

113 2 Related Works

114 The previous research has presented a diversity of industrial115 pilot studies of AR as reported in the ARVIKA project [7]. They116 have demonstrated that the human factors in interface design and117 software and hardware standards are two of the major issues for118 the application of AR in industry.119 Owing to their novelty, interfaces using augmented reality for120 DM navigation are not supported by current standardization.121 The S1000D [8], a standard on technical information developed122 by the aerospace industries for military aircraft maintenance,123 defines two types of documentation: fixed page presentation on124 paper or screen and interactive electronic technical publications125 on screen. It supports desktop-based GUIs (mouse, keyboard,126 touch screen, etc.) and defines in detail screen layout, title/menus127 bar, dialog boxes, etc. The S1000D is rather limited in the defini-128 tion of 3D CAD model navigation: It restricts 3D visualization in129 terms of 2D static or animated graphical figures (prospective and130 orthographic), fly-through (3D model navigation), and helper131 applications (e.g., Adobe Acrobat or Arbortext IsoView).132 In specific, S1000D do not support tangible user interfaces. The133 first generation of TUIs date back to the early 1980s, when Frazer134 explored different approaches to parallel physical and digital135 interactions with his 3D data input devices [9].136 The ARIEL system, from Mackay et al. [10] is one of the first137 examples in literature of TUI applied to technical augmented real-138 ity environments. ARIEL relies on a graphics tablet for capturing139 user gestures and on a projector to overlay 2D multimedia annota-140 tions with real technical drawings. Their experiments demonstrate141 the benefits of a tangible medium (i.e., the paper drawings) when142 superimposed with digital content.143 Terry et al. [11] present JUMP, a tangible interface for aug-144 mented paper-based architectural documents. They present a145 novel set of tangible tools for navigation and interaction using 2D146 augmented technical drawings. In JUMP, so-called filter tokens147 are placed on the paper to modify the visualization of electrical,148 mechanical, and structural information. They have also developed149 a physical rectangular selection tool. The selection is conducted150 by framing a section of the drawing with physical brackets.151 Tory et al. [12] have proved by user tests that gesturing, naviga-152 tion, annotation, and viewing are the four primary interactions153 with design artifacts in technical meeting. They have encountered154 bottlenecks in the collaborative design process, when meeting par-155 ticipants attempted to navigate digital information, interact with156 wall displays, and access information individually and as a group.157 Wang et al. [13] have developed an AR application that super-158 imposes technical drawings with 3D models of mechanical and159 electrical parts. They have used HMDs to allow a collaborative160 review of those parts. They found that HMDs and this kind of vis-161 ualization facilitate a design review. Nevertheless, they did not162 focus on interaction.163 In our previous work [3], we present an AR application that164 enhances paper technical drawings with the digital master. We165 perform tracking with the optical markers embedded in the draw-166 ing template using the ARToolkit [14]. The ARToolkit determines167 the pose of the drawing using a video camera to capture the physi-168 cal workspace enriched with physical markers and it superimposes

169digital contents onto the paper drawing (e.g., a 3D CAD model or170an updated 2D drawing). We used paper drawing as tangible inter-171face to navigate and to access the digital master data. We demon-172strated the advantages of AR supported drawings for engineering173applications. The major constraint is the visibility of the markers174that need to be in the line of sight of the camera. We observed fre-175quent unintended marker occlusion that caused latency or freezing176in tracking. These effects reduce the interactive experience and177the effectiveness of this interface.178In addition, the limited variety of commands available with179TUIs was also a problem of the interface. Common functions like180pan/zoom and rotate can be mapped using multiple markers but181the combinations are quite limited. Therefore, our previous solu-182tion required a hybrid interface that is based on a flexible mixture183of tangible items and GUI operations.184In summary, most researches in literature are proof-of-concept185prototypes for user interface design. For this reason, commercial186engineering software producers increase emphasis on user-187friendly interfaces to deal with growing data complexity (e.g.,188geometries, components, information). Dassault Systems, for189example, with its latest engineering commercial products, facili-190tates user’s browsing, zooming, selecting, and inspecting the prod-191uct definition intuitively using virtual 3DLive turntables [15]. The1923DLive turntable is an innovative interface specifically developed193for 3D model understanding and tangible rotation using a touch194screen.195This research gives a contribution in the ATD user interface196design by introducing as novelty the natural interfaces. The word197“natural interfaces” defines a specific class of human and com-198puter interfaces, which are “transparent” (i.e., not perceived) to199the user. The user can use the system without the awareness of the200technology behind it. This kind of interaction needs to be highly201intuitive: The user should not concern about gestures, buttons, etc.202Furthermore, no technical devices can be attached to the users’203hand.204Like real objects, the user can grasp a virtual object with his fin-205gers, pick it up, and place it at every desired place. Therefore, dif-206ferent approaches exist.207Buchmann et al. [16] have introduced one of the first systems.208Their system called FingARtips is a gesture-based system for the209direct manipulation of virtual objects. They used fiducial markers210on each finger to track the gesture of a person. The markers are211tracked, and a computer-internal model is calculated. This model212allows picking up virtual objects.213Reifinger et al. [17] have introduced a similar system. The214authors have utilized an infrared tracking system for hand and215gesture recognition. Infrared markers are applied to the fingers of216a user. The tracking system detects the markers on the fingers. A217computer-internal hand model is build. Thus, the user is able to218grasp virtual objects like real objects.219Our solution, compared to existing knowledge aims to offer a220contribution for a better digital master data fruition improving the221ATD concept with natural interaction.

2223 Our Idea

223The augmented technical drawing concept is depicted in Fig. 2.224The user sees a paper drawing superimposed by a 3D model of the225digital master. The significant aspects of this idea are the paper226drawing and the markers printed on the paper drawing template.227Each marker has a unique ID AQ4. This ID is associated with the 3D228model. To obtain access to this model, the user has to move the229paper drawing in the field-of-view of a video camera. The system230detects the marker in the image and determines the ID. Thus, the231AR application can load the requested data (e.g., 3D model) and232shows it on top of the paper drawing.233During the engineering general labor, the engineer deals with234different paper drawings. To retrieve the requested information235from the entire digital master, the user chooses the related paper236drawing and shows it to the video camera. We decided to use

J_ID: JCIS DOI: 10.1115/1.4006431 Date: 28-March-12 Stage: Page: 2 Total Pages: 9

ID: veeraragavanb Time: 12:38 I Path: V:/3b2/JCIS/Vol00000/120009/APPFile/AS-JCIS120009

000000-2 / Vol. 00, MONTH 2012 Transactions of the ASME

mikeflower
Nota
ok
Page 5: View_with Author Annotations

PROOF COPY [JCISE-11-1453]

237 paper drawings for three main reasons. First, drawings are cur-238 rently used in many engineering stages. Second, technical users239 are familiar with the handling, storing, and the sorting of sheets of240 papers. Third, the simplicity and the low cost deployment allow a241 wide usage of this technique.242 The continuous product redesign, required by the global com-243 petitiveness, causes rapid paper drawing obsolescence. Our244 approach can considerably reduce the need to reprint obsolete245 drawings. Drawings serve just as reference for tracking and tangi-246 ble interface, while all revisions will be digitally available through247 AR. To work with these digital contents efficiently, navigation248 and inspections functions for design review are necessary. For249 instance, navigations functions are orbit, pan/tilt, or layer control.250 In addition, inspection functions, like assembly explosion or sec-251 tion planes are used to examine more features of the models. We252 propose to control these functions with a combination of two types253 of interaction techniques: tangible interfaces and natural gesture-254 based interaction.

255 4 Setup and Implementation

256 We implemented the proposed concept in the system depicted257 in Fig. 3. Two input devices are used: the Creative Live Cam258 Video IM Ultra webcams (1280� 960 pixel at 30 fps) and the259 Microsoft Kinect [18]. In our application scenario, the webcam is260 used for drawing detection and tracking, while the Kinect is used261 for user tracking and gesture recognition. The Creative Live Cam262 is fixed on the HMD of the user. The MS Kinect and the user are263 located face-to-face and the Kinect location is fixed. The Kinect264 integrates two video camera sensors: One sensor captures red–-265 green–blue (RGB) color images with a resolution of 640� 480266 pixel. The second sensor provides 12 bit depth images. The output267 device is a video see through HMD. We use a Canon VH 2002268 with a resolution of 640� 480 pixel and two embedded video269 cameras.270 The software system incorporates two components: a visualiza-271 tion component and an interaction component. The objective of272 the visualization component is to combine and register the 3D273 models with the real drawings. Therefore, two modules are uti-274 lized: tracking and renderer. The ARToolkit marker-based track-275 ing library [14] is used as tracking module. It searches for visual276 markers in the webcam image and recognizes the markers’ ID and277 spatial position. The second module, the renderer, is based on the278 open source scene graph library OpenSceneGraph [19]. The visu-279 alization of 3D models is managed as follows: If a marker is found280 in the scene, its unique ID is used to query the engineering data-281 base for the related DM. The renderer generates an image of the282 3D model spatial aligned to the ATD using the tracking data (i.e.,

283position and orientation of the marker in the camera reference284frame).285The interaction component provides user skeleton tracking and286hand gesture recognition. We use the OpenNI (Open Natural287Interaction) framework [20] to retrieve images from the Kinect288and to calculate the user’s skeleton. The OpenNI frameworks289incorporate a set of application programming interfaces that pro-290vides a middleware component to access technical devices like291the Kinect. From the skeleton data, we retrieve the 3D hand292position.293The hand gesture recognition is based on an open source com-294puter vision library [21]. It provides functions for image manipu-295lation, object segmentation, recognition, and classification. Our296hand gesture recognition system detects five gestures: fist, open297hand, closed hand, index finger, and a waving gesture. Starting298point of the hand gesture recognition is the RGB color image299from the Kinect camera. The system detects the hand on the basis300of the skin color using a trained segmentation algorithm. The301algorithm uses a trained histogram of distribution of colors, which302is calculated in an off-line training step. To identify the hand by303segmentation, the system computes the probability of each pixel304to be a subset of the trained skin color. The output of this step is305the shape of the hand in the retrieved image. This shape is classi-306fied using the Hu set of invariant moments. They are invariant to307translations, rotations, and scales, making the system robust. To308identify the hand gesture, we compare the estimated Hu set of309moments with a predefined set using a classification method and a310support vector machine. This is a nonprobabilistic binary linear311classifier. The predefined Hu set of moments of each gesture is312calculated out of 200 images, and the average values are stored as313template. Further details of the implementation of the interaction314component can be found in Ref. [22].315The interaction component and the visualization component316communicate via network messages. The visualization component317triggers a function according to the content of the message. The318application runs on a Windows 7 PC with Intel Xeon 3,6 GHz319Processor and 6 GB RAM with a NVIDIA Quadro 5500 GPU.320In Sec. 5, AQ5we describe the interaction techniques for digital321master.

Fig. 2 Visualization of the digital master using AR and theaugmented technical drawing

Fig. 3 Overview of the visualization and interaction system

J_ID: JCIS DOI: 10.1115/1.4006431 Date: 28-March-12 Stage: Page: 3 Total Pages: 9

ID: veeraragavanb Time: 12:38 I Path: V:/3b2/JCIS/Vol00000/120009/APPFile/AS-JCIS120009

Journal of Computing and Information Science in Engineering MONTH 2012, Vol. 00 / 000000-3

mikeflower
Nota
ok
Page 6: View_with Author Annotations

PROOF COPY [JCISE-11-1453]

322 5 Engineering Contents Interaction

323 In our system, engineering contents can be handled in two324 modalities: First, the tangible approach; it consists of grasping and325 moving the ATDs mainly on the table. Second, the hand gesture-326 based interaction; it is carried out by hand gestures in space. The327 user can benefit from both approaches independently or simulta-328 neously. We optimized the interaction for 3D model engineering329 task: data evaluation, design review, and inspection. We provide330 interaction functions for the following tasks: (i) scene navigation331 (pan, zoom, orbit), (ii) layer control, (iii) object selection, (iv)332 operation browsing, (v) assembly explosion, and (vi) 3D interac-333 tive sectioning. We did not wanted to implement a full set of334 CAD modeling functions which require a very precise pointing,335 continuative numerical input, and extensive working sessions.

336 5.1 Tangible Interaction Design. An ADT, tracked by com-337 puter vision algorithms, can theoretically provide a 6 DOF input338 to the user. Nevertheless, our tests have shown that holding the339 drawing in space with one or two hands affect the stability of the340 tracking, due to the requirement of the system to acquire planar341 marker. Another limitation due to the marker visibility occurs342 when the drawing normal (thus of the marker) is nearly orthogonal343 to the camera direction. We experienced that, when the normal of344 ATD and the camera viewpoint direction have an angle above 70345 deg, the tracking is not precise or it can even be lost.346 To face the aforementioned issues and to use ADTs effectively347 in an engineering environment, we decided to design the main tan-348 gible interaction keeping ATDs on a horizontal planar surface (ta-349 ble, workbench, etc.) This guarantees stability and precision but350 reduces the ATD DOF to three: sliding in two directions and351 rotation.

352 5.1.1 ATD Pan. The pan function is a very common opera-353 tion in CAD: It allows the movement of the 3D model in two354 directions. Since the scene is referenced to the ATD, pan function355 is automatically provided by our system by simply moving the356 drawing. We tested this interaction, and we experienced that it is357 stable and effective only when the paper is on a flat surface like a358 table and the ATD is moved on this surface. When the user han-359 dles the ATD for panning the scene along vertical axis, the inter-360 action is possible but it can be unstable.

361 5.1.2 ATD Orbit. Another common task in CAD model navi-362 gation is the orbit function: It allows the rotation of the 3D model.363 This form of navigation incorporates moving the camera view-364 point around an ideal center in the scene. This function, as the pre-365 vious, is basically implemented with the ATD because the user366 can simply rotate the sheet along three axes. Due to the marker367 visibility limitations, this metaphor is mostly effective when the

368ATD is placed on a flat surface, therefore, limiting the scene orbit-369ing around the vertical axis.370As in most CAD systems, we offer the possibility to move the371center of rotation from the standard position (center of the draw-372ing) to any point in the space using the hand gesture interaction373described in Sec. 6 AQ6.374An intrinsic limitation of this metaphor is the impossibility to375inspect the bottom 3D model. We present a solution to this issue376by simply flipping the ATD. We call this operation backside navi-377gation. We previously double print the drawing with a set of dif-378ferent markers on the back of the sheet. The system recognizes379these new markers and displays the opposite side of the model by380orbiting the camera of 180 deg (see Fig. 4).

3815.1.3 Layer Control. We implemented a layer visualization382system with tangible interfaces. Layers facilitate the separation of383data into different categories. They are useful in CAD visualiza-384tion for pruning the data of interest. To avoid the use of keyboard385buttons, the activation/deactivation of layers is available in two386ways: by query chips and by folding patterns. Query chips are387disks with a printed marker that triggers a specific data query.388Dropping or flipping a chip to the AR working area activates the389associated query. (Fig. 5 left). Query tabs work similar but are em-390bedded in the technical drawings. The user can modify the query391by simply folding or unfolding the tabs (Fig. 5 right).

3925.2 Hand Gesture-Based Interfaces. The hand gesture-393based interaction provides a natural way to interact with engineer-394ing 3D models. To support the interaction, we visualize a proxy395element (a semitransparent sphere) as 3D cursor for each hand.

Fig. 4 Back side navigation by flipping the drawing

Fig. 5 Layer chips (left) and folding patterns (right) are utilized as tangible interface for dataretrieval

J_ID: JCIS DOI: 10.1115/1.4006431 Date: 28-March-12 Stage: Page: 4 Total Pages: 9

ID: veeraragavanb Time: 12:39 I Path: V:/3b2/JCIS/Vol00000/120009/APPFile/AS-JCIS120009

000000-4 / Vol. 00, MONTH 2012 Transactions of the ASME

mikeflower
Nota
ATTENTION: here is referring 5.2 not section 6
Page 7: View_with Author Annotations

PROOF COPY [JCISE-11-1453]

396 This virtual element indicates the 3D position of the hand in the397 working volume. In addition, it operates as a pointing device for398 natural interaction.

399 5.2.1 Natural Selection. The natural interface selection400 mimics the real touching by collision detection between the 3D401 cursor and the selectable objects (Fig. 6). The selected 3D models402 are highlighted for visual feedback. This function is useful for403 CAD model inspection retrieving information on a distinct assem-404 bly, subassembly, or on a component of the DM.

405 5.2.2 Natural Pan. This function facilitates the movement of406 the point of view. It is realized by a grasping metaphor when no407 objects are selected. If the user clenches a fist, the whole scene is408 “grasped” and translated in 3D space (see Fig. 7). The user opens409 the hand to release the scene.

410 5.2.3 Natural Orbit. The natural orbit function facilitates the411 rotation of the point of view. The scene orbit function is imple-412 mented as a two-hand interaction. It is activated by two clenched413 fist gestures. The user virtually grasps the scene in two points and414 rotates it in 3D space using both hands (see Fig. 8). To release the415 model, the user opens the hands.

416 5.2.4 Natural Zoom. The zoom function increases or417 decreases the size of the visual appearance of the scene. It is418 implemented as a two-hand interaction. Performing a clenched fist419 and an opened hand activates this function. To change the zoom420 value, the user increases or decreases the distance between the421 hands, visualized as 3D cursors (Fig. 9). Zoom ends when the user422 opens both hands.

4235.2.5 Natural Assembly Explosion. The assembly explosion424function visually separates the distinct components of a DM in425space to provide an interactive exploded view of the assembly. It426is one of the most important functions in technical design review

Fig. 6 Natural selection of a 3D model

Fig. 7 Example of clutch pan: fist gesture activates 3D camerapan

Fig. 8 Example of clutch orbit: two fist gestures activatecamera orbit

Fig. 9 One fist and one open hand gesture activates the zoom

Fig. 10 The shaking gesture activates the CAD assemblyexplosion

J_ID: JCIS DOI: 10.1115/1.4006431 Date: 28-March-12 Stage: Page: 5 Total Pages: 9

ID: veeraragavanb Time: 12:39 I Path: V:/3b2/JCIS/Vol00000/120009/APPFile/AS-JCIS120009

Journal of Computing and Information Science in Engineering MONTH 2012, Vol. 00 / 000000-5

Page 8: View_with Author Annotations

PROOF COPY [JCISE-11-1453]

427 is the assembly inspection. The user has to navigate the CAD428 object structure (i.e., assemblies, subassemblies, and parts). To429 activate this function, the user selects the (sub)assembly of inter-430 est and shakes his/her hand (Fig. 10). To re-assembly the exploded431 view, the user has to wave his/her hand. The exploded model can

432be used for further nested navigation. Because of the spatial dis-433tributed, the user can easily select and inspect specific parts.

4346 Application Case

435The presented DM interface is a novel approach compared with436the industrial practice. It was necessary to investigate whether the437presented interaction techniques facilitate the review and the inspec-438tion a complex 3D model. In this section, we describe an application439case that we used to assess, first, the feasibility of the interface and,440second, to receive a qualitative feedback from the users.441Figure 11 shows the digital model of the application case: a442foldable electric assisted bike designed by an engineering student443workgroup. This 3D model consists of five subassemblies and in444total of more than 200 parts. Figure 11(a) shows a subassembly of445this bike. Figure 11(b) shows the paper drawing used as tangible446interface on the working table. Figure 12 shows the user’s view of447the paper drawing using a HMD: The 3D model is superimposed448with the augmented technical drawing.449Figure 13 shows the navigation tasks, in particular the pan and450orbit operations. The user can move and rotate the paper template451to survey the 3D model from different point of views. In addition,452the user can zoom the model inspecting details.453Figure 14 shows the selection and translation of a subassembly454using the hand gesture-based interaction. This interaction is used455to modify one or more parts, subassemblies, and assemblies of the4563D model. The user selects the object(s) using the 3D cursor.

Fig. 11 (a) The example case 3D subassembly and (b) its augmented technicaldrawing

Fig. 12 The augmented technical drawing as seen in the HMD

Fig. 13 Pan and orbit operations facilitate the inspection from different point ofviews

J_ID: JCIS DOI: 10.1115/1.4006431 Date: 28-March-12 Stage: Page: 6 Total Pages: 9

ID: veeraragavanb Time: 12:40 I Path: V:/3b2/JCIS/Vol00000/120009/APPFile/AS-JCIS120009

000000-6 / Vol. 00, MONTH 2012 Transactions of the ASME

Page 9: View_with Author Annotations

PROOF COPY [JCISE-11-1453]

457 After clenching the fist, the user is able to translate the object by458 moving his/her hand. Thus, the user can disassemble and assemble459 the virtual technical system step by step.

460In addition to the manual parts handling, the user can trigger an461automatic subassembly explosion. After selecting a sub/assembly462using the 3D cursor, the user shakes her/his hand quickly

Fig. 14 The user can select single subassemblies and parts and move themindividually

Fig. 15 Assembly explosion function

Fig. 16 The user zoom a 3D model changing the distance between his/her hands

J_ID: JCIS DOI: 10.1115/1.4006431 Date: 28-March-12 Stage: Page: 7 Total Pages: 9

ID: veeraragavanb Time: 12:40 I Path: V:/3b2/JCIS/Vol00000/120009/APPFile/AS-JCIS120009

Journal of Computing and Information Science in Engineering MONTH 2012, Vol. 00 / 000000-7

Page 10: View_with Author Annotations

PROOF COPY [JCISE-11-1453]

463 (Fig. 15, left). The system shows an animated assembly explosion464 of the selected subassembly, while other hi-level subassemblies465 disappear (Fig. 15, right).466 We carried out a pilot test with 14 students of mechanical engi-467 neering and four industrial engineers. All the recruited subjects468 (five female and 13 male) are medium/expert CAD users. The469 subjects were asked to test the interface performing a design470 review of the electric bike assembly, thereby, to evaluate each of471 the implemented interaction techniques: ATD pan/orbit/layer, nat-472 ural selection, natural manipulation (pan/orbit/zoom), and assem-473 bly explosion. After 20 min of usage, each subject was asked to474 fill a questionnaire. Our qualitative survey revealed no critical475 issues during selection and navigation using the 3D cursor. After a476 few minutes of training, all users were able to select and to manip-477 ulate single subassemblies and parts. This short learning phase is478 the direct result of an interface that mimics the real object behavior.479 The survey validated also the user acceptance of the interface480 regarding the comfort. One of the encountered limitations observed481 during the experiments regards the precision of the selection. We482 experienced in some cases that the 3D cursor might exceed the size483 of a subassembly. This problem was easy solved using the zooming484 function for the scene as depicted in Fig. 16.485 In summary, we could assert that tangible interfaces and hand486 gesture-based interaction can symbiotically work to improve the487 DM data fruition.

488 7 Conclusions

489 This paper introduces a novel interaction paradigm that incor-490 porates engineering digital models, augmented technical draw-491 ings, as well as a combination of natural and tangible interfaces.492 All interaction techniques have been designed to emulate the493 manipulation of real objects.494 The main added value of our research is a demonstration of the495 feasibility of the integrated interface in the specific field of appli-496 cation. We address the following functions of CAD model naviga-497 tion: (i) scene navigation (pan, zoom, orbit), (ii) layer control, (iii)498 object selection, (iv) operation browsing, (v) assembly explosion,499 and (vi) 3D interactive sectioning.500 An explorative test has been carried out. The tangible approach501 using augmented technical drawing demonstrates the feasibility502 for 2D translation and 2D rotation. In addition, hand gesture-based503 interactions demonstrate the feasibility in 3D activities and manipula-504 tion along vertical direction. Thus, the hybrid interface demonstrates505 its feasibility. Moreover, the natural user interface eases the selection506 of the best possible interaction technique for DM comprehension507 during design review, maintenance, and assembly tasks. Moreover,508 hand gesture-based techniques have been proven to be effective and509 consumed by the users. This result provides a benefit to the inevitable510 transition in engineering practice from the current usage of paper511 drawing to the upcoming fully paperless digital master. In future512 work, we will focus on two main aspects. First, we will exploit more513 complex gestures using also the interference with real objects. Sec-

514ond, user tests will be carried out to statistically prove the effective-515ness of the proposed interaction techniques.

References[1] Homepage of the Adobe’s 3D pdf technology, http://www.adobe.com/

516manufacturing/solutions/3d_solutions/, accessed on December 3, 2012.[2] Homepage of the Collada file specification, https://collada.org/, accessed on De-

517cember 3, 2012.[3] Uva, A. E., Cristiano, S., Fiorentino, M., and Monno, G., 2010, “Distributed

518Design Review Using Tangible Augmented Technical Drawings,” Comput.-519Aided Des., 42(5), pp. 364–372 (I. Horvath, K. Lee, and V. Shapiro, eds.,520Elsevier Ltd). AQ7

[4] Fiorentino, M., Uva, A. E., and Monno, G., 2011, “Product Manufacturing Infor-521mation Management in Interactive Augmented Technical Drawings,” ASME 2011522World Conference on Innovative Virtual Reality, ASME, New York, Milano. AQ8

[5] Ullmer, B., and Ishii, H., 2000, “Emerging Frameworks for Tangible User Inter-523faces,” IBM Syst. J., 393(3), pp. 915–931.

[6] Blake, J., 2010, Natural User Interfaces in .NET, Manning Publications Co.524ISBN: 9781935182818. 2011. AQ9

[7] Friedrich, W., 2002, “Arvika-Augmented Reality for Development, Production525and Service,” Proceedings of the 1st International Symposium on Mixed and526Augmented Reality ISMAR ’02, Darmstadt, Germany, Sept. 30–Oct. 1, pp. 3–4.

[8] S1000D International Specification for Technical Publications, http://www.527s1000d.org/, accessed on December 3, 2012.

[9] Frazer, J., 1995, An Evolutionary Architecture, Architectural Association Publi-528cations, Themes VII, John Frazer and the Architectural Association.

[10] Mackay, W. E., Pagani, D. S., Faber, L., Inwood, B., Launiainen, P., Brenta, L.,529and Pouzol, V., 2000, “Ariel: Augmenting Paper Engineering Drawings,” Con-530ference Companion on Human Factors in Computing Systems (CHI ’95),531ACM, New York.

[11] Terry, M., Cheung, J., Lee, J., Park, T., and Williams, N., 2007, “Jump: A Sys-532tem for Interactive, Tangible Queries of Paper,” Proceedings of Graphics Inter-533face 2007 GI ’07, ACM, New York, Vol. 234, pp. 127–134.

[12] Tory, M., Staub-French, S., Po, B. A., and Wu, F., 2008, “Physical and Digital534Artifact-Mediated Coordination in Building Design,” Comput. Supported Coop.535Work, 17(4), pp. 311–351. AQ10

[13] Wang, X., and Dunston, P. S., 2005, “System Evaluation of a Mixed Reality-536Based Collaborative Prototype for Mechanical Design Review Collaboration,”537Proceedings of 2005 ASCE International Conference on Computing in Civil538Engineering, L. Soibelman and F. Pena-Mora, eds., American Society of Civil539Engineers (ASCE), Cancun, Mexico, July 12–15.

[14] Kato, H., and Billinghurst, M., 1999, “Marker Tracking and HMD Calibration540for a Video-Based Augmented Reality Conferencing System,” 2nd Interna-541tional Workshop on Augmented Reality, San Francisco, CA, pp. 85–94.

[15] Dassault Systemes, http://www.3ds.com/products/3dlive/, accessed on Decem-542ber 3, 2012.

[16] Buchmann, V., Violich, S., Billinghurst, M., and Cockburn, A., 2004, “Fingartips,543Gesture Based Direct Manipulation in Augmented Reality,” Proceedings of the5442nd International Conference on Computer Graphics and Interactive Techniques545(GRAPHITE ’04),Singapore, June 15–18, ACM Publications, pp. 212–221.

[17] Reifinger, S., Wallhoff, F., Ablassmeier, M., Poitschke, T., and Rigoll, G.,5462007, “Static and Dynamic Hand-Gesture Recognition for Augmented Reality547Applications,” HCI Intelligent Multimodal Interaction Environments, J. A.548Jacko, eds., Springer Publication, Berlin, Heidelberg, pp. 728–737.

[18] Homepage of XBox Kinect, http://www.xbox.com/kinect/, accessed on Decem-549ber 3, 2012.

[19] Homepage of the open source project OpenSceneGraph, http://www.openscenegraph.550org/projects/osg, accessed on December 3, 2012.

[20] Homepage of the Open Natural Interface, http://www.openni.org.[21] Homepage of the open source computer vision library OpenCV, http://opencv.

551willowgarage.com/, accessed on December 3, 2012.[22] Radkowski, R., and Stritzke, C., 2011, “Comparison Between 2d and 3d Hand

552Gesture Interaction for Augmented Reality Applications,” Proceedings of the553ASME 2011 International Design Engineering Technical Conferences & Com-554puters and Information in Engineering Conference IDETC/CIE 2010, Washing-555ton, DC, Aug. 28–31.

J_ID: JCIS DOI: 10.1115/1.4006431 Date: 28-March-12 Stage: Page: 8 Total Pages: 9

ID: veeraragavanb Time: 12:41 I Path: V:/3b2/JCIS/Vol00000/120009/APPFile/AS-JCIS120009

000000-8 / Vol. 00, MONTH 2012 Transactions of the ASME

mikeflower
Nota
OK
mikeflower
Nota
please remove newyork: conference was in milan :)
mikeflower
Nota
uffficial citation taker form sciencedirect. Feel free to modify according journal conventions. Thanks