Page 1
{jpsml, gfg, gds, jmxnt, ebmx, vt, jk} @cin.ufpe.br
ARCam: an FPGA-based Augmented Reality Framework
Petrópolis, May 2007
SVR 2007
JOÃO PAULO LIMAGermano Guimarães
Guilherme Silva
JOÃO MARCELO TEIXEIRAEmanoel Xavier
Veronica TeichriebJudith Kelner
Page 2
Motivation
• Many AR tracking techniques require image processing procedures
• Usually done by software– General purpose processing– Operating system overhead– Impact on frame rate and image
resolution– Increase on clock frequency and
power consumption
Page 3
Motivation
• Solution: embedded image processing– Dedicated hardware– Better performance– Real parallelism– Low power consumption– High resolution images
Page 4
Goals
• ARCam: framework for the development of embedded AR systems
• Library of common AR functions
• Development model based on components
Page 5
Related work
• It was not found any flexible embedded solution for AR applications
• Features of existing solutions– Rely on hybrid hardware-software approaches– Dedicated to specific applications
• ARCam contribution– Entirely hardware based– General component based framework for developing
embedded AR applications
Page 6
ARCam development environment
Digital image sensor
Altera Stratix II FPGA
Page 7
Framework architecture
ARCam architecture
Page 8
Framework architecture
ARToolKit Pipeline
Search formarkers
Find marker 3Dposition andorientation
Identifymarkers
Position andorient objects
Render 3D objectsin video frame
Page 9
Framework architecture
ARCam architecture
Search formarkers
Find marker 3Dposition andorientation
Identifymarkers
Position andorient objects
Render 3D objectsin video frame
Page 10
Framework architecture
ARCam architecture
Page 11
Implemented components
• Image binarization and gray scalingLabelingGeneric convolutionMean filterEdge detectionCentroid estimationQuad detectionHardwire
Page 12
Implemented components
Image binarization and gray scaling
• LabelingGeneric convolutionMean filterEdge detectionCentroid estimationQuad detectionHardwire
Page 13
Implemented components
Image binarization and gray scalingLabeling
• Generic convolutionMean filterEdge detectionCentroid estimationQuad detectionHardwire
N11 N12 N13
N21 Pin N23
N31 N32 N33
C11 C12 C13
C21 C22 C23
C31 C32 C33
333332323131
2323in222121
131312121111out
NCNCNCNCPCNCNCNCNCP
×+×+×+×+×+×+×+×+×=
Page 14
Implemented components
Image binarization and gray scalingLabelingGeneric convolution
• Mean filterEdge detectionCentroid estimationQuad detectionHardwire
Page 15
Implemented components
Image binarization and gray scalingLabelingGeneric convolutionMean filter
• Edge detectionCentroid estimationQuad detectionHardwire
Page 16
Implemented components
Image binarization and gray scalingLabelingGeneric convolutionMean filterEdge detection
• Centroid estimationQuad detectionHardwire
Page 17
Implemented components
Image binarization and gray scalingLabelingGeneric convolutionMean filterEdge detectionCentroid estimation
• Quad detectionHardwire
Page 18
Implemented components
Image binarization and gray scalingLabelingGeneric convolutionMean filterEdge detectionCentroid estimationQuad detection
• Hardwire
Page 19
Performance analysis
Ratio (100MHz)Process sw/hw
Binarization 18.89Gray Scale 16.103x3 Filter 30,428.57Mean Filter 1,751.15Edge Detection 1,951.06Labeling 3,623.64Centroid 5.26QuadDetection 27.63
Page 20
Results
• Two case studies– Pong
• Prototype AR application rapidly created• Do not worry about modularization
– Object recognition• Make use of the componentized design model
Page 22
Results :: Object recognition
Centroid estimation Hardwire(x,y)
Page 23
Lessons learned
• Software to hardware translation
• Recursion to iteration
• New possible optimizations
Page 24
Conclusions
• An architecture was implemented to support the development of AR embedded solutions
• A pre-existent infrastructure makes the development of hardware based AR applications easier and faster
• Performance obtained from the hardware implementation was shown to be satisfactory
Page 25
Future work
• Performance analysis• Finish QuadDetector• Hardwire extension
– Z-buffer– Textures
• Creation of an authoring tool for hardware based AR applications
• More complex AR studies• Different AR approaches
– Markerless AR• External memory access
Page 26
Virtual Realityand MultimediaResearch Group
{jmxnt, ds2, gsm, lhcbc, vt, jk} @cin.ufpe.br
miva: Constructing a Wearable Platform Prototype
Petrópolis, May 2007
SVR 2007
João Marcelo TeixeiraDALITON SILVA
Guilherme Moura
Luiz Henrique CostaVeronica Teichrieb
Judith Kelner
Page 27
Introduction
• Virtual and augmented reality applications hosting
• Mobile and autonomous execution
Page 28
Introduction
• High performance hardware requirement– 3D applications– Visual quality/response time result in high cost!
• Minimum intervention on user’s mobility– Wearable computer
Page 30
miva Prototype
• Intel Pentium-M,1.4GHz• 512MB DDR (up to 1024MB)• 855GME Intel, 16-64MB• Intel 10/100 Mbit/s
Page 31
miva Prototype
• 4 USB V2.0 ports• Stereo analog output and SPDif audio
input, mic and SPDif• COM1/COM2 RS232• 5Volts DC 1.5Ah
Page 32
miva Prototype
36cm
3.4kilograms
31cm 4hours
Page 33
miva Prototype
• Software setup– Windows XP Embedded
• Reduced storage space required• High performance
Page 34
Software Architecture
Page 35
Software Architecture
• Application layer modules
Page 36
Software Architecture
• Service layer– Hardware abstraction API– Can be customized and incremented by user– Provides a high level abstraction for
communication and persistence services
Page 37
Software Architecture
• Support layer modules
Page 38
Potential Applications
mivaDesk
Page 39
Potential Applications
mivaDesk
Page 40
Potential Applications
mivaTherm
Page 41
Conclusions and Future Work
• RV/RA platform• Easily extensible• Developed without previous project
design
Page 42
Conclusions and Future Work
• miva platform physical evaluation– Size must be reduced– A more usable container will be developed– Use of more accurate pointer devices (data
glove and tracker)