A Sketch Interface for Mobile Robots Marjorie Skubic Craig Bailey George Chronis Computational Intelligence Research Lab University of Missouri-Columbia
A Sketch Interface for Mobile Robots
Marjorie SkubicCraig Bailey
George Chronis
Computational Intelligence Research LabUniversity of Missouri-Columbia
Outline
• Motivation and context
• Route maps
• The PDA sketch interface
• Experimental study and results
• Conclusions and future work
Spatial Reasoning with GuinnessSpatial Reasoning with Guinness
References
Acknowledgements
Route Maps
• Tversky’s work– Depictions vs. Descriptions
– Extraction of route descriptions
– 1 to 1 correlation
• Michon and Denis– Landmarks and critical nodes
The Sketch Interface
• Objects• Labels• Paths• Delete• Start• Move• Undo• Send
Objects
• Closed Polygons• Any shape or size• Thresholds to
determine gap closure• Feedback on
recognition– Sound
– Color
Labels
• Default numbering for object labels
• Tap on screen to edit• Can use Palm OS
Graffiti recognition or a software keyboard
Paths
• Limit of one• A minimum length
required• Color Feedback
Path Direction
• Default direction is the direction the path is drawn
• User can specify the direction with a sketched “blob” to denote the start of the pathRecognized by
– Number of points
– Average distance of all points
– Proximity to path endpoint
Delete• An intuitive delete:
cross out an object• Recognized by
– Two consecutive strokes
– Both lengths shorter than a path
– The strokes cross
• Color feedback• Search for closest
object or path
Determining Crossed Marks• Use the slope equations of lines• Endpoints of strokes determine the line• A pair of decision parameters can be computed• If both parameters lie between 0 and 1, then the
two strokes must have an intersection
(X1,Y1)ua= (X4-X3)(Y1-Y3) - (Y4-Y3)(X4-X3)
(Y4-Y3)(X2-X1) - (X4-X3)(Y2-Y1)
ub= (X2-X1)(Y1-Y3) - (Y2-Y1)(X1-X3)
(Y4-Y3)(X2-X1) - (X4-X3)(Y2-Y1)
IF (0 < ua < 1) AND (0 < ub < 1) THEN
the lines intersect
(X3,Y3)
(X4,Y4) (X2,Y2)
Menu Commands
• Also accessible through graffiti
• m Move• u Undo• c Clear • t Transmit• f Configure
“Digitizing” the Sketch
User Evaluation• Tested how well the interface performed with real
users• Pre-experimental questionnaire• Tasks
– Sketch tasks– Re-sketch tasks– Task scores
• Post-experimental questionnaire
• Questionnaires contain Lickert style statements (Lickert, 1932) along with several open-ended questions
Statistical Analysis
2 groups, 2 scenes:
• Compared by scene sketched
• Compared by course level of participant
• Means compared with the t test
• Null Hypothesis: there are no differences when compared by sketched scene or course level
Participants
• 26 students from CS courses• One participant scores was not used• Only 5 owned a PDA• Students of Scene B rated themselves
significantly better at giving directions (p = 0.02)
• No differences when compared by course level
Scene A
Example Sketches of Scene A
Scene B
Example Sketches of Scene B
Post-Experimental Survey:Landmark Scores
1 = very difficult; 5 = very easy
• Creating Landmarks– 4.6 ± 0.6
• Deleting Landmarks– 4.2 ± 0.9
• Usefulness of Deleting– 4.7 ± 0.6
• Usefulness of Labeling– 4.8 ± 0.6
Post-Experimental Survey: Path Scores
1 = very difficult; 5 = very easy• Creating a path
– 4.4 ± 1.0
• Deleting a path– 4.4 ± 1.0
• Usefulness of deleting– 4.7 ± 0.7
• Usefulness of the starting point– 4.2 ± 0.9
Post-Experimental Survey: Overall Scores
• Usefulness of changing sketch– 4.8 ± 0.4
• Usefulness of deleting sketch– 4.2 ± 1.0
• How well sketch represents environment– 83.6 ± 7.4
• Overall ease of interface– 4.4 ± 0.6
Usability Results
• Only two significant differences (p<=0.05) were found among the scores– Usefulness of deleting by scene (p=0.0)– Final sketch rating by scene (p=0.05)
• In both cases, students in scene B rated higher• Same group that rated themselves better at giving
directions• Differences were not found when compared by
course level• The Null Hypothesis is accepted
Task Score Results
• Collected sketches were scored– +1 for starting landmark
– +1 for each correct turn
– +1 for landmark at turn
– +1 for each correct straight segment
– + 1 for ending landmark
– -1 for extra turns or straight segments
• No significant differences found (p=0.12)– Sketch Task Score = 0.91 ± 0.11
– Re-sketchTask Score = 0.82 ± 0.26
Conclusions
• Created a new sketch based interface on a handheld computer
• Intuitive and little reliance on traditional menus and icons
• User evaluation finds the interface as easy to use as pencil and paper by 2:1
Future Work
• Continue integration into the Guinness system• Recognition of more sketched symbols• Recognition of turning rate• Creation of 3D virtual environments with libraries of
objects
Email: [email protected]: www.cecs.missouri.edu/~skubicfunded by the Naval Research Lab
ARCHITECTURE
imageserver
mapserver
SRserver
spatialbehaviors
obstacleavoidance
posecontinuouslocalization
speech
robot
Co
rtex
sensor data sens
or d
ata
robo
t cm
ds
corrections
oldest short term map
user commandsand responses
SR &map info
robot pose robo
t po
se
user commandsand responses
enco
ders
sens
or in
fo
query & label
speechcommands
sketchdirectives& feedback
robot commands
trulla
vfh
shorttermmap
longtermmap
GUI(EUT)
gesture
PDA
User: How many objects do you see?Robot: I am sensing four objects.User: Object 2 is a table.User: Describe the scene.Robot: There are objects on my front right.
The object number 4 is mostly in front of me. The table is behind me.
User: Go behind the table.
Behind the table
SRserverSRserver
betweenbetween object 1 and object 2 object 1 and object 2
using the midpoint between closest points
using the midpoint between centroids
using the CFMD
Image ServerImage Server
PATH DESCRIPTION GENERATED FROM THE SKETCHED ROUTE MAP1. When table is mostly on the right and door is mostly to the rear (and close) Then
Move forward2. When chair is in front or mostly in front Then Turn right3. When table is mostly on the right and chair is to the left rear Then Move forward4. When cabinet is mostly in front Then Turn left5. When ATM is in front or mostly in front Then Move forward6. When cabinet is mostly to the rear and tree is mostly on the left and ATM is mostly
in front Then Stop
Understanding Sketched Route MapsUnderstanding Sketched Route Maps
[1] M. Skubic, P. Matsakis, G. Chronis and J. Keller, "Generating Multi-Level Linguistic Spatial Descriptions from Range Sensor Readings Using the Histogram of Forces", Autonomous Robots, Vol. 14, No. 1, Jan., 2003, pp. 51-69.
[2] M. Skubic, D. Perzanowski, S. Blisard, A. Schultz, W. Adams, M. Bugajska and D. Brock “Spatial Language for Human-Robot Dialogs,” IEEE Transactions on SMC, Part C, to appear in the special issue on Human-Robot Interaction.
[3] M. Skubic, S. Blisard, C. Bailey, J.A. Adams and P. Matsakis, "Qualitative Analysis of Sketched Route Maps: Translating a Sketch into Linguistic Descriptions," IEEE Transactions on SMC Part B, to appear.
[4] G. Chronis and M. Skubic, “Sketch-Based Navigation for Mobile Robots,” In Proc. of the IEEE 2003 Intl. Conf. on Fuzzy Systems, May, 2003, St. Louis, MO.
[5] G. Scott, J.M. Keller, M. Skubic and R.H. Luke III, “Face Recognition for Homeland Security: A Computational Intelligence Approach,” In Proc. of the IEEE 2003 Intl. Conf. on Fuzzy Systems, May, 2003, St. Louis, MO.
ReferencesReferences
From left to right George Chronis, Grant Scott, Dr. Marge Skubic, Matt Williams,
Craig Bailey, Bob Luke, Charlie Huggard and Sam Blisard Missing: Dr. Jim Keller
Guinness and GangGuinness and Gang
Sketch-Based NavigationSketch-Based Navigation
The sketched route mapThe robot traversing the sketched route
Sketch-Based NavigationSketch-Based Navigation
The digitized sketched route map
The robot traversing the sketched route
This work has been supported by ONR and the U.S. Naval Research Lab. Natural language understanding is accomplished using a system developed by NRL, called Nautilus [Wauchope, 2000]. We also want to acknowledge the help of Dr. Pascal Matsakis.
AcknowledgementsAcknowledgements
NRL’s Multimodal Robot InterfaceNRL’s Multimodal Robot Interface