Top Banner
Integrating Motion, and Voice Control with Kinematic of a SCARA Robotic System William Disotell Colin Mitchell Final Draft April 26, 2015 CEN4935 Senior Software Engineering Project Instructor: Dr. Janusz Zalewski Software Engineering Program Florida Gulf Coast University Ft. Myers, FL 33965
52

Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

Feb 04, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

Integrating Motion, and Voice Control

with Kinematic of a SCARA Robotic

System

William Disotell

Colin Mitchell

Final Draft

April 26, 2015

CEN4935 Senior Software Engineering Project

Instructor: Dr. Janusz Zalewski

Software Engineering Program

Florida Gulf Coast University

Ft. Myers, FL 33965

Page 2: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

2

Table of Contents

1. Introduction ..................................................................................................................... 4 

1.1 SCARA System ......................................................................................................... 4 

1.2 UMI RTX .................................................................................................................. 5 

1.3 Kinematics ................................................................................................................. 6 

1.4 Kinect for Windows .................................................................................................. 7 

1.4.1 Overview ............................................................................................................ 7 

1.4.1.2 Voice ................................................................................................................ 9 

1.4.2 Motion ................................................................................................................ 9 

2. Software Requirements Specification ........................................................................... 10 

2.1 Project Objectives ................................................................................................... 10 

2.2 Physical Diagram .................................................................................................... 10 

2.3 Context Diagram ..................................................................................................... 11 

2.4 Software Requirements ........................................................................................... 12 

3. Design Description........................................................................................................ 14 

4. Implementation and Testing ......................................................................................... 18 

4.1 Control Module ....................................................................................................... 18 

4.2 RTX Communication Module ................................................................................. 18 

4.2.1 RTX Initialization ............................................................................................. 19 

4.2.2 RTX Motion ..................................................................................................... 19 

4.3 Kinect Module Implementation .............................................................................. 24 

4.3.1 Kinect Movement Implementation ................................................................... 24 

4.3.2 Kinect Voice Implementation ........................................................................... 27 

Page 3: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

3

4.3 Testing ..................................................................................................................... 28 

4.3.1 RTX Testing ..................................................................................................... 29 

4.3.2 Kinect Testing................................................................................................... 29 

5. Conclusion .................................................................................................................... 32 

6. References ..................................................................................................................... 34 

Appendix I – Repairing the RTX ...................................................................................... 35 

Removing RTX Arm ..................................................................................................... 35 

Replacing Broken Elbow Encoder Belt ........................................................................ 38 

Appendix II – Testing Results .......................................................................................... 44 

Page 4: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

4

1. Introduction

This project, is utilizing the UMI RTX Robot as shown in figure 1.1. The RTX is a

SCARA robot, defined in section 1.1. The robot was designed with six degrees of freedom, or six

joints that are able to be controlled, and a movable gripper. It is to be used with a single serial

cable using the RS-232 protocol which is still widely used in industry settings. The limitation of

the software controller for the RTX is that it is restricted to being controlled with simple buttons

on a screen [5]. The primary goal of this project is to add a more user interactive feel to the RTX

through voice and motion tracking. After the consideration of multiple methods to add these

desired features, the Kinect for Windows described in section 1.4, was chosen. An additional

goal of the project is to be able to control the RTX through kinematics as defined in section 1.3.

The integration of kinematics is important as it is an essential feature for robots with multiple

degrees of freedom and would be required for most real world applications.

Figure 1.1: The UMI RTX Robot

1.1 SCARA System

SCARA is an acronym standing for Selective Compliance Assembly Robot Arm, a robot

system class that was invented by Hiroshi Makino. The distinctive features of a SCARA system

are three independent motion translations and a single rotation about a fixed axis of orientation

Page 5: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

5

[1]. In the RTX’s case its rotation is about its vertical axis. The rigid structure about two axes of

tilting (the axes normal to the axis of rotation), provide the SCARA system the ability to perform

accurate cyclic operations with heavy payload capacity. This is due to that the rotary axes are

mounted vertically rather than horizontally minimizing robots deflection when it carries its

payload while moving. This allows the load to be carried by the joint frame rather than the

motor.

1.2 UMI RTX

The Robotic system selected to perform the integration of kinematic, voice, and motion

control is the RTX educational robot, manufactured by Universal Machine Intelligence LLC in

1986 [2]. This SCARA System provides seven degrees of freedom; six from the main structure

with an addition degree of freedom provided by the gripper positioned at the end effector. The

RTX is designed to be controlled via a RS232 serial link connected to a computer. The system is

driven by seven 24VDC motors; a single 20w (3A) motor for Z axis and six 3w (750mA) motors

for the remaining axes. To record the position of the RTX, each motor has two-phase optical

incremental encoder. The device is controlled by two processing units called intelligent

peripherals (IP), identified as IP1 and IP2. These IP’s can control the position, speed,

acceleration and force of up to five motors [2]. The configuration of the IP-motor relationship is

outlined in Table 1.2.1.

Table 1.2.1 IP Responsibility for Specific Motors [3]

Motor Controlled by:ZED IP1 SHOULDER IP1 ELBOW IP1 YAW IP1 WRIST1 IP0 WRIST2 IP0 GRIPPER IP1

Page 6: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

6

The communication between the RTX and the PC was originally driven by Pascal library.

Later the library was translated into a Win32 Library in the C Programing Language named

RT100 [3]. The correlation between a RTX task and the RT100 library is shown in Table 1.2.2.

Table 1.2.2 RTX Task vs RT100 Library Functions [3]

Task RT100 Library Functions

Communication tasks arm_init_comms, arm_shutdown

Initializing the Robot arm_init

Preparing the arm at the start of an operation arm_restart, arm_define_origin, arm_version

Defining how the arm will move arm_set_mode, arm_read, arm_write

Initiating movement of the arm arm_go, arm_interpolate

Getting reports on the arm’s current status arm_general_status, arm_motor_status

Stopping the arm arm_stop

Reset IP default settings arm_reload_pids

Using IPC directly arm_raw_command, arm_raw_response

1.3 Kinematics

Kinematics is the branch of classical mechanics which describes the motion of points,

bodies (objects) and systems of bodies (groups of objects) without consideration of the causes of

motion. These calculations provide a knowledge of the position any point of the arm with respect

to the external coordinate system. The calculation are broken up into two parts; the first is

forward kinematics which translate joint positions to the coordinates of the end effector and the

inverse kinematics which reverse the forward kinematic shown in figure 1.3.

Page 7: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

7

Figure 1.3 Kinematic Relations

1.4 Kinect for Windows

1.4.1 Overview

Figure 1.4.1.1 Kinect Components

The Microsoft Kinect is a line of motion sensing input devices originally designed for the

Microsoft Xbox 360 video game console and Windows PCs. The Kinect’s webcam-style add-on

peripheral enables users to control and interact with their console or computer through a natural

user interface using gestures and spoken commands without the need for a game controller, a

mouse, or a keyboard.

Page 8: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

8

The Kinect components are divided in five main sensor groups, which are the RGB

camera, the infrared (IR) emitter, the IR depth sensor, the multi-array microphone, and the 3-axis

accelerometer; which are depicted in figure 1.4.1.1. The An RGB camera that stores three

channel data in a 1280x960 resolution. This makes capturing a color image possible. An infrared

(IR) emitter and an IR depth sensor. The emitter emits infrared light beams and the depth sensor

reads the IR beams reflected back to the sensor. The reflected beams are converted into depth

information measuring the distance between an object and the sensor. This makes capturing a

depth image possible. A multi-array microphone, which contains four microphones for capturing

sound. Because there are four microphones, it is possible to record audio as well as find the

location of the sound source and the direction of the audio wave. A 3-axis accelerometer

configured for a 2G range, where G is the acceleration due to gravity. It is possible to use the

accelerometer to determine the current orientation of the Kinect. The detailed specifications for

these components are displayed below in Table 1.4.1.1.

Table 1.4.1.1 Kinect Specifications

Kinect Specification

Viewing angle 43° vertical by 57° horizontal field of view

Vertical tilt range ±27°

Frame rate (depth and color stream)

30 frames per second (FPS)

Audio format 16-kHz, 24-bit mono pulse code modulation (PCM)

Audio input characteristics

A four-microphone array with 24-bit analog-to-digital converter (ADC) and Kinect-resident signal processing including acoustic echo cancellation and noise suppression

Accelerometer characteristics

A 2G/4G/8G accelerometer configured for the 2G range, with a 1° accuracy upper limit.

Page 9: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

9

1.4.1.2 Voice

The voice control for the RTX functions illustrated in figure 1.4.2.1. All voice commands

will be started with the phrase ‘R’-‘T’-‘X’. From there, simple commands such as ‘HOME’ can

be used to signal the RTX robot back to its origin position as specified in figure 4.2.2.1. The

other capability is for the user to specify to move a degree of a freedom to a certain position. For

example, ‘RTX Zed 500’ will move the zed to the 500 position.

Figure 1.4.1.1 Kinect Voice Commands

1.4.2 Motion

Motion commands for the RTX will be controlled using two arms to make gestures. A

gesture will be defined by a hand movement parallel to the Kinect. The left hand of the user will

allow for the selection of a degree of freedom and the user’s right hand will adjust the value of

the selected degree. For example, if ‘Zed’ is the currently selected degree of freedom, then the

user’s right hand will start at the current value for the Zed and moving it downward without

changing the user’s distance from the Kinect will decrease the value for the Zed (figure 1.4.2.1).

Figure 1.4.2.1 Kinect Motion Commands

Page 10: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

10

2. Software Requirements Specification

2.1 Project Objectives

The project objective can be broken in four parts:

Create a functional user interface in C++ with direct control over the RTX joints

Use the user interface for the RTX to be control with kinematic input

Control the RTX with simple voice commands being read through the Kinect For

Windows

Control the RTX joints with arm gestures being recognized through the Kinect

For Windows

Previous projects at Florida Gulf Coast University have worked with the RTX in

LabView and Java [4], however the goal for this project will be to code the project in C++.

Therefore, this project will not use the code or resources from any past FGCU project involving

the RTX.

2.2 Physical Diagram

The physical diagram in Figure 2.2.1 shows connected hardware that is used for a project

and how the user will physically interact with the system. As depicted, the RTX robot is

connected via an RS-232 cable with a computer that is hosting the designed software. The

computer receives commands from a user with the use of external I/O devices such as a

Microsoft Kinect, mouse and keyboard and displays pertinent information back to the user. The

designed software will also receive input from the Kinect which shall be connected via a USB

cable. The user has the option to use this connection through motion and voice as described in

Figure 1.4.1.1 and Figure 1.4.2.1.

Page 11: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

11

Figure 2.2.1 Physical Diagram

2.3 Context Diagram

The context diagram for the software named RTX Kinect is shown in Figure 2.3.1. This

diagram shows connection between the software and outside world. As shown in the figure, all

pieces to the RTX are encapsulated and the software is communicating via a RS-232 interface.

The software’s main components are shown in the center oval and read the data stream from the

Kinect and communicate with the user. The user also interacts with the Kinect one way in the

form of motion and voice commands.

Page 12: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

12

Figure 2.3.1 Context Diagram of the RTX Kinect Software Controller

2.4 Software Requirements

Specific software requirements for the software controller are started below

1) When a command sequence is received from the user, the RTX Kinect software shall

handle it via an RS232 interface to the RTX robot.

2) The RTX Kinect software shall send the signals to the RTX to move it to the location

specified by the user.

3) The range of motion of the RTX shall be displayed by the RTX Kinect software with

seven manageable fields, one for each degree of freedom with the addition to the gripper.

Page 13: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

13

4) The RTX Kinect software shall accept input in the form of an inverse kinematic equation

taking the variables X, Y, Z, Roll, Pitch, and Yaw.

5) The Kinect Module shall handle motion commands from the user’s right hand by adjust

the current value of the selected degree of freedom.

6) The Kinect Module shall handle motion commands from the user’s left hand by rotating

through available degrees of freedom.

7) The Kinect Module shall handle voice commands from the user as specified in section

1.4.1.

Page 14: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

14

3. Design Description

The architecture of the RTX Kinect software is shown in Figure 3.1. The Control Module

is at the center of the diagram and makes the decisions for the software. It directs commands

from the user via the User Interface module and uses the RTX Communication Module to

change the position of the robot. The Kinect Voice/Motion module handles the gesture

interpretation and sends when a gesture has been recognized to the Control Module. The Control

Module catches commands from the Kinect Voice/Motion and User Interface Modules to send to

the RTX. The RTX Communication Module handles the messages to and from the RTX via a

RS-232 interface.

Figure 3.1 Structural Diagram of the RTX Kinect Software Controller

Page 15: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

15

An important aspect in designing software is looking at it from a dynamic perspective,

showing the behavior of the software. Figure 3.2 illustrates the motion sensing portion of the

system. The system begins with the robot uninitialized. From there, it waits for the user to give

the initialize signal which sends the robot a default testing of all of the pieces and motors. Once

the robot has reached its ready position, the Kinect features need to be enabled through the User

Interface. Error handling will check for a valid Kinect for Windows to be connected to the

application. If the Kinect was not found, or cannot be attached, an error message describing this

will be shown to the user.

The Kinect waits until it can recognize a skeleton frame through the camera, and once it

does, the software in the Kinect Voice/Motion module will start tracking the movement for a

recognized gesture. When a gesture is recognized, the result is sent to the Control Module which

may then trigger it to send a change in position to the RTX.

Page 16: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

16

Figure 3.2 Behavioral Diagram for Kinect Motion Detection

Page 17: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

17

Figure 3.3 Behavioral Diagram for Control Module

Page 18: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

18

4. Implementation and Testing

4.1 Control Module

The implementation of the Control Module was done through the Kinect_Dlg.cpp file.

The Control Module contains an instance of the RTXControl class, named rtx, which

encapsulates the RTX Communication Module. At the top of the RTX_KinectDlg.cpp file, a

pointer to a CMotionThread is created. The CMotionThread runs the Kinect Module on a

separate thread for better performance.

CMotionThread* adapter;

Code Snippet 4.1.1 Initialize RTX

Therefore, the Control Module is the main head of the program, holding the instance of

the other modules inside. The Control Module also links the MFC user interface with the

available content of the RTX Communication Module. When a value is changed in the user

interface, the value, (millimeters for Zed, and degrees for others), is sent to the RTX

Communication Module to be translated to what the RTX takes, encoder counts.

4.2 RTX Communication Module

The first step in connecting and communication with the RTX is establishing a serial

connection. The configuration for this is shown in Table 4.2.

Table 4.2 - The RS232 Serial Configuration for the RTX

COM Port # 1 Data Rate 9600 Data Bits 8 Parity Even Stop Bits 1

Page 19: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

19

4.2.1 RTX Initialization

The RTX initialization is required to be done before any other function can be

accomplished. This process includes establishing RS-232 communication between the robot and

User Interface, initializing each motors speed, initializing each joint to the home position, and

setting the initial position fields. The software accomplishes this task after the connect button is

clicked, defined in code snippet 4.2.1.

void CRTX_KinectDlg::OnBnClickedConnect() { arm_init_comms("COM1", &rCode); arm_init(&rCode); arm_write(ELBOW, SPEED, TOP_SPEED, &rCode); arm_write(SHOULDER, SPEED, TOP_SPEED, &rCode); arm_write(ZED, SPEED, TOP_SPEED, &rCode); arm_write(WRIST1, SPEED, TOP_SPEED, &rCode); arm_write(WRIST2, SPEED, TOP_SPEED, &rCode); arm_write(YAW, SPEED, TOP_SPEED, &rCode); arm_write(GRIP, SPEED, TOP_SPEED, &rCode); arm_go(NUMERIC_GO, 0x1555, &rCode); waitUntilDone(&rCode); arm_stop(DEAD_STOP, &rCode); }

Code Snippet 4.2.1 Initialize RTX

4.2.2 RTX Motion

The RTX‘s movement is driven by seven motors, enabling six degrees of freedom for its

seven joints. The RTX stores the location of each motor’s position in the on board RAM [2],

measured in encoder counts. After the RTX is initialized the encoder count for each motor is

zero, this position is named Home and its orientation is displayed in figure 4.2.2. Each joint has

a corresponding motor; except for pitch and roll which share two motors in the wrist end

effector, labeled wrist1 and wrist2.

Page 20: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

20

Figure 4.2.1 RTX Initial Position (Home) [2]

The roll and pitch movement of the wrist end effector is implemented by a combination

of speed and rotation direction of the two wrist motors. When both motors are driven at the same

speed in the same direction, the wrist moves only with the pitch motion. When they are driven at

the same speed in the opposite direction, the wrist moves only with the roll motion. To enable

both roll and pitch motions simultaneously, the motors are driven at different speeds. A summary

of these wrist motor configuration are displayed in table 4.2.1. [2]

Table 4.2.1 Roll and Pitch Wrist Motors Drive Configurations [2]

Wrist Movement Wrist Motors Speed and Rotation Roll Same Speed and Same Direction Pitch Same Speed and Opposite Directions Both Pitch and Roll Different Speeds

To convert the position of the RTX between SI unit and encoder count unit; a constant

scale factor derived from the gear ratio and number of counts per motor revolution for each

motor. The specific constant for each motor is listed in table 4.2.2. The unit conversion for each

Page 21: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

21

joints except pitch and roll, is multiplied by the constant for SI units to encoder counts

conversion and divided by the constant to convert encoder counts to SI units.

Table 4.2.2 Encoder Counts for each Unit of Motor Movement [2]

Motor Encoder Count per UnitELBOW 14.611 (count/degree) SHOULDER 29.221 (count/degree) ZED 3.750 (count/mm) YAW 13.487 (count/degree) WRIST1 13.487 (count/degree) WRIST2 9.740 (count/degree) GRIPPER 14.706 (count/mm)

The pitch and roll unit conversion are calculated simultaneously, due to the sharing of

both wrist motors. If both motors are driven at the same speed in the same direction, the RTX

moves only in pitch rotation. When they are driven at the same speed in the opposite direction

the result is a roll rotation. But both pitch and roll movement can be completed together if the

motors are driven at different speeds or only. The roll and pitch conversion formulas are listed

below in table 4.2.3 and use a constant factor k = 0.07415 to calculate the encoder count and SI

unit conversions. [3]

Table 4.2.3 Roll and Pitch Encoder Count and Degree Conversion

Degree to Encoder Counts Encoder Counts to Degree Pitch = ((WRIST1 + WRIST2)/2)*k WRIST1 = (PITCH + ROLL)/k Roll = ((WRIST1 - WRIST2)/2)*k WRIST2 = (PITCH - ROLL)/k Note: k=0.07415

The RTX geometric limitations determined by its end stops design in ever joint. The

range of every joint is display in figure 4.2.2. The calculated encoder counts and SI units are

displayed in table 4.2.4, which were derived from the above conversions.

Page 22: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

22

Figure 2.2.2 RTX Orientation Limits [2]

Table 4.2.4 Maximum Range for Each Unit of Motor/Joint Movement [2]

Joint End Stop to End Stop Encoder counts

Encoder CountsTotal Range

End Stop to End Stop Range

Total Range

ELBOW 2206 to -2630 4836 90 to -90 (deg) 331(deg)

SHOULDER 2630 to -2630 5260 180 to -151 (deg) 180(deg)

ZED 0 to -3554 3303 -915 to 0 (mm) 915(mm)

YAW 108 to -2642 2750 2 to -98 (deg) 102(deg)

PITCH 4882 to -3560 8442 181 to -151 (deg) 313(deg)

ROLL 1071+E/3 to -(1071+E/3) 2142 110 to -110 (deg) 220(deg)

GRIPPER 1200 to -30 1200 0 to 90 (mm) 90(mm)

Page 23: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

23

The RTX Kinect software sends movement instructions to the RTX using the

“JointMove” function displayed in code snippet 4.2.2. The order of the movement is as follows:

Zed, Shoulder, Elbow, Yaw, Wristl, Wrist2 and Grip. A new motor is not powered till the

previous has finished its task. All motors coincide with their respective joint movement, except

for the pitch and roll. These two are be obtained from the combined action of wirstl and wrist2

motors.

void RTXControl::JointMove(JOINT_T j){ EncoderUpdate(); switch (j) { case ELBOW_J: arm_write(ELBOW, NEW_POSITION, ecVal.ELBOW, &rCode); break; case SHOULDER_J: arm_write(SHOULDER, NEW_POSITION, ecVal.SHOULDER, &rCode); break; case ZED_J: arm_write(ZED, NEW_POSITION, ecVal.ZED, &rCode); break; case PITCH_J: arm_write(WRIST1, NEW_POSITION, ecVal.WRIST1, &rCode); arm_write(WRIST2, NEW_POSITION, ecVal.WRIST2, &rCode); break; case ROLL_J: arm_write(WRIST1, NEW_POSITION, ecVal.WRIST1, &rCode); arm_write(WRIST2, NEW_POSITION, ecVal.WRIST2, &rCode); break; case YAW_J: arm_write(YAW, NEW_POSITION, ecVal.YAW, &rCode); break; case GRIP_J: arm_write(GRIP, NEW_POSITION, ecVal.GRIP, &rCode); } arm_go(NUMERIC_GO, 0x1555, &rCode); waitUntilDone(&rCode);

Page 24: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

24

} Code Snippet 4.2.2 RTX Move Function

4.3 Kinect Module Implementation

4.3.1 Kinect Movement Implementation

Implementing the Kinect was simple once we had more familiarity with the Kinect SDK.

The Kinect for Windows can track up to six skeletons at 30 frames per second. For the purposes

of this project, only one person is desired to be controlling the RTX at a time, so additional

skeleton data is not used and is not drawn on the screen. The software determines which skeleton

to follow by comparing the perceived physical distance from the Kinect.

Although skeleton data is readily available with the Kinect SDK, there is no build in

gesture recognition for SDK v1.8. For this reason, we had to create our own recognition with the

skeleton data. Since each person moves their hands at different levels and at different speeds,

these variations needs to be accounted for when detected a gesture. This detection uses the

following constants as defined in Code Snippet 4.3.1.

//vector size, number of points needed to be consistent for a gesture //ex. 20 points in a row need between min and maxMovement in a direction unsigned int maxVectorSize = 20;  //min distance per check for it to be considered a gesture float minMovement = 9e‐4;  //max distance per check for it to be considered a gesture float maxMovement = 1e‐2; 

 Code Snippet 4.3.1

The maxVectorSize variable, 20, refers to the number of consecutive frames required for

a gesture. A larger number would require the user to be moving at a constant rate for a longer

period of time. The next two numbers, minMovement and maxMovement are used to compare

the min and max distance traveled between frames for gestures. This value is in meters. With a

frame rate of 30 hertz, these values require the user to maintain a speed between ~0.03m/s and

~0.33m/s. These values were chosen through trial and error to find something that felt

comfortable when using the Kinect.

Page 25: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

25

Because we created the gestures to be easily triggered without straining motion, the

variable in Code Snippet 4.3.2, maxDelta was added. When only comparing the targeted axis, it

was found to be triggering the gesture before the user’s hands were completely raised. To solve

this problem, we examine the secondary axis and keep the overall change below maxDelta, 0.05

meters or approximately 2 inches. This addition filters out falsely flagged gestures by restricting

them to straight movements along the axis.

//max change from beginning to end to be considered a gesture //ex. right hand moves along the Y axis, this would be the max the X value can //change float maxDelta = 5e‐2; 

Code Snippet 4.3.2

The other dilemma that was discovered upon continued use of the Kinect, is that there is

no relaxed position for the user and as long as the user is in front of the Kinect, a gesture has the

chance of being detected. To solve this problem, an additional check was added and the user’s

hand is required to be above the user’s elbow along the y axis. With this change, the user is

required to have raised hands so cannot accidently trigger a gesture. Code Snippet 4.3.3 shows

the method checkForGesture which implements these ideas. When a gesture is detected, the

corresponding method is called which for the right hand, moves the current joint, and for the left

hand, switches the current joint.

//Right Hand if (rightHandY > rightElbowY){ //tracking new values

addToVector(vRightHandY, rightHandY); addToVector(vRightHandX, rightHandX);

//if we have enough points for a full gesture and if the user is //adhering to a straight path if (vRightHandY.size() == maxVectorSize &&

checkWithinDelta(vRightHandX, maxDelta)){ //check for a downward movement if (checkNegative(vRightHandY, minMovement, maxMovement)){ printRightHandDown(); vRightHandY.clear(); vRightHandX.clear(); } //check for an upward movement

Page 26: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

26

else if (checkPositive(vRightHandY, minMovement, maxMovement)){

printRightHandUp(); vRightHandY.clear(); vRightHandX.clear(); } } } //if they lower their hand, clear the values else { vRightHandY.clear(); vRightHandX.clear(); } //Left Hand if (leftHandY > leftElbowY){

//tracking new values addToVector(vLeftHandX, leftHandX); addToVector(vLeftHandY, leftHandY);

//if we have enough points for a full gesture and if the user is //adhering to a straight path if (vLeftHandX.size() == maxVectorSize &&

checkWithinDelta(vLeftHandY, maxDelta)){

//check for a leftward movement if (checkNegative(vLeftHandX, minMovement, maxMovement)){ printLeftHandLeft(); vLeftHandX.clear();

vLeftHandY.clear(); }

//check for a rightward movement else if (checkPositive(vLeftHandX, minMovement,

maxMovement)){ printLeftHandRight(); vLeftHandX.clear();

vLeftHandY.clear(); } } } //if they lower their hand, clear the values else { vLeftHandX.clear();

vLeftHandY.clear(); }

Code Snippet 4.3.3

Page 27: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

27

Switching the current joint when a left hand gesture occurs is done in a simple manner by

creating a circular list using the modulus operator as shown in Code Snippet 4.3.4. The joint

variable is initialized to 0 which sets the default RTX Joint to Zed.

if (goingRight){ if (joint == 6) joint = -1;

joint++; } else{

if (joint == 0) joint = 7; joint--;

} current_joint = CRTX_KinectDlg::JOINT_T(joint);

Code Snippet 4.3.4

4.3.2 Kinect Voice Implementation

Voice was implemented in the Kinect with the use of a Speech Recognition Grammar

Specification file (.grxml). In these files, as shown in Code Snippet 4.3.5, tags are created and

items are given to the tags. Whenever the microphone array reads any of the items for a tag, it is

mapped to a SpeechAction and is then handled accordingly. All items have the prefix of ‘RTX’

to avoid any unintentional movement by the RTX.

<item> <tag>RTXZedPlus</tag> <one-of> <item> RTX Zed up </item> <item> RTX Zed right </item> <item> RTX Zed positive </item> <item> RTX Zed plus </item> </one-of> </item> <item> <tag>RTXZedMinus</tag> <one-of> <item> RTX Zed down </item> <item> RTX Zed left </item> <item> RTX Zed negative </item> <item> RTX Zed minus </item> </one-of>

Page 28: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

28

</item> Code Snippet 4.3.5

The complete SpeechAction enum can be seen in Code Snippet 4.3.6. There are two

actions for each joint, moving in each direction, along with no action.

enum SpeechAction { ZedPlus, ZedMinus, ShoulderPlus, ShoulderMinus, ElbowPlus, ElbowMinus, YawPlus, YawMinus, PitchPlus, PitchMinus, RollPlus, RollMinus, GripperPlus, GripperMinus, RTXActionNone };

Code Snippet 4.3.6

4.3 Testing

After completion, the project was tested against the requirements in described in section

2.3. The testing is divided into test cases with testing identification numbers between 01 and 08.

The results for each test can be seen in Appendix II. Each test case is comprised of the following:

Test Case ID

Objective: the purpose of the test

Description: process from completed the test

Test Conditions: prior conditions when completing the test

Expected Results: expected results from the test assuming it passes

Page 29: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

29

4.3.1 RTX Testing

Test Case ID: 01

Objective: Initialize the RTX

Description: After the program is first opened, press the “Initialize” button of the GUI

Test Conditions: The program is opened successfully and this is the first test preformed

Expected Results: A message will be displayed to the user in the display box which will state that the RTX was successfully initialized and ready to use. The RTX will be in the home position. Test Case ID: 02

Objective: Test the joint movement sliders in the GUI

Description: For each joint, move the slider to one extreme and press the joint button. Then, move it to the other end and repress the joint button

Test Conditions: Test Case ID 01

Expected Results: Each joint will move from one extreme to the other which can be identified visually on the RTX Test Case ID: 03

Objective: Test the kinematic positioning in the GUI

Description:

Test Conditions: Test Case ID 01

Expected Results:

4.3.2 Kinect Testing

Test Case ID: 04

Objective: Test the initialization of the Kinect

Description: In the opened GUI, press the “Kinect Control” button

Test Conditions: Successfully opened program

Page 30: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

30

Expected Results: The center screen in the GUI will be switched to black and the closest detected skeleton will be displayed in green. There will also be a message printed in the output box which states the Kinect was initialized successfully Test Case ID: 05

Objective: Test the joint movement using the Kinect

Description: After the Kinect is tracking your skeleton and displays your right hand as a solid green line, preform a Kinect hand gesture moving downward with your right hand. Wait until the RTX has stopped moving, then preform a Kinect hand gesture moving upward with your right hand.

Test Conditions: Test Case ID 01, Test Case ID 04, and for the user to be positioned at a detectable location relative to the Kinect. This test also expects that Test Case ID 06 has not been preformed

Expected Results: After the initial downward movement, the Zed, the default selected joint, will be moved downward by 100cm which can be visually identified. After the second movement, the Zed will return to its original position Test Case ID: 06

Objective: Test the joint selection using the Kinect

Description: After the Kinect is tracking your skeleton and displays your left hand as a solid green line, preform a Kinect hand gesture moving right with your left hand. Then, preform a Kinect hand gesture moving left with your left hand.

Test Conditions: Test Case ID 01, Test Case ID 04, and for the user to be positioned at a detectable location relative to the Kinect. This test also expects that this test case has not been previously preformed

Expected Results: After the first movement is detected, a message to the user will be displayed stating that the currently selected joint is now the Shoulder. The second gesture will change the selected joint back and Zed will be displayed to the user in the output box Test Case ID: 07

Objective: Test the voice command to move a joint

Description: Say “RTX Zed Down” within microphone distance to the Kinect. Wait until the RTX has stopped moving, then say “RTX Zed Up”.

Test Conditions: Test Case ID 01 and Test Case ID 04

Expected Results: The RTX will move the Zed down 100cm after the first voice command, and will move back up 100cm to its original position after the second command

Page 31: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

31

Test Case ID: 08

Objective: Test the voice home command using the Kinect

Description: Say “RTX Home” within microphone distance to the Kinect

Test Conditions: Test Case ID 01 and Test Case ID 04

Expected Results: The RTX will run its initialization sequence and move back to its home position

Page 32: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

32

5. Conclusion

This project concluded with a completed software product that can accurately and

responsively control a UMI RTX robot with on screen controls, gestures through Microsoft

Kinect, and voice commands using Microsoft Kinect. The state of the RTX robot group in the

FGCU robotics lab was left with two fully functional RTX robotic arms. The third arm has a

broken rubber belt as shown in figure 5.1. Having this belt broken in the shoulder of the robot

results in it locking up during the robot’s initialization sequence and it will become unusable. For

the remaining two robotic arms, the RTX Kinect program was run on two separate computers

controlling them to ensure their functionality. A more detailed account of what was physically

done with the RTX can be seen in Appendix I.

Figure 5.1 Broken Belt in RTX robot

For any future projects working with the RTX, it is recommended that the RTX100.lib

library is used for any control of the RTX robot which has noble documentation [3] instead of

the polling methods that were utilized in project the past Java project [5]. For any future work

which is expanding project, there is still functionality that was originally desired but never

implemented that can be done. With moving the arm relatively straightforward, defined in

Page 33: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

33

section 4.3, the focus can be shifted to some of the interesting things it can do. For instance, there

are hard coded sequences in the GUI, with some interface work, user defined sequencing can be

added with brings a lot more functionality to the robot.

On the Kinect side of the project, the gestures, defined in section 2.4, are straightforward

as well and are easy for the user to perform. This can be changed with a more complex and

interactive system. For instance, the case of movement by gestures, it can be changed to more

reflect the user’s current position.

Page 34: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

34

6. References

[1] J Angeles, Fundamentals of Robotic Mechanical Systems: Theory, Methods, and Algorithms

3rd Edition, Springer Science and Business Media, New York, NY, 2007

[2] Universal Machine Intelligence Limited, Inside RTX, London, England 1987

[3] G Knight, UMI Robot User and Programmer’s Manual, Hitsquad, Sydney, Australia1999

[4] J. Zabala and F. Bejerano, The UMI RTX Robot: Simulated Human Anatomy, 2008

[5] W.Disotell, C. Mitchell. UMI RTX Controller, Florida Gulf Coast University, CEN 4065,

2014

Page 35: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

35

Appendix I – Repairing the RTX

Removing RTX Arm

1. Remove Shoulder Joint plastic cover shown in figure 1.

Figure 3

2. Remove screw on lower shoulder joint cover shown in figure 2.

Page 36: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

36

Figure 4

3. Remove Lower two screws on shoulder joint plate shown in figure 3.

Figure 5

Page 37: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

37

4. Remove upper two screws on shoulder joint plate shown in figure 4.

Caution: When Screws are undone arm with be discounted from body but wires

are still connected.

Figure 6

5. Discounted wires from the body shown in figure 5.

Page 38: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

38

Figure 7

6. Reassemble RTX by following the steps in reverse.

Replacing Broken Elbow Encoder Belt

1. Remove top protective cover on Shoulder Link by unscrewing the 4 Philips head screws,

shown in figure 6.

Page 39: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

39

Figure 6

2. Remove top protective cover on Elbow Link by unscrewing the 4 Philips head screws,

shown in figure 7.

Figure 7

Page 40: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

40

3. Remove rubber drive belt on Shoulder Link, shown in figure 8.

Figure 8

4. Remove Elbow Drive gear, by removing the set screw and prying upward shown in

figure 9.

Page 41: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

41

Figure 9

5. Disconnect the 3 Elbow Link wire harness connections shown in figure 10 and push the

slack down to the Elbow Joint tube to allow Elbow Joint to drop.

Figure 80

6. Loosen 5 screws holding Elbow Encoder Belt on and slid inward towards joint, shown in

figure 11.

Page 42: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

42

Figure 91

7. Drop Elbow Joint down carefully, then slid old Elbow Encoder Belt off by navigating it

to the gripper shown in figure 12.

Page 43: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

43

Figure 102

8. Reassemble RTX by following the steps in reverse and adjusting the slack on the Elbow

Encoder Belt so that it’s snug.

Page 44: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

44

Appendix II – Testing Results

Testing Team: Carson Kirkpatrick, Fernando Gonzales, James Beans

Testing Project Of: Colin Mitchell & William Disotell

Test Date: 4/21/2015

Page 45: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

45

Test Case 1

Title Initialize the RTX

Related Requirements Not provided

Test Conditions The program is opened successfully and this is the first test performed

Inputs Pressing the Connect button

Expected Result A message will be displayed to the user in the display box which will state that the RTX was successfully initialized and ready to use. The RTX will be in the home position.

Actual Result The RTX was initialized and the arm was reset to its home position. The display box was displayed on the screen.

Success/Fail Success: The RTX started as expected

Suggestions None

Page 46: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

46

Test Case 2

Title Test the joint movement sliders in the GUI

Related Requirements Not provided

Test Condition Test Case ID 01

Inputs Move the slider to one extreme and press the joint button

Expected Result Each joint will move from one extreme to the other which can be identified visually on the RTX

Actual Result All joints moved all the way from one extreme to the other as expected.

Success/Fail Success: The joints moved as expected

Suggestions None

Page 47: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

47

Test Case 3

Title Test the joint movement sliders in the GUI

Related Requirements Not provided

Test Conditions Test Case ID 01

Inputs Move the slider to the given positions

Expected Result The Coordinate positions will be X will equal 405mm, Y will equal 0mm, and Z will equal -1004mm.

Actual Result The coordinate positions were X = 405, Y = 0, and Z = -1004 as expected.

Success/Fail Success: The coordinates matched the expected result

Suggestions None

Page 48: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

48

Test Case 4

Title Test the initialization of the Kinect

Related Requirements Not provided

Test Conditions Successfully opened program

Inputs Press the “Kinect” button

Expected Result The center screen in the GUI will be switched to black and the closest detected skeleton will be displayed in green. There will also be a message printed in the output box which states the Kinect was initialized successfully

Actual Result The Kinect was initialized and the output box displayed that the initialization was successful.

Success/Fail Success: The skeleton appeared on the screen.

Suggestions None

Page 49: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

49

Test Case 5

Title Test the joint movement using the Kinect

Related Requirements Not provided

Test Conditions Test Case ID 01, Test Case ID 04, and for the user to be positioned at a detectable location relative to the Kinect. This test also expects that Test Case ID 06 has not been preformed

Inputs Perform a hand gesture moving downward with your right hand, wait until the RTX has stopped moving, then preform a hand gesture moving upward with your right hand.

Expected Result After the initial downward movement, the Zed, the default selected joint, will be moved downward by 100cm which can be visually identified. After the second movement, the Zed will return to its original position

Actual Result The Zed moved down 100 cm after the initial downward movement, then returned to the original position after the second movement.

Success/Fail Success: The movements were matched exactly

Suggestions None

Page 50: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

50

Test Case 6

Title Test the joint selection using the Kinect

Related Requirements Not provided

Test Conditions Test Case ID 01, Test Case ID 04, and for the user to be positioned at a detectable location relative to the Kinect. This test also expects that this test case has not been previously preformed

Inputs Perform a hand gesture moving right with your left hand. Then, perform a hand gesture moving left with your left hand.

Expected Result After the first movement is detected, a message to the user will be displayed stating that the currently selected joint is now the Shoulder. The second gesture will change the selected joint back and Zed will be displayed to the user in the output box

Actual Result After the first movement, a message was displayed saying the currently selected joint is the shoulder. After the second movement, a message was displayed saying the currently selected joint is the Zed.

Success/Fail Success: The movements were matched exactly

Suggestions None

Page 51: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

51

Test Case 7

Title Test the voice command to move a joint

Related Requirements Not provided

Test Conditions Test Case ID 01 and Test Case ID 04

Inputs Say “RTX Zed Down”, then say “RTX Zed Up”.

Expected Result The RTX will move the Zed down 100cm after the first voice command, and will move back up 100cm to its original position after the second command

Actual Result When “RTX Zed Down” was said, the Zed moved down 100 cm and when “RTX Zed Up” was said, the Zed moved up 100 cm.

Success/Fail Success: The command was recognized and performed

Suggestions None

Page 52: Integrating Motion, and Voice Control with Kinematic of a ...itech.fgcu.edu/faculty/zalewski/CEN4935/Projects/SCARArobotWithKinect.pdfThis project, is utilizing the UMI RTX Robot as

52

Test Case 8

Title Test the voice home command using the Kinect

Related Requirements Not provided

Test Conditions Test Case ID 01 and Test Case ID 04

Inputs Say “RTX Home”

Expected Result The RTX will run its initialization sequence and move back to its home position

Actual Result When “RTX Home” was said, the RTX returned to its home position.

Success/Fail Success: The RTX followed the command

Suggestions None