Senior Design Documentation Library Detailed Design Specification Department of Computer Science and Engineering the University of Texas at Arlington Chapter 2: Documenting the Architecture Eagle Eye Team Members: Berkins, Nicholas Sanders, Baron (TL) Shrestha, Amit Tran, Nguyen Last Updated: d MMMM yyyy 40112.8375 h:mm:ss am/pm Detailed Design Specification 1
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
This document specifies the detailed design for the Eagle Eye project. The motivation here is to enable explaining in the details all components of the system listed in the Eagle Eye’s Architecture Design Document as well as the components’ functions and relations to other components.
1.2 Outline
The basic outline of the detailed design description is Introduction Architecture Overview Component Design Quality Assurance Requirements Traceability Matrix Acceptance Plan
1.3 Purpose
The purpose of this project is to provide a software system on a ground control station to receive telemetry data and a video stream from an outside source. The software station will identify targets from the video stream and also locate targets using telemetry data. This system will be called Eagle Eye.
1.4 Scope
The system is designed to compete in the Unmanned Vehicle Systems International (AUVSI). This system can also be beneficial, if it is used in the senior design lab or with modifications could be used in other target recognition projects. The system is designed to recognize basic shapes, alphanumeric characters and colors.
The AUVSI competition 2009 scenario was to support a Company of US Marines while conducting a Patrol. An Unmanned Aerial System (UAS) will support their sweep with intelligence, surveillance and reconnaissance. In order to support them, our UAS must comply with Air Tasking Order special Instructions for departure and arrival procedures, and then remain with the assigned airspace. It will be tasked to search a large area for typical targets, and may be tasked to conduct an immediate route reconnaissance for convoy support or a point reconnaissance if requested. Our software will be the crucial link to identifying the ground targets.
The project scope is to design a system that assists the UAS in mission completion. The system will recognize targets based on the live video stream provided to the user. The system hardware will be a computer as a ground station. The system software will detect targets in the camera (WC)’s field of view and the user will manually control the camera to get the best quality pictures. The system output will be a form based on recognized targets.
HC HelicopterWC Webcam/CameraCSE Computer Science and EngineeringUTA University of Texas at ArlingtonAVL Autonomous Vehicle labAUVSI Unmanned Vehicle Systems InternationalUAS Unmanned Aerial SystemISR Intelligence, Surveillance and Reconnaissance
1.6 System Requirements
This section details the order of implementation of the key requirements listed in the Eagle Eye’s System Requirements Document. They are grouped into groups that are logical from both development and other stakeholders views and are the result of consultation and negotiation with all stakeholders.
Highest Priority
o The system shall have a Graphic User Interface (GUI)
o The system must accept a video stream
o The system shall recognize targets using the video stream
o The system shall be able to recognize the geometric shape of the target
o The system shall be able to recognize the targets alphanumeric character
o The system shall be able to recognize the background color of the target
o The system shall be able to recognize the target’s alphanumeric color
o The system shall be able to store target information into a file
o The system shall allow a user to retrieve all accepted targets and their individual
o The system shall have playback mode for all the recorded
Intermediate Priority
o The system shall accept telemetry data
o The system shall find the target location using telemetry data
o The system shall allow the user to accept a target or decline based on a snap shot
from the video stream and the telemetry data provided by the HC
This section describes the overall architecture of Eagle Eye. The system composed of 5 layers: Main Control Layer, User Interface Layer, Data Capture Layer, Target Identification Layer and Storage Layer. All of the layers work independently, except the Main Control Layer which will interact with the other four layers.
Figure 2 - Architecture Layer Diagram of Eagle Eye System
2.1 Layer-Name and Description
2.1.1 Main Control Layer The Main Control Layer is primarily responsible for controlling Eagle Eye System.
The controller will take in the video stream and the telemetry data from the data capture layer. It will pass this video and telemetry data to the storage and user interface layers. The Control layer will send a single frame from the video to the target identification layer for identification. It will also initiate listeners for the user interface and calls for the appropriate actions.
2.1.2 User Interface Layer The User Interface Layer is primarily responsible for presenting information to the
user and taking input from a user
2.1.3 Data Capture Layer The Data Capture Layer is primarily responsible for acquiring the video stream from
a camera (wireless or USB cable) and telemetry data from an outside source, i.e. from a helicopter or a text file
This section describes the type of data that flow among layers.
LayersData Flow Description
From To
Main Control
Storage Layer The Main Control Layer will send requests to retrieve or save data from the Storage Layer
Main Control
Data Capture Layer
The Main Control Layer will send requests to start/end data capturing.
Main Control
User Interface Layer
The Main Control Layer will send information to display on the User Interface Layer
Main Control
Target Identification
Layer
The Main Control Layer will send a still frame from the video stream to the Target Identification Layer
The User Interface Layer will identify targets in the still frame
The User Interface Layer will send results back to the System Control Subsystem
User Interface
Layer
Main Control The User Interface Layer will send user inputs to the Main Control Layer
Storage Layer
Main Control The Storage Layer will provide the Main Control Layer the options to retrieve stored videos and the identified targets based on the AVL format.
Data Capture Layer
Main Control The Data Capture Layer will provide video streams and telemetry data
Target Identification
Layer
Main Control The Target Identification Layer will send the potential targets in a still frame
Now that the inter-subsystem dataflow are explicitly defined above, each data element flowing between subsystems is described in the table below.
Data Element Descriptions
1
This will be the signal from the System Control Subsystem, based on the user input, to start receiving telemetry and video data.
The reverse flow of data will also be the video stream along with the telemetry data going back from the Capture Controller Subsystem to the System Control Subsystem. The flow also contains a still captured frame with its given telemetry data when requested by the user
2The data that flows here would be the telemetry data that comes from the outside source (UAS or a text file) to the Telemetry Acquisition Subsystem.
3This data is a live video stream comes from a camera to the Video Stream Acquisition Subsystem.
4This is the telemetry data that was received by the Telemetry Acquisition Subsystem and is now being sent to the Capture Controller Subsystem.
5This data could be a live video stream or a video file that was received by the Video Stream Acquisition Subsystem and it is being sent to the Capture Controller Subsystem.
6 When the System Control Subsystem has received the target along with its
attributes, the System Control sends it to be displayed to the GUI. The reserve flow of data will be the user input to the Main Control Subsystem
7
The flow of data from System Control subsystem to Storage Manager Subsystem is user action to retrieve saved data and video stream constantly recorded.
The flow of data from Storage Manager Subsystem to System Control subsystem is based on if the user decides to retrieve stored targets or video streams from the storage layer.
8
The data here which is the video stream and telemetry data will be provided to the Still Frame Capture Subsystem when desired by user
Once a target has been identified (with its still frame, and attributes), the System Control will receive results from the Target Identification Subsystem
9Target Identification Subsystem will call the Attributes Training Set Subsystem to provide the attributes to look for in the still frame
10This is the video stream that was received from Video Processing Subsystem by the Storage Manager Subsystem. It is also stored video that is sent back to the Storage Manager Subsystem in order to be sent to the System Control Subsystem
11 This is the stored target info that was received from the Target Identification Subsystem by the Storage Manager Subsystem. It is also stored target info that is sent back to the Storage Manager Subsystem in order to be sent to the System Control Subsystem.
12 This is the user input to the system.
Table 3.1 – Inter-Subsystem Data Element Descriptions
Table description
Data Element The name by which the data element is referred to in the rest of the document. Data element is also used for objects.
Description A written description of the data element
Producer Subsystem The name of the layer or subsystem that creates the data elements specified. Note that this is the layer that invokes the constructor for the data element, not necessarily the layer that owns the constructor.
Consumer Subsystem The name of the layer or subsystem that uses the data element created by the producer layer.
Table Cells The intersection of the two layers. At the intersection of the two layers is a list of data element numbers referring to the Inter-Subsystem Data Element Description Table (Table 1). Elements listed in the intersection are created by the subsystem or layer on the left and consumed by the subsystem or layer above.
This section details the individual functions that will be utilized in each subsystem. For each of these functions, inputs, outputs, data structures, and important dependencies are defined. Each layer is reduced to subsystem and then to component level and all are described in detail below. Also included are the assumptions and responsibilities necessary for each subsystem and component.
4.1 Data Capture Layer
4.1.1 Telemetry Acquisition Subsystem
4.1.1.1 General
The Telemetry Acquisition Subsystem sends a message to an outside source to send telemetry data. Telemetry data from an outside source includes speed and altitude of HC, GPS location, position of HC with respect to the ground, and the position of a camera with respect to the HC and the camera axis. With one axis, the data would get a floating-point number for a WC location on the X-Y plane with reference to the WC.
4.1.1.2 Assumptions
Telemetry data received from outside source will be specific; such as speed and altitude of a HC, GPS location, and position of a HC and a WC with respect to the ground and target.
4.1.1.3 Responsibilities
Record speed of a HC
Record altitude of a HC
GPS location
Position of a HC with respect to the ground
Position of a WC with respect to a HC based on the degrees of freedom of a WC
Send the received telemetry data to the Telemetry Data Preparation Subsystem
4.1.1.4 Subsystem Inter-layer Interfaces
Method Description InformationRequired
Information Returned
get_Raw_TD Retrieves telemetry data from outside sources.
• Source Id • Telemetry data
Table 4.1.1.1: Subsystem Inter-layer Interfaces to Telemetry Acquisition Subsystem
get_Raw_TD(//this will be the state variables described in the table below); { //this will be used to assign the TD variables to a local variable for returning to the //capture controller }
TD-TABLE
Name Col. Offset Type Description
Phi 0 DoubleRoll angle in radians $-\pi < \phi < \pi$. Zero is level with the horizon and increasing $\phi$ is right wing down.
Theta 8 DoublePitch angle in radians $-\half\pi < \theta < \half\pi$. Zero is level with the horizon and increasing $\theta$ is nose up.
Psi 16 DoubleHeading (yaw) angle in radians. $-\pi < \psi < \pi$. Zero is due north and increasing $\psi$ is rotating clockwise when viewed from above. East is $\psi=\pi/2$, west is $\psi=-\pi/2$, and south is $\psi=\pm\pi$. Be sure to account for the discontinuity when facing south; it can cause problems if you compute $\Delta\psi = \psi - \psi'$. Use the \cmdline{turn_direction} routine instead.
P 24 DoubleBody frame roll rate in radians (unbiased). Positive $p$ is rolling right wing down. The actual range of the reading should be $\pm 90\deg$, although very high readings can result from bad gyro bias calibrations.
Q 32 Double Body frame pitch rate in radians (unbiased). Positive $q$ is pitching nose up.
R 40 DoubleBody frame yaw rate in radians (unbiased). Positive $r$ is rotating clockwise when viewed from above.
X 48 Double North position in local tangent plane in m (filtered)
Y 56 Double East position in local tangent plane in m (filtered)
Z 64 DoubleDown position in local tangent plane in m (filtered). Remember that negative $z$ positions are above ground level.
Vx 72 Double Forward velocity in body frame in m/s
Vy 80 Double Sideways velocity in body frame in m/s
Vz 88 DoubleDownward velocity in body frame in m/s. Remember that negative $vz$ is actually a climb when the aircraft is upright.
Mx 96 DoubleForward magnetometer reading (unscaled, filtered). The readings are signed and corrected for the hard iron calibration values that are in the AFCS's configuration file, but they are not scaled to any units like Tesla. This is not a problem since they are only used in proportion to each other, not in comparison to any outside magnetic force.
My 104 Double Sideways magnetometer reading (unscaled, filtered)
Ax 120 DoubleForward accelerometer reading in m/s/s (filtered). The acceleration values are roughly scaled to m/s/s and zeroed for when the IMU is level, although there may be some slight scale error. As with the magnetometer readings, this is not significant since the accelerometers are used in proportion to each other.
Ay 128 DoubleSideways accelerometer reading in m/s/s (filtered)
Az 136 DoubleDownward accelerometer reading in m/s/s (filtered)
Raw_p 144 DoubleRaw body frame roll rate in radians (biased). These raw values are typically not used by any application, but may be used for post-flight analysis of gyro bias drift during the flight.
Raw_q 152 DoubleRaw body frame pitch rate in radians (biased)
Raw_r 160 DoubleRaw body frame yaw rate in radians (biased)
Crc_err 168 DoubleNumber of CRC errors on the IMU serial link
Voltage 176 DoubleIMU bus voltage. This is computed from the band gap reference inside the IMU. 4.9 - 5.1 is good; if it drops below 4.8 volts the battery is nearly dead and not producing sufficient current for the regulator to maintain supply voltage. Expect the artificial horizon to begin tumbling and for other problems to occur if this happens.
Resets 184 DoubleNumber of IMU resets. If this value ever changes the IMU has received a significant shock and is likely to be producing bad data for a while. It is best to discontinue the flight as soon as possible.
Rpm0 186 intEngine RPM
Rpm1 188 intMain-rotor RPM
Rpm2 190 inttail-rotor PRM
4.1.1.6 Private method
None
4.1.2 Video Stream Acquisition Subsystem
4.1.2.1 General
The Video Stream Acquisition subsystem will receive video from WC. It is also responsible for sending all of the received video to the Capture Controller Subsystem
4.1.2.2 Assumptions
Camera will have high enough resolution to identify target
Stream video will maintain the standard of certain pixel resolution and type.
4.1.3.3 Responsibilities
Receive telemetry data from Telemetry Acquisition Layer
Prepare telemetry data received from outside source
Send/transfer telemetry data to the System Control Subsystem
Receive and process video data from Video Stream Acquisition Layer Subsystem
Send/transfer video data to the System Control Subsystem
Send/transfer video data to the Storage Manager Subsystem
Capture the still frame from the video stream
Calculate the telemetry data of the captured frame
Send the captured frame and telemetry data to the Target Identification Subsystem
4.1.3.4 Subsystem Inter-layer Interfaces
Method Description InformationRequired
Information Returned
start_Video_ Acquisition Start Video Capture Subsystem • Video Source
• Pass/Fail
start_TD_Acquisition
Message from User Input subsystem to start retrieving telemetry data from UAS
• User Input Message
• Pass/Fail
get_TD Send/transfer two telemetry data to the System Control Subsystem and Still Frame Capture Subsystem
• Time •2 sets of telemetry data
get_Video Send video to the main controller • Time • Video
capture_Frame Capture time of User input and call functions to get the frame at that time from the video stream as well as calculate the telemetry data of that frame.
• Time •Still Frame from the video
Table 4.2.1.1: Subsystem Inter-layer Interfaces to Capture Controller Subsystem
Target Identification Subsystem is responsible for identifying a target’s geometric shape, alphanumeric character, alphanumeric color and background color. This subsystem receives data from the Still Frame Capture Subsystem and User Inputs Subsystem. It is also responsible for sending the information (identified targets with their attributes) to the Storage Layer Subsystem and the System Controls Subsystem.
4.1.4.2 Assumptions
None
4.1.4.3 Responsibilities
Receive data from Still Frame Capture Subsystem
Identify targets
o Identify target’s basic geometric shapes
o Identify target’s background color
o Identify target’s alphanumeric character
o Identify target’s alphanumeric color
o Location
Send the identified target to the System Control Subsystem
4.1.4.4 Subsystem Inter-layer Interfaces
Method Description InformationRequired
Information Returned
Identify_Targets Receive data from Still Capture Subsystem and identify targets
• still frame, telemetry: an object of “image” class and an object of “telemetryData” class
• Pass/Fail
Table 4.2.4.1: Subsystem Inter-layer Interfaces to Target Identification Layer Subsystem
The Storage Manager Subsystem will store the video from the video source as well as the targets from the target identification layer. The subsystem will allow for retrieval of targets and video.
4.1.5.2 Assumptions
None
4.1.5.3 Responsibilities
Store all the recorded videos
Store all the targets information including their respective attributes
Make video and target information available for the user
4.1.5.4 Subsystem Inter-layer Interfaces
Method Description InformationRequired
Information Returned
store_Videos Store the video received from the Video Processing Subsystem (Target Identification Layer )
• Video data •Pass/Fail
store_Targets Store targets and their attributes received from the Target Identification Subsystem
• Identified targets and their attributes
•Pass/Fail
Retrieve_Videos Send stored video to the System Controls Subsystem (User Interface Layer).
•Video file name
•Video
Retrieve_Targets Send stored information to the System Controls Subsystem (User Interface Layer)
•Stored targets and their attributes
•Stored targets and their attributes
Table 4.3.1.1: Subsystem Inter-layer Interfaces to Storage Manager Layer Subsystem
4.1.5.5 Public Methods
store _Videos(//parameter for video received from Main Control Subsystem){
retrieve the “telemetryData” object from one file name
send the picture + telemetryData to Storage Controller
decrease counter
}
}
4.1.7.6 Private Methods
None
4.1.8 Attributes Training Set Subsystem
4.1.8.1 General
Attributes Training Set Subsystem is responsible for defining target’s attributes such as target’s geometric shape, alphanumeric character, alphanumeric color and background color. This subsystem sends training data to the Target Identification Subsystem when identifying an object is needed.
The System Control acts as a manager and checks everything that is coming into the its subsystem. The System Control receives the video stream, HC/WC telemetry data, and the targets attributes from the target identification layer and also anything that is received from the storage layer. Once all this is received the System Control sends this info to the View subsystem to be displayed to the user.
actionListener Method that listens for the user input from the GUI layer; then based on that user input performs the equivalent action desired from the user.
The action event or button chosen in the GUI
None
Table 4.4.3.1: Public Interface to System Control Subsystem
4.3.1.5 Public Methods
public void display_TD(//this will be a string that contains the type of data file)
{get_TD();
}
public void display_VideoStream(//this will be a string that contains the type of video file)
{get_Video();
}
public void display_IdentifiedTargets(//what choice was made)
{If(//the user wants video that’s been stored) {
retrieve_Video()}Else if(//the user wants targets that have identified and stored) {
This section describes the conditions under which the final product will be deemed acceptable by the stakeholders. It is a duplicate of the acceptance criteria section from the System Requirements Document.
7.1 Criteria
7.1.1 The Eagle Eye system shall accept a video stream
7.1.1.1 Description
The system will take video as an input from the video recorder on autonomous helicopter.
7.1.1.2 Source:
AV-Competition Rules 2009
7.1.1.3 Applicable Constraints
Video must be of a least 800x600 resolutions.
7.1.1.4 Applicable Standards
N/A
7.1.2 The Eagle Eye system shall recognize the characteristics of a target with a given video stream and other data from the HC.
7.1.2.1 Description
The system will be able to recognize targets alphanumeric characters, colors, and geometric shapes from a given video stream and other given data.
7.1.2.2 Source:
Dr. Reyes and the AV-challenge documentation
7.1.2.3 Applicable Constraints
Object is no bigger than 8 square feet, colors used must be from the standard color chart, and the video must be of a least 800x600 resolutions.