Top Banner
ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM: Nick Frederico, Drew Stock, Jesse Mailhot, Sean Patry, Tim Paerson, Brendan Marn, Brad Poegel, Mike Boyd ECE FACULTY ADVISOR: Tom Miller COURSES INVOLVED: ECE 543, ECE 583, ECE 562, ECE 541, ECE 548, ECE 649, ECE 772, ECE 796 CURRENT DATE: December, 2013 PROJECT COMPLETION DATE: April, 2014
9

ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

Sep 26, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

ECE791 Final REPORT

PROJECT TITLE: Quadsat CPROJECT TEAM: Nick Frederico, Drew Stock, Jesse Mailhot, Sean

Patry, Tim Patterson, Brendan Martin, Brad Poegel, Mike Boyd

ECE FACULTY ADVISOR: Tom MillerCOURSES INVOLVED: ECE 543, ECE 583, ECE 562, ECE 541, ECE

548, ECE 649, ECE 772, ECE 796CURRENT DATE: December, 2013PROJECT COMPLETION DATE: April, 2014

Page 2: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

AbstractControl algorithms are an important part of many systems. This project's

goal is to model NASA's MMS four unmanned satellites as well as provide a test bed for future formation flying and control algorithms. Controlling attitude and position are necessary to obtain autonomous flight. Linear Quadratic Regular ( LQR ) control and Position Integral Differential ( PID ) control were both used to control the attitude and position.

A vision system was designed to track the three-dimensional location of the quadcopter in real time. The vision system software sends its data to a Graphical User Interface application. Here the actual location of the quadcopter is calculated and sent to the on-board microprocessor. This data is sent to the negative feedback control loop so that position can be controlled. The GUI also graphs the attitude and position data in real time. This application also sends data to start, stop, or land the quadcopter.

The on board microprocessor communicates with the accelerometer and gyroscope via I2C protocol. The data obtained from each is sent to the negative feedback control loop to control yaw rate, yaw position, roll rate, roll position, pitch rate, and pitch position. Several different control implementations were used in this process. In the beginning stages of the project, cascaded PID control was used. We switched from that to LQR, then a combination of both PID and LQR.

The quadcopter successfully achieved altitude control using the data from the vision positioning system. Roll and pitch were successfully controlled using PID and LQR in the uni-axial rig. This allowed testing of roll and pitch separately. The final stage includes the quadcopter on a two pole rig so that it can not yaw. The rig also limits the amount it can roll and pitch significantly. This implementation shows that the quadcopter is close to autonomous flight.

Figure 1: System layout and flow.

Page 3: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

BackgroundNASA is launching the MMS, which is the Magnetospheric Multiscale Mission.

This is an unmanned space mission that involves sending satellites into space in a tetrahedral formation. Each satellite has 8 booms used for data measuring, and each satellite must achieve 3 rpm spin rate. This project is meant to model NASA's MMS satellites and provide a grounded test bed for future formation flying control systems and control algorithms.

Figure 2: Representation of NASA's MMS with four satellites in tetrahedral formation

MethodController

To achieve autonomous flight we needed two forms of control: attitude and position. The attitude controller has gone through several stages before reaching it's current state. To begin the project we planned on using Liner Quadratic Regulator control (LQR) for attitude. This is a slightly more complex algorithm, but is meant to provide a better control response than other alternatives. After a period of time, we weren't seeing great results from the LQR controller, we switched to Position Integral Differential control (PID). PID control is very easy to implement, and very easy to tune in the case that it is not quite correct, but can give a slightly less desirable response than LQR. After quite a bit of gain tuning of the PID controller, we had achieved some level of attitude control, but not enough for sustained flight. We then decided to use a hybrid method: a PID controller for the attitude angle of the quadcopter, and an LQR controller for the attitude rate. This gave us even better results, but still not quite good enough for sustained flight.

We have only gone as far as implementing a height control algorithm, because stable attitude control is required before translational control is implemented. PID control is used for the height controller. When testing is a height-only rig (the quadcopter is attached to two metal poles and can only move up and down), the PID controller worked reasonably well. We are currently working on implementing a sliding mode controller, which encompasses attitude as well as position control.

Page 4: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

Vision SystemTo control it's three-dimensional position, the quadcopter needs to know where

it is in space. To give it this information, we developed a vision system for tracking. This system was implemented using two PSEYE cameras tracking an LED-lit ping pong ball, an image processing program written in C++, using the OpenCV library and wireless communication via Xbee. The quadcopter with an LED-lit ping pong ball is shown in figure 3, and the design of the vision system is shown in figure 4.

The image processing program constantly takes snapshots of the image streaming from the PSEYE for processing. It then takes this snapshot, and filters it based on a range of RGB values that we provide based on the color that we want to track (we used blue, but kept the option of tracking different colors for when we are tracking multiple quadcopters at once). After we have filtered the image, we create a greyscale image based on the filtered data, where everything within the range of values is colored in a shade of white, and everything that fell outside of the values is colored black. This image is then scan starting from the top left, searching for pixels above a certain threshold of brightness (because the brighter the white, the closer the match) until it finds the first matching pixel. We then consider this the “pixel location” of the ball relative to the camera. The two images from the vision software are shown in figures 4 and 5. Initially, we used a “Hough Transform” algorithm that was provided by the OpenCV library to create a best-fit circle around the ball. This, however, turned out to be very inaccurate, and would often pulsate making it appear that the ball was moving when it was stationary, or create circles around other objects as well. After struggling with that Hough Transform for a while, we decided to do that portion of the processing on our own, in a slower but much more accurate way.

The pixel location of each camera is then sent to our Java application (which also includes a graphical interface, described later). The Java application is told before the flight has started how far away the ping pong ball is from each camera. This is vital, because the amount of physical distance the ball has actually moved when it moved a certain number of pixel on the image is dependent on how far away it is from the camera. So when the Java application receives this information, it first cross-references the data to determine how much closer or further the ball has moved to each camera, so that is can recalibrate the “pixels-per-inch” value for the next captured image. It then also determines how far it has moved from it's original location, which we set as location (0, 0).

When this location has been calculated, it is then sent in a wireless command via Xbee to the on-board Arduino. Currently, it only sends the height value, because we are not yet implemented translational control. The Arduino then uses this height value in it's height controller.

Page 5: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

Figure 3: The quadcopter with the LED-lit ping pong ball for tracking.

Figure 4: Vision system design, two PSEYE cameras are set up orthogonally, each receiving a 2D pixel location.

Figure 5: Original image of quadcopter Figure 6: Processed greyscale image from the vision software with filtered RGB values, white is

what is being tracked.

Page 6: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

Graphical User InterfaceThe other portion of the Java application is the graphical interface( Figure 7 ). This

was created so that we can more easily interact with the quadcopter and view real-time data in a meaningful manner.

The reason we initially create the GUI was so that we could send wireless commands to the Quadcopter by just pressing buttons. We had, up until this point, been starting the quadcopter with just a 10 second delay on the start-up of the Arduino, so we would have to power cycle it every time we wanted to fly it again, and had to set a time limit on the flight because we could not manually shut it off. Because of this, we wanted to create a start and stop button that would simply send a command via XBEE to the Arduino to start and stop flight. We then also added a “spin up” button, which simply told the Arduino to spin the propellers at the slowest possible speed, so that we could make sure the propellers were working, in the correct spots, and spinning in the correct directions. This greatly improved the safety of our tests.

We then wanted to be able to see the motor thrust values in a better way than just printing them to the interactions window in real-time, so we added a “save” button. The Java program constantly receives thrust data wirelessly from the Arduino, and stores it into array lists. When the “save” button is pressed, it exports this data to a Microsoft Excel spreadsheet so that we can better analyze the data.

We also added a few other buttons throughout the project: a “control” button, that tells the Quadcopter to transition from spin up to full flight, a “stop” button which stopped the program timer and was later replaced by a “land” button, which lowers the total allowable force output by the motors to just lower than the amount required to lift the Quadcopter, so that it will slowly float back down to the ground. This was implemented well into our full flight test period, because the chassis of the Quadcopter had been slightly distorted from the full-force crash landings it had taken.

When we began creating the GUI, we also realized that we had a great opportunity to show real-time data being streamed from the on-board Arduino. We created a Java class that creates a sliding graph, so that as we continue to send new pieces of information to it, it adds the newest point to the right of the graph and removes the oldest point from the left side of the graph, so we can always look at the past 14 seconds of information.

We also created a three-dimensional graph class that will show where the Quadcopter is in space. In addition to this, we created a “cam panel” class, so that we can graph the two-dimensional location of the Quadcopter relative to each camera.

Page 7: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

Figure 7: The Graphical User Interface. There are six buttons on the top row, and a “land” button has been implemented not shown here. There are 3 graphs showing roll, pitch and yaw. These graphs

show the raw data before passing it through a filter, as well as the filtered data.

ContributionThe original LQR controller was designed by Brendan Martin, with assistance

from Tim Patterson. The flight code using this control scheme was implemented by Jesse Mailhot, with help from Nick Frederico and Drew Stock. The physical motor model was found experimentally using a force transducer by Brad Poegel.

The PID controller, along with the hybrid PID and LQR controller were again designed by Brendan Martin, with assistance from Tim Patterson, Drew Stock, and Nick Frederico. Xbee communication was implemented primarily by Drew Stock, with assistance from Nick Frederico and Jesse Mailhot. The Solid Works model of the system was created by Sean Patry. The rigs for controls testing were primarily designed by Mike Boyd.

The original vision system using the Hough Transform was designed primarily by Nick Frederico. The filtered image scanning method which replaced the Hough

Page 8: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

Transform was created by Drew Stock, with assistance from Nick Frederico. All of the custom graph classes for the GUI were designed by Nick Frederico, as well as the Java vision system processing, with some assistance from Drew Stock.

The sliding mode controller is currently being designed by Brendan Martin with assistance from graduate school adviser Thomas Fuller. The new flight code on the Beaglebone Black by Jesse Mailhot, with assistance from Drew Stock and graduate school adviser Thomas Fuller.

Discussion & ConclusionsThe purpose of this project is to create a test bed for future formation flying as

well as developing control algorithms for autonomous flight. This purpose included many tasks such as a working vision system, communication between the on-board microprocessor and computer, a Graphical User Interface ( GUI ), communication between the accelerometer and gyro and the Arduino, and control algorithms on the Arduino.

The vision system provides the pixel location of the quadcopter to a great degree of accuracy thanks to being able to change the exposure of the camera as well as the filtering the color range specified. The Java application receives this information and correctly calculates the three-dimensional position. This software works as intended, based on the specified goals.

The communication using Xbee works as intended for the most part. There were some problems in sending data, because the Arduino code was not parsing this data efficiently. We changed this so that the Java application only sends one byte of data to the Arduino to quickly receive the command such as start, stop, or land.

The main portion of this project that wasn't working as intended was the control algorithms. The quadcopter performed well on the uni-axial rig. Once it was in full tethered flight, the system became very non-linear and is much harder to control.

There are a few things that could be done differently. This includes using a Kinect for the vision system, although this wasn't necessary based on the progress of the other aspects of the project. Keeping track of all the hardware and software changes would have saved a lot of time, since there were many setbacks. We spent a significant amount of time gain tuning, and this became a sort of “guess and check” scenario. If we spent our time more efficiently on this subject, perhaps more could have been accomplished.

The next step for this project is to change microprocessors from an Arduino to a Beaglebone Black. This allows for a much faster processor speed, but also causes complications such as getting wireless communication to still work. What also could be achieved next is to have a rod with a ball on it to control the altitude of the quadcopter. This could be done using the vision system and a person holding the rod to control the

Page 9: ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT TEAM ...unh.edu/ece/Department/Senior Projects/ECE792_2014/final reports/… · ECE791 Final REPORT PROJECT TITLE: Quadsat C PROJECT

height of the quadcopter by moving it to a desired z position. This requires more code to be written in the vision system software as well as the java application. It could be significant since it shows the capabilities of the vision system as well as how well the quadcopter can control its altitude.

We have successfully accomplished a working vision positioning system of an object in three-dimensional space. Software in Java was designed to obtain this information from the vision software to create a visual representation of the quadcopter in three-dimensional space as well as to send the on-board microprocessor the current altitude data. Communication between the on-board microprocessor and the Java application was established using Xbee. Altitude control as achieved, and roll pitch control were achieved to some extent, mainly on the uni-axial rig. Yaw control was achieved using a hanging test rig, where the quadcopter was hung by a rope and tethered to the ground. The project succeeds in creating a great starting point for a future generations of work.