Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate Professor, Oakland University Belal Sababha, PhD Assistant Professor, PSUT Marriott Hotel- Amman, Wed. 12/8/2015 1
20
Embed
Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1
Utilizing Science & Technology and Innovation for Development
Real-Time 3D First-Person View for Unmanned Systems
Osamah A. Rawashdeh, PhD, PEAssociate Professor, Oakland University
Belal Sababha, PhDAssistant Professor, PSUT
Marriott Hotel- Amman, Wed. 12/8/2015
2
Project Team - Jordan
Nathir Rawashdeh, PhDAssistant Professor, Exchange CoordinatorMechatronics Engineering Dept.German Jordanian UniversityAmman, Jordan
Areas of Expertise: Image Processing, Controls, Mobile Robots
Belal Sababha, PhDAssistant Professor, ChairComputer Engineering Dept.Princess Sumaya University for TechnologyAmman, Jordan
Areas of Expertise: UAVs, Automotive Controls, Embedded Systems
3
Project Team - USA
Osamah Rawashdeh, PhD, PEAssociate Professor, Academic CoordinatorElectrical and Computer EngineeringOakland UniversityRochester, Michigan, USA
Areas of Expertise: Unmanned Systems, Reliability and fault tolerance, Embedded Systems Design
Samir Rawashdeh, PhDAssistant ProfessorElectrical and Computer EngineeringUniversity of Michigan - DearbornDearborn, Michigan , USA
Areas of Expertise: Stereo Vision, Small Satellites, biomedical Signal Processing
4
Embedded Systems Research Lab at Oakland University
• Multi-core engine and transmission control for Ford
• Hardware-in-the-Loop (HIL) simulations for Chrysler
• Internet weather connectivity for Chrysler
• Infotainment prototyping for GM• PI for a 6-year Research Program
for Undergraduates (REU) funded by the NSF Autobike, Inc. founding
• An applied project was selected that provides practical value for participants
• Developing the infrastructure of the scientific research by establishing research labs in Jordan (PSUT & GJU) via equipment purchasing and joint projects with colleagues in the USA.
• Building the capacity of Jordanian human resources and developing expertise in Jordan by exchanging expertise via mutual lab visits and planed meetings.
• Publish joint research.
• Provide students in Jordan with Master level opportunities in the areas of mobile robots, embedded systems, and image processing.
12
Scope of work/Duration
Scope of work:- Embedded systems hardware and software
development- Real-time image processing algorithm to implement
a software gimbal- Mobile robots- Optics and 3D vision
Duration: 24 Months
13
Methodology of Implementation
1. Component Selection:
• Begin by investigating different wide-angle lens and video camera combinations.
• The optical characteristics of fisheye lenses will be researched and ideally a lens with an evenly distributed distortion will be selected.
• A small, lightweight video camera compatible with the chosen lens must also be found.
• In addition, a wireless AV transceiver system that is capable of transmitting high quality live video to the ground station with low latency must also be obtained.
• Finally, virtual reality goggle options wand image processing hardware would need to be studied, selected and purchased.
14
2. Image Processing Algorithms and Embedded Software R&D
• After the different components are purchased, software will be developed to interface with the virtual reality goggles and to perform image processing on the received video.
• The goggle’s software APIs will be used to obtain positioning data from the headset.
• An efficient image processing algorithm will be researched and implemented to transform the wide-angle videos to flattened images by removing the fisheye distortion.
• From the flattened images, a “window” from each video stream will be furtherly processed with respect to the orientation of the operator’s head as determined by the virtual reality goggle APIs.
• Finally, the processed videos will be shown through the virtual reality headset.
Methodology of Implementation
15
3. Testing and System Evaluation
• An appropriately sized UAV and/or UGV will also be purchased to test the 3D FPV system.
• The cameras and a wireless transceivers will be mounted onto the UAV/UGV and remotely controlled using the 3D FPV system.
• Basic range testing will be done on the wireless video links to evaluate the system’s performance and characteristics.
• Finally, the functioning system will be evaluated for viability in different UAV/UGV applications.
Methodology of Implementation
16
Expected Output
• Improve and extend the features of current wide-angle image software-based gimbals for FPV applications on UAVs and UGVs.
• Implement a 3D FPV system with depth perception for unmanned systems.
• Exchange of knowledge and experience in the fields of embedded real-time image processing between peers in the USA and Jordan.
• Publication of conference and journal papers.
• Technological demonstrations to local institutions in Jordan and USA.
17
Impact
The results may be beneficial for the following governmental and industrial institutions:
• Public Security• Boarder Security• Civil Aviation• The KADDB• Visual multimedia industry
It will also result in building the capacity of researchers in Jordan in the fields of unmanned systems and embedded real-time image processing.
18
Sustainability
• Continued joint project works between colleagues in Jordan and the USA
• Referral and mobility of experts and master level students between institutes
• Complementary cooperation between partner labs
• Sharing of expertise with local partners in Jordan and USA
19
Action Plan
Sequence of planned activities
1. Two researchers from Jordan will visit the partner laboratories in the USA, to define the hardware and software designs, and agree on work distributions.
2. The partner institutions in Jordan will proceed with the equipment purchasing.3. Prototype development will start in Jordan in collaboration with US partners.4. Two researchers from the USA will visit the labs in Jordan to review the progress,
adjust objectives, and outline research publications.5. Initial progress will be presented in conferences in both countries sponsored by the
partner universities.6. The labs in Jordan will finalize the prototype and conduct applied experiments.7. The labs in Jordan will collect necessary data for publications.8. All partners will collaborate on writing journal research publications.9. Partners in Jordan will demonstrate the system to interested local entities via an