seseWORCESTER POLYTECHNIC INSTITUTE Computer Science, Mechanical Engineering, And Robotics Engineering Programs NASA Lunabotics 2019-2020 A Major Qualifying Project submitted to the Faculty of WORCESTER POLYTECHNIC INSTITUTE In partial fulfillment of the requirements for the Degree in Bachelor of Science by: Kevin Bimonte, Computer Science Harrison Burack, Computer Science Cara Freedman, Mechanical Engineering Jack Hogan, Computer Science Mark Hogan, Robotics Engineering Nicole Kuberka, Robotics Engineering Project Advisors: Professor Nicholas Bertozzi Professor Joshua Cuneo Professor Therese Smith Date: 05/18/2020 This report represents the work of WPI undergraduate students submitted to the faculty as evidence of a degree requirement WPI routinely publishes these reports on its website without editorial or peer review. For more information about the project’s program at WPI, see https://www.wpi.edu/project-based-learning
126
Embed
seseWORCESTER POLYTECHNIC INSTITUTE Computer Science, … · 2020. 5. 18. · NASA Lunabotics 2019-2020 A Major Qualifying Project submitted to the Faculty of WORCESTER POLYTECHNIC
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
seseWORCESTER POLYTECHNIC INSTITUTE
Computer Science, Mechanical Engineering,
And Robotics Engineering Programs
NASA Lunabotics 2019-2020
A Major Qualifying Project
submitted to the Faculty of
WORCESTER POLYTECHNIC INSTITUTE
In partial fulfillment of the requirements for the Degree in Bachelor of Science by:
Kevin Bimonte, Computer Science
Harrison Burack, Computer Science
Cara Freedman, Mechanical Engineering
Jack Hogan, Computer Science
Mark Hogan, Robotics Engineering
Nicole Kuberka, Robotics Engineering
Project Advisors:
Professor Nicholas Bertozzi
Professor Joshua Cuneo
Professor Therese Smith
Date: 05/18/2020
This report represents the work of WPI undergraduate students submitted to the faculty as
evidence of a degree requirement WPI routinely publishes these reports on its website without
editorial or peer review. For more information about the project’s program at WPI, see
Appendix A. Lunabotics Awards ......................................................................................... 103
Appendix B. Sketches ......................................................................................................... 106
Appendix C. Sinkage Calculations ...................................................................................... 110
Appendix D. Four Bar Calculations ..................................................................................... 111
Appendix E. Bucket Free Body Diagram Results ................................................................ 112
Appendix F. Sequence Diagrams ......................................................................................... 112
x
Equations
Equation 1. The probabilistic law characterizing the evolution state (Mahroos, Hassan, & Shaaban,
2011) ........................................................................................................................................ 42 Equation 2. Determining the distance taken based on Figure 38................................................. 60
Equation 3. Velocity required to complete the longest path in 5 minutes ................................... 60 Equation 4. Torque required for drive base motor ..................................................................... 60
Equation 5. Calculation of resultant friction force, assuming no resistant force (Fr) ................... 63 Equation 6. Force required for the passive piston to life the deposite conveyor. ......................... 73
Equation 7. Goal Heat Resistance of the System........................................................................ 76 Equation 8. Thermal Resistance from Conduction ..................................................................... 76
Equation 9. Equation to figure out size of row. .......................................................................... 77 Equation 10. Bekker Pressure-Sinkage Equation ..................................................................... 110
Figure 1. NASA’s Lunabotics Logo ............................................................................................3
Figure 2. Markhor .......................................................................................................................5 Figure 3. The five most abundant gasses in the Martian atmosphere plotted logarithmically
(Dunbar & Greicius, The Five Most Abundant Gases in the Martian Atmosphere, 2012) ............6 Figure 4. The solar wind (beige streaks) rips molecules from Mars’ atmosphere. Orange lines
represent high energies of outgoing ions, blue low (Shirah, 2015) ..............................................7 Figure 5. A smooth curve of atmospheric pressure along with the same time frame. ....................8
Figure 6. Icy Regolith simulant (gravel), at various sizes (Heiney & Johanboeke, RMC Icy-
Figure 10. 3D-Printed guide for excavator wheel ....................................................................... 20 Figure 11. CAD model comparison of current LOADER bucket design (left) and the previous
Figure 13. CAD model of LOADER wheel design. ................................................................... 24 Figure 14. CAD model of the base design, with electronics and battery side-support. ................ 25
Figure 15. Image of LOADER, indicating the position of the battery mounts and conveyor support
Figure 16. CAD model of the LOADER conveyor subsystem. .................................................. 28 Figure 17. Chart for 775pro motor, showing operational levels of multiple aspects (VEX Robotics,
n.d.). ......................................................................................................................................... 29 Figure 18. Side view of LOADER in the starting, compact configuration. ................................. 29
Figure 19. CAD model of the full LOADER in its driving configuration. .................................. 30 Figure 20. CAD model of the full LOADER system in the digging position. ............................. 31
Figure 21. Diagram showing the connection of all the electronics on the robot. ......................... 33 Figure 22. Initial Structure of the Robot’s Software Environment .............................................. 35
Figure 23. Initial ROS Diagram ................................................................................................. 36 Figure 24. Example of a ROS2 Action ...................................................................................... 37
Figure 25. Image captured from Intel RealSense displaying the distance gradient in meters, with
an RGB image for comparison. ................................................................................................. 42
Figure 26 – The dynamic Bayes network that characterizes the evolution of controls, states, and
Figure 27. Implementation of OpenVSLAM for obstacle detection and field mapping .............. 44 Figure 28. Example of fiducial that would be placed on collection bin ...................................... 45
Figure 29. Entity Relationship Diagram (ERD) for LOADER ................................................... 48 Figure 30. The Component table as represented in the ERD ...................................................... 49
Figure 31. The Component and Subsystem tables and their relationship .................................... 50 Figure 32. The Run and Subsystem tables and their relationship ................................................ 51
Figure 33. Individual Components and their relationships ......................................................... 52 Figure 34. Laser cut wooden prototype assembly ...................................................................... 54
xii
Figure 35. 3D printed internal insert for the prototype buckets. ................................................. 54 Figure 36. Prototype CAD Model of 0-degree bucket (left) and 30-degree bucket (right). ......... 55
Figure 37. Conveyor Excavator Prototype to calculate digging force in flat-bucket orientation
(left), and a closeup of the bucket at approximately 10 degrees (right). ...................................... 57
Figure 38. Potentially longest path for robot navigation, used to calculate robot speed. ............. 59 Figure 39. Free body diagram of torque required to drive one wheel.......................................... 61
Figure 40. Free body diagram of the top view of the chassis, use to calculate resultant friction
Figure 41. Four bar free body diagram, summed around point D ............................................... 64 Figure 42. Four bar link one free body diagram, summed around point A .................................. 65
Figure 43. Four bar link three free body diagram, summed around point D ................................ 65 Figure 44. Four bar motion study settings .................................................................................. 67
Figure 45. Four bar motion study one results ............................................................................. 67 Figure 46. Four bar motion study two results ............................................................................. 68
Figure 47. Bucket forces free body diagram, summed around point A ....................................... 69 Figure 48. Bucket forces free body diagram, summed around point B ....................................... 70
Figure 49. Allowable Forces per Insert in N (Perpendicular to Belt Surface............................... 71 Figure 50. Free body diagram of attachment point of the conveyor piston ................................. 72
Figure 51. Heat Sync Flow Chart .............................................................................................. 76 Figure 52. Example of PBS, breakdown of systems, subsystems, and parts. .............................. 83
Figure 53. Example of the material order list, showing the electronic components section. ........ 85 Figure 54. Sketch of various details regarding external and internal auger components ........... 106
Figure 55. Sketch of potential auger design ............................................................................. 107 Figure 56. Sketch of potential drill design with delivery hole at the top ................................... 108
Figure 57. Sketch of potential auger mechanism ...................................................................... 109 Figure 58. Sketch of potential auger mechanism being put in the ground, and depositing contents
Figure 60 The Sequence diagram for the orietation phase. ........... Error! Bookmark not defined. Figure 61 Sequence diagram for the navigational phase. .............. Error! Bookmark not defined.
Figure 62 Sequence diagram for the digging phase. ..................... Error! Bookmark not defined. Figure 63 Sequence Diagram for the return navigation phase. ..... Error! Bookmark not defined.
Figure 64. Sequence diagram for the dump phase. ....................... Error! Bookmark not defined.
xiii
Tables
Table 1. Arena dimensions for 2020 competition. ...................................................................... 11
Table 2. Robot dimensions for 2020 competition. ...................................................................... 12 Table 3. Robot Runtime for 2020 competition. .......................................................................... 12
Table 4. Distance from axel to bucket teeth in two configurations for digging conveyor testing, the
average-peak and steady forces read on the gage, and the calculated forces at the bucket teeth. . 57
Table 5. Sinkage Calculation for Robot Wheels......................................................................... 59 Table 6. Variables used to calculate resultant friction force. ...................................................... 63
Table 7. Four-Bar variable lengths (in millimeters) ................................................................... 65 Table 8. Equations of equilibirum for Four-Bar Free Body Diagram analysis ............................ 66
Table 9. Lengths from bucket free body diagrams ..................................................................... 70 Table 10. Equations of Static Equilibrium for bucket free body diagram, summed around point A
................................................................................................................................................. 71 Table 11. Equations of Static Equilibrium for bucket free body diagram, summed around point B
................................................................................................................................................. 71 Table 12.Values for Gas Springs on Conveyor .......................................................................... 72
Table 13. Power Budget for LOADER ...................................................................................... 74 Table 14. Amount of data collected per a table and over twenty runs ......................................... 78
Table 15. Risk Mitigation Table for each problem ..................................................................... 79 Table 16. The budget from the cost plan, showing all contributions. .......................................... 86
Table 17. Performance strengths of different mobile power sources (Suppes & Storvick, 2016). E,
Table 18. Sinake Calcuations for Robot Wheels ...................................................................... 110 Table 19. Four bar static analysis force results......................................................................... 111
Table 20. Bucket static analysis force results ........................................................................... 112
1
Introduction
We live in an exciting time and place for space exploration. NASA is amid several exciting space
exploration programs including ARTEMIS and the Moon to Mars program. ARTEMIS seeks to
establish a strategic lunar presence and will land the next American astronaut on the moon by 2024
and partner commercially by 2028. The moon will then act as a proving ground for engineering
feats used to make the jump to Mars. Between modern engineering advancements, innovation, and
commercial partnerships, NASA has the potential to rapidly usher in a new, interplanetary age.
Not only is the competition a valuable experience for all involved, but it is also a way to collect
innovation for future use in NASA’s push for a sustainable lunar installment. For example, the
discovery of water just beneath the lunar surface has made it possible to imagine sustainable
research on the satellite and has shaped the rules for this year’s competition. Small concentrations
of water are present under just a few centimeters of lunar dust, so the main goal of this year’s
Lunabotics challenge is to design robots that can dig through several centimeters of sand and
retrieve the larger rocks that range in size from two to four cm beneath.
The goal of the project is to have a robot built to all the specifications of the competition, with the
capabilities to perform well at the given tasks. The engineering project the team faced to address
this goal included challenges from multiple disciplines including mechanical, electrical, and
software engineering. The interdisciplinary nature of the project necessitates high levels of sub-
team communication and coordination. Design choices were made to optimize compatibility and
transferability as the project will be passed down to future teams, and a key to incremental
improvement each iteration is a swift onboarding process. Because the rover is intended to mine
in locations like the Moon and Mars, which the team considered both the realistic constraints of
the simulated competition environment and the future constraints of those extreme environments.
2
Having provided information on the environments within which NASA robots may one day
operate on both the Moon and Mars, we will primarily use the defined specifications of the actual
operating environment of the competition. We documented the design decisions and all the
competition specifications. Finally, all the testing and results are depicted in the end.
3
Background
NASA Robotic Mining Competition
Figure 1. NASA’s Lunabotics Logo
“NASA is called to land American astronauts, including the first woman and the next man on the
Moon by 2024. We’re committed to achieving this bold goal. Through the Artemis program, we will
go to the Moon in a way we have never gone before – with innovative new partnerships,
technologies, and systems to explore more of the lunar surface than ever before. Then we will use
what we learn on the Moon to take the next giant leap – sending astronauts to Mars.”
-NASA Administrator Jim Bridenstine
The NASA Robotic Mining Competition (NASA RMC) began in 2010 and is now known as the
Lunabotics competition. The competition gathers student teams from 50 universities and tests their
robots that are designed to mine on the moon and mars. The challenge simulates the lunar surface
and requires each robot to dig, collect, and deposit as much regolith as possible within a 15-minute
time frame. Robots may operate via teleoperations, or with any blend of autonomy. Teams are
encouraged to operate autonomously and are scored higher if they do. Along with the competition,
there are awards in a variety of categories. Appendix A lists the available awards and requirements
which range from collection points to the Rookie Award for best performance from a new team.
These awards gave the team additional objectives to strive for while designing the robot for the
4
main challenge. Lunabotics has evolved over the years. This past year, an overall reduction in the
maximum dimensions and mass of the robot has forced many teams to rebuild from scratch.
Through this competition, NASA continues to challenge college students to engineer solutions for
some of the hardest problems involved in the sustainable exploration of the Moon and Mars.
Previous WPI Entries
WPI has been competing in the Lunabotics competition for years, each year improving upon the
next. This allows the team to stay competitive by improving different aspects of the robot based
on the team's skillset year to year. The rule changes this year has forced a complete mechanical
design overhaul and has forced the team to design under a much more stringent size budget.
Software In previous years, several WPI teams have based their robot design around Markhor, having forked
its’ software for their own needs on Github. This system followed a model-view-controller (MVC)
architecture. The view, or the GUI, was programmed with Java, its Swing library was used if the
robot needed manual control. The controllers, the interface between the models and the view, were
programmed in C#. Finally, the model, or the logic structured by the data in the problem, was done
in Python.
Past Strengths and Weaknesses The past WPI teams have excelled in different areas of the challenge. Markhor had an elegantly
designed dumping mechanism that had the digging mechanism completely move out of the way (
see Figure 2).
5
Figure 2. Markhor
The engineering was very elegant, but the digging mechanism was not quick or robust enough to
match the delivery system. The digger design became the efficacy chokepoint for the whole system
even though the delivery system could handle much more. Markhor also had scoops engineered to
break up gravel and collect it. But the chain system the buckets were attached to struggled to dig
to a valuable depth. Last year, the key aspect of the robot that needed to be improved upon - the
digging mechanism - was not the focus. Instead, the vision was improved. There was a small
change to the digging mechanism that did not improve the overall performance of the system.
Small mechanical errors caused many of the problems during the 2018-2019 project. Although
each team solved many issues, the digging mechanism was not improved. Learning from their
mistakes, the team will put more effort into the digging mechanism such that it might succeed
where others have failed.
6
Mars
Mars has been the target for humanity for decades. Not only is it very close to the earth, but it also
has a tolerable atmosphere and previous missions have discovered water there. The feasibility of
life on Mars is relatively high compared to other environments in space. A journey to Mars to
establish the infrastructure for long term research opportunities would require mining the water
below the surface, and this infrastructure will contribute to future establishments on the celestial
body.
Martian Atmosphere There are distinct differences between the atmospheres of Earth and Mars. The composition
breakdown is illustrated in Figure 3:
Figure 3. The five most abundant gasses in the Martian atmosphere plotted logarithmically (Dunbar &
Greicius, The Five Most Abundant Gases in the Martian Atmosphere, 2012)
The most prominent gas in the Martian atmosphere is CO2, a potent greenhouse gas. Although it
is the dominant component of the atmosphere, CO2 does not produce a greenhouse effect equal in
magnitude to that of Earth. On Earth, CO2 makes up only 0.04% of the total volume of gasses,
7
however, Earth is warmed by ~33°C by the greenhouse effect. By contrast, Mars only experiences
~5°C of warming from the greenhouse effect due in part to the lack of water vapor in Mars's
atmosphere, and because of the atmosphere’s low density. The mean atmospheric pressure on the
Martian surface is ~0.6% that of Earth's atmospheric pressure. This discrepancy is largely due to
the lack of a magnetosphere around Mars. Without the magnetosphere, solar winds “blow”
particles from the atmosphere and into space (see Figure 4).
Figure 4. The solar wind (beige streaks) rips molecules from Mars’ atmosphere. Orange lines represent high
energies of outgoing ions, blue low (Shirah, 2015)
As a result of low atmospheric density, the CO2 molecules are much less efficient at absorbing
radiation rebounding off the surface of Mars, thus the weak greenhouse effect (Planetary Sciences,
Inc, n.d.). The atmospheric pressure is also highly dependent on the Martian seasons. Mars
experiences extreme seasons due to its highly elliptical orbit, and axial tilt that is slightly greater
than that of Earth at 1.5 degrees. These two factors vary the amount of sunlight hitting certain parts
of the planet, which in turn drastically changes the temperature as there is very little temperature
moderation via the atmosphere as previously addressed. For example, during southern winter,
8
when Mars is at its apogee, temperatures can reach -125°C at the southern pole. In these winter
conditions, CO2 from the atmosphere turns from a gas to a solid and forms polar caps of dry ice.
So much CO2 de-sublimates that the atmospheric pressure drops ~25% or more and has been
measured from 8.7 Mb to 4.0 Mb (Caplinger, 1994). The trend can be seen in Figure 5.
Figure 5. A smooth curve of atmospheric pressure along with the same time frame.
Another notable feature of the Martian atmosphere is its height, which extends ~10km from the
surface of Mars, and has a total mass of ~2.5 x 1016 kg (Williams, 2018). Dust is responsible for
the red hue of the atmosphere, but the dust can be much more detrimental given more violent
weather patterns. Dust storms are common in the southern hemisphere during the spring and
summer, often covering wide swaths of land and lasting for several days. However, much larger
storms have been observed. These larger dust storms span the planet and can last for months. The
first observed global dust storm happened in 1971, seen by Mariner 9. Since scientists have
observed similar storms in 1977 (twice), 1982, 1994, 2001, 2007, and 2018. The storm in 2018
was responsible for decommissioning the Opportunity rover. These storms present an
unpredictable, dangerous factor in all Mars operations. Furthermore, they may be responsible for
molecular water loss over billions of years (Shekhtman, 2019).
9
Operational Challenges In addition to the significant atmospheric and temperature differences, operating on Mars presents
other challenges unlike any that can be experienced on Earth. Mars is, at any given time, between
54.6 million and 401 million kilometers from Earth, although humans have never observed Mars
at its theoretical minimum separation. This separation makes any direct teleoperation of a Martian
robot impossible because of the signal delay ranging between four and twenty-four minutes,
averaging around thirteen minutes. Operationally, that means that crews on Earth can, at best, react
in 26 minutes. Therefore, robotic operations must be autonomous and must avoid relying on
teleoperation for critical systems (Ormston, 2012).
10
2020 RMC: Lunabotics
As mentioned, dimensions for the robot were reduced in every direction to emulate an actual
payload that would be deployed on the Moon or Mars. The robot is designed for off-world
plausibility, all the physical processes, gases, fluids, and consumables must be capable of working
in extreme conditions beyond the Earth's atmosphere. Other design considerations include safety,
communication, and navigation.
Navigational aids of the system may not be higher than a quarter of a meter above the sieve frame,
cannot be permanently attached, or caused alterations. A forty-millimeter diameter, visible, red
“kill switch” is also required, and is just one example of the safety considerations taken.
The robot must be able to mine through approximately thirty centimeters of a lunar dust stimulant
called BP-1, which will expose a bed of approximately fifteen centimeters of an icy regolith
simulant bellow - which is gravel, as seen in Figure 6.
Figure 6. Icy Regolith simulant (gravel), at various sizes (Heiney & Johanboeke, RMC Icy-Regolith Simulant,
2018)
11
Current Rules and Restrictions
For a robot to qualify and be awarded points in this year’s competition, it must meet the dimension
specifications located in Table 2. If the robot exceeds them it may still compete, as long as all of
the other rules are met, such as safety, communication, and so forth. A team must complete a
systems engineer plan and paper, and conduct a public outreach program with an accompanying
report. Not completing these other requirements bars the team from this year’s competition. Along
with the changes to the robot’s dimensions, the competition runtime (see Table 3) was changed
from a ten-minute setup and ten minute mining time. The arena dimensions in Table 1 remain
unchanged since the previous year.
Arena Dimensions
Length (meters) ~5.4
Width (meters) ~3.6
BP-1, regolith simulant depth (cm) ~30.0
Gravel, icy-regolith simulant depth (cm) ~15.0
Table 1. Arena dimensions for 2020 competition.
12
Robot Dimensions
Maximum Length (meters) 1.00
Maximum Width (meters) 0.50
Maximum Height (meters) 0.5
Mass (kilograms) 60.00
Table 2. Robot dimensions for 2020 competition.
Robot Runtime
Robot Arena Set-Up (minutes) 5
Robot Mining Time (minutes) 15
Robot Extraction (minutes) 5
Table 3. Robot Runtime for 2020 competition.
Robot Deliverables
The deliverables for what was completed on the robot will address three main things; meeting the
new parameters set by the challenge, a new digging and driving design, and collecting more than
one kilogram of regolith in a single run. Along with new designs, there will be analysis for the
drivetrain including sinkage calculations, necessary angle for the excavator system, the placement
and speed of the deposit conveyor, and the torque required to reliably collect regolith simulant.
ROS will control the robot using a system of abstracted nodes, and packages. It will manage the
communication within the robot and handle any error conditions that may arise. Furthermore, it
13
will allow for easy readability and modification for future groups. A circuit diagram will be created
to lay out the hardware components and show the necessary power draw. Also, heat sink
calculations are available for the electronic components inside a dust-free encasement to ensure
they won't overheat and malfunction.
Additionally, some possible usages of regolith as a renewable energy source going to detail. In
preparation for the continuation of this project, the documents collected and created during this
year will be organized to make sure there are a few difficulties as possible. This means having all
of (but not limited to) the following documents available and organized: SolidWorks models,
analysis documentation, forms ordering and tracking parts, and a list of available parts/material
and their locations. The team also hopes to provide insight into what a project of this size will take
to complete, such as timesheets, and skill matrixes.
System Requirements Review
The team worked on and presented a system requirement review. This review ensured the team
had an exhaustive list of goals moving forward. This review was divided into NASA provided
constraints, functional requirements, non-functional requirements, and software requirements.
The robot must meet given specifications of the competition guidelines provided by NASA
including weight, size, bandwidth, power, and dust management.
Functional requirements are as follows:
• The robot shall complete two full cycles of operations within 15 minutes.
• The robot should be able to dig continuously.
• The digging wheel shall collect 1 kg of icy-regolith in less than 1 minute.
• The digging wheel shall operate in clockwise and counterclockwise directions.
• The containment shall be able to hold a minimum of 1 kg of regolith.
14
• The delivery conveyor shall deploy passively.
• The delivery conveyor shall discard regolith from icy regolith.
• The drivetrain shall not stall under a 61 kg load.
• The digging wheel shall be durable enough to accomplish ten 15-minute collects.
Non-functional requirements of the robot include the following:
• Base
• The base shall not have a mass greater than 45 kg.
• The drivetrain shall have at least 2 motors.
• The power distribution shall contain a clearly visible kill switch.
• All electronics shall be environmentally sealed against conductive dust.
• The containment shall sense when 1 kg is present.
• The delivery shall minimize dust production while operating.
• Digging Wheel
• The wheel shall not move the CG outside the bounds of the drive chassis.
15
Software requirements of the robot include:
• The robot shall not utilize a compass or GPS for path calculations.
• The robot shall be able to run using teleoperation.
• The robot should be able to run fully autonomously.
• The robot should be able to continuously monitor all systems.
16
Design
Mechanical
The NASA RMC: Lunabotics 2020 competition had revisions to the rules and requirements which
more closely align with, and support the parameters of, the Artemis lunar exploration program
(previously detailed). There were substantial changes to the parameters of the robot including both
the weight and size having been reduced, requiring the team to design a new model for the robot.
The previous robot had a much larger digging system than what can be implemented under the
new rules. To fulfill the new requirements of the competition the team decided to investigate
different and smaller methods to both maneuver and collect regolith through prototyping. The
results indicated that a completely new robot design, build, and software base was necessary, thus
warranted a new testing phase.
On top of designing a new robot, the team also restored the previous years' robot to operating
conditions for a groundbreaking event and put additional focus on other competition requirements
such as outreach, documentation, and team organization. With all of this in mind, a schedule of
due dates was generated and followed throughout the design process of the robot. The first stage
included in-depth research of new digging mechanisms, and other critical features; the
programming language, components that could be reused from the previous years' robot, and new
components that would need to be used. Following this research were the prototyping and CAD
design phases, which allowed for analysis and testing. Parallele with these steps included outreach,
program design, and manufacturing consideration.
17
Excavation Designs A major focus of the new robot design was developing an efficient and competitive digger. Since
a new digging design was necessary for the team to face up to more rigorous competition
requirements, multiple design iterations were explored before the final design was chosen.
Auger The first digging mechanism design considered by the team was an auger, and when compared to
the previous conveyor-style digging mechanism, a drill-type digging mechanism could improve
the excavating efficiency of the robot while conforming to new constraints (see Figure 7). Drill
digging has been used for boring throughout the 18th century. Typically consisting of a screw-
type attachment to a drilling motor, the digger is encased in a cylindrical body to contain dug
material. In the case of regolith mining, it should include filtration layers to sort material by size.
Below are sketches of the initial concept for the auger-type drill piece.
The current desire for the depth of the digging mechanism is forty centimeters deep which is
comprised of thirty centimeters regolith and ten centimeters of icy regolith. This would strike a
balance between a favorable regolith to sand ratio while limiting the depth the drill must reach.
18
Figure 7. Sketch of proposed digging mechanism, showing the internal chamber (top right), and the outer
drill shell opening to deposit material into the collection bin (bottom left).
The main issue with this design, however, is manufacturing something so unique. It did not seem
feasible due to the vast amount of expensive custom machining required to build the auger. It was
decided this was beyond the team’s capability, cost range, and subsystem weight limit. Finally, the
design would also complicate other subsystems to accommodate the unique design. Ultimately, it
was decided to abandon this design for the 2020 competition.
19
Bucket Wheel After ruling out the auger design, an excavator wheel design was investigated. This concept
includes an angled digging wheel which would have teethed buckets, spaced about the center to
collect material (see Figure 8 and Figure 9). Compared to the auger, this design seemed more
feasible in terms of manufacturing.
Figure 8. Excavator Wheel Assembly
Figure 9. Excavator Wheel Assembly expanded
20
One issue the team was able to solve with the bucket wheel excavator was how to guide the
material through and out of the wheel onto the conveyor while moving against gravity. To do this,
the team optimized the design of a “bucket insert” which could be 3D printed for each bucket
compartment inside the wheel as shown in Figure 10. The shape of this insert is curved to allow
material to pick up velocity, and it ends at a 30-degree angle to be sure the material does not slow
at the exit. It allows the material to gain speed on the curved edge, and by the time the material is
at the top, it exits the wheel horizontally.
Figure 10. 3D-Printed guide for excavator wheel
Although the team moved forward with prototyping and fully modeling the excavator wheel design
in CAD, the design was not chosen. Most notably, the design would not collect enough material.
This is due to its size and the power limitations of the motors. If a larger motor was chosen, it
would not have fit on the chassis. Another reason was the difficulty in maintaining a 30-degree
angle within such a small bot. The geometry required to accommodate this angle consumed an
excessive amount of space for the starting size restrictions. A four-bar mechanism was chosen to
position the digging mechanism. The team also had difficulty positioning the four-bar on the robot
due to interference with the containment conveyor, which could not be fixed without decreasing
21
the size of the digger. Overall, this could be a useful design if developed further, but the team ruled
this to be out of the scope for this year’s project.
Conveyor LOADER was ultimately designed with a motorized digging conveyor to excavate material. This
digger consists of the conveyor system, its motor, ten excavating buckets, and secure fasteners.
The belt system that was chosen is a sturdy polyamide profiled belt and pulley with customizable
size. The decision to go with a belt instead of a chain was based on the amount of surface contact
provided by the driven pulley to the belt which addresses issues with chain systems such as slipping
or falling off sprockets due to debris buildup. An issue in the past with other conveyor excavators
was the radius of the pulleys which is difficult for the buckets to maneuver about. Due to this, the
pulley chosen for LOADER was maximized. The conveyor is powered by a Falcon 500 motor.
Due to the high power this motor provides and customizable gear stages, it can be geared based on
required torque determined during testing. The bucket design was based on the tested Ibex buckets
which were already built and available for use. Some features of this design that will be reused
include the durable machined bucket teeth and the general shape of the sheet metal. This can be
seen in the comparison in Figure 11.
Figure 11. CAD model comparison of current LOADER bucket design (left) and the previous year’s robot
bucket design (right).
22
The fasteners on the buckets include a hinge and bump system to aid in revolving around corners,
as well as strong screws designed for the custom belt which can withstand 170N of perpendicular
force. Since the bucket is flat, when it revolves about the pulleys on the conveyor one side has to
“lift” up off of the bucket. The bump provides support at the back of the bucket, and the hinge
includes a spring preventing it from swinging forwards, as shown in Figure 12.
Figure 12. Excavator Conveyor with internal view
Wheels vs. Treads
The previous robot used tank treads to traverse the field. Treads have more contact area with the
ground than the typical wheel making them appropriate for a terrain composed of the lunar
simulant by distributing the weight of the robot over a larger area. Because of this, treads prevent
heavier robots from sinking into the fine simulant which can create drag and even immobilize the
robot. Treads are also able to manage obstacles without the use of a suspension. Unfortunately,
treads are not as effective at maneuvering when compared to conventional wheels, and this was
23
one of the numerous reasons the team considered when deciding to go with wheels. Along with
greater maneuverability, wheels are lighter than treads, which has become a key factor considering
new mass constraints. Using wheels, therefore, allows weight to be allocated into other sub-
systems such as the digging mechanism. The second advantage of wheels is space. Given the size
of the drive chassis, treads would use up over half of the width of the chassis. Finally, wheels
provide speed, lightweight (compared to treads), and overall maneuverability. The team also
believes that wheels will be easier and faster to produce.
To improve maneuverability, grousers are added to each wheel, as this increases forward motion
performance. Researching grousers as rover wheels, it was found that a fifteen-degree spacing,
ten-millimeter grouser height, and 1.5-millimeter thickness is was preferable for a 200 to 400-
millimeter diameter wheel (Liu, Gao, & Deng, 2008). The diameter of the robot's wheels are 254
millimeters, this indicates that twenty-four grousers should be added to the wheel. The study that
was done used completely solid wheels, and since the wheel is mostly empty and will be
lightweight, it was decided that twelve grousers will be attached. An angle of 30 degrees between
each grouser will help to keep multiple grousers in contact with the soil at once (see Figure 13).
Changing from treads to wheels requires sinkage calculations, torque analysis, and stress analysis
to determine how the wheel will perform in the simulated environment.
24
Figure 13. CAD model of LOADER wheel design.
Linkage
To actuate the digging conveyor, a four-bar linkage was chosen based on its versatility. It was
determined the four-bar links would be made from a strong material - aluminum tubing - to bear
loads in both operating positions and shear stresses on the driving link. It was determined early on
that a worm gear motor would be ideal for providing the desired anti-back drive condition while
in the static digging position due to its high gear ratio. To connect the four-bar links to the digger,
sturdy bearings and torque transfer couplings were chosen to ensure the best connection at the
links and the least amount of friction while rotating about the connecting points.
Base
Due to the reduction in maximum size discussed in Section 3.1, the entire robot needed to be
redesigned. Few components of the old robot could be reused, as they were all specific to the
previous robot and its goals. The team chose a lightweight, open frame design for the base to
25
accommodate the current design and allow for future modifications. The U-shaped base was
designed to maximize subsystem space and an operational range of motion (see Figure 14). The
main structure consists of two rectangular tubes that are each four hundred millimeters long with
a 50.8-millimeter by 25.4-millimeter cross-section. These tubes also serve as the housing for a
timing belt-and-sprocket drivetrain system. Enclosing those components inside ensured minimal
dust would come in contact with the belt. To further reduce dust build-up, 3D printed caps will go
on both ends of tubes.
Figure 14. CAD model of the base design, with electronics and battery side-support.
The front of the base has the wheels attached with Vex VersaBlock (Vex Robotics, n.d.), which
allows adjustment to the toothed belts. Room was allocated for the fasteners for other subsystems
26
and interactions with the pulley system have been avoided by using short screws that do not extend
past the threaded holes.
In addition to providing support, fastened brackets at the rear of the base house the drivetrain
motors and the electronics box. Since these motors are positioned under the containment system,
they also help to counterbalance the digging mechanism. On each exterior side of the frame is a
mount that holds one of the two batteries, a diagram of this is shown below in Figure 15. These
are placed between the wheels to give the motors and electronics box enough room to stay cool,
and to give the robot an even weight distribution so the robot is not front or back heavy. A post is
attached near the back of the base that gives the extra support to the conveyor once it is in its final,
extended configuration. This reduces the constant force applied to the conveyor pistons and adds
extra stabilization.
Figure 15. Image of LOADER, indicating the position of the battery mounts and conveyor guide rail.
27
Passive Lift Containment To meet the requirement of delivering material collected to a 0.6-meter-high collection bin, the
team designed a passive-lift conveyor. When designing the entire robot it was determined that the
containment system does not have to move after it reaches its final, extended position. Thus a
passive lift is the most advantageous, as it will consequently reduce the weight and power draw.
The passive lift mechanism utilizes a sealed, damped spring which is much lighter than the motor
alternative. Since there is no reason for this containment to return to its original position, this is
the best place in the design to eliminate electronics and power draw. The containment system will
only need to fit within the size constraints at the beginning of the competition because once it is
deployed, the size constraints (from the competition rules) no longer apply. The full design (as
shown in Figure 16) is lightweight, with polycarbonate sides and front of the container, and a
lightweight belt from a treadmill that makes up the conveyor.
28
Figure 16. CAD model of the LOADER conveyor subsystem.
The front piece has cutouts for the teeth of the digging mechanism to come through, as well as
help sift through the regolith and only allow icy regolith to be collected. The slides, located at the
bottom of the subsystem, seen in Figure 16, indicate how the team has made a passive sealed,
damped spring work without the sealed, damped spring: the slides move the conveyor with the
springs with relatively no friction. The full movement of the containment conveyor is moving back
and up. A Vex 775pro motor will power the conveyor, and move the material from the containment
area to the delivery area. It was chosen because the motor has been used in the previous robots
meaning it was already in the inventory. The 775pro has enough power to move the icy regolith
from the robot up to the 0.6 meters drop off point using the grousers on the belt. When working
with the motor it will not pass the first twenty percent on the motor graph shown in Figure 17.
29
Figure 17. Chart for 775pro motor, showing operational levels of multiple aspects (VEX Robotics, n.d.).
Final CAD Over numerous iterations, a final Solidworks model was created. Three different configurations
were modeled, with the figure below (see Figure 18) being the starting configuration.
Figure 18. Side view of LOADER in the starting, compact configuration.
30
This configuration is only used at the beginning of the match to stay with starting restrictions of
the competition. After the competition has started, LOADER switches to the driving configuration
(see Figure 19).
Figure 19. CAD model of the full LOADER in its driving configuration.
The driving configuration is used throughout the match to get to and from the dig site and dumping
location. As shown in the figure, the digger is slightly extended and the drop-off conveyor is
positioned in a collection manor. Finally, in Figure 20, an isometric view of the terminal digging
configuration is shown. When in its terminal point, the digger can reach to deepest elements of the
field. This allowed for the most optimal collection of the icy regolith.
31
Figure 20. CAD model of the full LOADER system in the digging position.
Mass Budget
Per the requirements of the competition, a mass budget is required to track the weight of the parts
and the weight of the entire robot. To complete this, material properties are added to each part of
the CAD model or taken from specifications – such as with motors for example. As the materials
are updated the weight is added to the Product Breakdown System (PBS), which is explained in
more depth in Section 6.2. In short, the PBS is a master list of parts for the entire system, which
includes part counts and weights. This keeps track of the overall weight. Additionally, after the
estimated weight of the robot is determined, and the robot is completed, an actual total must be
determined to ensure the system meets the competition requirements.
The three main considerations when deciding on the weight of the robot were: 1) maximum load,
2) competition points, and 3) future changes to the system. The maximum weight the robot can be
is sixty kilograms, and as the robot only needs to deposit one kilogram, the robot’s operational
32
weight is calculated using sixty-one kilograms. The second consideration is for future
improvements. The total weight of the robot shall be forty kilograms, as opposed to the maximum
because a robot that weighs sixty kilograms does not score any weight-based points and leaves no
room for future additions. For every kilogram the robot weighs, the team is deducted 8.00 points;
the less the robot weighs the fewer points the team is deducted, with a maximum deduction of
480.00 points. A forty-kilogram robot costs the team 320.00 points from the overall score.
Future changes to the system were taken into consideration as well. Giving the next team a mass
allowance to make changes to the design is crucial for successfully continuing this project.
Allowing for up to twenty kilograms of mass to be added to the system creates a reasonable buffer
for improvements to any subsystem, electronics, power, or digging/depositing.
Electronics
The Base power distribution system consists of two twelve volts, twenty-two amp-hour, Sealed
Lead Acid batteries in series. This capacity has an 11% margin over the required 467 watt-hours
for the fifteen-minute regolith collection. The electronics of LOADER are distributed into different
voltage lines and data transfer lines, as shown in Figure 21. The Jetson, Rasberry Pi 4 CPU, and
HERO Boards will be encased in an aluminum enclosure with a removable lid for protection
against the regolith dust. The enclosure will also perform as an aluminum heat sink for the
electronics. The energy consumed by the robot will be recorded with a “Commercial Off-The-
Shelf” (COTS) electronic data logger and be visible to the judges after the competition. Per the
competition rules, and an emergency stop switch is required.
33
Figure 21. Diagram showing the connection of all the electronics on the robot.
The Robot contains five motors: two 24V motors and three 12V motors. The two 24V motors
control the drivetrain. Each one of the three 12V motors controls one of the following subsystems:
the bucket conveyor, the four-bar linkage, and the offload conveyor. The drivetrain motors are
both controlled using an ODrive motor controller that allows fine-tuning of PID, easy
compatibility with ROS, and limits the current provided to the motor. The bucket conveyor motor
is controlled through a CAN bus, beginning at a HERO Board and
ending at the voltage regulator module. Two Talon controllers regulate the voltage to the four-bar
and conveyor motors and send data from the encoder back to the HERO Board.
The robot also contains four sensors: a potentiometer, an IMU, and two IR sensors. The
potentiometer is connected to one link of the four-bar mechanism. It can read the position of the
four-bar at any given time. The IMU is used to determine if the robot's tilt angle is too great in
34
situations such as driving over a rock or entering a crater that is too steep. Finally, the two IR
sensors are used to check how much material is in the collection bin. If neither sensor is providing
a signal for a given amount of time, it is assumed that the bucket is empty. If the upper sensor
provides a signal the bucket is full of icy regolith.
Software
This year, the Robot Operating System (ROS2) was used to control the robot. With ROS2, the use
of an action-based communication environment was utilized to send and receive data from the
sensors on the robot. There are several components of the software structure that operate outside
of the ROS network. Finally, the software has a sequence-based design and will be using a database
to track the robot’s states and save states across hard system-restarts. This functionality enables
the robot to pick up a given sequence of actions from any point along the process, even after a
complete shutdown.
ROS2 The decision to use ROS was supported by several factors. First, ROS is a standard in robotics.
This means there is widespread support for many of its capabilities and will therefore be much
easier to work with. Several teams in the past have worked with ROS, and there was more team
familiarity with the environment. It will also be easier for future teams to onboard, assuming it is
more likely that they have worked with ROS rather than any other framework.
ROS2 has support for a variety of libraries that are geared towards completing common robotics
tasks, navigation for example. These libraries can be leveraged to accomplish the complex tasks
that the competition requires.
ROS2 is not optimal for real-time operations. This is the biggest drawback of the framework.
However, the competition does not require a real-time response for success. Measures were also
35
taken to reduce the bandwidth usage across the ROS network at any given time during operation.
The high-level ROS2 network design is shown in Figure 22.
Figure 22. Initial Structure of the Robot’s Software Environment
The decision to implement ROS2 has driven many of the following design choices. The initial
ROS structure was much more complex, including many more components of the robot. The initial
diagram is shown in Figure 23.
36
Figure 23. Initial ROS Diagram
To maintain efficient use of bandwidth the team encapsulated much of the complexity seen above.
First, the visual subsystem was removed from the ROS network, thus encapsulating all the raw
visual data communications into a node that will only communicate processed data across the
network. Furthermore, all the services and topics have been replaced by actions. Actions will be
discussed in the following section. Similarly, there is no Raspberry Pi represented in the initial
graph. This change will also be discussed in later sections.
37
Actions Actions are a compound communication method implemented in ROS and improved upon in
ROS2. Actions initialize a command, receive status updates while the command is executing, and
then receives a completion message. This communication protocol replaces both the ROS service
and the ROS topic. The topic operates on a publish-subscribe method, while services operate on a
request-response method. Topics are used for continuous data transmission. The robot requires
constant transmission on the motor controllers and the IMU sensor. Services are needed for non-
continuous data transmission. The robot requires non-continuous data transmission for the real
sense camera, potentiometer, and database system. Actions can fulfill both needs. An example of
ROS2 action communication is detailed in Figure 24.
Figure 24. Example of a ROS2 Action
The first message sent is a goal. This goal is one node asking for another to perform the task and
return when the goal is reached. Once the goal is accepted, the action server waits for feedback.
Feedback can return continuously until the goal has been accomplished. Once the goal is
completed, or an error case arises, the receiving node will return a result to the action server. The
goal and result take the place of ROS services, and the feedback takes the place of ROS topics.
38
The types of data accepted across each different action can be configured to the specific needs of
that communication channel.
ROS1 vs ROS2 ROS1 has been the standard for a long time. Very recently, ROS2 has become prominent. ROS,
being an open-source project, thrives because of its community, and the community has not had
as long to generate as useful, and robust packages for ROS2 as it has for ROS1. However, with
the ROS2 package, a ROS Bridge was introduced which allowed for cross-communication
between the two releases. The bridge also made it possible to implement any of the packages that
might be available for version one and not version two. This is the biggest drawback for ROS2,
user support, and package availability. The ROS Bridge allays this concern. The team decided to
switch to ROS2 for two main reasons. First, ROS2 can be run on windows, whereas ROS1 must
be hosted on a Debian based machine. Although this is not a design consideration, as Ubuntu is
being used for the operating system on the PI, it is an ease of use factor that was deemed to be
important as well. Secondly, ROS2 had a much better implementation of ROS actions. Although
ROS1 does have actions, they function as more of an afterthought in all of the documentation, and
in the implementation across many open source packages. Actions are a central method of
communication in ROS2.
Raspberry Pi 4 At the beginning of this project, the main processor on the robot was the NVIDIA Jetson Nano.
The Jetson is optimized for image processing and has a dedicated GPU, however, aspects of the
Jetson prevented the computer from being fully utilized. When it became clear that the Jetson was
not capable of handling image processing as well as central communication, the team looked for a
replacement. During the formation of this solution, it was decided that the Jetson would still be
39
used to process the data coming from the cameras. To decentralize the ROS network, optimize the
connections to the rest of the sensors, and encapsulate the area of highest data traffic, the team
decided the Raspberry Pi 4 was a better main processor.
The Raspberry Pi 4 was a clear choice as it solved several issues. ROS is much easier to implement
on the Pi, and the Pi has more general-purpose input/output (GPIO) pins. Although the processing
specifications of the Jetson and the Pi are very similar, the Pi was more than capable of handling
the ROS2 network without the image processing load. Secondarily, when the Jetson was being
used as the main processor, many of the sensors were going to be connected over USB. Since the
Pi has 40 GPIO pins, the team no longer must rely on USB. This is beneficial because each USB
is assigned a different serial port every time the board is powered on.
Teleoperations vs Autonomy The Lunabotics competition offers three different ways to complete a trail: Teleoperation, Partial
Autonomy, and Full Autonomy. Teams have the opportunity to earn five-hundred more points
competing with full autonomy instead of teleoperation. Not only does this provide motivation for
a fully autonomous robot, but it is also reflective of real-world applications when communicating
with rovers on other planets. Due to the distances between Earth and other planets, the signal time
for the robot to receive commands from Earth is so delayed that the robot could not be properly
operated in real-time. Using autonomy will allow the robot to control itself and make decisions
that could not be done in the timeframe of teleoperation, therefore operating under more realistic
conditions. This, however, does not negate the need for a teleoperation system. In the case of
autonomy failure, teleoperation would have to be used. Also, teleoperation provides it’s usefulness
when testing mechanical subsystems before full deployment.
40
Sequence Methodologies The first step for designing a sequence-based software structure is brainstorming events. Then the
events are rated based on how critical they are to completion of the overall task. A critical aspect
of event generation is cross-team communication which is equivalent to requirement gathering for
a typical software deliverable. The Software Sequence Model contains all the event cases that the
team initially tried to address before extensive testing took place (Bimonte, et al., Software
Sequence Model, 2020). They aided in the creation of the sequence diagrams that outline the flow
of events, how they will be handled, and in what order they must execute (see 0). The two of those
documents make up the robot’s high-level design.
Vision and Navigation According to the rules for the Lunabotics Competition, the robot must travel to the digging site
while avoiding different obstacles on the field. This requires the use of multiple different methods
of tracking and image recognition techniques to be able to track obstacles. For this, NVIDIA’s
Jetson Nano will be used, along with OpenCV and OpenVSLAM libraries to map the environment.
To process the depth images from the two RealSense cameras, a program will identify the pixels
of the image that represented a boundary. The program would begin by opening a pipeline to allow
the camera to record data to the Jetson Nano. Once this pipeline is open, the camera can begin
imaging. After waiting between five and ten frames to initialize the image, the camera can capture
the image. Using a Depth Sensor field and distance gradient image, the image processing program
captures the total width and height of the image in pixels. The program will proceed to iterate
through each pixel, placing the distance reading into a multidimensional array and CSV file. In
this step, the processing program would compare the pixel to surrounding pixels, looking for a
large difference between the surrounding pixels. If a large distance was detected, the closer pixel
41
would be flagged as an obstacle and put into the CSV accordingly. In theory, this would mark the
outline for all obstacles, simplifying the SLAM pathfinding algorithm. Upon completion of the
data transfer into the CSV file, the pipeline will close, turning off data collection from the camera.
To end the program, the Jetson Nano sends the collected CSV file via ROS to the Raspberry Pi 4,
where the data is pushed into LOADER’s sensor database and to the SLAM pathfinding algorithm.
The image processing program can be called as rapidly as necessary to collect sufficient data for
SLAM pathfinding.
NVIDIA Jetson The NVIDIA Jetson Nano is a micro-computer commonly used for image processing due to its
onboard GPU compared to other micro-computers, such as the Raspberry Pi 4, that do not possess
one. This leads to a noticeable speed reduction in the processing of images, which is a critical
feature necessary for autonomous navigation since a delay in image processing could result in
obstacle collision or complete system failure. The task for the Jetson is to produce a map given the
images from the two cameras to be used for pathfinding.
Intel RealSense To view the field, two Intel RealSense D435i Depth cameras will be utilized, with one camera in
the front and one in the back of the robot. This system takes images from three different kinds of
cameras: an RGB, infrared, and depth. The RGB image displays what would be perceived by the
human eye allowing for the robot to be teleoperated while the depth camera allows for the
calculation of the distance to a point on the field. The depth aspect of this camera will be used for
localization and mapping algorithms that will determine the pathing of the robot. This is achieved
in part by using a custom library from Intel, called LibRealsense, to gather the depth information
from each of the cameras. The data is then processed into a user interface that displays distances
42
in meters via the color spectrum. Objects closer to the camera are displayed as blue while farther
objects are displayed as red, as shown in Figure 25.
Figure 25. Image captured from Intel RealSense displaying the distance gradient in meters, with an RGB
image for comparison.
Simultaneous Location and Mapping Simultaneous Location and Mapping (SLAM) keeps track of both where the robot is and the
environment around the robot. The input to the algorithm is a sequence of states. At each state, the
robot knows the commands it was given to get to the current state, and the observations it has while
in that state. The goal of the output is to map the environment and accurately know the path the
robot has taken. Once the robot reaches the next state, the previous outputs are used to approximate
the next set of goals at this new state. In practice, however, there is the possibility for sensor error,
which is solved using a probability function (see Equation 1), which is explained in detail in the
next paragraph.
𝑝(𝑥𝑡|𝑥0:𝑡−1, 𝑧1:𝑡−1, 𝑢1:𝑡) = 𝑝(𝑥𝑡|𝑥𝑡−1, 𝑢𝑡)
Equation 1. The probabilistic law characterizing the evolution state (Mahroos, Hassan, & Shaaban, 2011)
43
Figure 26. The dynamic Bayes network that characterizes the evolution of controls, states, and measurements
(Mahroos, Hassan, & Shaaban, 2011)
The state at time t is dependent on the state at time t – 1 and command ut. The measurement at zt
depends on the state at time t (Mahroos, Hassan, & Shaaban, 2011). A visual representation of this
can be found in Figure 26. From this, it follows that for each next state x, the probability
distribution grows, the uncertainty amplifies. At the next state, the robot can observe an obstacle
that it has previously observed to reduce the uncertainty of the location of the obstacle. Since
position xt was dependent on position xt-1 and the observation at that position depended on the
distribution of the obstacle, the algorithm can refine the probability distributions of all the variables
that depended on the old distribution for the location of the obstacle. When the SLAM algorithm
has generated a full map, simple navigation could then be used with the map. This leads to greater
movement speed due to all obstacles and locations being mapped.
OpenVSLAM and OpenCV To achieve the goals of SLAM pathfinding, the team implemented the library OpenVSLAM. This
library provides the ability to use different types of cameras and models to implement a SLAM
algorithm, allowing the algorithm to utilize all camera modules on the RealSense cameras as
needed. OpenVSLAM also allows maps to be easily stored and loaded to the Raspberry Pi. This
44
simplicity makes the process of localizing new images simple, as previously stored maps are
continuously updated with the newer images as seen in Figure 27. With the simplicity, benefits,
implementation, and testing of OpenVSLAM, the library proved to be integral to pathfinding and
field mapping.
Figure 27. Implementation of OpenVSLAM for obstacle detection and field mapping
Along with the OpenVSLAM library, the OpenCV library played an important role in field
mapping. The OpenCV library allowed objects to be located using fiducial images. A fiducial is
an icon that, when viewed by a camera, can give alignment data to the robot. The fiducials each
have identifying marks that the library can track (see Error! Reference source not found.). Since
each fiducial is unique, the OpenCV library can identify each fiducial used, and store that
information for future use. This library can also determine orientation based on the robot's location.
Using the dedicated GPU on the Jetson, the robot can use the image of a fiducial to determine its
orientation on the field. Combining the orientation features given by the fiducials with the depth
processing of the RealSense cameras and map of the field from OpenVSLAM, the robot can
determine its exact position relative to the fiducial. This process makes alignment with the
collection bin significantly more reliable, to ensure LOADER can deposit the collected material.
45
For LOADER to align with the collector bin it will use the rear RealSense camera’s RGB imaging,
to locate a fiducial that will be placed on the bin.
Figure 28. Example of fiducial that would be placed on the collection bin.
Communication With ROS being the method of communication between each subsystem, it was necessary to
provide hardware and electronic methods to send commands from the Raspberry Pi. This involved
multiple different computer systems and sensors to collect and transmit data. ODrive motor
controllers allow for the communication to the drive motors. A HERO Development Board from
Cross The Road Electronics provided a communication channel to the digging and delivery
conveyor, utilizing CAN communication to send precise data. Lastly, Ethernet connection and
IMU data provided sensor support and communication to the WiFi bridge to ensure sufficient data
transmission and collection.
HERO Development Board To control all non-drive motors on LOADER, the team opted to use a HERO Development Board.
This board, programmed in C#, allowed for the use of already-owned speed controllers, Talon
SRXs. These speed controllers, generally used in the FIRST Robotics Competition, allow for
46
precision control of brushed motors. To complement these, Falcon 500 motors from VexPro were
chosen, which are brushless motors that have a built-in speed controller.
Serial communication will be used to transmit data using the HERO development boards. With
unused GPIO pins being unused on the Raspberry Pi, along with the simplicity of data transfer,
serial communication seemed to be the most beneficial. This led to testing two different methods
of serial communication: UART (Universal Asynchronous Receiver-Transmitter) and I2C (Inter-
Integrated Circuit). Both UART and I2C are commonly used in master-slave software
environments, like the relationships between ROS nodes, and there is documentation for this
method of serial transmission over ROS. Using the ROS library, rosserial, the data from the
Raspberry Pi main node can be transmitted to the HERO boards and vice versa. The HERO boards
will then be able to send commands to the attached motors using CAN. Controller Area Network,
or CAN for short, is a protocol that has been widely used in the FIRST Robotics Competition over
the past few years and by the industry for longer. Compared to the Pulse Width Modulation (PWM)
protocol, CAN allows for the sending and receiving of messages to and from each component on
LOADER, whereas PWM can only send integer values (0 to 255 generally). This was an important
feature due to the Talon SRXs having built-in encoder ports that the team wanted to utilize. Also,
CAN could be used to send an operating voltage or amperage for each motor instead of a speed
percentage to run at.
IMU An inertial measurement unit (IMU) allows the Raspberry Pi to collect orientation data for the
robot. Through the use of an IMU, LOADER can determine its initial orientation on the field, as
well as the direction of motion and other forces applied to the robot. There are IMUs in both
RealSense cameras which will not be turned off with other functions of the camera per rules of the
47
competition. The main IMU for the robot will be located at the center of mass. Similar to the
HERO Boards, the IMU communicates with the Raspberry Pi through serial communications,
specifically I2C. The Raspberry Pi powers the IMU and communicates via the second and third
GPIO pins. Unfortunately, due to the COVID-19 pandemic, the IMU sensor was not fully
implemented nor tested.
Ethernet An ethernet communication system was chosen to tie each component together and connect to the
WiFi bridge. The team chose the ethernet standard over other standards, like WiFi or Bluetooth,
to diminish the possibility of signal interference, and to avoid increased data transfer during the
competition. In addition to avoiding signal interference, theoretical transfer speeds are greatly
superior to both WiFi and Bluetooth, due to the nature of Ethernet. Lastly, in the team's limited
testing, the ROS system was able to connect with greater ease to each of the components compared
to the wireless counterparts.
Database State monitoring is an important part of any well thought out robot design. The team decided to
implement a MySQL Database that keeps track of the various information on the robot broken
down by individual trials, subsystems, and components. When designing a database, the creation
of an Entity Relationship Diagram (ERD) is a key step. It allows for the thoughtful planning of
each table and their relations to each other. The ERD for LOADER, as shown in Figure 29,
represents how every component of the robot relates to each other.
48
Figure 29. Entity Relationship Diagram (ERD) for LOADER
49
Component Relationships One of the most important aspects of the above ERD is the Component table. The Component
table holds all the various components that exist on the robot that can be tracked. This can range
from the batteries and motors to the cameras and sensors. Each component is given a unique ID, a
name, and a device type to go along with it. Figure 28 not only depicts this but also shows how
motors and sensors pair to each other.
Figure 30. The Component table as represented in the ERD
The “Pairs” relationship indicates how a sensor pairs to a motor. In the diagram above, a Sensor
can only be paired with a motor; however, a motor can be paired with multiple sensors. This is an
important relationship to note as an encoder, for example, cannot be paired to more than one motor.
However, with that same logic, a temperature sensor and an encoder can both be paired to a single
motor to monitor various aspects of it.
50
Subsystems
Figure 31. The Component and Subsystem tables and their relationship
A subsystem is a part of the robot that performs some sort of task, like the digger mechanism. A
subsystem is made up of a unique ID, a name, and a description. As shown above, in Figure 31, a
component can only belong to one subsystem. However, a subsystem can have multiple
components. This is shown by the “Contains” relationship, which takes the ID of the component
and pairs it with the ID of the relevant subsystem. This allows for both high- and low-level views
to monitor the state of the robot by subsystem or as one system.
Trials Adding in a Trial table allowed saving historical data for debugging and improvement tracking.
Without a Trial table, data that was specific to a previous trial would be cleared each time at the
system started. A Trial is first composed of a unique ID, Type (Practice vs Competition), and a
Start Time. After the Trial, the Stop Time and Completion Status are then updated. By taking the
difference (delta) of the two times, it is possible to determine if the robot completed the intended
goals in the fifteen-minute time frame and if it had gotten better since the last trial.
51
Figure 32. The Run and Subsystem tables and their relationship
In addition to calculating the start and end time of the trial, keeping track of the trial times for each
subsystem was equally important. Since a subsystem could run multiple times in each trial, the
relation was kept as many-to-many, meaning that multiple trials could be linked to multiple
subsystems and vice versa (see Figure 32).
52
Message Relationships
Figure 33. Individual Components and their relationships
Keeping track of the runs and the times of each subsystem is important, but this alone is not
sufficient. This is where individual component tracking must be utilized. As shown in Figure 33,
everything is broken down based on the following component types: Battery, Camera, Motor, and
Sensor. The tables for each of these components, apart from the Sensor table, get a steady stream
of information. All the tables have three foreign keys assigned to them, except the Battery table
which has two, to relate the data in the tables to higher-level views. These foreign keys are the
Component ID, the Subsystem ID, and the Run ID. The Sensor table does not follow the same
style as the other three tables since there are different types of sensors that are present on
53
LOADER. Shown in the ERD subset in Figure 33, the IMU table uses the relationship “IsA” to
denote that it is a sensor. The “IsA” relationship denotes that an input is a certain type, such as a
sensor in this case. That table, along with any other sensor-typed table, is the table that will
continuously receive information.
54
Analysis
Development and Prototyping
For prototyping, designs were first modeled in SolidWorks to portray the current design as
accurately as possible. Prototypes were made with available materials that were laser cut and
assembled. The initial design that was created was for the excavation wheel and conveyor, which
was created out of wood Figure 35.
Figure 34. Laser-cut wooden prototype assembly
The purpose of the excavator wheel prototype was to allow for the angle of the conveyor's arm to
be changed. Two prototype buckets were designed and 3D printed, along with an internal insert to
allow for a change in angle to guide the collected material through the bucket Figure 35.
Figure 35. 3D printed internal insert for the prototype buckets.
55
The two plausible 3D printed bucket designs could be attached in up to four spots, and the angle
of the excavator relative to the ground could be altered from 0 degrees to 30 degrees with 2-degree
increments. As the angle increased the effectiveness of the passage of collected material increased.
The changing from a straight bucket to an angled bucket to match the excavator wheel angles
proved to be a vast improvement in both collection rate and amount (see Figure 36). This allowed
for a variety of test situations and data. Each bucket has different style teeth, angled head, and
intake opening. Videos were taken of each of the tests to show the different collection rates of the
buckets.
Figure 36. Prototype CAD Model of the 0-degree bucket (left) and 30-degree bucket (right).
Overall, this test method was successful in providing the team information about how much
material will be mined per revolution and the feasibility of the design. Unfortunately, the overall
lack of digging payload led to the elimination of the wheel and attachment of the digging buckets
to a conveyor belt instead.
56
After the design phase of the excavator conveyor, the CAD was updated to reflect the new
mechanisms involved and was further developed to support motion analysis. During the CAD
stage, component interactions were troubleshot; the center of mass properties were adjusted
through iteration; overall mass was calculated to ensure it was within limits; and the team modeled
all additional fasteners, tubing, and small parts that would be necessary.
To find the force required to dig through sand and rocks at the bottom of the digging conveyor, a
prototype (seen in Figure 37) was created. This consisted of two pieces of wood that were used for
the base and a third piece of wood with a bucket (from the previous team) secured with duct tape,
all connected with a single axle. The base wood pieces were used to secure the structure in the
sand, with the bucket-mechanism positioned to be able to collect a full load of material. To get the
force required, a spring scale was also attached to the central piece of wood. The bucket was
attached at two configurations: flat, and approximately 10 degrees. These configurations were
chosen to represent the two general orientations the buckets will be in when traveling around the
end of the conveyor belt. It was necessary to get these three different configurations as the force
required would vary, as seen in Table 4. The spring scale was located 314 millimeters from the
axle, and the tip of the bucket was located at three different distances, depending on the
configuration.
57
Configuration Distance
(mm)
Peak Force
Gage (N)
Peak Force
Calculated (N)
Steady Force
Gage (N)
Steady Force
Calculated (N)
Flat 101 11.5 36 3.2 10
Angled 143 4.5 9.7 1.2 2.6
Table 4. Distance from axel to bucket teeth in two configurations for digging conveyor testing, the average-
peak and steady forces read on the gage and the calculated forces at the bucket teeth.
Figure 37. Conveyor Excavator Prototype to calculate digging force in flat-bucket orientation (left), and a
closeup of the bucket at approximately 10 degrees (right).
Once the digging force was determined from the prototype, it was used to develop static analyses
on the bucket conveyor and four-bar mechanism. These analyses were used to determine the power
requirements needed from motors before ordering them, and also strength requirements for the
fasteners of the bucket on the pulley. The prototyping phase did not include every mechanism in
58
the robot, so the SolidWorks Analyses were a major part of showing how the mechanisms would
likely perform in real life. Overall, the analysis phase covered each aspect of the robot using either
real-life models or SolidWorks data and theoretical calculations.
Sinkage Calculations
Sinkage calculations were conducted to evaluate how deep the wheels would sink into the regolith
with the design. The result from these calculations were determined to be negligible, which
assisted in the determination of the wheel dimensions. The sinkage calculations used the following
equations: rigid wheel sinkage on the extraterrestrial surface, Becker pressure sinkage equation,
and Reece pressure sinkage equation (shown in Equation 10 and Equation 11 located in Error!
Reference source not found.). Error! Reference source not found. shows the results of the
sinkage calculations. The wheel dimensions were verified using the pressure sinkage equations.
Using the surface area from the wheel equation, as well as the properties of the regolith simulant,
the final acceptable sinkage for the overall design was determined. The sinkage turned out to be
so tiny, well under the ten-millimeter cutoff decided by the system requirements, that it was
determined to be negligible to the point it would not have to be considered unless the wheels would
rotate at an extreme speed.
59
Variable Number Units
Sinkage 1.10E-03 Meters
Mass on each Wheel 15 kg
The cohesion of Soil used 1 kPa
Table 5. Sinkage Calculation for Robot Wheels
Speed Calculations
The robot's speed is based on the size of the field (5.4 by 3.6 meters), the allotted time available
(fifteen minutes), and a conservatively estimated path taken (seen in Figure 38). The N-shaped
path was chosen to ensure the speed of the robot would be more than sufficient to navigate the
field, collect material, and deposit it in the given time limit. Although the robot would not travel
along the walls of the field, using these values add a factor of safety into the calculations as well.
Figure 38. Potentially longest path for robot navigation used to calculate robot speed.
60
To calculate the distance of the least optimal path is shown in Equation 2. The travel time of the
robot will be less than the fifteen minutes allotted, as there are two digging and two depositing
stages that will occur. If the time for travel is reduced to five minutes and with this path needing
to be completed four times, the speed at which the robot must travel to complete the N-shaped path
is 12.96 meters per minute (shown in Equation 3). Each lap would be completed in seventy-five