American Institute of Aeronautics and Astronautics
1
Small-Scale Payload Operations Simulator for Proximity
Operations
Kristia K. Harris1, Jared M. Cokley2, Sean R. Holden3, Bogdan Udrea4, Shane T. Stebler2 and Blake E. Williams2
Embry-Riddle Aeronautical University, Daytona Beach, Florida, 32114
Michael R. McGarvey5
VisSidus Technologies, Inc., Daytona Beach, Florida, 32114
and
Michael V. Nayak6
Red Sky Research, LLC, Albuquerque, New Mexico, 87108
The process and studies that make-up the design, construction, and testing of a desktop
size spacecraft simulator are detailed. The spacecraft simulator is of the essence to validate
and verify the requirements and science algorithms of the Embry-Riddle Aeronautical
University (ERAU) cubesat, Application for Resident Space Object (RSO) Autonomous
Proximity Analysis and Imaging (ARAPAIMA). The simulator is designed with simple
commercial-off-the-shelf (COTS) hardware integrated in a way that allows for accuracy
and feedback control in a small form factor.
Nomenclature
J2 = differential drag
R = revolultion
x = number of steps
n = step mode
D = duty cycle
I = time
Vm = motor power supply
Vref = input voltage range
I. Background
ARAPAIMA is the basis for making and testing the desktop spacecraft
simulator. The mission statement of ARAPAIMA is a 6U cubesat that
autonomously navigates and maneuvers in close proximity to a RSO to
perform visible, infrared, and 3D imaging of the RSO, without prior
knowledge of the shape and attitude state, with sufficient accuracy to
autonomously plan rendezvous and docking maneuvers with the RSO. For
this mission, the RSO is planned to be the spent upper stage of the launch
vehicle.
1 MS Candidate, Department of Mechanical Engineering, Embry-Riddle Aeronautical University. 2 Senior Student, Department of Aerospace Engineering, Embry-Riddle Aeronautical University. 3 Junior Student, Department of Electrical, Computer, Software & Systems Engineering, Embry-Riddle Aeronautical
University. 4 Professor, Department of Aerospace Engineering, Embry-Riddle Aeronautical University, and AAS Member. 5 Chief Financial Officer, VisSidus Technologies, Inc., Daytona Beach, FL. 6 Principal Research Scientist, Red Sky Research, Albuquerque, NM, and AIAA Member.
II. A
Figure 1. Image of the ARAPAIMA
Cubesat
American Institute of Aeronautics and Astronautics
2
The satellite payload consists of two commercially available cameras, one for imaging in the visible spectrum and
the other for imaging in infrared (IR), and a miniature laser rangefinder (LRF) with a range of a few kilometers. The
three instruments are installed on the nanosat so that their apertures point in the same direction. The specifications of
the payload instruments are shown in Table 1. Using the three instruments, the nanosatellite performs visible and IR
imaging of a resident space object (RSO) for passive optical relative navigation and to take distance measurements
with the LRF for the generation of 3D point clouds of the RSO.
Table 1. ARAPAIMA payload instruments
Instrument Size (mm) Mass (kg) Power (W) Cost (1000 $)
MLR2K 34x54x90 0.115 1.5 13.0
GA640C 26x26x26a 0.027 1.5 28.9
STC-CL202A 28x28x37b 0.090 4.0 1.5
Values for power show peak consumption. a Without optics, b Without optics
The ARAPAIMA mission consists of three mission objectives which define the success criteria of the mission:
1) Determine the 3-D shape of
the RSO without catalogued
comparison.
2) Autonomously navigate and
safely maneuver in close proximity
to the RSO, in LEO.
3) Estimate the attitude state
of the RSO by remote observation.
The mission objectives of
ARAPAIMA are achieved in five
steps of increasing complexity.
During the first two steps, the
cubesat is commanded by ground
control to maneuver within LRF
range of the RSO and acquire a
relative circular orbit with respect
to it. The third step consists of
ARAPAIMA maneuvering
autonomously to reduce the size of
the relative orbit to a few hundred
meters by applying Angles Only
Navigation (AON) techniques. The
fourth step will perform visible and
IR passive imaging of the RSO.
During the fifth step, a combination of chaser attitude motion and relative motion between the cubesat and the RSO
is employed to perform 3D imaging of the RSO by combining LRF measurements and knowledge of the cubesat’s
inertial attitude and position. Successful completion of the mission validates a range of technologies that can be used
for debris removal from low Earth orbit by demonstrating robust, affordable, and responsive rendezvous of cubesats
with uncooperative RSOs, on a budget two orders of magnitude lower than previous observer missions such as XSS-
11 (AFRL) and Orbital Express (DARPA). The science Concept of Operations (ConOps) can be seen in figure 2.
These five steps are described in detail below. Step 1: Maneuver within LRF range (less than 2km) from the RSO using pre-loaded commands. After
separation from the launcher, detumbling, and systems check, ARAPAIMA is authorized to perform the first step.
During this step, the cubesat approaches the RSO to a distance just below 2km. The cubesat is commanded to point
the payload at the RSO and take visible and IR images and LRF ranges to confirm the successful execution of the
step.
Figure 2. The mission storyboard for the ARAPAIMA mission
American Institute of Aeronautics and Astronautics
3
Step 2: Acquire a relative circular orbit, with respect to the RSO, of less than 2km radius using pre-loaded
commands. Once the verification of the relative distance is completed, an Authorization To Proceed (ATP) from the
ground station is issued, and the cubesat uses pre-loaded commands to acquire a circular relative orbit with the RSO.
The radius of relative orbit is within LRF range, and similarly to the first step, after completion of the maneuvers the
cubesat uses its cameras for RSO imaging and the LRF to confirm its range. Additionally, the attitude is commanded
so that the payload tracks the RSO as ARAPAIMA orbits it. The ground control team issues an ATP after
confirmation of relative orbit acquisition, and the cubesat proceeds to the next step.
Step 3: Maneuver autonomously to reduce the size of the relative orbit to below a few hundred meters. The third
step starts with the cubesat acquiring the RSO with its visible and IR cameras and using the LRF to perform periodic
ranging. The orbits of both the RSO and the cubesat are propagated on board the cubesat, and a propellant optimal
maneuver is computed to take the cubesat into a tighter relative circular orbit. During the inactive, nonthrusting arcs
of the reconfiguration trajectory the cubesat periodically acquires the RSO with the cameras, and it takes LRF
measurements and GPS solutions to verify the accuracy of the OMT maneuvers and ensure operational safety.
Operations during the third step are defined as autonomous because the orbital and attitude maneuver commands are
generated on board the cubesat instead of being pre-loaded by ground control. The team emphasizes that the
autonomous maneuvers performed during the proximity operations will be designed for simplicity and robustness.
The maneuvers will be fully validated on a high fidelity real-time mission simulation test-bed throughout the
lifetime of the mission during preparatory sessions. At the end of the third step, the cubesat lies in a circular orbit of
250m diameter relative to the RSO.
Step 4: Perform autonomous visible and IR imaging and LRF reflectivity measurements of the RSO. After
successful completion of the third step and receiving the ATP, the cubesat proceeds with the fourth step during
which it is tasked to perform autonomous imaging of the RSO and to autonomously plan relative orbit maintenance
maneuvers to offset the effects of differential drag, J2, and solar radiation pressure (SRP). Images taken in the visible
and IR spectrum will be used by the team to inspect the RSO and determine any outstanding features. Once a certain
number of observations are made, the cubesat enters its telecom mode to download the data to the ground station.
Upon analysis of the imaging data, the ground control team decides to issue the ATP to the fifth step.
Step 5: Perform 3D imaging of the RSO using a combination of attitude motion and relative motion, with
respect to the RSO, and combine LRF measurements and knowledge of the cubesat’s inertial attitude and position to
generate point clouds. Open outer loop control of attitude: Pre-programmed attitude profiles are used, which
command the cubesat to perform an up-and-down scanning motion or a slow spiral with respect to the RSO. A
coarse attitude state of the RSO with respect to the chaser body frame can be estimated and transformed to an
inertial frame based on the attitude solution of the chaser. Point clouds of larger resolution resolve the features of the
RSO and can be used to determine their relative locations with respect to the chaser body frame. Based on the
information extracted from the point clouds, RV and docking paths can be planned on-board, and the chaser is
commanded to follow them up to a safe distance to the RSO. End-to-end simulations of the scanning phase of the
mission will be employed to determine which parts are better performed autonomously and which are better
performed by pre-loaded commanding. These methods are also applicable to 3D imaging of tumbling or
maneuvering RSOs [1].
The desktop spacecraft simulator is an excellent foundation for the design and validation of ARAPAIMA. The
advanced maneuvers and on-board algorithms necessary to the success of the ARAPAIMA mission provide a
challenging and strenuous basis for which the simulator is tested. As can be seen in the ARAPAIMA mission
description, the stages of the mission provide a full range of maneuvers to design the testbed capability around. The
testbed is designed to meet the requirements and necessary accuracy and feedback for accurately simulating the
required ARAPAIMA maneuvers.
II. Introduction
The desktop spacecraft simulator and payload testbed, also known as Chronos, is a concept primarily developed
in order to test the actual payload of ARAPAIMA. The spacecraft simulator is named after the Greek god Chronos.
This name aptly fits the simulator since the god Chronos is often referred to as the personification of time. The
success of the simulator and the results of the testing that it is the basis for, hold the schedule of the ARAPAIMA
mission in its hands; thus the name Chronos is chosen. In addition to testing the ARAPAIMA payload, the
algorithms being developed for use with the ARAPAIMA payload also need to be tested. Thus, the full picture
concept is determined. The purpose of Chronos is to serve as a 1/24 scale maneuver testbed for the ARAPAIMA
mission. 1/24th scale was chosen in order to have easy access to model parts to be used in the simulator since 1/24
scale is a very common hobby model scale.
American Institute of Aeronautics and Astronautics
4
The goals of the ARAPAIMA testbed and payload simulator are to refine the design of ARAPAIMA, integrate
the hardware (HW) and software (SW) components, and perform laboratory tests of the optical payload of a
nanosatellite that performs proximity space-based space situational awareness (SSA) missions.
The work in progress is significant because it advances the state of the art in the integration and operation of
miniaturized, commercial off-the-shelf (COTS) optical instruments for nanosatellite space-based SSA, it reduces the
technical and schedule risk of the ARAPAIMA mission, it provides the undergraduate and graduate students
involved in the project hands-on training in space systems engineering, and it facilitates the development of unique
space systems engineering capabilities at the ERAU Department of Aerospace Engineering.
Successful completion of the testbed advances the technology readiness level of the ARAPAIMA payload and it
significantly reduces the technical and schedule risks associated with the development of the payload and its
integration with the nanosatellite subsystems.
Figure 3. The block diagram detailing the spacecraft simulator
A large factor in the design of the system is cost. The system is to be designed keeping cost efficiency as a high
priority. As can be seen more in-depth in each section, the methods and the technology for the design were used in a
way that uses low-cost items together with the addition of software to form high performance systems.
There are four main systems within Chronos, the spacecraft simulator robot, the RSO simulator robot, the motion
tracking system, and the computer software. Each system is described in more detail throughout the paper,
covering the design choices, the current status of the designs, the challenges that have been encountered, and the
future plans for the system. The block diagram of Chronos can be seen in Figure 3.
Many challenges are present in developing a small, highly accurate, isochronal system for the simulation of
spacecraft on the ground. To address some of these challenges, Chronos has gone through virtual design iterations
and many additional tests and studies have been done or are in the plans to be completed. These designs and studies
allow us to optimize the performance of the system without significantly increasing the cost; however, we do take on
many schedule slips and extensions in exchange.
III. Current Simulators in use
Chronos is designed to replicate some of the abilities of the larger payload simulators, like the European
Proximity Operations Simulator (EPOS). The EPOS is located at the German Space Operations Center operated on
behalf of the European Space Agency (ESA). The EPOS can be seen in Figure 4.
American Institute of Aeronautics and Astronautics
5
I There are also similar simulators made
and operated by the United States
government. In 2008, the U.S. Defense
Advanced Research Projects Agency
(DARPA), and the Navy Research Lab
(NRL) designed and built the Proximity
Operations Test Bed (POT). This Dual
Platform Motion Simulator conducts a full
three-dimensional spacecraft rendezvous and
docking replication under realistic dynamic
conditions. The POT platforms are capable of
conducting operation with payloads of 350
kg (Target Platform), and 400 kg (Pursuer
Platform). The two platforms have a
maximum translation range of travel of 25 m,
10m, 3m, in the x, y, and z plane
respectively. Also on the NRL facility, there
are other simulation test beds designed to
replicate other space objectives. The Front-
End Robotics Enabling Near-Term Demonstration (FREND) Robotic Arm is designed to test and demonstrate the
ability to interact and perform unaided grapple operations autonomously [2].
The downside to these testbeds is that they are large, expense to own and operate, and obtaining permission to
use one of these testbeds is difficult (unless working with the respective owners of the simulators). Access to
simulators and the advantages that having a simulator adds to the design of a nanosatellite is one of the future goals
of Chronos. Designing a simulator that can be easily purchased or replicated would provide many advantages for
nanosatellites developers.
IV. Design Plan
The design of the spacecraft simulator and the testing of the ARAPAIMA payload and algorithms are broken
down into five tasks that are currently in progress. These five tasks form the basis for the schedule of the design, the
work breakdown structure, and provide a basic overview of the design. These tasks are not dependent on each other
and therefore, they are being worked on simultaneously.
The first task is to develop and test the software for the command and control of the payload. Drivers for the
payload are being developed and installed on the Linux operating system kernel. Then, the camera command
routines can be written and the storage of images and range measurements can be managed and optimized. This task
results in the initial release of the prototype flight software for the payload.
The second task is to design and manufacture a payload emulator to house the payload components. The cameras
and laser rangefinder are to be integrated with the emulator and the on-board computer. The emulator is simply a
mechanical enclosure that houses the instruments, provides mechanical interfaces for mounting, and provides
electrical interfaces to supply power to the payload.
The third task is designing the experimental apparatus. Various pieces of equipment are integrated into an
experimental apparatus that simulates the relative translation between the nanosatellite and a RSO or target and the
attitude motion of the nanosatellite. In addition, the jitter of the nanosatellite will be simulated by over-imposing the
jitter on the simulation attitude motion. This task results in a fully functional experimental apparatus and calibration
and operating procedures.
The fourth task is to manufacture scale models of various RSOs. In this case the RSOs are simply the upper
stages of various launch vehicles. The RSOs are assembled and painted to resemble the real-life object. In addition,
the RSOs will contain small heaters that emulate active components in preparation for radiometry studies.
The fifth task is designing the experiment plans for imaging the RSO models with the payload emulator while in
relative emotion with respect to the RSO. In addition, the experimental apparatus is calibrated, the software required
to run the experiment is written, and the experiment is run with results collected.
These five tasks, once completed, allow the simulator to be designed and constructed, software to be written, the
experiment to be run, and the payload and flight software to be intregrated and tested. After completion of these
Figure 4. Image of EPOS. Courtesy of the European Space
Agency
American Institute of Aeronautics and Astronautics
6
tasks, the ARAPAIMA mission flight
cameras and laser rangefinder will have
undergone a partial characterization and
the software will be tested and ready for
debugging.
V. Present Status of the
Simulator
A. Design Iterations
The first design approach for the
spacecraft simulation system utilized a
simple, COTS robotic arm integrated
with an in-house designed and
manufactured rail system. The
CrustCrawler Smart Robotic Arm was
selected for this task, featuring AX-18A
motors. This robotic arm was
purchased, assembled, programmed,
and tested in-house. A new system was
formed around this robotic arm. The
basic idea of Chronos version 1 was to
have a robotic arm with at least six
degrees of freedom on a linear rail
system. The six degrees of freedom in
the robotic arm was accomplished by
installing two upgrade packs for the
robotic arm, increasing the standard four axis arm to a six axis system. The movement and positioning of the
payload mounted on the arm was determined using a range indicator and a Leap Motion Controller. All the data
from the Leap, range indicator, and payload are processed through a control computer. Additionally, position
feedback was used for the simulation. The system layout can be seen in figure 5. After about three months of programming and tinkering with the arm, it was dismissed as an option for Chronos.
There were multiple issues encountered with using this particular arm for our purposes. The first was the
requirement to use a low-level, packet communication style in order to achieve moving in multiple directions
simultaneously. This meant that very long codes were required to make the simplest of movements.
The most critical problem with the long code was the time it took to run the code. There would be long pauses
between maneuvers due to the computer processing time required by the program. The second issue was with the
motors themselves. Without out any additional weight, gear grinding could be heard from the motors. Eventually,
motors would just fail completely. This failure was determined by the lack of movement of the motor while
commands were being sent and the motor could be heard whining. After multiple motors experienced failure, we
disassembled a motor and discovered at least part of the root cause of the issue. The gears inside were completely
stripped. The main driving gear in the motors is metal, while the remaining gears are plastic. In our case, the
maneuvers and the stress on the motors caused the metal gear to strip the plastic gears.
Figure 5. Diagram of version 1 setup for Chronos
American Institute of Aeronautics and Astronautics
7
Figure 6. Images of Chronos version 1 showing the trolley and the robotic arm
After these two large issues were discovered, the robot arm was retired from Chronos. The trolley for this design
was kept and still remains in the current version, though some
modifications have been made. Images of Chronos version 1
can be seen in figure 6.
The next major version of the spacecraft simulator for
Chronos consisted of an in-house designed robotic arm
featuring stepper motors. This design was passed over and
never prototyped due to the over-complication with coordinate
transformations and the stress on the motors caused by the
requirement of holding the payload and other motors at odd
angles over periods of time. Another problem with putting
stress on the motors is the holding torque of the motors.
Although the motor may be rated for a particular holding
torque, it was noticed that over time the motors started to slip,
causing inaccuracies in the positioning. In addition, the motors
would always be holding or moving, causing the motors to
overheat quickly.
The third version of the spacecraft simulator resembles a 3D
printer in design. This design has been kept; however, a simpler
version, version four, is built for mock-up. As seen in figure 7,
version three consisted of four sets of small rail pairs, excluding
the trolley railing system. The payload is mounted in the section seen in the middle of the image and moves using a
belt system driven by stepper motors. This design is still a current iteration because it has many advantages. One
large advantage is having complete support on the top and bottom of the rails providing a great deal of stability. This
lessens the impact of shaking and wobbling as the payload is moved around.
The current version of Chronos is similar to version three, but with a focus on having a lighter top. This allows
better performance out of the motors. This version is still a constant work-in-progress; however, a mock-up has been
rapid prototyped in order to speed up the process of testing and designing improvements. The mock-up can be seen
in Figure 8.
B. Experimental Apparatus
1. Motors
The spacecraft simulator emulates the in-orbit range of motion of payload instruments using Arduino technology
and position tracking from sensors. Movement of the spacecraft simulator robot requires motors to drive the robotic
trolley. After researching motors that allowed accurate movements that were within Chronos’s budget, stepper
motors were chosen. The NEMA 17 stepper motors are manufactured with high precision and are originally
designed for Lulzbots’s and Aleph’s 3D printers. NEMA stepper motors provide finite and exact movements when
Figure 7. Chronos version 3 CAD model
American Institute of Aeronautics and Astronautics
8
3D printing. Utilizing their micro-movements allows Chronos’s robotic system to achieve micro-stepping. This
capability used on Arduino micro-controller boards makes the motors ideal when compared to DC motors.
Due to the polarities, DC motors are able to be connected via an Arduino board in two ways. The first
connection configuration connects the positive red wire to digital out on the Arduino board, while the ground wire
connects to the corresponding ground wire. Depending on the motor selected, this produces a clockwise rotation
whereas opposing the polarities creates a counter-clockwise rotation. Connection two is performed by switching the
connecting wires to the opposing pin in, such as DC positive wire with ground and DC ground with digital out.
Unfortunately, regular DC motors do not provide large amounts of torque, nor do they offer additional augmentable
hardware components useful for the spacecraft simulation. This is why the stepper motor design is chosen since it
has hardware that can amplify the accuracy of motor control.
Augmenting the NEMA 17 with a compatible motor driver will produce more available steps within a full
revolution. The compatible motor driver DRV8834 will decrease the step degree to a chosen amount. The DRV8834
provides 6 additional step mode configurations. These additional modes are known as microstep resolutions (step
sizes) and can be seen in Table 2.
Table 2. The microstep resolution from the motor driver
M0 M1 Microstep Resolution
Low Low Full step
High Low Half step
Floating Low 1/4 step
Low High 1/8 step
High High 1/16 step
Floating High 1/32 step
Precise movements are required when simulating zero-gravity environments. NEMA 17 stepper motors
achieve 200 steps within 360 degrees. Steps allow the motor to turn a specified number of increments within a full
revolution. For this motor and on most other ones, one step equals 1.8 degrees. Therefore, it’s easy to determine the
maximum number of steps available in one revolution I, which is shown in Eq. 1. The variab‘e 'n’ represents the
desired step mode to be achieved.
𝑅 = 360°
1.8° 𝑥 (𝑛)
Arduino compatible motors operate through pulse width modulation (PWM). Utilizing this modulation requires an
understanding of how the process functions. A PWM uses duty cycles to control the amount of power the motor will
receive. Variable on / off phases are controlled through the motors to supply micro-steps. These phases operate by
emitting a signal for its respective period. The period is the actual time it takes the signal to complete the on-and-off
cycle. The duty cycle depicts the running time versus the idle time of a device. By varying the on-and-off time of the
emitting signal, different duty cycles can be achieved. This is highly important because it allows multiple different
micro-step modes to be achieved for fine tuning movements. In Eq. 2 the duty cycle (D) equals the on time (I)
divided by the addition of full time (on and off). This ultimately will produce a percentage once multiplied by one
hundred.
𝐷 = (𝐼
𝐼 + 𝑂) 𝑥 100%
The procedure for acquitting the NEMA 17 motors with this capability uses proper current limiting techniques.
Failure to heed proper techniques could result in irreparable damage to equipment without replacing parts.
Achieving the step rate goal requires Chronos to be able to move in 1/32 step mode. This is the optimum step mode
for a space simulation mission provided by the motors. 1/32 step mode is a high step rate which requires a motor
supply voltage that is higher than the permissible voltage. By ensuring the power supply voltage for the motor driver
is greater than the rated voltage on the stepper motor, various micro-step modes are achieved. The appropriate
default voltage supply ranges are depicted in Table 3.
American Institute of Aeronautics and Astronautics
9
Table 3. Voltage supply ranges
Min Max
Motor power supply (Vm) 2.5V 10.8V
Input voltage range (Vref) 1A 2A
The DRV8834 motor driver is an allegro stepper driver. Allegro stepper drivers are current limited chopper
stepper drivers which make adjusting the step modes an easier process. On the DRV8834 the stepper motors can be
controlled by either phase/enable mode or indexer mode. These modes are settings that the user can determine. The
toggle capable modes are
chosen by connecting to the
CONFIG pin on DRV8834
driver. Currently there are a
total of six motors
manipulated with Arduinos to
be used in the spacecraft
simulation that will allow
three-dimensional
maneuvering.
C. Motor Controllers
Motor operation requires
the use of a simple control
board. Arduinos possess the
capability of sensing and
controlling within its
microcontroller board. It is
powered by the open-source
community, making a
variation of nearly every
product imaginable available
to some extent. Arduino
integration with Chronos
enables simple control while
providing compact and
lightweight control boards. The Arduino programming language is an implementation of Wiring based on the
Processing multimedia programming environment. Two types of Arduinos are chosen to effectively execute
Chronos . These Arduinos are the Arduino Mega 2560 and Arduino Micro. The size and cost of each of these boards
is detailed in Table 4. Both Arduinos run on the 1.0.1 integrated development environment (IDE) which provides
full range of coding from LED blinking to external communication between microcontroller boards. The latter
feature is used extensively by Chronos.
Table 4. ARAPAIMA Arduino instruments.
Arduino Board Cost (€) Dimensions (L x W) cm
Arduino Mega 2560 39 10.16 x 5.34
Arduino Micro 18 4.8 x 1.77
Both the Arduino Mega and the Arduino Micro offer a relatively easy plug-and-play design. They support
numerous add-on hardware and software components and extensions. The appropriate components and extensions
used are mentioned in following sections.
The Arduino Mega 2560 acts as the master command module in the hierarchy between the two Arduino types.
This results in Arduino Micro becoming the slave module for the Arduino Mega 2560. The benefits of having a
Figure 8. The block diagram detailing the Arduino/motor connections
American Institute of Aeronautics and Astronautics
10
command module ensures all motor controllers will receive the same command nearly simultaneously. The Arduino
Mega 2560 activates software based elimination of the possibility of different motors executing opposing or altered
commands. The Arduino micros are needed primarily for their size. The Arduino Micros are connected to the
stepper motors therefore adding extra weight and space to the spacecraft simulation system. It is clear that mounting
multiple Arduino Mega boards would not only be an inefficient use of space, but also more expensive than using
their Arduino Micro counterpart.
Master commands via Arduino Mega 2560 primarily execute in the MATLAB / Simulink environment. This is
possible due to the downloadable Arduino support package within MATLAB. Upon successful Simulink code setup,
the Arduino Mega communicates with the Arduino Micros. Therefore there exists a fluid data communication link
between the user and the spacecraft simulatior.
Slave commands via Arduino Micro executes through the Arduino IDE. Unfortunately MATLAB / Simulink
does not provide support for the Arduino Micro microcontroller. Access to support on Arduino Mega, Uno, and
Nano is only available. The possibility of creating a compatible Arduino Micro package for Chronos is in progress.
In order to execute commands, one of many methods must be chosen for Arduino Micro (detailed in the MATAB /
Simulink section). The software testing phases consist of the following steps.
Software testing on the Arduino boards first begins with the Arduino Micro. The standard LED blinking test
performs to check the integrity of the board and verify ease of Arduino based code programs. Completion of LED
phase moves the testing phase to the next step. Progression towards stepper motor control is now possible with
allowable direction, speed, and precise step rotational increment manipulation. Successful program execution
enables basic simultaneous motor control after troubleshooting reactions and voltage level adjustments.
D. Rapid Prototyping (3D Printing)
One of the first studies that took place for this project was the practicality of using rapid prototyping for the
manufacture of the components for Chronos. The largest area of worry was the capabilities of the Makerbot 2 to
accurately model size with an emphasis on circular features.
The system designed for this study is a motor mounting design for a roll, tilt, and pan system for Chronos. The
particular component for this study is designed and modeled in Catia v5. The first version of the Computer-aided
design (CAD) model follows the specifications given by the manufacturer of the motor. For the second iteration, the
motors are in-house, thus true values for the dimensions are able to be incorporated into the designs as well as some
features that are not in the manufacturer’s drawings. In the third iteration, the focus is more towards accounting for
the error from printing since the CAD model is accurate. In the fourth and final iteration, the last printing errors are
accounted for. In each version, after the Catia model was improved, the file is set to 0.001 tessellations and exported
as a STereoLithography (STL) file. This file is then imported into Autodesk Meshmixer for adjustments to the CAD
file (this step does not occur in the 1st iteration), including the addition of supports and checking for faults in the STL
file. Finally, the modified files are imported into the appropriate printer accompanying software. This software being
used is Makerware for the Makerbot 2 and Catalyst EX for the uPrint. This is the final step where the part is
optimally oriented and rafting or supports can be added. In addition, changes to the print quality can be made.
American Institute of Aeronautics and Astronautics
11
Figure 9. The 3D printed models and accompanying cad models of the piece studied
The data is analyzed using both measurements and fit checks. Calipers are used to precisely measure the
dimensions of the printed prototype. These numbers are compared to the values of the original CAD model. A fit
test is performed by trying the motor in each of the printed prototypes. Using the fit test, elements that need to be
adjusted and accounted for can easily be spotted. The data from each of the measurements can be seen in Table 5.
Table 5. Data collected for the printed versus CAD model comparison
Center
Radius
(mm)
Screw
Radius
(mm)
Mount
Radius
(mm)
Average
Wall
Thickness
(mm)
Length Width Height
Version 1 CAD 2.500 0.750 1.000 2.000 46.30 46.30 36.00
Printed 1.870 0.500 0.800 2.045 45.76 45.60 35.69
% Difference 25.20 33.33 20.00 2.250 1.166 1.512 0.861
Version 2 CAD 2.500 0.750 1.000 4.000 50.30 50.30 36.00
Printed 2.360 0.500 0.800 4.013 49.93 49.93 35.70
% Difference 5.600 33.33 20.00 0.313 0.736 0.736 0.833
Version 3 CAD 2.500 0.750 1.000 4.000 51.67 51.67 51.45
Printed 2.230 1.100 0.800 4.250 51.02 51.09 51.15
% Difference 18.80 46.67 20.00 6.250 1.258 1.123 0.583
Average % Difference 13.87 37.78 17.33 2.938 1.265 1.123 0.759
The biggest finding determined from this study is the fact that circular objects print with a lower accuracy than
their straight sided counterparts. The height of the part also demonstrates the most accurate print quality. The most
overarching finding is that overall the printed part is smaller than the original CAD model. This shrinkage is
dependent on the feature geometry; however, with the exception of a couple of random features, each feature is
generally smaller than the given dimensions. The average percentage of error is 22.99% for the holes and 1.52% for
American Institute of Aeronautics and Astronautics
12
the generally straight features. This yielded an overall percent error of 10.72%. The software makes a noticeable
difference in the print quality of the part. The addition of Autodesk Meshmixer allows for the fixing of the original
STL file exported from Catia. The program fixes holes in the files and provides an improved surface smoothness.
This fix is obvious between the first and second prints. The use of the Catalyst program for the uPrint allows
customization of the print setup of the file. It is possible to detail infill, support structures, and the tooling path
directly. A few conclusions are apparent from this study. First, there is a definite inaccuracy when using the Makerbot
Replicator 2 for printing and typically the part is smaller than modeled. Second, for non-essential support holes it is
better to model them larger so that screws still fit since washers can be used for a hole that is too large. The
calculated average percent error of 10.72% closely matches with the described 10% shrinkage of the Makerbot
Replicator 2. The prototypes which are modeled are actually usable parts for the purposes of this study. They
provide the strength and functionality required to serve the appropriate function in the research supported by this
study.
E. Testing Setup (Prototype)
The first completed Chronos prototype utilizes a combination of hardware components and various software
programs. Chronos’s hardware components are isolated and thoroughly tested before being implemented on the
fully configured mechanism. This is performed by breadboard wiring. The Arduino wiring diagram can be seen in
Figure 10. An Arduino Micro and a DRV8834 step motor driver are connected to a single breadboard. This allows
for an isolated system capable of operating a single Arduino Micro whilst being able to control one stepper motor
completely. Arduinos have two ways of being powered. One way withdraws power from its USB connection which
is normally used when uploading code, and the other way takes power from pin connection on the breadboard. The
Mastech DC power supply HY3005F-3 is the power supply used throughout the isolated prototyping phase.
Expanding the isolated system produces six Arduino Micros, motor drivers, and stepper motors configured to a
breadboard.
Progression from isolated to a fully operational prototype introduces new Arduino compatible devices. These
devices are Arduino sensors. The Arduino sensors are a fail-safe to prevent derailing from the trolley system. By
acting as a collision avoidance device the trolley becomes a safe movable system that protects the equipment and
Chronos team. The sensors allow the focus to remain on operational software that easily incorporates the sensors
data in to each program, devoid of hazard.
F. Matlab/Simulink Integration
Chronos requires accurate control over all of its lower robotic systems in order to produce an effective spacecraft
attitude control simulation. Chronos will rely on precise movements to track the tumbling RSO. These movements
come from the combination of software that focuses on GUI control in the MATLAB / Simulink environment.
Successful software allows complete control of the stepper motors. Simulink provides quick GUI code
generation for stepper motor direction and speed control. The Arduino Mega 2560 and Arduino Micro provide
Arduino IDE support, but only the Arduino Mega has support for MATLAB / Simulink control. Therefore the
Arduino Mega will have to relay the commands to Arduino Micros. There are numerous methods to ensure the relay
will execute correctly which ultimately makes this step a work in progress to obtain optimum performance. Two
main methods are detailed further.
American Institute of Aeronautics and Astronautics
13
One option uploads the Arduino Micro’s code to the Arduino Mega by calling functions in Simulink over a Wi-
Fi network. This basic approach appears to have the least issues since the user is coding directly in the Arduino
environment. Variations of this method allow Wi-Fi to be interchanged with Ethernet or Bluetooth communications.
Thanks to Arduino’s
many compatible
hardware shields and
components, swapping
the communication
device to Bluetooth or
Ethernet requires little
to no extra lines of
code. The flaw with
this software design
lies within the
Arduino. By focusing
on the Arduino
environment, Simulink
becomes an added
medium between the
user and the motors.
Since Simulink will
convert the Arduino
commands to custom
Simulink GUI
functions, program
writing becomes time
consuming. Unfortunately the more mediums present, the higher the chance of failure. Another option plants motor
control in the Arduino Mega entirely and sends out the configured command signal to the Micros. In Simulink the
Mega schematic is configured and exported commands are broadcasted to Arduino Micros. The Micros are
programmed to act as receiver devices that translate the Simulink schematics to executable Arduino language. This
method is the preferred choice since it is quicker to generate commands in real time when compared to focusing on
the Arduino environment; however, it creates room for error when the code reaches the Micros since a support
package for the Arduino Micros currently isn’t available. The Arduino Micro boards might not be capable of
interpreting and executing the commands successfully.
G. RSO Simulation/Modeling
The RSO simulator utilizes the Stewart platform or Hexapod design to provide a six degree of freedom platform.
In the mid 1900s the combined effort of
Gough and D. Stewart led to the
creation of a parallel manipulator known
as a “Hexapod.” The platform
successfully maneuvers about the
maximum degrees of motion available
due to its actuators. The actuators are
motors which control the moving
platform attached to six movable struts.
Incorporating this design in the
simulation aims to mimic the free fall
tumble of objects in low earth orbit via
movement about x, y, and z axis while
simultaneously providing pitch, yaw,
and roll capability.
The Stewart platform is an ideal
robot to be a part of Chronos since a
majority of the parts can be 3D printed
and assembled easily. The Stewart
Figure 10. The wiring diagram for the Arduino boards
Figure 11. Image of Delta IV upper stage modeled in CAD
American Institute of Aeronautics and Astronautics
14
platform simulates an RSO tumbling on a platform that is controlled by Arduino technology. After assembly,
modeled rocket upper stages will be individually placed upon the platform and the tumbling simulation can occur.
These upper stages are modeled to scale. Great care and time is dedicated to produce realistic and precise replicas in
CATIA with minimum human error. The available data for each upper stage is obtained from accredited sources
which link back to NASA’s manuals and records. A picture of one of the upperstages that can be modeled, the Delta
IV, can be seen on Figure 11. Whenever certain component measurements are not provided, a combination of
formulas and engineering intuition is used to complete the model.
H. Payload Instruments/Emulator
In order to remain cost effective, the testbed features payload components that are similar (for test purposes) but
lower priced than the ARAPAIMA flight hardware. The testbed consists of the three instruments including the FLIR
MLR100, the Sentech STC-CL33A, and the Sentech STC-CL202A; as well as a PC/104 Plus computer board and
two Phoenix frame grabbers.
Version 1 of the payload emulator was designed to house the Sentech STC-CL33A, the MLR100 laser
rangefinder, and a Logitech webcam as a stand-in for the infrared camera. The LRF comes without a protective box
which makes it very sensitive to shock; therefore, the entire MLR100 is housed inside the emulator. In testing the
payload emulator, having an infrared spectrum camera is important since it can be used to spot the LRF bloom on
the target. With this capability, the camera can be used to ensure that the LRF is properly aligned within the
emulator and that the bloom is not larger than the target and thus picking up erroneous measurements. The CAD
model of version 1 of the emulator can be seen in Figure 12.
The emulator was 3D printed in the Spacecraft Development Lab Fig.13 shows the printed emulator with the
LRF and two cameras attached.
Figure 12. Left. Shows the CAD model of the first version of the payload emulator
Figure 13. Right. Shows the printed first version of the payload emulator with the components installed
The second version of the payload emulator is a current work-in-progress. It will house the STC-CL33A, STC-
CL202A, MLR100, VersaLogic CPU PC/104-Plus module, and two Active Silicon Phoenix frame grabbers. This
version will be outfitted to mount directly to the spacecraft simulator. Space will additionally be allocated to allow
for both using the emulator with and without a separate computer to give inputs or analyze outputs.
Integration of the payload computer with the frame grabber and the Linux operating system is addressed and the
preliminary results are encouraging. An Active Silicon Phoenix frame grabber and its SW driver have been
integrated with a VersaLogic CPU PC/104-Plus module. Preliminary tests with a Sentech STC-CL33A monochrome
camera, with a resolution of 648x494pixels, are conducted to check the operation of the integrated components. The
STC-CL33A camera is used as a substitute for the IR camera for the payload integration work in the laboratory.
American Institute of Aeronautics and Astronautics
15
V. Point Cloud Reconstruction Simulations
Simulations of the 3D imaging phase of the
mission with scanning attitude motion and an open
outer attitude loop. The resulting point clouds are
shown in Figure 4 along with the reconstructed RSO
shapes corresponding to each of them. The relative
orbit is a circle of 250m radius and it is assumed that
the LRF measurements and the relative position
measurements between the chaser nanosat and the
target RSO are ideal. The pulse repetition frequency
(PRF) of the LRF was varies from 0.1 to 2Hz to create
point clouds of increasing sizes.
The total imaging time was fixed at 26 hours,
regardless of PRF. The low, medium, and high
resolution clouds have 3306, 11241, and 89511 points
respectively. It is interesting to notice that, despite the
fact that the shape reconstructed from the low resolution point cloud already captures the prominent features of the
RSO. A coarse attitude state of the RSO with respect to the chaser body frame can be estimated and transformed to
an inertial frame based on the attitude solution of the chaser. Point clouds of larger resolution resolve the features of
the RSO can be used to
determine their relative locations
with respect to the chaser body
frame.
In addition to point cloud
reconstruction, a comparative
study on RSO relative
trajectories will take place. The
testbed will simulate the same
trajectory as the MATLAB
simulation to establish a
correlation between images
taken on the testbed and images
from the MATLAB code. As
images are being grabbed by
the cameras in the testbed,
images taken at particular
trajectories will be saved. At the
same time, several images of the RSO will be extracted from the MATLAB simulations at the proper relative
Cartesian coordinates. The relative trajectories between the RSO and the chaser (ARAPAIMA) in low earth orbit
(LEO) can be seen in Figure 5. These images will then be compared to the images taken in the testbed to determine
the validity of the algorithm as well as the motion tracking and position accuracy of the testbed.
VI. Motion Tracking System
I. Microsoft Kinect
The Microsoft Kinect provides several useful input devices for object tracking and distance sensing, which are
used to monitor both the RSO and the simulation system as a whole. These input devices are shown below in Fig.
16.
Figure 15. Images of the point cloud reconstruction for each number of
strikes
Figure 14. Graph of the LEO proximity trajectory
American Institute of Aeronautics and Astronautics
16
Figure 16. The Microsoft Kinect has input devices for both visible light and infrared/depth capture. The
Kinect also has a microphone array below the sensing unit.
This experiment made use of the Color Sensor and the IR Depth Sensor, both of which have a resolution of
640x480 pixels. The range of the depth sensor in default mode is 0.8 to 4 meters (2.6 to 13.1 feet), and in near mode,
the range is 0.5 to 3 meters (1.6 to 9.8 feet). Regarding the accuracy of depth measurements, the error has been
experimentally shown to increase quadratically with respect to distance, and, “ranges from a few millimeters up to
about 4 cm at the maximum range of the sensor” [3]. Fig. 17, below, shows how the Kinect represents depth.
Figure 17. The Kinect measures depth from the plane parallel to the device.
The depth image and color image feeds are fed to Simulink using a set of third-party Simulink blocks called
Simulink Support for Kinect. The objective was to use Simulink and the Kinect to track an RSO, which required the
use of several blocks from the Simulink Computer Vision System Toolbox.
The model shown in Fig. 18 flows as follows:
1) The first cluster of blocks uses Simulink Support for Kinect to get depth and color images from the Kinect.
2) The greenscale depth image is converted to a grayscale intensity image.
3) The registered depth intensity image is converted to a Boolean (black/white) image. The block in Simulink
responsible for creating this image uses Otsu's method to automatically detect the threshold.
4) A blob analysis is performed on the Boolean image. This particular block had a few custom properties: (a)
The blob area must be between 100 and 1000 pixels (to avoid detecting very small or very large objects),
American Institute of Aeronautics and Astronautics
17
and (b) blobs touching the image border are
excluded. It is worth noting that the blob analysis
block has several outputs; including area, centroid,
and perimeter, but this experiment only made use of
the bounding box (BBox).
5) The bounding boxes from the blob analysis are
drawn on the color image. In Fig. 19 below, these
boxes are shaded red.
6) The last cluster of blocks displays the images on the
screen. A sample of these images is shown below in
Fig. 19.
Figure 19. Images produced by Simulink, using the Kinect video feeds. From left to right: Registered Depth,
BW, and RGB + Depth Blobs. For clarity, each of these images havehas been modified with the RSO boxed
in green.
As seen in the figure above, the RSO is
in one of the shaded boxes identified by the
blob analysis. The current work is to now
isolate the RSO, and to use depth data from
the Kinect to acquire meaningful
information about the position of the RSO
to help verify the accuracy of the RSO
simulation system.
J. RSO Suspension Apparatus
To isolate the RSO as shown above, an
apparatus with a minimal cross-sectional
area was required. This experiment
achieved this by using the Shakespeare
MSSP46-1UL to suspend the RSO in mid-
air. This device comes equipped with a
manual actuator for RSO location
manipulation.
K. Motion Tracking Limitations
The original setup went through several
iterations before reaching the setup presented here. One aspect of earlier designs called for using several Kinect
Figure 18. Simulink Block Diagram for
Microsoft Kinect object detection system.
Figure 20. Shakespeare MSSP46-1UL RSO suspension
apparatus. The manual actuator is boxed in red, and the
attached RSO is boxed in green.
American Institute of Aeronautics and Astronautics
18
sensors to triangulate the location of the RSO and to get more accurate depth data. While the Kinect SDK supports
multiple Kinects, the Simulink blocks that were used do not; thus, only one Kinect was used.
The Leap Motion Controller was also to be incorporated in the design. The Leap uses infrared to determine the
position of an object, much like the Kinect; it has, however, several distinct differences. The Leap is exclusively an
IR tracking device; it does not have a color camera or microphone array like the Kinect. Additionally, the Leap is
designed to track hands/fingers and finger-like tools such as pens or wands (known as “pointables”, in the API
documentation). Because many of our RSO models are not pointables, they cannot be tracked using the Leap
Motion.
VII. System Accuracy
Accounting for the accuracy of Chronos is a factor in the design choices, hardware and software, throughout the
process. Accuracy is the only factor in having a simulator that has six degrees of freedom (DOF). A minimum of six
degrees allows for each direction of movement of the spacecraft to be simulated. The search for stepper motors is
based on the accuracy and precision of the motor. The current NEMA 17 stepper motors have 200 steps per
revolution. This is an average value for steps per revolution among stepper motors; however, this motor paired with
the motor driver allow for 1/64th microstepping. Now, the same motors can achieve 12800 steps per revolution. This
provides Chronos with an excellent motor accuracy.
Another large factor in the accuracy is the ability to replicate the lighting conditions that would be experienced
in space. Through discussions with people connected with some of the large proximity operations simulators
previously mentioned, it is noted that replicating the lighting conditions is extremely difficult and that addional
processes are generally necessary to more accurately simulate the conditions. Thus, along with a motorized and
intensity controlled lighting system, an additional image touch-up stage will take place. This will be an optional
function in which the software can make alterations to the images to better mock the light conditions. These touch-
ups would then be validated using outside resources such aas PRISMA images.
There is constantantly checking and testing to reduce the number of error causing sources throughout the system.
Some of these sources include motor slippage which is controlled through careful programming and placement of
large mass items close to the center of the motors. Another large source that is currently being controlled is within
the motion tracking system. Since in some cases feedback on the location of the payload will be sent back to the
simulator for maneuvers, among other things, the location data must be correct. Thus to increase the accuracy of the
position data, multiple Kinects are being used with data comparison algorithms to improve the overall results. As
with an mechanical device, there are plenty of places for errors to occur. It is the goal to minimize errors where
possible and account for an other consistent errors. This will help to maintain accuracy on a low-fidelity simulator.
VIII. Requirements Verification
Simulations, both physical tests and software based simulations, are used for many reasons, primarily as tools to
verify mission requirements as well as the overall project design. A majority of these requirements are derived from
mission objectives. Chronos will be used as a verification tool for many of ARAPAIMA's upper (science) and lower
(subsystem) level requirements. The science requirements are derived from the mission objectives, and are what
drives the science algorithms that Chronos will be testing. Chronos will also be used to verify ARAPAIMA's
Payload, Attitude Determination & Control (ADCS)/Guidance Navigation & Control (GNC), and Software
requirements and design choices.
In order to verify the requirements, Chronos will conduct what is called a “day in the life” simulation for
ARAPAIMA. This simulation will conduct the same maneuvers, utilize the same components for the same amount
of time, and will conduct the same mission with the same parameters as ARAPAIMA will complete while in space
and on mission. The requirements for the science algorithms drive many of the requirements for Payload and
ADCS/GNC.
These subsystem requirements are focused on requiring minimum capabilities for the components. For Payload,
these capabilities focus on selecting components that have the ability to gather accurate information for the
algorithms for successful imaging results. The Software requirements are written in order to assist the design of all
software used on ARAPAIMA. Chronos will provide the Flight Software (FSW) team the ability to test the software
using hardware on a real time basis. These tests allow the FSW team to make accurate adjustments to the code to
improve the final version which will be loaded onto ARAPAIMA.
Chronos will also be used to verify the ADCS simulations run by the ADCS team. These simulations are being
completed using software, and are simulating the ADCS/GNC algorithms that will maneuver ARAPAIMA. Chronos
American Institute of Aeronautics and Astronautics
19
will allow the results of the simulations to be testing using a physical proximity operations simulator. During tests,
position data, including pointing location, will be collected and compared to the results of the software simulations.
As long as both simulations yield the same results, then it verifies that the ADCS simulation results are accurate, and
can continue to be used. The schedule for the progression of the ADCS simulations will be completed with
individual maneuvers prior to compiling them into the full mission operation maneuvers. Chronos will have the
ability to test both individual maneuvers and a list of maneuvers.
With respect to the verification of the design choices, Chronos will test the capabilities of each component, and
the software associated with each component. Doing so will determine how each component interacts and operates
in the “day in the life” simulation.
Chronos will also be used as a risk mitigation method to reduce risk associated with the subsystems. The
ARAPAIMA systems engineering team, in collaboration with the entire team, has created a list of anything that
could possibly occur. This list is of events that are not part of the mission, and have detrimental effects to the
mission. Some of these risks include failure (partial or full) of payload hardware, as well as software associated with
all components that will be tested with Chronos. Simulation results will be used to not only reduce the likelihood
and impact of some of these
risks, but will also uncover
potential new risks that will
be added to the list.
The Concept of
Operations (ConOps) for
ARAPAIMA details the
mission timeline from a
science point of view. The
ConOps details the orbits
that ARAPAIMA will be
conducting in order to have
mission success. In between
the orbits, there are decision
points where the spacecraft
will wait for ground
approval before continuing.
Chronos will be used to test
how the science algorithms
operate throughout the
mission timeline as
described in the ConOps.
The ConOps can be seen in
Figure 21.
IX. Future Work
The team has made crucial developments and progress on the Design of Chronos up to this point. Once the first
official, fully operational version of Chronos is finalized, there are many plans for future work with Chronos.
L. Isochronal Implementation
The original idea for Chronos is to design a system that would operate in soft real-time, meaning that the system
has no delay in response, feedback, and reaction times. A soft real-time standard is chosen over the hard real-time
standard for practicality. It is reasonable that with low-end components a delayed signal may occur on occasion.
Thus it is determined that the focus should be more towards developing an isochronal system. Isochronal being
uniform in time, having equal duration, or recurring at regular intervals. Here the goal is more to design a system in
which the delays are equal and consistent rather than designing a system that has no delays at all.
Figure 21. ARAPAIMA ConOps diagram.
American Institute of Aeronautics and Astronautics
20
Throughout the process of programming the Arduino boards with the motors, it became apparent that there is a
slight delay in when the motor moves as opposed to when the signal is sent to the motor. This lead to the realization
that implementing a full real-time system may be a bit out of reach with the current components.
To determine the effect of the delays in the system an additional study is planned to be conducted. This study
will consist of using six Arduino micro boards with motor drivers to control six individual motors. A few different
conditions will be tested, including the addition of components and changes in the actual programming. Some of the
current plans include testing the signal delays with drastically different length wires, the delay with using the XBEE
modules for wireless communication between the boards, and using real-time clocks and time stamped commands to
try and reduce the difference in motor response times. The delays in the system will be measured using an
oscilloscope to study the offset of the electrical signals. The results of this study could result in the implementation
of electrical signal synchronization into the design.
In addition, this study will allow the authors to determine the type of dynamical phenomena that are occurring
within the system. Some of these phenomena could include lag synchronization, generalized synchronized, and
phase synchronization. The goal is to ensure that no lag synchronization is occurring since the zero lag state is
unstable. Also, achronal synchronization may occur which could prove difficult to account for throughout the
system.
X. Plug-and-Play spacecraft simulations
Currently, we only have access to the ADCS simulations from the ARAPAIMA team; however, Chronos will be
designed to be a universal system. The idea is that once all of the Simulink programming is complete, we should be
able to use a wide variety of payloads on Chronos with the accompanying ADCS software. The software should be
simple to tie into Chronos, with minimal adjustment on both ends to allow them to work together virtually
seamlessly.
In order to achieve compatibility with general ADCS algorithms, the system designed for Chronos needs to be
robust while maintaining simplicity. The coding also needs to be well document to ensure that if any problems arise,
they can be quickly mitigated.
With Chronos being developed to be a universal system, it allows different cubesat projects the ability to test
payload integration, flight software, and algorithms in a way that is not currently available to standard university
students and projects. This allows for more complex missions to be designed and tested on the ground and thus,
hopefully successful after launch.
XI. Flight Software Testing
One of the goals with Chronos is to be able to test the flight software for ARAPAIMA. The idea is to be able to
rapidly iterate the design and test cycle the major pieces of software, even on a low-fidelity testbed. The flight
software is currently in development for the ARAPAIMA cubesat which proves to be advantageous. As each version
of the software is released, it can be tested on Chronos to identify problems that may not otherwise be noticed.
The flight software will be loaded on the tigerboard located in the payload emulator. This will allow the main
computer to give feedback data to the flight software just as the other subsystem components would in space. For
example, the computer will be able to feed the flight software data on location that it would normally receive from a
startracker or IMU.
Since we will be able to see the data that is being fed into the flight software, it will be easier to identify where
errors or inconsistencies are occurring. This should speed up the overall software development process.
XII. Conclusion
This paper has covered the goals, current progress, relevance, and future work of a desktop spacecraft simulator
and payload testbed. The purpose of this paper and the design of the testbed is both to prove the capability of a
desktop scale COTS simulator and to validate the ARAPAIMA payload and algorithms.Much progress has been
made so far in the design and studies involved in the testbed. Through the use of COTS parts, rapid protyping, and
standardized software, a good foundation has been build for the testbed. Currently, the designs and ideas are still in
the finalization process; however it is expect that within the next year, a fully operational testbed will be located at
ERAU. This testbed will be used to validate and test many missions, starting with ARAPAIMA.
American Institute of Aeronautics and Astronautics
21
Acknowledgements
The authors would like to give a special thanks to the Olena Manakina of the Aerospace Engineering department
at ERAU, Heidi Steinhauer of the Freshman Engineering Department at ERAU, Jonathan Hatcher and Seth Grablow
of Teledyne Oil & Gas, Caroline Day, and Francisco Franquiz for their contributions and efforts throughout this
research.
The authors would also like to give thanks to the Embry-Riddle Aeronautical University Ignite Initiative, the
Florida Space Grant Consortium, and the Air Force Research Laboratory University Nanosatellite Program for their
financial support.
A special thanks to the ARAPAIMA team, without their hardwark and support this research and concept would
not exist.
References 1 K. Harris, M. Nayak, et al., "Application for RSO Automated Proximity Analysis and IMAging (ARAPAIMA):
Development of a Nanosat-based Space Situational Awareness Mission" presented at SmallSat 2013, Logan, Utah,
2013.
2 Debus. T. J., Dougherty, S. P., “Overview and Performance of the Front-End Robotics Enabling Near-Term
Demonstration (FREND) Robotic Arm” presented at AIAA Infotech Aerospace Conference, Seattle, Washington,
2009 <http://www.mdacorp-us.com/Robotics%20Papers/2009_1870_published.pdf>
3 Khoshelham, Kourosh; Elberink, Sander Oude. 2012. "Accuracy and Resolution of Kinect Depth Data for Indoor
Mapping Applications." Sensors 12, no. 2: 1437-1454.