

ShipBot
This Project was associated with the Mechatronics class I took as a masters student at Carnegie Mellon. The goal of this project was to design and build an autonomous robot to navigate a test site and actuate breakers and valves to simulate cargo ship operation.
Project Overview Video
Problem Description
As freight autonomy technology advances, the demand for intelligent and integrated systems is increasing. The main transport mode for global trade is ocean shipping with about 90% of traded goods carried by container ships. It take roughly 25 to 50 crew members (depending on the size of the vessel) to operate a container ship in order to make sure all systems are functioning properly and to make any necessary adjustments during the voyage.
​
The requirements needed to successfully operate a large cargo ship poses a big opportunity for robotics. Having an unmanned vehicle be able to maneuver around a ship to check and adjust operating systems would be a very valuable resource and would help smaller crews effectively operate larger ships.
​
This project aims to address this challenge by providing a mobile robot that can activate switches, turn valves, and flip circuit breakers. The robot will be able to accept mission files, navigate autonomously, and perform tasks regardless of any physical disturbances from large waves or storms.
System Design - Performance Specifications
Explicit Requirements
The goal of the ShipBot project is to create a robot capable of performing various tasks aboard older boats and ships. It needs to be portable and avoid interference with existing crew. It’s total size may not exceed 1.5 feet (in width) x 1.5 feet (in depth) x 2.5 feet (in height). Prototyping can be done with wall power; however, the final product should have a dedicated power supply. Power tethers are permitted. The total project has a $1000 budget.
​
The robot will operate within a 3’ x 5’ testbed that consists of eight 1’-wide work stations. Our team will have 1 minute to place our robot and orient it to the testbed. From that point on, The Drunken Sailor must operate without any human intervention. Though the locations of the work stations are known, what tasks are available will change depending on the mission file. There are 4 types of tasks that the robot will have to perform: turning spigot valves, wheel valves, or shuttlecocks, and flipping breakers. Each task will be set to an arbitrary state and the robot will have to determine whether or not to interact with it based on the desired end state provided in the mission file. The mission file will also contain instructions as to which workstation to visit and a target completion time. All tasks will be roughly at a height of 18’’±2’’. A guide rail will surround the devices and leave them at a depth of roughly 6’’±2’’ from the operating area. The robot may use the guide rail for mobility guidance or physically attach to it during the setup phase.
​
The gate and spigot valves will be required to be turned to a target angle with an acceptable error of ±15o. Shuttlecocks will need to be rotated 90o to the open or closed position. Breakers are categorized by A or B types indicating the on direction. The robot may optimize the path it takes for time if our team specifies its route before the testing phase. The expectation otherwise is that the robot follows the order given in the mission file. The operating area may have an obtrusive section of pipe that the robot must navigate around. Additionally, the entire testbed will feature a rolling deck to simulate the rocking of a ship at sea.
​
It goes without saying that safety, as always, is paramount. The manufacturing process and the autonomous robot should not pose a threat to property or life. The robot should be well constructed, being both aesthetically pleasing and structurally sound. “Rat’s nest wiring” and duct-tape will all negatively affect reviews at the Design Expo and lead to overall less robustness of the system. The expectation is that the final prototype will resemble a completed portable system, so a laptop should not be the driving computer of the final system. Aside from these explicit requirements, the teaching team reserves the right to expand the list of requirements at any time.
Implicit Requirements
While there are many clear criteria the ShipBot must achieve, there are a number of requirements that are inherent to the nature of the project. Maintaining academic integrity is as important as always. With multiple teams all working on the same project, there is a likelihood that similar mechanisms will appear amongst multiple designs; however, it is critical to cite our work and where we derive our inspiration from. We have to have proper documentation of our process and implementation.
​
Apart from moral expectations, there are additional logistics we must consider. The robot must be structurally stable as it performs its actions so as not to collapse under strain. The robot must be able to successfully manipulate objects around it in order to accomplish its goals. Humans should be able to intervene in case of error or emergency. We also have to complete the ShipBot within the span of the semester. With the semester being shorter than usual and with the first two weeks being virtual, we will have to be incredibly proactive in our design and implementation. Ordering and prototyping will have to be completed early so we have enough time to iterate on our design.
Coolness Factors
-
Speed - we tried to perform our tasks as quickly as possible so that we may complete the trials faster than the target times given.
-
Battery Power - We successfully implemented Zeee 8.4V 3000mAh NiMH Batteries to our electronics system so our robot would not need a power tether.
-
Aesthetics - we tried to make our robot more unique. Given that our team’s theme is pirates, we called our robot The Drunken Sailor and engraved pirate imagery into the front and back of the robot. We also adorned our robot in pirate duckies. An additional speaker allows The Drunken Sailor to hum sea shanties as it goes about its work, including the one of its namesake.

Functional Architecture
Cyberphysical Architecture


System Design - Mobile Platform
The mobile platform is the main housing for all our systems. To prevent tipping and to keep the center of gravity close to the ground, we implemented a design that keeps the majority of our weight near the bottom of the robot. While there are many ways to locomote, we decided mecanum wheels will serve best for our needs. With the arrangement of the testbed, it would be very convenient to have the ability to translate without having to rotate the entire robot. Omni wheels may be a bit cheaper, but the extra weight of the mecanum wheels and the angle of the rollers will mitigate unwanted sliding, especially in a dynamic environment such as a rocking ship.

System Design - Gantry System
The gantry system provides a way for the robot to position the end effector in the appropriate location quickly and accurately. The system has two dimensions: X and Y, where X is across the width of the robot and Y is along the height. Each axis contains a stepper motor mated to a 300 mm lead screw that translates the end effector mounting plate via linear guide rails and bearings. Each gantry direction contains two guide rails and four bearings, two per rail, to ensure the stability of the end effector during all movements. The robot starts with the gantry at a home position of (0,0), which is set by the SKR board. When the stepper motor drivers receive a command, the motors spin the lead screws and send the end effector mounting plate to the desired coordinates.

System Design - End Effector
The end effector for our robot is broken down into two parts. There is the part interfacing aspect and the wrist joint aspect. The part interfacing aspect is how our end effector will engage with the various ship components on the testbed. There are a few different types of valves and circuit breaker switches and our end effector was designed so it is compatible with all of these different components. The other part of the end effector is the wrist joint. Some valves on the testbed are oriented upwards and some are oriented towards. Therefore the end effector can rotate 90 degrees to interact with both of these valve orientations; this is why this part of the end effector is referred to as the “wrist joint.”
In the horizontal orientation, the gantry X and Y directions can maneuver the end effector to rotate valves and flip circuit breakers. However, in the vertical orientation, the end effector loses a degree of freedom that contributes to the rotating motion. To solve this issue, the end effector also has a built-in stepper motor that moves the support arm in and out in the Z direction. This movement, in addition to the X movement of the gantry system, is how the end effector can rotate valves mounted in the vertical orientation on the ShipBot testbed. Due to this simple but effective design for the end effector, our robot can interface with all the testbed components. To change the state of these switches and valves the robot simply maneuvers the two-dimensional gantry system and the Z stepper motor once the end effector's finger is properly engaged.

System Design - Electronics/Firmware
The electronics and firmware consist of all electrical components and code that is running within our system. This subsystem requires a thorough knowledge of programming with microcontrollers, signal processing of sensors, and motor control. These are all skills that our team members have developed from previous coursework and refined in the labs during the first few weeks of the course. The microcontrollers run C++ and Python code that performs many tasks related to sensing and actuation.
Our electronics and firmware can be broken into three main subsystems. There is the sensor processing subsystem, which interfaces with sensors (namely the RealSense camera). Based on the sensor data, this subsystem then sends commands to the two actuation subsystems. The first actuation subsystem deals with locomotion, which receives commands from the central hub and then drives motors to move the robot. The second actuation subsystem deals with manipulation, which receives commands from the central hub and then directs the robot’s arm to a specific location in space.
Our robot has four different processors, each used to handle a specific purpose:
-
A Jetson Nano functions as the central hub. It reads and processes sensor data, as well as sends commands to the other two processors. The Jetson Nano uses PyTorch, an ML framework, in order to handle image processing coming from the Intel RealSense camera.
-
An Arduino Mega is used for motor control in order to move the robot around the testbed. This device receives commands from the Jetson Nano in order to drive the motors, guiding the robot to a specific location. The Mega can run at 16 MHz and has a large number of PWM and interrupt pins, which is necessary to interface with our four DC motors, which each have encoder feedback.
-
An Arduino Uno drives the linear actuators, extending and retracting the end-effector as desired. The Uno is a simple processor that runs at 16 MHz, but fortunately we really don't need much processing power - we will just simply be telling the linear actuators to extend fully or retract completely. The Uno also has a compact motor shield that we use to drive the actuators. It receives commands from the Jetson.
-
An SKR 2.0 board allows us to use G-code, a computer numerical control programming language. This board has stepper motor drivers on it; its design allows us to easily use G-code to control the stepper motors as desired. Commands come from the Jetson Nano to drive the motors as needed.

System Design - Perception/Planning
This subsystem includes everything from the high-level interpretation of the command files down to the drive-kinematics sent to the motors. This involves working with the data from the stereo camera and possibly other sensors to both localize and identify the valves/breakers in front of the robot, and coordinate between all the different subsystems, such as the central hub, manipulator, and drive subsystems. Especially with the low-level code, this overlaps with the electronics/firmware subsystem, in that the kinematics/feedback control loop will need to be implemented in firmware on the microcontrollers themselves.

System Implementation - Mobile Platform & Gantry System
To ensure that our mobile platform would be strong enough and stable enough to support our entire robot, our team performed a weight test and slipping test. After the mobile platform was constructed and the motors were attached we added weight to the platform to see how much load our mobile platform could manage. The platform we constructed was able to successfully move in all directions with 20 pounds of weight on the platform. This was 5 pounds over our estimated final robot weight and therefore proved our mobile platform construction, along with our chosen DC motors, would be more than strong enough for our application.
​
For our slipping test, we placed the robot on a flat sheet of plywood and lifted one side introducing an angle to the mobile platform. We kept increasing this angle until either our robot base started to slide or started to tip over. We were able to achieve 20 degrees of tilt before our mobile platform lost its grip and slid down the plywood. Considering that the design specifications were changed to not include a rocking motion, we knew that we would be dealing with level ground; this slip test proved our mobile platform would be stable.
​
The testing of our gantry system was very straightforward. Since it was designed like a 3D printer or CNC machine, all we needed to do was give an X and Y coordinate and make sure our gantry was moving to the correct position. After some adjustments with the current going to the stepper motors and the proper conversion from motor steps to distance, the gantry system achieved the desired position to within two millimeters of accuracy.
.png)
.png)
System Implementation - End Effector System
Our end effector system was put through two tests to ensure robustness: a position test and a strength test. The position test was quite simple; we extended and contracted the linear actuators until they hit their internal limit switches and checked the resulting orientations. When designing the end effector system, we used both geometry and trigonometry to choose mounting locations that would result in 90-degree orientation changes with the full stroke of the actuator. This significantly simplified the programming as no feedback was required. Because we only needed vertical and horizontal orientations, we could drive the linear actuators to the fully extended location and fully contracted location every time. During testing, our calculations proved to be accurate and the linear actuators were correctly positioning the end effector system. For our strength test, we applied forces to the tip of our end-effector finger. Flipping the breaker boxes was the task where the end effector would need to withstand the most force. We measured that about 6 pounds of force was required to successfully flip the circuit breakers. Our end effector repeatedly was able to withstand this force. We knew going into the finger design that it would have to be strong, so when setting up the 3D print, we selected 100 percent infill and chose a printing orientation that would maximize the material strength.
.png)
.png)
.png)
.png)
System Implementation - Motor Subsystems
We tested each motor subsystem carefully before integration. For the DC motors, this meant designing a system that could command four DC motors with encoder feedback using PID control. For the linear actuators, we simply needed to drive the linear actuator pair and the solo linear actuator forward and back as needed. For the stepper motors, we needed to ensure that the SKR was configured appropriately so that each of the XYZ axes could be controlled using G-code commands. Once we programmed and tested each subsystem independently, we then integrated all the systems. We wrote a Python program to command each of the motor subsystems from the Jetson Nano, which ensured that all motors still functioned as desired. Thus, we were confident that we could control each component of the robot as needed for our tasks.

.png)
.png)

System Implementation - Perception Stack
The perception stack was designed with tests built-in. The YOLO neural network used to detect the valves/breakers in the camera frame was classified on a dataset we collected ourselves. This dataset contained over 14,000 images, one-fifth of which were set aside for validation. Therefore, we could ensure that the network was not being overfitted to the training data. Additionally, each classical vision algorithm was developed using a suite of “test videos” that were pre-recorded. Parameters were tuned until the code successfully detected the valve/breaker in every frame of the videos. Testing the planner was done quantitatively through simple measurement tests that worked up the stack. We first validated that our low-level PID loop controlled each motor’s speed correctly. We commanded each motor to turn at a specified rate (in degrees per second), and validated that the motors did indeed turn at this rate within ±5%. Next, we validated the accuracy of the trajectory generation/follower code. We commanded the robot to drive in a 1-meter square, then rotate 90 degrees and back. Again, we tuned parameters until we achieved a 5% closed-loop error for this trajectory. These tolerances were tight enough to ensure that we could drive to a station and adjust for any small errors using the gantry system.

System Implementation - Component Integration
With all individual subsystems implemented and tested, we integrated everything together to validate that everything worked in unison and make appropriate changes/adjustments.
.jpg)

System Performance
.png)
Mobile Platform
The mobile platform performed exactly as designed. It was a secure and sturdy base for the rest of our robot components. The 80/20 frame made connecting our motor mounts and gantry support frame very easy and the platform bed gave us a lot of room to mount all of our electronics. It was adaptable enough so that as our constraints were tweaked, we were able to easily adjust the wheel positioning and add additional platforms for electronics. The mobile platform also passed our testing criteria of being able to support at least 15 pounds and maintain stability with at least 15 degrees of tilt. The robust platform we created was able to support 20 pounds and withstand 20 degrees of tilt.
Gantry System
Our gantry system performed exactly as intended. The motion was smooth and consistent and properly aligned our end effector system every time. The SKR board did a great job controlling the gantry system and the stepper motors we chose were the perfect strength for driving the system. The testing of our gantry system was very straightforward. Since it was designed like a 3D printer or CNC machine, all we needed to do was give an X and Y coordinate and make sure our gantry was moving to the correct position. After some adjustments with the current going to the stepper motors and the proper conversion from motor steps to distance, the gantry system achieved the desired position to within two millimeters of accuracy.
End Effector
The end effector system also performed well considering the variety of components our robot needed to interface with. The finger was the perfect shape, size, and strength to successfully rotate all the valves and flip all the circuit breakers. The linear actuators were also a really good choice for our design as they made adjusting the end-effector orientation simple and efficient. The linear actuators kept a consistent travel distance which corresponded to a constant end effector finger location. This is why our end-effector was able to interface with most of the testbed components so reliably. Our end effector system was put through two tests to ensure robustness: a position test and a strength test. The position test essentially just confirmed our geomety and trigonometry calculations in that the full stroke lenth of the linear actuators should correspond to a 90 degree change in orientation. This test was successful and both the finger and drawbridge were able to transition between the vertical and horizontal orientations. For our strength test, we applied forces to the tip of our end-effector finger. Flipping the circuit breakers was the task where the end effector would need to withstand the most force. We measured that about 6 pounds of force was required to successfully flip the circuit breakers. Our end effector repeatedly was able to withstand this force. We knew going into the finger design that it would have to be strong, so when setting up the 3D print, we selected 100 percent infill and chose a printing orientation that would maximize the material strength. This proved to be very effective as our end effector finger never broke.
Electronics/Firmware
The electronics design worked very well for our robot. Although it took several weeks to develop, once we had each of the electronic subsystems functioning properly, they were integrated onto our robot. The DC motors responded well for the locomotion of the robot, although we found it sometimes difficult to attain precise movements given our encoder feedback design. Besides this, the linear actuators and stepper motors worked great for actuation, and the Jetson Nano was more than fast enough to handle all the communication between subsystems.
Perception/Planning
The perception stack performed worse than we expected. During tests, our detection accuracy was well over 90% with any missed detections caused by significant distortion in the captured frame. However, during actual testing, we found that oftentimes, the neural network misclassified the detections. Additionally, differences in lighting conditions threw off classical methods used to detect the valve rotations. We were able to work around this by using a ring light and by fixing the exposure/white balance of the RealSense to more closely match the color profile of the test videos. Our planning stack worked exceptionally well. We could reliably move distances with a 5% error. However, we did have some trouble with turning, especially as the batteries drained. Despite the shaky nature of the mecanum wheels, the feedback controllers worked well to remove any tracking errors we encountered.
Final System Demo Performance
When looking at our final robot as a cohesive package of the individual systems described above functioning together, our robot performance was fantastic. The Drunken Sailor was able to confidently visit every station given in the mission file and reliably actuate the ball valves and circuit breakers. The rotary valves proved to be a little more difficult, but our robot was able to attempt every time with about a 50 percent successful completion rate. The robot finished the course in 4 minutes and 35 seconds and earned 11 out of 15 points. Additionally, our robot was within the give size constraints and our team finished the project $15 under budget. These overall results earned us first place in the ShipBot competition in addition to the award of fastest run time. Our team was very proud of our final robot performance.
.png)
.png)
System Demo Videos
Our Team Won First Place In The Competition!!!


Project Management - Schedule

Project Management - Issues Log

Project Management - Parts List
