Project Timeline
Our first step in this project was to collaboratively ideate based around all of our Learning Goals . We came up with many promising and ambitious ideas, including an autonomous hot-air balloon, an automatic drumming robot. Eventually, we settled on an autonomous waiter robot that could navigate a human space to deliver food. We wanted to utilize the ROS2 architecture to control it, and we wanted it's end-effector to be modularly swappable.
After deciding on a building a waiter robot, we defined our high-level goals for this project. We outlined our mechanical systems, including the drivetrain, and interchangeable module system, as well as our software systems, including the ROS2 topics, nodes, and modules that sense the world through sensors and interact with the world through motors. Then we got to work!
Electrical/Firmware Progress:
On the electrical side, we did hand calculations to estimate our drivetrain motor requirements, and using these estimates, selected suitable motors. Using the electrical specifications of these motors, we selected a battery that could power them, with a margin of error large enough for an additional module motor. With these critical components selected, we ordered motor drivers, as well as a buck converter to power our Raspberry Pi and Arduinos from the battery. Once the components arrived, we assembled a barebones power distribution network for testing.
Software Progress:
While waiting for these components to arrive, we examined the possibility of using microROS with Raspberry Pi Picos, but determined that the learning curve would be too steep to be able to meet our ambitious project goals, so instead we decided to pivot to using Arduino Unos as our microcontrollers. We also performed all of the computational setup for our Raspberry Pi, including downloading ROS2 packages, setting up a workspace, and assigning it a static IP address on the OLIN-ROBOTICS network. This included extensive setup for getting the Raspberry Pi to communicate with our Picos over MicroROS, although this wasn’t a part of our future system.
We established our codebase on Github and created a node for teleoperating the robot, which would later help us test the robot’s drivetrain hardware integrity. We created a motion execution node which performed basic proportional correction in the linear and angular dimensions to get from its current location to an arbitrary pose elsewhere. Finally, we began testing an open-source AprilTag recognition package developed for ROS2 systems like ours.
Mechanical Progress:
We created the preliminary CAD for our robot frame, chassis and drivetrain. We settled on having two driven wheels mounted along the center axis of the robot, so that we could have only two driven wheels and still turn in place. We also sourced some caster wheels to use to keep the robot balanced. After getting our CAD finalized, we began sourcing the other materials we would need.
Demo and Debrief:
At the end of Sprint 1, we presented our progress to the rest of the class. We demonstrated our complete chassis, our power distribution system, and our AprilTag recognition through ROS. After our sprint review, we met as a team to reflect on our progress and team structure, and set goals for the upcoming sprint.
Our goals for Sprint 2 were to finish constructing the drivetrain, design the tray module and modular mechanical output system, start writing libraries for various robot sensors like the buttons and encoders, and develop and implement the architecture for our robot’s localization and navigation system.
During Sprint 2, we continued to develop our robot’s mechanical, electrical, and software systems.
Electrical/Firmware Progress:
To test our drivetrain and power distribution system, we wrote motor firmware that receives motor commands from the Raspberry Pi and sends PWM signals to the motor drivers. In order to let users interact with Walter, we added buttons and wrote firmware to support them. We also started writing firmware to calculate linear and angular velocity from both motor encoders to provide real time feedback from the motors to the software system. For both the buttons and encoders we wrote custom Arduino libraries that were optimized for use on Walter.
Software Progress:
On the software side, we started to model our system as “state estimation” and “path-planning”. We settled on a path-planning system that would localize itself on a reference map and then navigate to predefined locations in the world. We developed a mapmaker node that utilized the ROS2 TransformManager tool to define static relationships between various key locations in the world, such as known AprilTag landmarks and table destinations. We also developed a serial adapter node that parsed serial communication between the microcontrollers and the ROS2 network, in place of using MicroROS. Finally, we began testing navigation software in simulation using the ROS2 command-line tools.
Progress on state estimation included extensive testing of the AprilTag package alongside a new tool, [rpi camera, v4l2 package testing]. While visual pose estimation was in development, we built a dead reckoning-based pose estimation system that allowed us to test our navigation software. This development, alongside our path-planning progress, allowed us to finally start testing the drivetrain hardware integrated with autonomous navigation right at the end of Sprint 2. This prepared the software team for a Sprint 3 that was almost entirely testing and debugging in the real world.
Mechanical Progress:
On the mechanical side, we completed our robot’s drivetrain and frame, and began work on the tray module. Using ½” plywood, and some basic woodshop tools, we cut out four panels - three walls and a top - and mounted them to our 1010 aluminum chassis using T-Nuts.
Once we had a top surface to attach something to, we then worked to get a draft of the tray module created. We designed a four-bar linkage, consisting of laser-cut acrylic links mounted with 3D-printed brackets. The links connected to an aluminum shaft held in place by bearings in 3D-printed mounts, allowing it to rotate freely. We mounted the tray, a laser-cut ⅛” plywood sheet, to the top of the four-bar.
Demo and Debrief:
At the end of Sprint 2, we presented our sprint review to the rest of the class, complete with a live demo of the robot being tele-operated to a sitting person, and the tray module being manually extended to deliver a bowl of fruit. Following our sprint review, we again met as a team to reflect on our team structure, organization, and our goals and deadlines for the upcoming final sprint.
In our final sprint, the final robot took shape as we completed and integrated all of our various systems.
Electrical/Firmware Progress:
On the electrical side we picked a module motor and motor driver. Using the geometry of the four-bar mechanism, we did hand calculations to estimate the torque required of the motor. From the results of these calculations, we decided to use a stepper motor for precise positional control, and holding torque. We also added an inertial measurement unit, or IMU, and motor encoders to the sensor Arduino to allow for the software to make a more accurate pose estimate.
With the tray module assembled, we wanted to be able to detect the presence of a plate on the tray so we integrated strain gauges into the tray design. Each strain gauge was a wheatstone bridge, and to amplify this to readable levels, we built a subtracting operational amplifier (OpAmp) for each strain gauge.
Software Progress:
On the software side, we split up our “goal-driver” node, which managed error correction, driving, and path-planning, into two nodes such that path-planning was totally independent. The new path planning node included a priority queuing system for new destinations, and multiple configurations for how it would navigate between them. We also developed a new module node that actuated the four-bar and integrated it into our routing routine. The robot would now actuate the four-bar upon destination arrival and only start navigating towards a new goal when the four-bar had been retracted.
We confirmed the functionality of our software pipeline in simulation, and also tested it on an alternate platform (a modified vacuum bot) to show off the system embodied. This allowed us to eliminate software errors while the physical robot was not yet ready for integrated testing. We also began developing several launch files that allowed us to run tests of systems in isolation, rather than bringing up the full software network every time we tested something.
We continued developing our own node to convert pose estimates in relation to an AprilTag into the world frame; however, we had a lot of trouble solving the math required of this transformation. As a result, all footage of robot navigation before the end of Sprint 3 involved dead reckoning rather than landmark pose estimates.
Mechanical Progress:
On the mechanical side, we worked to design the module interfacing system and adapt the existing four-bar tray to the module interfacing system. We first designed a dovetail joint inspired mechanism that has two parts: the module housing (which is attached to the robot frame), and the module receiver (which is part of a given module, and slides into the module housing). When the module receiver slides in the module housing channel, two gears mesh and transfer rotational power from a stepper motor. This allows a given module to be swapped without any fasteners. The module housing and receiver were both 3D printed using dissolvable support material due to the internal geometry required to transfer power in this way.
The four-bar tray is actuated by the stepper motor through a vertical shaft from the module receiver and a set of bevel gears that actuate the linkage.
We also designed and CAD-modeled a potential second model to serve drinks, which would rotate a carousel designed to hold standard 12oz beverage cans. Due to time constraints, we did not end up building this module.
Post Sprint 3:
Following our final sprint review, we still had some final things to work out with our robot. One major issue was a delay in communication between the Arduinos and the Raspberry Pi. This delay meant that our software routines would become less accurate and slow down any testing.
This site was created with the Nicepage