ROBOOST – Project state summary

Let’s summarize what’s going on so far. Above you can see pictures of the different prototype iterations.

The one on the left was basically my first introduction into robotics and programming. It simply consisted of a bought tank chassis and a few ultrasonic sensors, connected to an Arduino, to do some really fundamental object avoidance.

Next up is what two of my colleagues and me did for HTLΒ graduation thesisΒ (this an Austrian school-form, focused mainly on technology, in my case mechatronics, of similar level to high school). The concept wasn’t bad, the execution however had many flaws. It was the first time using mecanum wheels (IΒ΄ll post a whole dedicated summary of the kinematics involved). We used way too heavy lead battery cells in combination with low torque stepper motors, resulting in really compromised driving capabilities (in fact completely destroying the benefits of the wheels). Also, the mechanical design did not allow to carry any goods, in contrast to the inofficial goal of an autonomous beer crate carrier. To be fair, it did kinda work, and, considering the effort we also had to put into the documentation, we really did not do such a bad job.

This was also the first time I was introduced into LiDAR technology (which I will also describe in detail in a separate post). In my part as “the software guy”, I was dumb enough to reverse engineer the whole LiDAR protocol (as its documentation was in some sort of Chinese I guess). And if this wasn’t enough, I decided to write my own visualization tool. In C++. Yeah well, 315h of work later (this includes also the documentation and frequent coffee meetings with my really supportive supervisor), we had actually achieved something. The visualization of the point-cloud (incomposits mainly the LiDAR data) was working, and I was getting quite reliable landmarks for the usage in the SLAM algorithm (Simultaneous Localisation And Mapping, definitely also deserves a separate post). Afterwards I realized, all the programming was in fact redundant, as a system called ROS (also coming up in a separate post) exists, in which the integration of this sensor wouldn’t be much more than the press of a button. Nice.

Well, after finishing HTL, I decided to take the whole project on again for myself, from the ground up, hopefully learning from the encountered problems (as soon will be clear, this was in many parts not the case). So the latest prototype emerged:

(banana for reference)

This post should serve as a summary of the current state of the project, in order to document the progress of the development of the next version. Too keep it simple, I decided to separate it into the three main influential disciplines.

Electronics

For the electronics, there are a few different things to consider:

The robot has a maximum runtime of about 3 hours, when at full speed. This however, is not tested, but calculated. It is able to apply a “clean” and controlled voltage to all the components, while having various safety mechanisms implemented.

The motor-speed can be adjusted using a custom feedback controller, the motors are able to deliver (using the mecanum wheels) a maximum velocity of about 4 m/s. Using a custom made sensor-shield, different interfaces for various sensors are available.

For the power source, a 12 Ah LiPo battery is used. They usually are the goto choice, due to their light weight and high capacity.

The downside of LiPo batteries is their tendency to catch fire or explode, usually caused by wrong charging mechanisms. To prevent this, a BMS is used, to adjust the individual cell voltage and monitor the temperature. Also, a buck converter is used to to step down whatever input voltage is coming from outside (the charge plug) to 12,4 V. This adds another safety mechanism, so that the BMS itself does not run hot, needing to step down the voltage itself.

From the outside view, one simply has to find a fitting connector and ensure that the voltage is at least 12.4V. So, one can use 18, 24, 28 etc. Volt chargers and doesn’tΒ  need to worry about charging safety.

LiPo batteries could also be damaged by discharging them to a too low voltage. Therefore, the total voltage is measured by the microcontroller on the sensor-shield and limited by the BMS.

In this case, three main power levels are used. For once, there is a 12V voltage that is regulated by a boost converter. One needs this boost converter to ensure a steady voltage, as the battery voltage drops due to discharging. These 12V are mainly used by the motors and the MPU. As the MPU requires a really “clean” voltage, big bypass capacitors are used to filter voltage spikes, usually induced by the motor reverse induction.

Another converter is fed by the 12V rail, and generates 5,4V for the second rail.

For once, a nice big red emergency stop is embedded into the control panel. If pressed, the motors (only the motors) are disconnected from power, resulting in halting the robot.

One of the major safety hazards has been mentioned above: the LiPo battery. In addition to the controlled charging voltages, the main power line is secured by a 20 A fuse. As the motors consume most of the power, they are are also fused, limiting current above 10 A.

To prevent any signal distortion, (nearly) every signal line is secured by a ferrite core, the power lines with bypass capacitors. To further reduce EMC (inducted interference), shielding is used on the main cables and a common ground is established.

I don’t know of this counts as a safety feature, however, by using the green switch next to the emergency stop, the whole system can be powered on. A emerging behavior is that, by both turning on the robot with the green switch and pressing the emergency stop, everything is powered on, excluding the motors. This is pretty convenient for programming the robot, as it doesn’t drive you away. πŸ™‚

As the whole robot consists of about 300 parts, I’ll simply name the “most important” ones. If you need any further information make sure to contact me! πŸ™‚

The main processing unit (MPU) is a Jetson Nano (basically a Raspberry Pi on steroids). I chose this one, as it has a special graphical processor onboard, perfect for machine learning and AI applications. The sensor- and motor-shield revolve around an ESP32, however this is about to change in the next version.

Concerning the sensors: A dual IMX219-83Β camera is connected to the MPU and used for stereo vision and object detection. The RPLiDAR A1Β is the bang for your buck LiDAR choice, as it’s really cheap and delivers quite nice sensor readings. (I’m also using an IMU, but this component is really not that special)

Β The motors are 12V worm gear DC motors with a double shaft. One end is connected to WISAMIC 600p/r rotary encoders, the other to the mecanum wheels. Again, the motors are about to change next version.

Sensor and motor-shield are designed using EasyEDA, equipped and soldered by me. πŸ™‚

Informatics

The software is divided into three main segments, each running on a separate processor. For once, there is the MPU (Jetson Nano), running the ROS master and performing the high level processing, like path planning, SLAM, user-input, object detection and other packages, depending on the current conditions. This master program communicates with two subsidiary nodes, using ROS-Serial, running on the microcontrollers. One responsible for the motor controls, the other for the sensor data.

Well, the thing is, I haven’t finished the ROS setup so far, however, besides some fine tuning of the PID motorcontrollers, everything should be ready for self driving. Before and while implementing all this high level stuff, I’d like to have a solid hardware. I’m not really dependent on the hardware anyways so (nearly) everything can be transferred to the new robot as is.

I won’t go into that much of a detail, as you can see all the source code in the GitHub repo, soon including a nice readme file ;). If you have any further questions, make sure to contact me!

Mechanics

Well well, but how is this bad boy constructed?

The main goal of this design was simply to fit all the components and do some sort of “proof of concept”. This was accomplished, but not more (as you can see in the “Problems” section”). In its current state, the robot weights about 24kg and is able to move, as stated above, with a maximum speed of 4 m/s. It is about 44 cm in length and can carry a calculated load of 22kg. Before manufacturing, every component (besides most of the electronics) has been modeled in SolidWorks. The picture below is a rendering of the whole construction.

It is composed of three main sections, where the key part consists of an laser cuted acryl plate. The bottom one houses the motors, H-bridges, charging buck-converter, encoders, battery and balancer. Components belonging to the drive train are held in place by orange 3D printed parts, also connecting it to the second stage. This one holds most of the electronics, including the MPU, power terminals, fuses and the motor-shield. The big heat sink on the side of the robot belongs to the step-up converter, that keeps the battery output on steady 12V. Another 12 to 5V converter sits near by, underneath the camera. The top layer is designed to hold various sensors and buttons. Also the antennas and the sensor-shield are places there.

In addition to the 3D printed parts, aluminium rods are used to fasten the different sections together, while the main structural load is held by aluminium extrusions, right beneath the second acryl plate.

Aluminium turned clutches are used to fasten the wheels to the motor shaft.

I’m not that pleased with the whole design, hence developing a new version. However, if you’re interested in the 3D files, contact me! πŸ™‚

The first thing that catches the eye are the special wheels, called mecanum wheels. These allow the robot full freedom of motion in the two dimensional plain. The robot can, for example, simultaneously move in one direction and rotate around its axis. Also, movement perpendicular to the wheel rotation is possible, as you can see in the video below. The exact working principles of these wheels will be covered in an upcoming post.

As you can see, the positioning is not perfect yet, as the PID controller parameters need a bit of fine tuning and the position correction is not implemented yet.

Problems

While pretty pleased with the outcome so far, the project still has a long way to go. The following points are the main problems to solve in the next version.

Currently, it is really labor intensive to change little components in the inner section of the robot. To ensure the capability to add sensors in the future, there has to be an interface, linking from the load section to the inner controllers.

Right now, the drive train (including geared motor, couplers and encoders) take up a lot of space in the bottom section of the robot. Furthermore, the turned wheel connectors are not centered correctly, resulting in unpredictable skitting behavior. Also, the current gear reduction is produced by a worm gear, which is self locking, thereby bringing the wheel to an aprupt halt if the motors get disconnected from the power supply. This is for one bad for the gear itself and hinders one from manually rotating the wheel for debug purposes.

The current design is in fact really nice and sturdy, however, in order to be able to carry goods (especially beer crates), most of the hardware design has to get changed. This includes changing the frame to a metall construction, implementing interchangeable tires (for outdoor and indoor) and moving components like the LiDAR and antennas away from the top, to make space for the beer crates. πŸ˜‰

Right now, the boards are connected via USB to the MPU. Usually the UART protocol is fine for low bandwidth communication, however when using the LiDAR, the data has to get transferred faster. The plan is to use some sort of parallel protocol, to increase the update frequency of the sensor data.

As seen above, the controllboards on the motor- and sensor-shield are ESP32 microcontrollers. Well, while they usually are a great choice, especially because they are fairly easy to use and are quite fast, the WiFi is indeed not needed on these boards. Having the WiFi module on there results in higher current draw, layout limitations and, is quite frankly a feature not needed. Therefor, the next PCB design should make use of another microcontroller, for example using the STM32.

Although IΒ΄m doing a summary here, many parts of the robot are not documented well. For example, the only way the electronics system is described, is by a drawing on my whiteboard. Also, the mechanical components exist, besides the real ones, only in 3D form, so there are no drawings with measurements.

5 1 vote
Article Rating
Subscribe
Notify of
1 Comment
Inline Feedbacks
View all comments