Initialization

Our first step was to ensure we could rotate and steer the tires. In order to sync the esc with our pi, we completed a motor initialization sequence, again provided on the Canvas website. Doing this allows the pi to control the speed of the car via the PWM board. This entailed setting the duty cycle of the modulation to 20 percent, 15 percent, 10 percent and then 15 percent, in that order. The duty cycle is the amount of time the motor sends a signal in a given cycle. For the car, setting the duty cycle to 15 percent stopped the tires altogether, or zero speed. The initialization sequence needed to be completed every time we turned the car on. 

The steering angles range from 0 to 180 degrees, with straight ahead being set to 90. Higher than 90 turns the car left, while less than 90 turns right. A substep of the initialization process was to find if the 90th degree was actually straight on the path. There could be a couple reasons why this wouldn’t be the case, including weight distribution of the car with the multiple sensors, varying tire pressure, and potentially a non perfectly straight path. This, too, was done almost every time we drove the car, and found our ‘straight’ steer angle varied between 92 and 93. 

It should be noted that the rear tires are used for the speed, and the front tires are used for steering. 

Turning

The turn during the project utilizes the RPLidar. For this step, the process is fairly straightforward. When we analyzed our projected path, we found that on the left side of the car, there is only one large pole along the first leg, and it happens to be at the very end where we would like to start the turn. Therefore, we continuously read the distance value corresponding to the 90th degree of the lidar. The image below shows the direction the car travels in red, and which angle we will read in green.

We initially measured the distance from the path to the pole at the moment in which we would begin turning. We then coded that if the lidar picks up a reading within a range +/- of the previously measured distance, we would turn the tires for a certain period of time. This did take some trial and error in order to achieve a smooth turn and place it in the middle of the second leg without over or under steering. In our final design, we set the steering angle to 110 degrees, and kept it in that position for 3.8 seconds before returning to the original steering angle. 

After the turn, the steering component would take back over to center the car. We also still read values of the lidar, and if another pole was found after the turn, the car would stop and would conclude the race track.

Steering

Steering entails keeping the car centered on the track. Over the course of the semester, this is where a majority of our time was spent and dedicated to. We were unsuccessful in perfectly steering the car in the end, unfortunately. 

In our final design, we placed the camera on the side of our car rather than the front, for reasons which will be explained later in the paper. The figure below shows a simple photo capture outside.

The main method for steering was to manipulate the images in order to stay a steady distance away from the grass on the left side. This was done by doing a number of things. The first of which was to take an image, similar to that above. The next was to convert the image to a HSV color scheme, which stands for hue, saturation and value. The hue component is basically a color range, and saturation and value help read brightness of the pixels. Doing this allows us to mask the image so that certain colors become white and the rest will be black. We took the HSV image and converted it again to pick out green colors. An example of this is shown in below.

After the masking, we proceeded to crop the image to only display the bottom portion, as we did not need to take into account anything above the grass. After this is done, ideally we have an image that would be white from the green grass on the top and black on the bottom as the road. 

We then went through each pixel, line by line starting from the top and going down. The number of white pixels were counted in each row and compared to a threshold number. Obviously, as we get to the road, which should be nearly black, the number of pixels should drop off considerably. When we found the first row which was beneath this threshold, we would return that row and stop looping. 

The simple thought of steering is that if the row number would get smaller, that would mean the road would be found higher up on the image, and the car would be too far away from the grass. The car would then turn its tires to the left for a small amount of time before returning to the original steering angle. The opposite would be true if the row number would get larger. By taking an image, masking it, and reading the pixels, we were able to steer the car on the track to keep a certain distance away from the grass.

ROS

Our final design utilized ROS1, which allowed us to run several sensors at once and have them interact with the car. We used six different nodes to run our car. We established talker nodes for the camera and lidar sensor, as well as one to initialize the car starting. We defined two listener nodes for the motor as well as one for the steering. 

The camera node would publish the row it recognized as the edge of the sidewalk. The steering was subscribed to this topic and would change the wheel position based on the current sidewalk value. The lidar node published a 0 or 1 depending on whether or not it detected a pole on the left side of the car. The motor initialization node was basically just a start button once we had all of the other nodes up and running. When it was run, the motorstart node would turn on the motor and the steering would begin. The other listener node was subscribed to the lidar topic, and this would override the steering to complete the hard coded turn once it recognized the 1 value from the lidar sensor. After the pole was no longer in range, the steering would turn back on and a count kept track that one pole had been passed. The same process would happen until the lidar detected another pole. At that point, the count would increase to 2. Once the car reached the final pole at the end of the path, the count would be at 3, which signified to the motor listener node to stop running, and the car would stop.

Results

Although we contributed a great amount of effort and time, we were unable to successfully have our car drive autonomously along the track without it steering off into the grass. The car did work for a part of the track, including if it started closer to the turning pole. We would have loved to say our car worked perfectly, but due to problems and considerations outlined in this paper, our project was simply not consistent enough to fully complete our task.