
Applications for Computer Vision and Plant Identification Program
Function and Operation
The purpose of this project is to be able to remotely control the robot identifying different species of plants. The code is implemented on the NVIDIA Jetson Nano Jetbot by Waveshare. The Jetson Nano is a small computer primarily used for learning about AI and machine learning. This system is Linux-based and can be accessed wirelessly, via USB-port, or connecting to the desktop GUI with an HDMI cable. I used a wireless connection to work on the Jetbot by connecting it to the same Wi-Fi network my laptop was connected to and connecting to the robot's IP address via web browser. From there, the code can be editted.
​
The teleoperation is based on the original example code on the SD card image for the Jetbot. The code for identifying plant came with the SD card image for standalone projects on the Jetson Nano without extensions. This classfication code was unchanged for now.
Discussion and Conclusion
Teleoperation
One issue I had with this project was that the initial controls of the Jetbot were extremely sensitive. The example code used the left joystick of the Xbox controller to control the speed and rotation of both robot wheels. In order for both motors to be completely still, the joystick had to rest in the in the absolute center so that the controller coordinates for the joystick were (0,0). Code needs to be written so that the joystick would not be so sensitive to the input if the joystick is desired to control movement.
​
Changing the controls to using the left and right triggers for the left and right motors, respectively, made turning the robot much easier than using the joystick or D-pad controls. The D-pad controls detected a change in input, like a zero to one change or tone to zero change. This indicated two executions of the action, which mad turning the robot heading much more difficult. Using trigger controls of turning was more precise in where I wanted to drive the robot. However, using D-pad controls for forward or backwards movement allowed me to have constant wheel speeds, rather than having varied wheel speeds with the trigger inputs.
​
For this project, I used an Xbox Series controller. The final decision for teleoperation controls were to enable left and right trigger functionality for the primary purpose of turning the robot and up and down D-pad button controls for forward and backwards movement, respectively.
Plant Identification
The code for this section was relatively simple to implement on the robot. The right bumper of the controller is used to take photos for training dataset. On-screen controls were used to indicate which photos when to what dataset, how many epochs to train the program, and whether the program was training or in live execution.
​
The final decision for plant identification were to use the example code from the classifcation lab of NVIDIA's Jetson Nano DLI course, which was simple and easy to implement. One major thing to add to this part of the application would be to automatically train the program from stored data rather than indicating training data during code execution.