Search This Blog

Powered By Blogger

Monday, June 19, 2017

Bot, my personal pet robot

It has been about a two months since I started writing to my blog again after may be 2-3 years. I wish to continue writing from now on. So, today I post about my personal pet robot, Bot.
I first started working on my personal robot about probably a year and a half ago. I wanted to make a robot which I can use for various experiments and of course I have a dream of having my own pet robot. The picture posted here is the latest revision of the robot. It has undergone three revisions due to various issues and my personal preferences.























Latest revision of Bot. Yeah,its untidy. But I still love it.

The first Bot only had an ESP8266 as a its brain. That was only used for an OpenCV experiement. Later on I bought another chassis with two wheels and encoders. But That platform turned out to be unstable and the motors were of bad quality.
Bot rev. 2.0 had a raspberry pi 2 as its brain and an Arduino UNO for more "realtime" things like controlling the motors,taking battery voltage reading and acquiring data from IMU etc. I also added a Raspberry pi NoIR camera and a dot matrix display as its face. 
Here is a video of that robot smiling when it saw a ball,
This was just a basic color based object tracking demo. After facing numerous stability issues and lack of space, I gave up that platform and finally bought the current chassis you can see in the previous photo. It has two motors built in.
This chassis is great as it doesn't overspeed and both motors rotate almost at the same rate. Now my objective was to make it go in a straight line.As this chassis had no encoder, I had to use an MPU6050 IMU.
This Intertial Measurement UNIT has a 3 axis accelerometer and a 3 axis gyroscope and is an excellent bargain for $2-$3(in my country). Now,in order to move in a straight line, the robot must keep its yaw angle constant. Now we all know that accelerometers can't measure yaw angle and gyros have a drift problem, which means they tend to drift and therefore for the same orientation, their reading changes. So I would need a magnetometer. Fortunately, MPU6050 has a built in digital motion processor which runs some sort of weird and wonderful low pass filtering algorithm that keeps the yaw angle stable enough(note this enough) even without help from a magnetometer. 
The arduino runs a simple nested if...else... to keep itself steady when going straight or steering 90 degrees. I could have used PID there, but this if else seems to be working so didnt.
Then comes another problem. This robot runs on a 7.4V 800mAh lipo battery. Battery voltage is not a constant thing.When you charge it, it goes up to a value. When you use it, the voltage drops. As a result, the minor speed tweaking of the two motors of our differential drive system doesnt work anymore and the robot canoot correct its orientation properly or turn efficiently etc.
Now this issue can be solved in many ways like taking voltage reading and adjusting calibration accordingly in the code. But I did something that is not very efficient but easy. 
I connected a boost converter followed by a buck converter between battery and motor driver. The boost converter boosts the voltage to about 9V and then the buck converter converts it to about 8 volts, at which I did my tweaks.That way the voltage remains pretty much constant but since nothing is free in our world, I lose efficiency that means my battery runs out faster but thats okay for now.
Dusty, very very dusty. On the left are the power input cables for motor driver and on the right are the battery cables 

Now let's take a look at the brain, the raspberry pi 2. It is running raspbian jessie. Its a great sbc for community support etc. but not powerful enough to do anything beyond very basic image processing. So when I work with image processing things with this robot. I usually stream video from the pi over wifi and use opencv on PC to process the stream. 
The following link contains an excellent set of instructions on how to do this.
Reducing image size or 320x240 resulted in a near instant streaming with almost zero lag,but we are using wifi dongle, so sometimes it suffers some lag.
I am working on adding autonomy but for now,I use it as a remote controlled car to test its driving performance etc. This is a very slow project as I work on it maybe once a week or month. But work is progressing and I hope it will be done someday.
I wanted to implement a ROS based control environment for this robot. So I installed ROS in raspberry pi following the instructions at ROS website.
After initializing etc. if you want to connect your pi to your PC,then change the ROS_MASTER_URI environment variable in pi to your PC ip address.
run these commands
nano ~/.bashrc
then at the end of that file. Paste the following,
export ROS_MASTER_URI="http://:11311"
then close nano by pressing Ctrl+x->y->Enter
run source ~/.bashrc
Now, you can run roscore in your pc and the pi will talk to it.  
Right now, I am working on ROS integration. will post more when its done.
My ultimate goal is to make a robot that can move between rooms and find the caller. It may also notify me about important emails and things. It has a speaker. I shall add a microphone. I added one but the audio reception was really bad.
If any of you are wondering about the laser beside the camera, I used it to implement a laser rangefinder using image processing. I shall post it soon.