Now that the robot is moving better (the extra L293D chips and capacitors have improved things), it is time to start looking into estimating the robot state over time. I thought I would start simple using just odometry and the ultrasonic sensors. This will also enable me to validate some of the odometry measurements. I have reasonable expectations on the quality of the results. Everything I have read leads me to believe this will not be very accurate. First, here is the parts list (it is getting quite long):

- Arduino: Arduino Uno R3
- Robot platform: http://www.dfrobot.com/
- Motor shield: https://www.adafruit.com/products/81
- XBee wireless kit: http://www.amazon.com/CanaKit-Xbee-Wireless-Kit/dp/B004G1EBOU/ref=sr_1_9?ie=UTF8&qid=1331990327&sr=8-9
- HY-SRF05: https://www.virtuabotix.com/feed/?p=209
- Wheel encoders: http://www.dfrobot.com/index.php?route=product/product&filter_tag=encoder&product_id=98

Next I need a state model. Again, starting simple, the robot pose can be represented as X, Y, θ. This position / orientation model is discussed in several papers. A couple of key links that fully describe this approach are:

- http://rossum.sourceforge.net/papers/DiffSteer/DiffSteer.html
- http://www-personal.umich.edu/~johannb/Papers/pos96rep.pdf
- http://www.ecse.monash.edu.au/centres/irrc/LKPubs/ICRA97a.PDF

Obviously, the DFRobot has four wheels and will not be perfectly modeled using this differential steering model. More likely, it will be better modeled using a tracked robot (or crawler). However, I have to start somewhere and this is the easiest. Starting this very simple model differential steering model, we have two sensors (the encoders) that are producing measurements about how far each wheel has rotated. Using the nomenclature specified by Lucas (see rossum reference), the encoders provide S

Currently, the github code returned the counts provided by the optical encoder. To convert those counts to meters, we need to make a few adjustments. Based on the wheel encoder data sheet, there are 20 PPR (pulse per revolution) made. This means 20 returns is a full wheel revolution. The wheel is about 0.0635m in diameter (2.5 in). Converting this to the circumference, the S

_{l}and S_{r}or the distance each wheel has traveled since the last measurement.Currently, the github code returned the counts provided by the optical encoder. To convert those counts to meters, we need to make a few adjustments. Based on the wheel encoder data sheet, there are 20 PPR (pulse per revolution) made. This means 20 returns is a full wheel revolution. The wheel is about 0.0635m in diameter (2.5 in). Converting this to the circumference, the S

_{l}/ S_{r}measurements are:` inline float convert(int count) `

` { `

` `

static const float pi = 20.0f;`static const float ppr = `

3.1416f` static const float diameter = 0.0635f; `

` static const float circumference = pi * diameter; `

` return circumference * `

count / ppr;` }`

I inserted this code into the wheel encoder class so now the values returned are the distance in meters. Next is the representation of the state. I added a new class to the trunk at device/platform/dfrobot/state. This class will represent the state of the robot given encoder updates at regular intervals. Initially, this will be just a simple linear prediction model, but will evolve as more sensors are added. First, the state can be represented explicitly using the equations given in here:

*s* = (S_{r} + S_{l}) / 2

θ_{t }= (S_{r} - S_{l}) / b + θ_{t-1}

x_{t} = *s* cos(θ_{t}) + x_{t-1}

y_{t }= *s* sin(θ_{t}) + y_{t-1}

where b is the baseline between the wheels. This means the current state at time t is based on an incremental update to the previous state based on the current measurements. Obviously this integration is going to be very noisy and subject to drift. But nonetheless, it is a starting point. I started with a simple sketch that drove the robot forward, paused, then backward. I repeated this several times with the robot heading towards a wall and away. Using this with the ultrasonic range sensor, I should see the range distance decrease while the position increases and visa versa. Here is a short movie of the robot moving:

Next, I made a plot of the distance sensor output versus the robot x position:

which clearly shows the relationship I was expecting. Next, here is a plot showing the x/y position of the robot in the world:

Again, the code to make this plot is at the github site:

https://github.com/mark-r-stevens/Ardadv/tree/master/device/platform/dfrobot/state/test01next up, I will look at more complicated maneuvers with different wheel speeds and see how well the model works. I will also start validating the distances traveled by measuring the actual robot location.

When I first got started with XBee, I took lot of time and hard work for getting the right information to help me on my working projects. I put my project on hold and spent a few months simply trying to understand XBee specifically using it with arduino. This post is a good source for understanding XBee module with Xbee tutorial.

ReplyDeleteCan you help me? I tring to build a wheeled mobile robot

ReplyDeletecan you please tell me the equations you used to calculate the displacement Sl and Sr ??

ReplyDeleteill be grateful