So some time has passed and I’ve had some success with different aspects of my robot.
For simplicity, I’m using a test bed based on an Arduino Duemilianova connected to a Nerdkits sourced display. I’ve hooked the display up as if it was a Pololu Orangutan SV-328 and am using Pololu libraries to write to it. Also, I’ve been working on the actual SVP based robot, so both of which are working well.
The processor 328p is used for the Duemilianova and requires the use of the Timer0, which implies no Pololu motor library code is possible without conflicts. However this is not an issue, as the Duemilianova doesn’t have motor drives anyway. The actual DogBot has the 1284p which is used in the SVP and uses Timer3, which has no limitations on any known libraries to my knowledge.
The freeRTOS code is posted on the Pololu Forum, mostly just back-up as the application code is very immature.
At this stage I’ve got all of the I2C bus based sensors working, based on code developed by Fleury. So, I can read the thermal sensor for its 8 pixels, and equally importantly, I can read the SRT10 Ultrasonic Sensor for distance in cm. One issue with the ultrasonic sensor is that its field of vision is so great that it basically detects anything “in front” of it. Good to not run into things, but pretty useless as a fine directional capability. It seems lucky that the Sharp IR distance sensors are very directional, and sufficiently accurate as a complement. The analog sensor readings are working well too, though I still have to create a ADC to cm regression.
From the point of view of sensing, it looks like the Sharp IR sensors will be the reference. With the SRT10 sonar being most relevant to create a “zone of safety” where I can be assured that the nearest object in a cone of 120deg is measured, but can’t be sure exactly which direction the object is. On the thermal side, I will get a vector (direction and temperature) from the sensing location, but no distance. But, that I knew and expected.
Putting some effort into designing the motor control, or Transport Task, has taken up my thoughts recently. I don’t want to link the odometry available from the quadrature encoders back into the mapping or routing task. Similarly, I don’t want to link the intertial navigation available from yaw and linear acceleration sensors into the motor task.
I think the transport task should simply take a vector, relative to the the current pose of the robot, and execute these translation commands subject to feedback from odometry, leaving the inertial navigation to another task.
This fits well into the design of the hardware, as odometry can can be queried from the Pololu SVP ancillary processor, without blocking, and the motor PWM drivers can be also managed without blocking other tasks. This creates a self contained task that does not need to share resources with other tasks.
However, the inertial sensors are analogue readings and the ADC will need to be shared with the Sharp IR distances sensors. Creating the need for a semaphore, and blocking based on the availability of the ADC.
Because of the battery issues described in Post 4, I’ve had to remove the servo neck of the DogBot. Therefore, I will implement the option for motion to be along circular paths, as well as along a straight line. Motion along a straight line, with stationary rotations to create the correct pose prior to departure, are the best paths to arrive at the destination with the lowest risk and shortest path. However, with a fixed sensor head, straight lines don’t fill the map with information as they leave the sensors always pointing in the same direction.
If the DogBot proceeds from A to B via a circular route (if this is requested by the mapping or logistics task), then the sensors will be pointed at all directions from +90 to -90 degrees along the path to the destination. Allowing the travel time to be used effectively for data acquisition.
If I’m feeling smart, then I can create any number of route subdivisions, and force DogBot to describe a path of smooth semicircles to the destination, gathering sensor data along the route.
The inertial sensors can be run in a parallel task (using the ADC along with the Sharp IR sensors), with the odometry (from the ancillary processor) to cross check that the expected distances and directions are traveled. Whilst I think the odometry is more likely to be accurate, the map will be updated constantly so some inaccuracy should be expected and tolerated by the code.
My next step is to design this transport task. This task should take distance, bearing and path description (straight line, circle, sinusoidal, etc) and carry it out to the best of its ability (odometry PID data only). I expect resolving this effort will take the next few weeks, and perhaps longer.
Later work is to develop the logistics and routing task that will issue the navigation requests to the transport task.
Continued with Post 6 (one year later).
Pingback: Freetronics 2010 (Arduino Duemilanova) Overclocking & Review | feilipu