This journal is maintained by Winlab summer interns Vamsi and Aniket, who are working on robotmobility. Unless otherwise noted, the entries are typed up by me (Aniket) and I refer to myself in first person for simplicity.
7/2/07 - Day 21
We are continuing to work on our algorithms to improve their speed and reliability. We have some time on the grid reserved for tomorrow morning as well as 2pm on Thursday and Friday.
We are currently focusing our efforts on finding the best curve fits for our data. Currently I am measuring data in a triangle and fitting it to a plane, and Vamsi is measuring his data along a strait line and fitting to a parabola. Both have their strengths and weaknesses.
There are many useful tools in the packages numpy, scipy, and matplotlib. We can generate high quality plots directly from python, which is easing the data analysis process.
7/3/07 - Day 22
We have begun working on a new grand solution to the access point locating challenge. We are working on an intellegent python program which combines different types of regression analysis to locate the AP.
A previously written bit of code, bigtri.py, was successful in locating the access point within a few meters. We are taking the idea from this code and working it into a larger project which meausures the signal strength at various locations around the room and remembers them all. This new program will form multiple "guesses" on the location of the AP using a few different techniques and also try and narrow down its guesses to a single best guess.
Ivan has suggested three guessing possibilities:
- Have the robot make several triangles aroudn the room and guess the location of the AP based on each one. Average the locations
- Sample some data points from each of the triangles visited and make a guess based on that
- Use every point observed and make a guess
We are using matplotlib to visualize the map the robot generates of the room and numpy to work with the data. So far we have written some fundamental functions to plot the robot's progress as it works, and the final product will allow us to see the robot's guesses on a map. We also have assembled code to perform the curve fit data to a 3D cone using scipy's leastsq() function.
Another fundamental difference between this project and previous algorithms is that we are making more careful decisions about data structures. I am trying to use numpy's arrays as much as possible. All of the data collected, since it all needs to be stored and recalled, will be held in a single array. Functions are in place to add new data points to this array and extract from it lists of X,Y, and S values.
7/5/07 - day 23
We are under way coding our ultimate solution to the access point problem. The first thing we've begun to do is assemble different python modules for different purposes, keeping the main code clean and readable. We have a module that handles the data, a module tha handles plotting, and a module that takes care of statistical analysis.
7/6/07 - day 24
The data we are gathering is very reluctant to fit the shapes we are trying to fit. The once-considered-reliable algorithm "bigtri.py" is producing erratic and inconsistant results. Using a least squares method to fit the data to a cone shape is frustratingly innacurate.
I've written some code, "snake.py", which snakes along a hallway taking measurements. The computer is connected to a LinkSys router at the other end of the hall. I tried fitting the data to a cone with known slope and z-value at the apex. The leastsq procedure was trying to solve for the x,y location of the cone's apex (the Access Point), and the result was far from accurate. I plotted the gathered data against the expected cone in matlab only to find that the data contained distinctly higher values than the cone I was hoping it would fit to.
The cone I was hoping to fit was the equation: z = a * sqrt((x-x0)2+(y-y0)2) + z0, where a is the cone slope and (x0,y0,z0) is the location of the apex. In my test run, with all of a,x0,y0,z0 known, all of the experimental data had much higher z values than the anticipated cone. To compensate, I raised the z0 value from 26.17 to 40, which looked like a much more appropriate fit. However, the leastsquares method still fails to locate the apex of this cone.
In order to determine the shape, we had one robot wander around orbit (we had some grid time today) taking measurements. The data is difficult to interpret, but I have begun to do some preliminary curve fitting to it. My hope is to be able to crack the code and be able to reliably identify the location of the access point from a random sampling of the large set of data. So far the prospects are dim - the data is very noisy.
7/9/07 - day 25
Serious hardware complications are stifling our progress. Two of our three robots are out of commission, apparently due to a fault in the RCM unit. We have written to the evolution team regarding the problem and hope to hear back soon.
I have begun porting our scanning_thread() python script to c++. We have been able to compile a few c++ codes now, and I expect before long we will switch away from python in favor of the more flexible and feature-rich c++ libraries.
7/10/07 - day 26
The robots are still broken.
We are working on a small project that will go live tomorrow at 7pm when we have some grid time scheduled. The plan is to use our scanning_thread() python script on every node on the grid (except 1 which will be an access point, and of course many nodes are out of comission and unusable). The plan is to generate a map of the signal strength in the room and study it until we can find a shape that fits it well.
We have collected data from the robot wandering around the orbit room. The data is incomprehensible, and no shape fits it very well. Hopefully our grid experiment will provide us with some better results.
7/13/07 - day 29
We've been working on a couple different projects for the past few days. The biggest is v3.py, which is going to lead the robot to the access point with high confidence. A second project is generating a map of signal strength vs. position using as many nodes on the grid as possible. The third thing I've been doing is playing around with Makefiles and g++, which are slowly beginning to demystify.
One important observation I have made is that the ERSP sample code does not always compile with g++-4.1 (It spits out an "extra qualification error". Fortunately, g++-3.3 is provided on the Debian Sarge CD, and can exist alongside g++-4.1 without a forced downgrade. We need to change the CXX flag in each Makefile to g++-3.3, but once this is done the code compiles and runs.
Vamsi has built some python scripts to build the map from the grid. We are using our basic scanning_thread() function …TBC