Monday, January 31, 2011

Wrap up from Friday Meeting and New Tasks

Current Status:

Completed items
  • Kill Switch
  • GPS read
  • Compass Read
  • Accelerometer Read
  • 1 axis gyro Read
  • Single Way Point (Click on the map and the robot goes to that point based on GPS and Compass)
In Progress
  • Wheel Encoders
  • Integrate Accelerometer and Gyro
  • Multiple Way points
  • Read Sonar Sensor
  • Read IR Sensor
  • Collision Avoidance
  • Color Cone Tracking
  • USB based horn / signal lights
Tasks to Do:
  • Allow a series of way points to be defined on the map and have the robot move to each one.
  • Create new way points based on collision detection.
  • Integrate wheel encoders into mapping.

Some preliminary things I would like to see are:
Sonar Sensor: Demonstrate that the robot can track an object (say a person walking in front of it) by chasing that person around.
Demonstrate the upgraded GPS with the current software
I would like to see a MODEL of the event loop that we will eventually use IE
  1. Read GPS
  2. Read Compass
  3. Find how far we traveled based on encoders
  4. Are we near a cone? If so go to 6
  5. Set new speed and heading for the platform and go back to 1
  6. Search for a cone.
  7. If we find a cone, move to within 5 feet of it.
  8. If we don't find a cone after 30 seconds go back to 1.

Think about the event loop in terms of the the difference in timing between GPS updates,encoder updates, sonar updates and compass updates. What are these data rates? How can we preserve as much information as possible?

Saturday, January 29, 2011

important gps code note

So I discovered the reason the GPS coordinates would often fail to update. In the code, I ask it to read the serial line and if it starts with $GPGGA, use that line & data. The issue is, there's multiple lines that the GPS returns. SO if, when the program reaches the if statement, and the GPS is not sending that exact line RIGHT then and there, the program moves on to the rest of the code without updating. So the GPS coords only updated every now and then, when the timing happened to be just right. This time between acquisitions could be anywhere between the 1Hz of the GPS unit and NEVER.
What I've done is created another while loop inside the main one that only focuses on reading the serial line. Once the $GPGGA line is acquired, it allows the code to move on. Now the time between acquisitions will be at MOST 1 second.
The old code looked something like this:
    line = ser.readline()
    line = line.split(",")
    if line[0] =='$GPGGA':
       if init == True:
            currenttime,Latitude,Longitude = parse_gps(line)
            coord_i = lat_long_into_xyz(Latitude,Longitude) 
            print coord_i
            init = False
            currenttime,Latitude,Longitude = parse_gps(line)
            coord = lat_long_into_xyz(Latitude,Longitude)
            coord_x = coord[0] - coord_i[0
            coord_y = coord[1] - coord_i[1]
   ~~~rest of code~~~

The WORKING code now looks like this:
while (True)
    line = ser.readline()
    line = line.split(",")
    while(line[0] != '$GPGGA'): #Until it gets the $GPGGA line, stay here
        line = ser.readline()
        line = line.split(",")
    if init == True:
        currenttime,Latitude,Longitude = parse_gps(line)
        coord_i = lat_long_into_xyz(Latitude,Longitude) 
        print coord_i
        init = False
        currenttime,Latitude,Longitude = parse_gps(line)
    coord = lat_long_into_xyz(Latitude,Longitude)
    coord_x = coord[0] - coord_i[0]
    coord_y = coord[1] - coord_i[1]

Thursday, January 27, 2011

Autonomous Navigation attempt

Over the weekend, I worked on some code to drive the platform using a clickable map and the digital compass and gps. When you click different spots on the map, the platform kicks into gear and starts trying to get somewhere. However, it drove in the completely wrong direction and didn't go straight. Technically it was a failure, but it at least drove and tried to get somewhere.

Sonar Sensor

I put together the Sonar Sensor today.
Connect at 96008N1. It will output R### where the #'s are the distances in inches at about 20Hz. (Check calibration and timing on this) Minrange is about 6 inches, max range should be to about 15 feet.

Play with this in Hyperterminal first.

Everything is soldered except the connect to the USB to TTL. Be sure to connect the BLACK wire on the end of the connector to +5 on the CP2102 USB to TTL connector. Hot glue the connector in place and add some hotglue on the wires on the EZ1 board. I did had a noise suppression Cap into the inverter circuit so the serial should be relatively noise free. (IE don't get rid of my connector!)

Andy/Brian (or both) work on some code to get the robot to travel forward until within 20cm of an obstacle and then stop. Next would be have the platform turn until it detects an obstacle, move toward the obstacle once detected and then stop when at 20cm.

More details on how it is put together at:

Thursday, January 20, 2011

Pictures of the new shell and magnet box

Here is the picture of the project box that is holding the magnet sensor. It is mounted on top of the new turtle shell which we mounted to the chassis using vex parts.

Here is the robot in its full protected glory. Now if it rolls the laptop will not be completely destroyed.

ONWARD to glory!!!!
IMU code is going well. Here is the video the shows how it works.

Wednesday, January 19, 2011

IMU in meters per second

Finished Converting the IMU output data to meters per second. The noise in the signal dropped off drastically when I soldered the connecting wires directly to the sensor unit. My next task on the IMU is to convert the tilt axis data into something relevant for our use, and to use that data to ignore the acceleration of gravity's effect on our position.

Weekly Update: Arduino and Shaft Encoders

  • Lesson 4
  • Lesson 5
  • Pair Shaft Encoder with Arduino
  • Basic Encoder Code
 Shaft Encoder and Code

int inputPin = 2;
int val;                    
int rotationState;       
int inputSwitch = 0;         
float wheelRadius = 1.125;
int spokes = 44;
float distMet;
float rads;
int deg;
long numDeg;

void setup() {
  pinMode(inputPin, INPUT);   

  rotationState = digitalRead(inputPin);  

void loop(){
  val = digitalRead(inputPin);      

  if (val != rotationState) {       
    if (val == LOW) {               
      deg = 360/spokes;
      numDeg = inputSwitch * deg;
      rads = radians(numDeg);
      Serial.print("Radians: ");     
      distMet = (wheelRadius * rads)/39.3700787;
      Serial.print("Rotation Distance ");
      Serial.println(" Meters.");
  rotationState = val;               

Saturday, January 15, 2011

New Week, new to do list

Hi Folks,

The progress reports look good. Keep the video coming. I am happy to see that the platform is driving around.

For this next week, I see our overall goal as being able to drive the platform around while logging the GPS values and having the platform update its position on the map in real time. Ideally we would see a video by the end of the week of the platform driving through the field with an overlay of the gps track. It occurs to me that we will probably want to shift to google earth for the real time mapping. This should make the real time gps tracking or replay a cinch. (I remember we did this on our near earth satellite project a few years ago.)

Things to do:
  • Secure the netbook to the platform so it will survive. (Maybe bend up some brackets out of vex parts. Feel free to drill holes in the top plastic plate to bolt the backets on. Or zipties?)
  • Get something working with the wheel encoders on the robot even if it is low res. Choose robustness over resolution. I am more concerned about hardware now then software. Feel free to take the platform home if you need to (I think this is our biggest hardware problem now.)
  • Modify the sample arduino software to send out serial messages with the encoder values.
  • Do Calibration of your encoders with the real platform.
  • Work with Brian on getting the compass mounted on the platform. Andy needs it for his mapping work this week. The compass needs to be mounted at least 6 inches above plastic platform and on a NON metallic surface.
  • Take a look at the Maxbotics EZ1 and Sharp IR sensors. We will need to interface these to a different arduino next week for obstacle avoidance.
  • We need a map. I like your progress so far. Keep working on implementing the map code with the gps and get the image to move around according to the gps data.
  • Explore google earth as a platform for visualizing the gps data. I don't think this can replace what you are doing, but should allow for nice presentations.
  • Work with Mitch to agree on a messaging protocol from the encoders to the map software. I would suggest that the encoders send out a serial stream at 57600 8N1 which reports the current encoder value of a wheel on each side in a comma delimited format and terminated with a carriage return. Maybe a datarate of 10Hz is a good goal since it matches the GPS.
  • We want to get autonomous navigation started. Using your map, allow the user to specify a target point on the map (perhaps by clicking or entering GPS values.) Calculate how far it is from the current position to the target position and how much the bot needs to turn. Generate a set of turn and speed commands to get the robot pointing at the target. Do this closed loop. (IE turn a little, measure from the compass, decide how to turn, repeat)
  • Test your autonomous navigation on the real platform over short distances using only the compass to make sure you can get the heading to work.
  • Start thinking about obstacle avoidance and how that will figure in to the map.
  • Tackle the WAAS and other issues to improve our GPS signal. I dug out an improved GPS , but have to build a USB to TTL interface and test it. Ok, that is done and I will have it for you on Tuesday. Look at for details.
  • Run the same tests on the new GPS that you ran on the previous and make any necessary changes to use the new output stream.
  • Read the Venus documentation and the comments on the sparkfun site. Currently it is setup with WAAS enabled, pedestrian mode, 10Hz, 57600 8N1. There may be some other settings to play with.
  • Get the compass on the platform. Andy needs it for his mapping work this week. The compass needs to be mounted at least 6 inches above plastic platform and on a NON metallic surface. Talk to John about how he communicated with his compass in I2C mode and if this is worth doing. (He has the same one we do.)
  • After this stuff is done, focus on the IMU. Make a start at integrating the accelerometers for position.
I searched through my parts bin and couldn't find a sonar sensor, so I have ordered a maxbotics EZ1 (These are great) This will give us front obstacle avoidance with a range of about 5 meters. I also have a pair of SHARP 2YOA02 proximity sensors. (Very narrow beam out to 1.5 meters) I would like this to be the last week we are remote controlling the platform and start moving to autonomous navigation.

Friday, January 14, 2011

Video Update: Driver Control

Weekly Progress update

So this first week I got several important things taken care of. The first thing I did was to solder and integrate the killswitch into the rest of the power circuitry. I also added an LED to indicate whether or not it was engaged.

The next thing I did was to integrate the joystick program I had made with the servo controls to get the joystick driving the platform remotely.
Right away we realized that something was wrong with the steering as it pulled hard left, and no amount of programming seemed to fix it entirely. This issue plagued us for several days. After playing with the mechanical side of the steering mechanism, I discovered that the amount that the suspension is compressed changes how much the platform veers left. So I added some zip ties to compress the springs more, and now it drives nearly perfectly straight.

We started testing the driving again, this time at full power, but after about 5 minutes, the robot stopped working. The killswitch mosfet had overheated, detached from the heatsink and died. I needed to replace it and find a way to cool it better. Today, I replaced it, and attached it to a larger heatsink with a built in fan (It's actually a compact CPU cooler :P). After that, I tested it out by driving it around on the grassy hill next to bldg 60 at full power. I found that at full speed, tipping is a serious concern, as it flipped over twice. Next week I plan on modifying the driving code so that the steering sensitivity varies inversely with the speed to help reduce this risk.

The last thing I did was getting a program running with pygame so an image of a robot could be moved around on a satellite image background with the joystick. I also got started on implementing this code with the gps and got the image to move around according to the gps data, but not accurately.

Thursday, January 13, 2011

GPS visualizations

Using the Pharos GPS-500 we can get a latitude and longitude coordinate pair. If this pair is then converted to a coordinate system in python, you can observe the visual drift of the signal. This first image is a screen shot of the GPS signal drift while the connected laptop is stationary.

The scale on the image can be compared to the theoretical width of the ball, which is 2 meters. The average drift during the test was 3.3 meters, and the maximum was 7.25 meters. This is pretty accurate, but when the laptop moves its a completely different story.
The following image represents a u shaped walking path. It very nearly resembles the actual path I took, but it has jagged locations where it deviates from the real path.

The final image is a graphical representation of the drift in the x and y coordinates while the laptop is stationary. As you can see there is considerable drift in both coordinate plains, and this poses a problem for precise robot navigation. It is my intention to look into the WaaS system to augment this signal.

In the above image the red line represents X axis drift, and the green line represents y axis drift.

Wednesday, January 12, 2011

Chassis Modifications

  • Adjusted the steering rods to compensate for drift
  • determined the maximum and minimum limits for the servo and set the corresponding numerical value on the controller
  • Found that drift is dependent on weight in chassis
  • springs need to be stronger or the suspension needs to be more stiff
  • finished soldering together the kill switch

    Parts list

    need hot glue
    charger for the Li-Po batteries
    pipe clamps to adjust spring tension

IMU experiment 1/12/2011

After much head scratching the IMU began to spew forth loads of relevant data!
when the sensor is shook in only the x direction you can observe the waveform in the red data stream, when it is shook in only the y direction you can observe the waveform in the green data stream, and when shook in the z direction there is a waveform in the blue data stream.

First issue:
The baudrate for communication between the arduino and python was set at 115200 bits per second, and this caused an accumulation of lag in the data.

Resetting the baud rate to 4800 eliminated the lag.

Second issue:
The arduino code correlated pins to the accelerometer data incorrectly. It was searching for data on pins 1-5 while the data was being fed out on pins 0-4.

Reassign the pin numbers to the correct variables on the arduino.

Third issue:
The signal is choppy if the wires aren't connected to the board in a specific way. I think this is due to a poor quality wire connector.

Proposed solution:
Solder the wires directly to the sensor.

Tuesday, January 11, 2011

Pre Class Progress

  • Uninstalled Old Version Software
  • Installed
    • Python 2.6.6
    • Pyserial
    • Pygame
    • Arduino Software
    • Dropbox
    • On Screen Keyboard
      • Because the keyboard seems to be stuck in function mode and I have yet to figure out why. Therefore, certain keys are currently stuck on special characters.
  • Created Email Account (For Dropbox Account)
  • Arduino
    • Lesson 0
    • Lesson 1
    • Lesson 2
    • Lesson 3
    • Lesson 4
    • Lesson 5
    • Get Shaft Encoder Working
  • Ad Hoc Process
In Windows Vista or greater, navigate to Start >> Network >> Network and Sharing Center >> Manage Wireless Networks >> Add
  • Select "Create an Ad Hoc Network"
  • Network Name: "Robo"
  • Security: WPA2-Personal
  • Key: "***********"
Now that the Ad Hoc network has been created, all computers should select the new network from their list of available wireless networks and connect.

Next, to connect to the computer and initiate a remote desktop session, Navigate to Start >> Network >> Single Click "EEE" >> Right Click "EEE" >> Select "Connect with Remote Desktop Connection"

Monday, January 10, 2011

January 10th 2011

Summary of activities:

GPS usage was discussed.
  • Andy attached his laptop to the platform and sent drive and turn signals.
  • It was discovered that vex motors can be controlled by the servo control board.
  • A Vex battery charger was disassembled to understand the complexity of the electronics necessary to charge a Ni-Cad battery.
  • The batteries used for the Rock crawler were identified to be Li-Po type (Lithium-ion Polymer)
  • A project on the make website was found which details the construction of a Li-Po charge kit for $10
  • Andy disassembled the Vexplorer wireless camera to see if the data feed could be used to inform the Magellan platform.
  • The compass GUI program was hacked to the fewest lines of code necessary to feed the data into a larger program.
  • Identified goals for presentation of the bot at the first 99 meeting

Tuesday, January 4, 2011

GPS data

The highlighted data stream has the Prefix of $GPGGA.
This prefix denotes that the stream in question is a fix.
Experimenting with the Pharos GPS-500 developed problems with my OS. There were some permissions problems... I will not go into the details.
Anyway, The data is sometimes junk and comes out looking like its in Russian, but sometimes its useful data.
The first number, 195827.782, is a time stamp for the data feed. The time is formatted to Greenwich Mean Time or UTC.
The second number, 3403.0995, is the latitude. The number represents 34 degrees, 03.0995 represents minutes.
The following letter, N, represents that the latitude is north of the equator.
the third number, 11751.1995, is the longitude, same format as the latitude.
The following letter, W, shows that the measurement is west of the prime meridian.

The Data is better if the GPS is run outside, and I think the next step is to get a map in place and follow the data from there.