Thursday, June 9, 2011

Society of Physics students presentation

We attended the Society of Physics Students (SPS) regional conference for zone 18 and brought Ferdinand with us. We presented a poster along with some other Mt. SAC students. Here is the .jpg of our poster.
The conference was fun. The CEO from the SETI institute gave a talk about their new radio telescope array, and detailed their summer internship program. Other meetings discussed methods for maintaining a healthy student club, and the was a physics magic show.

Wednesday, June 8, 2011

CORE comes back with Silver

After a grueling competition, our Robomagellan platform took second place. The motor controller burnt out on our second to last run, and we were unable to identify the problem before the competition finished. The gold medal went to UCLA's GLaDOS project, which has taken the gold at robogames for the past 3 years. It is unfortunate to lose, but to lose to a worthy opponent brings knowledge instead of shame.

Above: from left to right: Brian Walker, Andy Gabriel, Gabriel Chou.

Here is the platform in its moment of glory. It strikes me that most robots have names, because calling it the platform repeatedly is unpalatable. I suggest that Mt. SAC's entry be referred to as Ferdinand, in honor of the explorer for whom the contest is named.

Next year we will be coming back with more robust hardware, and more advanced navigation software, so GLaDOS might have its work cut out for it.

Monday, June 6, 2011

Video from Robogames 2011

Here is a video clip of the platform in action at Robogames 2011.



The system was mostly dead reckoned, taking compass readings from point to point, and using odometry to determine the distance traveled. This system generated consistent behavior, which we then tuned on successive trial. As a method for navigating a field, it has its limitations.

In order to improve on this system, a more advanced method for keeping track of position needs to be implemented. Our early attempts at tracking the robot graphically might work if the visual elements were removed and the data was handled in the dark. The graphical interface was excellent for debugging purposes, but the accumulation of lag in the robot's position may be due to the increased processing requirements.

Friday, February 25, 2011

Cone detection

Crashing into a cone:




This video demonstrates the robot tracking an orange cone using a video camera. This is accomplished using roborealm to process the video, and having roborealm communicate with python through an API server. The robot will attempt to crash its right wheel into the cone, and stop once it arrives. The code is in the infant stages, but it will consistently hit the cone, even if it doesn't turn the motors off after it does it.

-B

Thursday, February 24, 2011

Success!!!

Today we finished getting the new suspension attached to the chassis and got the steering aligned. It goes perfectly straight for about 20 meters and then starts to drift a little, but besides a full steering system overhaul, there's not much we can do about it. Tweaking the alignment simply involves adjusting the length of tie rods from the servo to the steering column. I found that one full turn of the adjustment results in roughly 1 degree of change on the wheels alignment
The newly installed suspension works great! The old shocks are still going to need pre-compression, however. We can continue to do it with zip-ties or buy some real clips.

On a different note, I rewrote the gps program to do the mapping with just the encoders and compass. As long as the encoders work as Mitch intended, it should be a very accurate way of keeping track of position. The robot_pos coordinates are no longer in pixels but in meters. This should make things easier in the future. Also, the robot coordinates don't rely on any image updating or processing. The program starts by waiting for a mouseclick on the map and then sets the initial robot position to that spot (This will be replaced by GPS coordinates later). It then requests an input distance and heading, and generates a destination coordinate based on these inputs. It should be a cinch to add on motor controls at this point.

Tuesday, February 22, 2011

Battery life: stress test


The above graph represents the voltage of the motor battery vs. time.
Conditions:
chassis suspended so the wheels do not touch the ground
motor power set to 150 units greater than stop condition
allowed to run without external torque on wheels

A natural exponent function very nearly represents the data, and establishes a horizontal asymptote at 11.26 volts.

It is reasonable to assume that the battery should never be allowed to approach that voltage, as it may refuse to take a charge afterward.

20 minutes of navigation seems to be the limit for safe operation.

-B

Project Meeting - Feb 17th

On Thursday the 17th of February there was a project meeting to discuss remaining tasks.

The result is the following task list.

-Test the run time for a fully charged battery
-Develop graphs and charts to explain the scientific content of the platform
-Test the efficiency of of numerous way points vs. sparse way points
-Test the accuracy and efficiency of traveling via a heading and distance scheme vs. a GPS and pixel scale scheme
-Check the behavior of the platform with constant compass update vs. periodic compass update.

Each of these tasks can be approached scientifically, so the next few weeks will involve gathering this data.

-Brian

Wednesday, February 9, 2011

The lag problem appears to be solved on my laptop.

I inserted one line in the call_gps() function, and it seems to have solved the problem.

The line was
gps.flushOutput()
The video shows how the gps is performing in the space.

Monday, February 7, 2011

Front Encoders Mounted and Tested

 Encoders
Today I completed the fourth encoder and mounted encoders 2-4 on the front wheels. Next I created a power rail for the encoders and connected them to the Arduino. After testing each encoder to ensure it was working properly, I tested the accuracy of encoders under motion. I noticed some variation between encoders, but even after 20+ meters the difference was minimal unless one of the encoders stopped on the borderline of a segment then it would trigger on/off repeatedly until moved again. I believe this error is due to the relative size of the segments in comparison to the actual encoders and I feel that a form of averaging could be utilized to compensate for the error programmatically.  

TO DO
Now that these encoders are done, I will focus on establishing more robust code to handle both front and rear encoders. While I am waiting for the new encoders to arrive, I will split my focus between refactoring my code and setting up the Sharp Long Range IR Sensors.






Video

Up periscope

Here is the new look for the robomagellan platform:

The shell has had a couple of modifications: First, the magnet sensor has been moved up several inches. Andy claims that the proximity to the laptop caused magnetic interferance, so he attached a board to the sensor and gave the platform a central periscope. I felt that the sensor was too exposed, and returned it to its project box, but utilized the wooden board to raise the whole box up into the air.The kill switch has been giving us trouble... read Andy's previous post for more info on that. So I thought the best solution was to extend the receiving antenna with some extra wire. I wrapped a double wire around the exterior of a carbon composite tube to create a helical antenna.

The range on the kill switch appears to be greater, but we will have to test it in the field to be sure.

Saturday, February 5, 2011

We need a FUNCTIONAL killswtich...

The killswitch is still ineffective. It will not pass inspection at the competition. I have no idea why on earth it still refuses to turn off. I can follow 5 feet behind it, press "OFF" or hold it in all kinds of ways until my thumb falls off, and it still does not die, even with this new switch with an antenna. This failure to stop resulted in irreparable damage to the ultrasonic range finder. (cont....)

Google Code Repository

There is an new google code repository at

Brian is taking the lead on organizing this. You should all get invites and please organize arduino code and python code into separate areas.

Thursday, February 3, 2011

Digital range finder code

Here is some sample code for interacting with the digital range finder.



CODE BEGINS:
----------------------------------------------------------------------

import serial
import pygame
from pygame.locals import *
import time
import math
from visual import *
import threading

init = 1

try:
sonic = serial.Serial(7,baudrate=9600, bytesize=8,parity = 'N',stopbits = 1,timeout=None, xonxoff=0, rtscts=0)
sonic.open
except:
print "Sonic range finder not connected or bad COM port number"
quit()

def parsesonic():
hold = 'a'
while(hold != '\r'):
hold = sonic.read()
sonic.read()
distance = sonic.read(3)
return int(distance)
class object_avoidance_thread(threading.Thread):
def run(self):
while True:
if (parsesonic()< 24):
print "woah man you're too close\n"

while(1==1):
if init == 1:
object_avoidance_thread().start()
init = 0
-----------------------------------------------------------------
CODE ENDS:


The above code causes the robot to output a string whenever an object is closer than 24 inches to it.

Current Tasks

  • Verify Functioning Encoder On Robot
  • Modify Program to Handle Four Encoders
  • Encoders 2 - 4
  • Connect to Arduino chip
  • Refactor Encoder Code
  • Incorporate Sharp Long Range IR Sensors

Encoder Success

Mounted Encoder
This video showcases a mounted and functioning wheel encoder and verifies that the programming is recording one full rotation correctly at 6.28 radians per rotation.

Remote Kill Switch

The saga of the remote kill switch really should be documented.

I am bringing in another remote kill with a 15A relay inside.

(www.datasheetcatalog.org/datasheets2/10/104158_1.pdf)

I used spade connectors. Please use these when joining wires as it will save us lots of grief down the road. Place a bit of solder on the connection and then wrap them in electrical tape.

Lets get the remote kill switch back in action so we can start driving again!

Tuesday, February 1, 2011

Video Update: Autonomous Control

Encoder Update

Encoder Snag

I hit a minor snag in designing the encoder chip. I completed the circuit on Sunday to find out on Monday that one out of four connectors functioned. Turns out I was reading the 5 band mini resistors incorrectly instead of being 11k OHM resistors they were 1k OHM and I blew two detectors. I learned my lesson about making sure to have my DMM with me. Fortunately, I had ordered some detectors from sparkfun a week ago so I am moving forward with reconstruction of the sensors. This time I am focusing my efforts on fusing the chip to each encoder instead of on a single board to minimize confusion. In addition, I modified the code to allow for 4 sensors. Currently, I need to focus on learning how to write functions in the Arduino language because much of the current code is heavily redundant. I have read the data sheet on the Sharp Long Range IR sensors and expect that construction / implementation should be relatively easy after the encoders are done.

Quick Cone Detector

Here is a quick demo of a cone detector. It returns the x,y coordinates of the center of the cone and the vector from the center of the frame (presumably where the robot is pointing) to the cone.

I will post a python snippet when we get to it, but this should give you guys an idea of the range of the cone detection (10 feet or so)

This reminds me that we need to get some cones. Please contact maintenance and public safety to see if we can borrow a couple of standard road cones. If not, I will buy some, but they are not cheap.


Computer vision requires massive CPU utilization so the cone detector will only run at 10 Hz or so.

Monday, January 31, 2011

Wrap up from Friday Meeting and New Tasks

Current Status:

Completed items
  • Kill Switch
  • GPS read
  • Compass Read
  • Accelerometer Read
  • 1 axis gyro Read
  • Single Way Point (Click on the map and the robot goes to that point based on GPS and Compass)
In Progress
  • Wheel Encoders
  • Integrate Accelerometer and Gyro
  • Multiple Way points
  • Read Sonar Sensor
  • Read IR Sensor
  • Collision Avoidance
  • Color Cone Tracking
  • USB based horn / signal lights
Tasks to Do:
  • Allow a series of way points to be defined on the map and have the robot move to each one.
  • Create new way points based on collision detection.
  • Integrate wheel encoders into mapping.

Some preliminary things I would like to see are:
Sonar Sensor: Demonstrate that the robot can track an object (say a person walking in front of it) by chasing that person around.
Demonstrate the upgraded GPS with the current software
I would like to see a MODEL of the event loop that we will eventually use IE
  1. Read GPS
  2. Read Compass
  3. Find how far we traveled based on encoders
  4. Are we near a cone? If so go to 6
  5. Set new speed and heading for the platform and go back to 1
  6. Search for a cone.
  7. If we find a cone, move to within 5 feet of it.
  8. If we don't find a cone after 30 seconds go back to 1.

Think about the event loop in terms of the the difference in timing between GPS updates,encoder updates, sonar updates and compass updates. What are these data rates? How can we preserve as much information as possible?

Saturday, January 29, 2011

important gps code note

So I discovered the reason the GPS coordinates would often fail to update. In the code, I ask it to read the serial line and if it starts with $GPGGA, use that line & data. The issue is, there's multiple lines that the GPS returns. SO if, when the program reaches the if statement, and the GPS is not sending that exact line RIGHT then and there, the program moves on to the rest of the code without updating. So the GPS coords only updated every now and then, when the timing happened to be just right. This time between acquisitions could be anywhere between the 1Hz of the GPS unit and NEVER.
What I've done is created another while loop inside the main one that only focuses on reading the serial line. Once the $GPGGA line is acquired, it allows the code to move on. Now the time between acquisitions will be at MOST 1 second.
The old code looked something like this:
while(True):
    line = ser.readline()
    line = line.split(",")
    if line[0] =='$GPGGA':
       if init == True:
            currenttime,Latitude,Longitude = parse_gps(line)
            coord_i = lat_long_into_xyz(Latitude,Longitude) 
            print coord_i
            init = False
       else:
            currenttime,Latitude,Longitude = parse_gps(line)
            coord = lat_long_into_xyz(Latitude,Longitude)
            coord_x = coord[0] - coord_i[0
            coord_y = coord[1] - coord_i[1]
   ~~~rest of code~~~


The WORKING code now looks like this:
while (True)
    line = ser.readline()
    line = line.split(",")
    while(line[0] != '$GPGGA'): #Until it gets the $GPGGA line, stay here
        line = ser.readline()
        line = line.split(",")
    if init == True:
        currenttime,Latitude,Longitude = parse_gps(line)
        coord_i = lat_long_into_xyz(Latitude,Longitude) 
        print coord_i
        init = False
    else:
        currenttime,Latitude,Longitude = parse_gps(line)
    coord = lat_long_into_xyz(Latitude,Longitude)
    coord_x = coord[0] - coord_i[0]
    coord_y = coord[1] - coord_i[1]

Thursday, January 27, 2011

Autonomous Navigation attempt


Over the weekend, I worked on some code to drive the platform using a clickable map and the digital compass and gps. When you click different spots on the map, the platform kicks into gear and starts trying to get somewhere. However, it drove in the completely wrong direction and didn't go straight. Technically it was a failure, but it at least drove and tried to get somewhere.

Sonar Sensor

I put together the Sonar Sensor today.
Connect at 96008N1. It will output R### where the #'s are the distances in inches at about 20Hz. (Check calibration and timing on this) Minrange is about 6 inches, max range should be to about 15 feet.

Play with this in Hyperterminal first.

Everything is soldered except the connect to the USB to TTL. Be sure to connect the BLACK wire on the end of the connector to +5 on the CP2102 USB to TTL connector. Hot glue the connector in place and add some hotglue on the wires on the EZ1 board. I did had a noise suppression Cap into the inverter circuit so the serial should be relatively noise free. (IE don't get rid of my connector!)

Andy/Brian (or both) work on some code to get the robot to travel forward until within 20cm of an obstacle and then stop. Next would be have the platform turn until it detects an obstacle, move toward the obstacle once detected and then stop when at 20cm.

More details on how it is put together at: http://profmason.com/?p=1472

Thursday, January 20, 2011

Pictures of the new shell and magnet box

Here is the picture of the project box that is holding the magnet sensor. It is mounted on top of the new turtle shell which we mounted to the chassis using vex parts.


Here is the robot in its full protected glory. Now if it rolls the laptop will not be completely destroyed.


ONWARD to glory!!!!
IMU code is going well. Here is the video the shows how it works.

Wednesday, January 19, 2011

IMU in meters per second

Finished Converting the IMU output data to meters per second. The noise in the signal dropped off drastically when I soldered the connecting wires directly to the sensor unit. My next task on the IMU is to convert the tilt axis data into something relevant for our use, and to use that data to ignore the acceleration of gravity's effect on our position.

Weekly Update: Arduino and Shaft Encoders

Arduino
  • Lesson 4
  • Lesson 5
  • Pair Shaft Encoder with Arduino
  • Basic Encoder Code
 Shaft Encoder and Code

int inputPin = 2;
int val;                    
int rotationState;       
int inputSwitch = 0;         
float wheelRadius = 1.125;
int spokes = 44;
float distMet;
float rads;
int deg;
long numDeg;

void setup() {
  pinMode(inputPin, INPUT);   

  Serial.begin(9600);          
  rotationState = digitalRead(inputPin);  
}


void loop(){
  val = digitalRead(inputPin);      

  if (val != rotationState) {       
    if (val == LOW) {               
      inputSwitch++;
      deg = 360/spokes;
      numDeg = inputSwitch * deg;
      rads = radians(numDeg);
      Serial.print("Radians: ");     
      Serial.println(rads);
      distMet = (wheelRadius * rads)/39.3700787;
      Serial.print("Rotation Distance ");
      Serial.print(distMet);
      Serial.println(" Meters.");
    }
  }
  rotationState = val;               
}

Saturday, January 15, 2011

New Week, new to do list

Hi Folks,

The progress reports look good. Keep the video coming. I am happy to see that the platform is driving around.

For this next week, I see our overall goal as being able to drive the platform around while logging the GPS values and having the platform update its position on the map in real time. Ideally we would see a video by the end of the week of the platform driving through the field with an overlay of the gps track. It occurs to me that we will probably want to shift to google earth for the real time mapping. This should make the real time gps tracking or replay a cinch. (I remember we did this on our near earth satellite project a few years ago.)

Things to do:
  • Secure the netbook to the platform so it will survive. (Maybe bend up some brackets out of vex parts. Feel free to drill holes in the top plastic plate to bolt the backets on. Or zipties?)
Mitch:
  • Get something working with the wheel encoders on the robot even if it is low res. Choose robustness over resolution. I am more concerned about hardware now then software. Feel free to take the platform home if you need to (I think this is our biggest hardware problem now.)
  • Modify the sample arduino software to send out serial messages with the encoder values.
  • Do Calibration of your encoders with the real platform.
  • Work with Brian on getting the compass mounted on the platform. Andy needs it for his mapping work this week. The compass needs to be mounted at least 6 inches above plastic platform and on a NON metallic surface.
  • Take a look at the Maxbotics EZ1 and Sharp IR sensors. We will need to interface these to a different arduino next week for obstacle avoidance.
Andy:
  • We need a map. I like your progress so far. Keep working on implementing the map code with the gps and get the image to move around according to the gps data.
  • Explore google earth as a platform for visualizing the gps data. I don't think this can replace what you are doing, but should allow for nice presentations.
  • Work with Mitch to agree on a messaging protocol from the encoders to the map software. I would suggest that the encoders send out a serial stream at 57600 8N1 which reports the current encoder value of a wheel on each side in a comma delimited format and terminated with a carriage return. Maybe a datarate of 10Hz is a good goal since it matches the GPS.
  • We want to get autonomous navigation started. Using your map, allow the user to specify a target point on the map (perhaps by clicking or entering GPS values.) Calculate how far it is from the current position to the target position and how much the bot needs to turn. Generate a set of turn and speed commands to get the robot pointing at the target. Do this closed loop. (IE turn a little, measure from the compass, decide how to turn, repeat)
  • Test your autonomous navigation on the real platform over short distances using only the compass to make sure you can get the heading to work.
  • Start thinking about obstacle avoidance and how that will figure in to the map.
Brian:
  • Tackle the WAAS and other issues to improve our GPS signal. I dug out an improved GPS ,http://www.sparkfun.com/products/9133 but have to build a USB to TTL interface and test it. Ok, that is done and I will have it for you on Tuesday. Look at http://profmason.com/?p=1468 for details.
  • Run the same tests on the new GPS that you ran on the previous and make any necessary changes to use the new output stream.
  • Read the Venus documentation and the comments on the sparkfun site. Currently it is setup with WAAS enabled, pedestrian mode, 10Hz, 57600 8N1. There may be some other settings to play with.
  • Get the compass on the platform. Andy needs it for his mapping work this week. The compass needs to be mounted at least 6 inches above plastic platform and on a NON metallic surface. Talk to John about how he communicated with his compass in I2C mode and if this is worth doing. (He has the same one we do.)
  • After this stuff is done, focus on the IMU. Make a start at integrating the accelerometers for position.
I searched through my parts bin and couldn't find a sonar sensor, so I have ordered a maxbotics EZ1 (These are great) This will give us front obstacle avoidance with a range of about 5 meters. I also have a pair of SHARP 2YOA02 proximity sensors. (Very narrow beam out to 1.5 meters) I would like this to be the last week we are remote controlling the platform and start moving to autonomous navigation.

Friday, January 14, 2011

Video Update: Driver Control


Weekly Progress update

1-14-2011
So this first week I got several important things taken care of. The first thing I did was to solder and integrate the killswitch into the rest of the power circuitry. I also added an LED to indicate whether or not it was engaged.

The next thing I did was to integrate the joystick program I had made with the servo controls to get the joystick driving the platform remotely.
Right away we realized that something was wrong with the steering as it pulled hard left, and no amount of programming seemed to fix it entirely. This issue plagued us for several days. After playing with the mechanical side of the steering mechanism, I discovered that the amount that the suspension is compressed changes how much the platform veers left. So I added some zip ties to compress the springs more, and now it drives nearly perfectly straight.

We started testing the driving again, this time at full power, but after about 5 minutes, the robot stopped working. The killswitch mosfet had overheated, detached from the heatsink and died. I needed to replace it and find a way to cool it better. Today, I replaced it, and attached it to a larger heatsink with a built in fan (It's actually a compact CPU cooler :P). After that, I tested it out by driving it around on the grassy hill next to bldg 60 at full power. I found that at full speed, tipping is a serious concern, as it flipped over twice. Next week I plan on modifying the driving code so that the steering sensitivity varies inversely with the speed to help reduce this risk.

The last thing I did was getting a program running with pygame so an image of a robot could be moved around on a satellite image background with the joystick. I also got started on implementing this code with the gps and got the image to move around according to the gps data, but not accurately.

Thursday, January 13, 2011

GPS visualizations


Using the Pharos GPS-500 we can get a latitude and longitude coordinate pair. If this pair is then converted to a coordinate system in python, you can observe the visual drift of the signal. This first image is a screen shot of the GPS signal drift while the connected laptop is stationary.


The scale on the image can be compared to the theoretical width of the ball, which is 2 meters. The average drift during the test was 3.3 meters, and the maximum was 7.25 meters. This is pretty accurate, but when the laptop moves its a completely different story.
The following image represents a u shaped walking path. It very nearly resembles the actual path I took, but it has jagged locations where it deviates from the real path.


The final image is a graphical representation of the drift in the x and y coordinates while the laptop is stationary. As you can see there is considerable drift in both coordinate plains, and this poses a problem for precise robot navigation. It is my intention to look into the WaaS system to augment this signal.

In the above image the red line represents X axis drift, and the green line represents y axis drift.

Wednesday, January 12, 2011

Chassis Modifications

1/12/11
  • Adjusted the steering rods to compensate for drift
  • determined the maximum and minimum limits for the servo and set the corresponding numerical value on the controller
  • Found that drift is dependent on weight in chassis
  • springs need to be stronger or the suspension needs to be more stiff
  • finished soldering together the kill switch


    Parts list


    need hot glue
    charger for the Li-Po batteries
    pipe clamps to adjust spring tension

IMU experiment 1/12/2011

After much head scratching the IMU began to spew forth loads of relevant data!
when the sensor is shook in only the x direction you can observe the waveform in the red data stream, when it is shook in only the y direction you can observe the waveform in the green data stream, and when shook in the z direction there is a waveform in the blue data stream.

First issue:
The baudrate for communication between the arduino and python was set at 115200 bits per second, and this caused an accumulation of lag in the data.

Solution:
Resetting the baud rate to 4800 eliminated the lag.

Second issue:
The arduino code correlated pins to the accelerometer data incorrectly. It was searching for data on pins 1-5 while the data was being fed out on pins 0-4.

Solution:
Reassign the pin numbers to the correct variables on the arduino.

Third issue:
The signal is choppy if the wires aren't connected to the board in a specific way. I think this is due to a poor quality wire connector.

Proposed solution:
Solder the wires directly to the sensor.

Tuesday, January 11, 2011

Pre Class Progress

Tasks
  • Uninstalled Old Version Software
  • Installed
    • Python 2.6.6
    • Pyserial
    • Pygame
    • Arduino Software
    • Dropbox
    • On Screen Keyboard
      • Because the keyboard seems to be stuck in function mode and I have yet to figure out why. Therefore, certain keys are currently stuck on special characters.
  • Created Email Account (For Dropbox Account)
    • robo.magellon@gmail.com
  • Arduino
    • Lesson 0
    • Lesson 1
    • Lesson 2
    • Lesson 3
    • Lesson 4
    • Lesson 5
    • Get Shaft Encoder Working
  • Ad Hoc Process
In Windows Vista or greater, navigate to Start >> Network >> Network and Sharing Center >> Manage Wireless Networks >> Add
  • Select "Create an Ad Hoc Network"
  • Network Name: "Robo"
  • Security: WPA2-Personal
  • Key: "***********"
Now that the Ad Hoc network has been created, all computers should select the new network from their list of available wireless networks and connect.


Next, to connect to the computer and initiate a remote desktop session, Navigate to Start >> Network >> Single Click "EEE" >> Right Click "EEE" >> Select "Connect with Remote Desktop Connection"

Monday, January 10, 2011

January 10th 2011

Summary of activities:

GPS usage was discussed.
  • Andy attached his laptop to the platform and sent drive and turn signals.
  • It was discovered that vex motors can be controlled by the servo control board.
  • A Vex battery charger was disassembled to understand the complexity of the electronics necessary to charge a Ni-Cad battery.
  • The batteries used for the Rock crawler were identified to be Li-Po type (Lithium-ion Polymer)
  • A project on the make website was found which details the construction of a Li-Po charge kit for $10
  • Andy disassembled the Vexplorer wireless camera to see if the data feed could be used to inform the Magellan platform.
  • The compass GUI program was hacked to the fewest lines of code necessary to feed the data into a larger program.
  • Identified goals for presentation of the bot at the first 99 meeting

Tuesday, January 4, 2011

GPS data


The highlighted data stream has the Prefix of $GPGGA.
This prefix denotes that the stream in question is a fix.
Experimenting with the Pharos GPS-500 developed problems with my OS. There were some permissions problems... I will not go into the details.
Anyway, The data is sometimes junk and comes out looking like its in Russian, but sometimes its useful data.
The first number, 195827.782, is a time stamp for the data feed. The time is formatted to Greenwich Mean Time or UTC.
The second number, 3403.0995, is the latitude. The number represents 34 degrees, 03.0995 represents minutes.
The following letter, N, represents that the latitude is north of the equator.
the third number, 11751.1995, is the longitude, same format as the latitude.
The following letter, W, shows that the measurement is west of the prime meridian.

The Data is better if the GPS is run outside, and I think the next step is to get a map in place and follow the data from there.