Bradley D. Farnsworth

Department of Electrical Engineering and Computer Science
Case Western Reserve University

 
 

Autonomous Robotics

In my studies at Case, I have been involved with several autonomous robotics projects. I will be updating this section with some detailed information on these projects.

Team Case - 2007 DARPA Urban Challenge

I was the sensors tech lead with Team Case in the 2007 DARPA Urban Challenge. Team Case built DEXTER, a fully-autonomous robotic vehicle for urban environments. In this, our first year, Team Case was a national semifinalist in the Urban Challenge.

I was responsible for selecting sensors to use on DEXTER, wiring the sensors for power and data, and making sure all the sensors on DEXTER were working properly. I wrote a 500kbps LIDAR driver with object detection in LabView. I also wrote a LabView wrapper for an EATON VORAD RADAR driver and assisted in writing a stereo vision driver.

I also maintained the computer systems on DEXTER and in the Team Case lab. I was responsible for modifying our five Mac Minis for DEXTER by specifying and installing solid state drives, and wiring out the power switch and status LED lines to an external control box.

Intelligent Ground Vehicle Competition 2006

I worked on vision algorithms for Case's entry to the 2006 IGVC. We created camera calibration software in MATLAB and C to create top-down views of the field in front of the robot, and then used color classifier algorithms to identify grass, white lines, and barrels in the field of view. My software returned a point cloud of detected object information from this top-down view that our robot could use for autonomous navigation. You can view more photos at the IGVC website.

LEGO Robot Autonomous Soccer 2006

In EECS 396L, we learned about digital image processing, TCP/IP communication, and biologically inspired algorithms, such as neural networks, for robotic control. We then built a small LEGO robot to play "soccer" on an indoor course. The goal was for our robot to push a golf ball into a goal area on an indoor field of approximately 1 meter x 2 meters.

We used a camera mounted in the ceiling above the playing field as our only sensor to determine the current state of the game (all robots' positions, and the position of the ball).

We created robust, flexible image processing software that could be trained to recognize various objects in the field of view. To accomplish this, we used background subtraction to isolate objects and then performed feature extraction (moment of inertia, RGB counts, perimeter, area, etc) on each object of interest in several poses.

Once trained, our software would generate optimal motion trajectories for our robot based on the current game state. These movement commands were transmitted wireless on an 802.11b link to an HP iPaq mounted on the robot. The iPaq then transmitted the motion commands directly to the motor controller on the robot.

Our team won 1st place in this challenge with the most goals scored.

Case IEEE LEGO Challenge 2005

We fielded a robot using a subset of the Autonomous Egg Hunt robot kit (including the 68HC11 microcontroller board) whose goal was to park in front of switching light bulbs in a closed arena. The competiton was a playoff-style with two robots head to head in each round. To start each round, one of several lights would be switched on. Both robots would then try to approach the light and park within a safety zone around the light. After a robot successfully parks, the light would turn off and the next (random) light would turn on. The biggest challenge was not to detect the source direction of the light, but to park within the safety zone. Objects were also strewn about the field, and the robot would have to navigate through them with a combination of bump sensors and robust construction. Some of the obstacles tended to be parts of the other robots! Our light-sensing soultion was to mount differential IR sensors on each side of the robot. We would then "parallel park" next to the light, ensuring that we were centered within the safety zone. Our team won first place.

LEGO Robot Autonomous Egg Hunt 2004

In EECS 375, we used LEGO beams, plates, gears, motors, a 68HC11 microcontroller board, AKA the MIT Handy Board, programmed in C, and various sensors to construct autonomous ( i.e., self-contained, no direct human control) robots. The first half of the course consisted of structured exercises in practical programming, LEGO mechanics, sensor principles, and software design, all presented with a biological slant - we consider autonomous robots to be model systems for the study of animal behavior. The second half of the course is spent designing robots which can compete in an Egg Hunt competition.

Our robot was designed to capture pastel eggs using a trap door in the front of the robot, and reject black-colored eggs. The pastel eggs were then transported on at a time to our "nest" which was indicated by a polarized light source. The robot used inexpensive bump sensors, photo-diodes, and photo-transistors to accomplish this task.

In this course, I learned to write solid, lightweight code in a state-machine design. I also learned to use inexpensive sensors with non-ideal outputs to accomplish useful tasks.

Our team won 1st place in this challenge with the most pastel eggs collected.


 
side_top
Bradley Farnsworth
side_bot
 
side_top

Home

Weblog

Research

Autonomous Robotics

Biography

Contact Information

View Bradley Farnsworth's profile on LinkedIn

Green Web Hosting! This site hosted by DreamHost.

side_bot
 
side_bot