Prospect 12 is PAVE‘s ongoing fully-autonomous vehicle project that started as an entry in the 2007 DARPA Urban Challenge. The vehicle is a 2005 Ford Escape Hybrid, equipped with several sensors and modified to allow computer-controlled driving while still comfortably seating five and allowing human driver control with the flip of a switch.
The team is currently working on improving its vision and navigation software as well as augmenting the system’s robustness. Additionally, the project continues to undertake hardware improvements and expansion, such as adding new sensors to the vehicle.
Faculty Adviser: Professor Alain Kornhauser
Prospect 12 is computer-controllable, which means steering, throttle, transmission, and brakes can all be set in software. We connect to the engine computer to use power steering and throttle (the gas pedal is electronic), and have mounted actuators to shift gears and apply the brakes. The control signals from the computer go to two Labjack UE9 devices for analog-to-digital conversion, are electronically isolated (to protect the computers and Labjacks), processed by custom circuitry on our main board [picture], and run through wires laid throughout the car.
Numerous system designs were tested before settling on something that was reliable and robustâ€”for example, electronically-controlled brakes proved to be too cumbersome and unreliable to implement, thus the team decided to use a linear actuator. Much of the knowledge we have now was accumulated from reading manuals and reverse-engineering to get everything working and integrated.
At PAVE, we have focused on using stereo cameras as the main sensors for our robots. Compared to other sensors like LIDAR units, stereo cameras are inexpensive while still providing rich, detailed views of the environment. We believe this combination makes them excellent sensors for a generation of widely-available, reliable autonomous vehicles.
Prospect 12 used three black-and-white stereo cameras in the Urban Challenge, but to better prepare it for navigating environments with signs, traffic lights, and other detailed features, we are switching to color stereo cameras as our main sensors. The Videre STOC cameras will be mounted on a new roof rack which will give them an excellent view of the road, and will be connected to several computers mounted on a server rack in the back for image processing.
The vehicle also uses a GPS, accelerometers and gyros, and a wheel encoder for estimating position and velocity. We are considering using sonar sensors in the bumpers (as some commercial do now) to assist in low-speed, close maneuvers (like parallel parking).
On the software front, we are porting all of our code to C++ in order to use the Inter-Process Communication framework (IPC) developed by Carnegie Mellon. In our experience, IPC requires much less overhead and is easier to implement than Microsoft Robotics Developer Studio. Learn more about our implementation of IPC.
A significant effort has been made to generalize our software so it may be easily implemented in and ported to other projects, such as our Intelligent Ground Vehicle Competition robots.To that end, our vision softwareÂ algorithmsÂ for Phobetor similar to those used in Prospect 12. To learn more about them, please visit the technologies portion of our website.