<P> To navigate, Stanley used five roof mounted Sick AG LIDAR units to build a 3 - D map of the environment, supplementing the position sensing GPS system . An internal guidance system utilizing gyroscopes and accelerometers monitored the orientation of the vehicle and also served to supplement GPS and other sensor data . Additional guidance data was provided by a video camera used to observe driving conditions out to eighty meters (beyond the range of the LIDAR) and to ensure room enough for acceleration . Stanley also had sensors installed in a wheel well to record a pattern imprinted on the tire and to act as an odometer in case of loss of signal (such as when driving through a tunnel). Using the data from this sensor, the on - board computer can extrapolate how far it has traveled since the signal was lost . </P> <P> To process the sensor data and execute decisions, Stanley was equipped with six low - power 1.6 GHz Intel Pentium M based computers in the trunk, running different versions of the Linux operating system . </P> <P> The School of Engineering developed the 100,000 lines of software run by Stanley to interpret sensor data and execute navigation decisions . Using what Popular Mechanics calls a "common robot hierarchy", Stanley utilizes "low - level modules fed raw data from LIDAR, the camera, GPS sets and inertial sensors into software programs (to control) the vehicle's speed, direction and decision making . </P> <P> Stanley was characterized by a machine learning based approach to obstacle detection . Data from the LIDARs was fused with images from the vision system to perform more distant look - ahead . If a path of drivable terrain could not be detected for at least 40 meters in front of the vehicle, speed was decreased and the LIDARs were used to locate a safe passage . </P>

Stanley the robot that won the darpa grand challenge