TORC Robotics has partnered with the Robotics and Mechanisms Laboratory (RoMeLa) at the Virginia Polytechnic Institute and State University (5593 Virginia Tech) with the aim of developing vehicles for the next generation of National Federation of the Blind (NFB) Blind Driver Challenge vehicles.
The NFB developed the Blind Driver Challenge which calls upon developers and innovators to create interface technologies to allow those who are blind to drive a car independently. Held at the Daytona Speedway as a pre-race event for the annual Rolex 24 sports car endurance race, a blind driver was to independently drive the vehicle down the main straight onto the road course.
Using a crossover SUV, TORC implemented its ByWire drive-by-wire conversion modules, SafeStop wireless emergency stop system, and PowerHub distribution modules on the vehicle. Drive-by-wire gives a driver electronic control of a vehicle. The premise comes from the fly-by-wire system, where an aircraft’s controls produce electronic signals which are read and put through computing systems connected to actuators that control the surfaces of the wings and tail.
Jesse Hurdus, TORC’s project manager for this event, stated, “Cars are much further behind in taking this step. In order to have an autonomous vehicle, you need to have it so a computer can control the throttle, transmission, and braking systems. This is drive-by-wire”.
The team also used light detection and ranging (LIDAR) which measures distance by emitting a laser pulse and analysing the reflected light to determine the obstacles a driver has to drive around. However, LIDAR has difficulty with classifying obstacles and differentiating objects such as vegetation from other solid objects, which is where Allied Vision’s Prosilica GC1290C camera provided the solution.
TORC used the camera to help overcome the challenges LIDAR presents, by taking sensor data and feeding it into the software to provide an understanding on what is around the vehicle and detecting lane markings. The information is fed back to the autonomous system and provides input to the blind driver so that he or she can keep the vehicle centred and within the lane.
The blind driver wears special DriveGrip gloves and sits on a SpeedStrip padded insert on the driver’s seat. The gloves contain small vibrating motors on top of each finger which help relay steering information from the autonomous system. The padding on the driver’s seat also contains vibrating motors stretching along the driver’s legs and back which relay the vehicle’s speed information and vibrate to tell the driver to accelerate or brake. Vibrations in the gloves to signal the direction the car needs to be turned.
While the focus of TORC’s systems was specifically for the Challenge, they can be potentially used for future solutions. Hurdus concluded, “This was an exploratory effort to see how we could use the cameras to achieve the goal. A person blind from birth was able to drive a vehicle outfitted with sensor technology to give him an understanding of the environment generated by a combination of Allied Vision’s cameras, LIDAR systems, and GPS localisation systems. The fusion of all this data was able to give this person the ability to ‘see’ the environment as a person would be able to see through their own eyes.”
The NFB developed the Blind Driver Challenge which calls upon developers and innovators to create interface technologies to allow those who are blind to drive a car independently. Held at the Daytona Speedway as a pre-race event for the annual Rolex 24 sports car endurance race, a blind driver was to independently drive the vehicle down the main straight onto the road course.
Using a crossover SUV, TORC implemented its ByWire drive-by-wire conversion modules, SafeStop wireless emergency stop system, and PowerHub distribution modules on the vehicle. Drive-by-wire gives a driver electronic control of a vehicle. The premise comes from the fly-by-wire system, where an aircraft’s controls produce electronic signals which are read and put through computing systems connected to actuators that control the surfaces of the wings and tail.
Jesse Hurdus, TORC’s project manager for this event, stated, “Cars are much further behind in taking this step. In order to have an autonomous vehicle, you need to have it so a computer can control the throttle, transmission, and braking systems. This is drive-by-wire”.
The team also used light detection and ranging (LIDAR) which measures distance by emitting a laser pulse and analysing the reflected light to determine the obstacles a driver has to drive around. However, LIDAR has difficulty with classifying obstacles and differentiating objects such as vegetation from other solid objects, which is where Allied Vision’s Prosilica GC1290C camera provided the solution.
TORC used the camera to help overcome the challenges LIDAR presents, by taking sensor data and feeding it into the software to provide an understanding on what is around the vehicle and detecting lane markings. The information is fed back to the autonomous system and provides input to the blind driver so that he or she can keep the vehicle centred and within the lane.
The blind driver wears special DriveGrip gloves and sits on a SpeedStrip padded insert on the driver’s seat. The gloves contain small vibrating motors on top of each finger which help relay steering information from the autonomous system. The padding on the driver’s seat also contains vibrating motors stretching along the driver’s legs and back which relay the vehicle’s speed information and vibrate to tell the driver to accelerate or brake. Vibrations in the gloves to signal the direction the car needs to be turned.
While the focus of TORC’s systems was specifically for the Challenge, they can be potentially used for future solutions. Hurdus concluded, “This was an exploratory effort to see how we could use the cameras to achieve the goal. A person blind from birth was able to drive a vehicle outfitted with sensor technology to give him an understanding of the environment generated by a combination of Allied Vision’s cameras, LIDAR systems, and GPS localisation systems. The fusion of all this data was able to give this person the ability to ‘see’ the environment as a person would be able to see through their own eyes.”