Live
- Indian students' concerns about employment, safety, and visas discourage them from applying to UK universities
- Candlelight Concerts Makes a Dazzling Debut in Hyderabad with Sold-Out 'Tribute to Coldplay' Show
- Shubman Gill Sustains Thumb Injury Ahead of Perth Test; Devdutt Padikkal Joins Test Squad
- Unlock Loot Boxes, Diamonds, Skins, and More Exciting Rewards with Garena Free Fire Max Redeem Codes for November 16
- Regarding the DOGE Plan, Vivek Ramaswamy stated, "Elon Musk and I Will Take a Chainsaw to Bureaucracy"
- Sudanese army says repulsed paramilitary forces attack in western Sudan, killing over 80
- Jaipur Open 2024: Baisoya makes a grand comeback to clinch title in marathon playoff against Rashid Khan
- Jamaat-e-Islami Hind President asks cadre to reach out to larger society beyond community
- Why PM mum on Caste Census, removing 50 pc quota limit: Rahul Gandhi
- Barrackpore Municipality Vice-Chairman found dead at home, suicide note suggests blackmail
Just In
Automakers are using tiny cameras, sensors to track drooping heads, steering wheel monitors and audible alerts to ensure drivers pay attention when using advanced driver assistance systems, like Tesla’s Autopilot, that allow drivers to take their hands off the wheel.
WASHINGTON: Automakers are using tiny cameras, sensors to track drooping heads, steering wheel monitors and audible alerts to ensure drivers pay attention when using advanced driver assistance systems, like Tesla’s Autopilot, that allow drivers to take their hands off the wheel.
In a report this week on the May 2016 crash of a Tesla Inc Model S that killed a driver who was using Autopilot, the National Transportation Safety Board demonstrated that users could mostly keep their hands off the wheel for extended periods despite repeated warnings from the vehicle.
But the crash underscored a vexing problem for automakers that want to gain an edge by launching technology that completely automates driving tasks. Unless a car is capable of driving itself safely in every situation, drivers will still have to remain alert and ready to take control even if the car is piloting itself. The NTSB, the federal agency charged with investigating significant transportation accidents, said during a 37-minute section of the 41-minute Tesla trip, the driver kept his hands on the wheel for just 25 seconds, putting his hands on the wheel for one- to- three second increments after getting repeated visual and audible warnings. General Motors Co delayed introduction of a driver assistance technology called Super Cruise that was initially planned for late last year because it said it was not ready.
The technology will go on sale this fall. Barry Walkup, chief engineer of Super Cruise, said the company added "a driver attention function, to insist on driver supervision." The system uses a small camera that focuses on the driver and works with infrared lights to track head position to determine where the driver is looking. If the system - which uses facial recognition software - detects the driver is not paying attention, it will prompt the driver to return attention to the road. If the driver does not respond, it will escalate alerts, including a steering wheel light bar, visual indicators, tactile alerts in the seat and audible alerts. If the driver does not respond, the vehicle is brought to a controlled stop.
Volkswagen AG's luxury Audi unit has a system that handles steering and braking at speeds of up to 40 miles. The system requires the driver to check in with the steering wheel every 15 seconds. Audi said the system will beep alerts at the driver, and if the driver does not respond, it will bring the vehicle to a stop.
The National Highway Traffic Safety Administration, which is the lead agency for regulating self-driving cars, does not test or preapprove driver assistance systems before automakers install them. Instead, the agency responds to complaints or crashes when it investigates whether a potential defect poses an unreasonable risk to driver safety. The May 2016 Tesla accident has raised concerns about the regulation of self-driving cars.
The NTSB will issue probable cause findings and may make recommendations to the NHTSA in the Tesla crash but does not plan to hold a public hearing on the incident, spokesman Keith Holloway said. In September 2016, Tesla unveiled new restrictions on Autopilot after widespread concerns the system lulled users into a false sense of security through its "hands-off" driving capability. The updated system temporarily prevents drivers from using the system if they do not respond to audible warnings to take back control of the car.
The car sounds warnings if drivers take their hands off the wheel for more than a minute at speeds above 45 miles per hour (72 kph) when there is no vehicle ahead, Tesla Chief Executive Elon Musk told reporters in September. If the driver ignores three audible warnings in an hour, the system temporarily shuts off until it is parked.
Musk said at the time that autopilot accidents are far more likely for expert users of the system, saying some users got as many as 10 warnings in an hour under the prior system. “It's not the neophytes, it's the experts. They get very comfortable with it and repeatedly ignore the car's warnings and in effect it becomes like a reflex action,” Musk said.
Tesla monitors drivers through their interactions with the steering wheel, turn signal, and speed setting, NHTSA said in a separate report.
Alphabet Inc's Waymo unit, which is also working on self-driving technology, is taking a different approach, arguing that asking drivers to pay attention while the car drives itself is wrong. Waymo is focusing its efforts on fully autonomous vehicles where humans take no part in driving, rather driver assistance. "We're not seeking to build a better car. Our goal is to build a better driver," Waymo Chief Executive Officer John Krafcik said earlier this year.
© 2024 Hyderabad Media House Limited/The Hans India. All rights reserved. Powered by hocalwire.com