Connected and Automated Vehicles

​​​​​FAQ

Q: Who is responsible for making sure cars are safe?

A: NHTSA sets the safety standards for vehicles and manages the vehicle self-certification process for manufacturers.

A: Drivers must know how to correctly use all the safety features in their vehicle and the limitations of those features.​

A: Manufacturers must plan for drivers’ learning any new technology and potential misuse of the technology.

Q: Who is responsible for making sure human drivers are safe and can follow the rules of the road?

A: State DMVs test and provide a driver’s license for human drivers.


Q: Who licenses the automation in a self-driving car?

A: That is undetermined currently. Most states that allow self-driving cars require a permit to be filed with the state before testing or operation, with or without a safety driver.


Q: Who gets a ticket for a self-driving car?

A: It is anticipated that the owner of the car would receive such a ticket, when operating in Level 4 fully automated mode but such situations may be dealt with on a case-by-case basis given the intersection between this emerging technology and the current Rules of the Road.

A: Note: Tesla’s AutoPilot or Full Self-driving beta, GM’s SuperCruise, and others are only Level 2 driver assist systems, so the driver is responsible for the technology in the vehicle.


Q: Who gets a ticket for a car with an automated feature that fails and causes a crash?

A: The driver of the car if they initiated the automated feature. (When operating in anything below a Level 4 fully automated mode.) Otherwise, it is anticipated that the owner of the vehicle would receive such a ticket if there is no operator of the vehicle, just like any other mechanical failure.


Q: Which Tesla drives itself?

A: None. Current versions of Full Self-Driving beta (FSD beta), Enhanced Autopilot, and Autopilot require active driver control and cannot drive the vehicle alone. Tesla has declared to the California DMV that FSD beta is only a Level 2 automated driver assist system. FSD Beta and Autopilot require a driver to pay attention to the road and be ready to take control of the vehicle at all times.


Q: Are AVs better than human drivers?

A: In some ways they will be much better. AVs have better perception than humans. They will be able to see a full 360-degrees. They can precisely compute the speed and slowing of cars ahead of them. They can react instantly because of their rapid computing power. They gain experience from other AVs while humans don’t learn from others. And of course, they don’t break traffic laws or drive while impaired or distracted.

On the other hand, humans have better intuition and instincts to interpret novel situations. For example, they can perceive body language to determine if the person standing on the side of the road is likely to try to cross.


Q: Will AVs be cautious enough?

A: They will be programmed, at least initially, to be very cautious. They will generally drive no more than the speed limit, come to a full stop for several seconds at stop signs, stop cautiously for red lights, and defer to pedestrians even if the AV has the right of way. Most likely, they won't be aggressive drivers.


Q: When are autonomous vehicles coming?

A: Some automated features are in many cars right now. Fully automated cars and trucks are being tested in many states and the public can access driverless taxi services in several cities. Fully driverless cars are not available to the public and are likely many years away from production.

Q: Are automated vehicles safe from hackers?

A: Bluetooth and cellular connections theoretically could make vehicles vulnerable to hacking, but this has not been demonstrated on any scale. Cybersecurity has been a priority for manufacturers for decades and DOTs are monitoring the issues. The fact remains that automated and connected features prevent more crashes every day