
"You're driving too fast", "Don't turn here, the next one", "I think you missed a pothole back there", "We can walk to the curb from here" - All helpful suggestions from people not tasked with driving the car. How will an autonomous driving system interact, process, weigh and respond to passenger preferences?
Enter US10173667B2, Occupant based vehicle control. Just granted last week, 1/8/2019, and filed 6/19/2017, this ELWHA LLC patent monitors a vehicles' occupants and modifies the driving experience.
"Acoustic sensors 136 may detect voluntary (e.g., “slow down, please”) and involuntary sounds (e.g., sudden intake of breath, screams, etc.)"
The occupant experience may be determined explicitly or implicitly. The same nonverbal cues a human driver may observe from a passenger, so does the "robotic" driver. "...the processing circuit may receive occupant data of a passenger repeatedly tapping a floor of the vehicle. The processing circuit may determine that the passenger is irritated and provide a command to increase, for example, the vehicle speed"
The implicit data may be acquired through any number of sensors. "The sensors of occupant monitoring system 130 may include cameras, body position sensors, force or pressure sensors, microphones, heart rate/pulse sensors, moisture sensors (e.g., for sweat detection), temperature sensors, a facial sensor (e.g., to detect frowns or facial features that indicate an occupant is uncomfortable)"
Just as we may give more weight to one passenger's driving preferences over another, so too does the system described in the '667 patent. Occupants may be classified by proximity to the driving control area, saved profiles or further categorization corresponding to a level of control or privilege. For example, "an owner or other high classified occupant may have additional privileges, such as overriding passenger inputs, (e.g., “ignore passenger A; passenger A gets scared any time the vehicle is operated over 20 miles-per-hour”, or “passenger B is our guest; pay extra attention to passenger B”, etc.)"
The implicit data may be acquired through any number of sensors. "The sensors of occupant monitoring system 130 may include cameras, body position sensors, force or pressure sensors, microphones, heart rate/pulse sensors, moisture sensors (e.g., for sweat detection), temperature sensors, a facial sensor (e.g., to detect frowns or facial features that indicate an occupant is uncomfortable)"
Just as we may give more weight to one passenger's driving preferences over another, so too does the system described in the '667 patent. Occupants may be classified by proximity to the driving control area, saved profiles or further categorization corresponding to a level of control or privilege. For example, "an owner or other high classified occupant may have additional privileges, such as overriding passenger inputs, (e.g., “ignore passenger A; passenger A gets scared any time the vehicle is operated over 20 miles-per-hour”, or “passenger B is our guest; pay extra attention to passenger B”, etc.)"
Learn More...
- Study Finds Backseat Drivers Nag Because They Just Don't Trust Your Driving - Paying closer attention to the road during your holiday travels may make driving less stressful. BY ROB STUMPF DECEMBER 20, 2018
- Estimating potential increases in travel with autonomous vehicles for the non-driving, elderly and people with travel-restrictive medical conditions Corey D. Harper, et al. Transportation Research Part C 72 (2016) 1–9
- Affectiva launches emotion tracking AI for drivers in autonomous vehicles. KHARI JOHNSON Venture Beat MARCH 21, 2018 6:00 AM
- Self-driving cars: from 2020 you will become a permanent backseat driver. Tim Adams The Guardian Sun 13 Sep 2015 05.05 EDT