
Kia Motors is one of the rare automakers to tread a fine line between making conventional cars and “new energy” vehicles. The company has no problems highlighting its electric vehicles (EV) at trade shows, going as far as displaying them center stage. Now the company is looking even further ahead at the autonomous vehicle (AV) era with its holistic interactive system.
Watching Your Emotions — Kia Can Soon R.E.A.D. You
Kia showed how it views the interactive life inside AVs at CES 2019. The company has partnered with the Massachusetts Institute of Technology (MIT) Media Lab’s Affective Computing Group for its Real-time Emotion Adaptive Driving (R.E.A.D.) system, which will be used on its “SEED Car” concept with the company’s V-Touch technology. We’ll have more on the SEED Car in a coming article.
Kia’s R.E.A.D. system uses AI to recognize drivers’ and riders’ physiological emotions based on facial expressions, electrodermal activity, and heart rate, and then responding in a hopefully helpful way. CES visitors could see how the sensory controls react in real-time to their changing emotional state.
R.E.A.D. uses a music-response vibration system in the seats, like some game simulation chairs where the player feels the control reactions.
This sensory-based signal-processing technology adapts seat vibrations according to sound frequencies of the music played. My favorite part is how the seats can be set to massage mode.
But mostly, Kia says the system is designed to enhance safety by providing haptic warnings from the vehicle’s advanced driver-assist systems.
The SEED Car concept is a 100 km (62 mile) pedal-electric hybrid vehicle. It uses pedal input from the driver with a high degree of electric power assistance to make it effortless. For longer trips, Kia designed the BIRD Car, an AV shuttle with greater range. Both of them use R.E.A.D.
Lastly, the V-Touch is Kia’s virtual touch-type gesture control technology. It uses a 3D camera monitoring the drivers’ and passengers’ eyes, as well as fingertips. Anyone can manage in-car features through a heads-up display. Hand and finger gestures are something many mobility makers are working on as an intuitive way to interact with the cabin environment, such as lighting, heating, ventilation and air-conditioning (HVAC), and the entertainment system. Ultimately, this translates to fewer buttons, fewer touch screens, and more open spaces.
According to our last talks with James Bell, Director of Corporate Communications, at Kia Motors America (KMA), “the R.E.A.D. System analyzes a driver’s emotional state in real-time through bio-signal recognition.” This is done using sensors that read facial expressions, heart rate, and electrodermal activity on the steering wheel. The data gathered helps the AI change the interior environment, altering conditions to appease the five senses inside the cabin. Think of it as a way to make a driver experience a positive one by enhancing the cabin environment.
Kia uses its AI technology for deep learning of its passengers and drivers. It establishes a user behavior baseline and identifies certain patterns and trends. It can then customize the cabin to become a better-adapted environment.
How AI & AV Are Linked To Our Future Mobility Needs
It seems there isn’t a single auto press release without using the words AI and AV these days. Yet, both of these loosely defined terms are building blocks of our future mobility needs. And motorcyclists, don’t feel too smug: BMW is working on AI and AV motorcycles as well. The deeper question is: the introduction of personal computers in the late 1980s was supposed to free our time, so what about in AVs? How will this new enhanced digital life make our experience better?
Mr. Albert Biermann, President and Head of Research & Development Division of Hyundai Motor Group, at CES answered those questions like this:
“Kia considers the interactive cabin a focal point for future mobility, and the R.E.A.D. System represents a convergence of cutting-edge vehicle control technology and AI-based emotional intelligence. The system enables continuous communication between driver and vehicle through the unspoken language of ‘feeling’, thereby providing an optimal, human-sense oriented space for the driver in real-time.”
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
