Electronic Products & Technology

Western University researchers take self-driving cars next level

EP&T Magazine   

Automation / Robotics Electronics Engineering Software Engineering Medical automotive autonomous cameras cars drive health in-vehicle research self-driving Technology Western

In-vehicle cameras monitor driver health and state of mind

All drivers know the importance of keeping your eyes on the road. Now, a leading research team at the Western University is demonstrating that in the emerging world of self-driving cars, it’s just as important for the road — in other words, your car — to keep its eyes on you. And this summer, they’re tapping into the expertise of a top student from India to advance their technology, a simple in-vehicle camera that can detect a driver’s general state of being as well as their gaze.

Harshita Mangotra, an undergraduate student in electronics and communication engineering at Indira Gandhi Delhi Technical University for Women, is one of 651 international students in Ontario this summer – and 2,220 across Canada – who are helping to solve tough innovation challenges through a unique initiative called the Mitacs Globalink Research Internship program.

self driving electronic computer cars on road, 3d illustration

Mangotra is working under the guidance of Professor Soodeh Nikan in Western University’s Faculty of Electrical and Computer Engineering to advance the technology, which uses eye pupil measurements and changes in facial expression and skin colour to detect a driver’s current state, including their mental load, their emotion and their overall well-being. The goal is to advance Level 3 autonomous driving, a level where drivers can legally take their eyes off of the road under certain conditions and focus on other tasks instead, such as using a laptop or tablet, or eating with both hands.

Still alert the driver to intervene

“The more control the automated system has, the more freedom the driver is given,” explained Nikan, noting that though rare, Level 3 cars are starting to appear in California and Japan, and are expected to continue to gain momentum. “Yet, when the car encounters something unexpected, it will still alert the driver to intervene and our goal is to make sure the driver is in the right state to resume that control when it does happen,” she added.

Advertisement

Cameras currently used in Level 2 self-driving cars monitor a driver’s visual attention only. If their gaze veers off the road for too long, the car alerts them to put their hands back on the steering wheel.

The Western University technology is different because it is the first to use a simple, inexpensive camera to monitor pupil size and facial changes as indicators that a driver is under stress, angry, has high blood pressure or is too distracted to concentrate, and is therefore not able to safely resume driving. “If the cognitive load of the driver is high, for example, it means they are incapable of correctly analyzing the unexpected situation that caused their self-driving car to alarm in the first place, and that means either the automated system should either maintain control, perform a safe, emergency stop, or call 911 depending on the situation,” explained  Nikan.

Accurately detect and calculate changes in pupil size

Mangotra’s contribution to the technology — which can be used with any self-driving car and is expected to be ready for simulated driver testing later this year — was to apply her expertise in computer vision to develop AI algorithms to accurately detect and calculate changes in pupil size in real time. Using a data set of more than 20 million video images of people driving under different scenarios, such as wearing glasses, blinking, changing their head position or squinting due to the sun, she was able to identify the best machine learning model to accurately estimate pupil size.

A separate project is now using that model to associate change in pupil size over time to changes in a person’s cognitive load. A third project is developing software to detect visual changes in a driver’s skin colour or expression, focusing on the forehead and cheek areas to look for indications of flushing, sweating or mood changes, for example.

It will interface with many of the driver monitoring systems

“Our goal is to have one camera to monitor a driver’s gaze, health and current state,” said Nikan, who is currently looking for an industry partner in Canada to help commercialize the technology. “When our technology is ready to go to market, we envision it will easily interface with many of the driver monitoring systems currently available,” she said.

For Mangotra, the opportunity to leave her home country for the first time and study abroad in a collaborative lab environment was an amazing experience she finds difficult to express in words. “Mitacs took care of everything, from the visa process to looking for accommodations, and I’ve felt 100% supported every step of the way,” she said. “The real difference between Canada and India was the teamwork I discovered in the research lab. It helped me to complete this project in such a short time, and I couldn’t have done it without the support of my peers.”

 

 

Advertisement

Stories continue below

Print this page

Related Stories