Home Data & Expense The AI in your non autonomous car

The AI in your non autonomous car

8 min read
View original post.

Sorry. Your next car probably won’t be autonomous. But, it will still have artificial intelligence (AI).

While most of the attention has been on advanced driver assistance systems (ADAS) and autonomous driving, AI will penetrate far deeper into the car. These overlooked areas offer fertile ground for incumbents and startups alike. Where is the fertile ground for these features? And where is the opportunity for startups?

Inside the cabin

Inward-facing AI cameras can be used to prevent accidents before they occur. These are currently widely deployed in commercial vehicles and trucks to monitor drivers to detect inebriation, distraction, drowsiness and fatigue to alert the driver. ADAS, inward-facing cameras and coaching have shown to drastically decrease insurance costs for commercial vehicle fleets.

The same technology is beginning to penetrate personal vehicles to monitor driver-related behavior for safety purposes. AI-powered cameras also can identify when children and pets are left in the vehicle to prevent heat-related deaths (on average, 37 children die from heat-related vehicle deaths in the U.S. each year).

Autonomous ridesharing will need to detect passenger occupancy and seat belt engagement, so that an autonomous vehicle can ensure passengers are safely on board a vehicle before driving off. They’ll also need to identify that items such as purses or cellphones are not left in the vehicle upon departure.

AI also can help reduce crash severity in the event of an accident. Computer vision and sensor fusion will detect whether seat belts are fastened and estimate body size to calibrate airbag deployment. Real-time passenger tracking and calibration of airbags and other safety features will become a critical design consideration for the cabin of the future.

Beyond safety, AI also will improve the user experience. Vehicles as a consumer product have lagged far behind laptops, tablets, TVs and mobile phones. Gesture recognition and natural language processing make perfect sense in the vehicle, and will make it easier for drivers and passengers to adjust driving settings, control the stereo and navigate.

Under the hood

AI also can be used to help diagnose and even predict maintenance events. Currently, vehicle sensors produce a huge amount of data, but only spit out simple codes that a mechanic can use for diagnosis. Machine learning may be able to make sense of widely disparate signals from all the various sensors for predictive maintenance and to prevent mechanical issues. This type of technology will be increasingly valuable for autonomous vehicles, which will not have access to hands-on interaction and interpretation.

AI also can be used to detect software anomalies and cybersecurity attacks. Whether the anomaly is malicious or just buggy code, it may have the same effect. Vehicles will need to identify problems quickly before they can propagate on the network.

Cars as mobile probes

In addition to providing ADAS and self-driving features, AI can be deployed on vision systems (e.g. cameras, radar, lidar) to turn the vehicle into a mobile probe. AI can be used to create high-definition maps that can be used for vehicle localization, identifying road locations and facades of addresses to supplement in-dash navigation systems, monitoring traffic and pedestrian movements and monitoring crime, as well as a variety of new emerging use cases.

Efficient AI will win

Automakers and suppliers are experimenting to see which features are technologically possible and commercially feasible. Many startups are tackling niche problems, and some of these solutions will prove their value. In the longer-term, there will be so many features that are possible (some cataloged here and some yet unknown) that they will compete for space on cost-constrained hardware.

Making a car is not cheap, and consumers are price-sensitive. Hardware tends to be the cost driver, so these piecewise AI solutions will need to be deployed simultaneously on the same hardware. The power requirements will add up quickly, and even contribute significantly to the total energy consumption of the vehicle.

It has been shown that for some computations, algorithmic advances have outpaced Moore’s Law for hardware. Several companies have started building processors designed for AI, but these won’t be cheap. Algorithmic development in AI will go a long way to enabling the intelligent car of the future. Fast, accurate, low-memory, low-power algorithms, like XNOR.ai* will be required to “stack” these features on low-cost, automotive-grade hardware.

Your next car will likely have several embedded AI features, even if it doesn’t drive itself.

* Full disclosure: XNOR.ai is an Autotech Ventures portfolio company.

Leave a Reply

Check Also

‘Mobilising rail’ for the MaaS generation

View original post.Mike Lambrou, VP, Head of Sales, UK for CellPoint Mobile, discusses how…