Home Connected Car Self-Driving How Self Driving Cars Work

How Self Driving Cars Work

8 min read
View original post.

Self driving cars have become a hot topic in the industry in the past few years, here in this article, Daniel Wesley tells us just how self-driving cars work, and the great technology behind them.

Lidar sensors

Receptive optics – A separate lens receives returning beams, and the unit processes beam “flight time” and angle to form a 3D point cloud reproduction of surroundings.

Dispersing beams – Newer “solid state” (no mechanical moving parts) units work by dispersing a source beam into a grid of horizontal and vertical beams.

Multiple sensors – Data from multiple lidar sensors is combined since individual sensors have a limited view angle.

Ultrasonic sensors

Element – Ultrasonic sensors have a round disc called an “element” that converts mechanical force (vibration) into an electrical signal and vice-versa for sending and receiving signals

Ping – The element sends out a sound (called a “ping”) and listens for the response (the echo) many times per second.

GPS – Connection to positioning satellites provides basic location awareness and preconstructed maps for trip planning.

Image sensors (cameras)

Exterior cameras – Cameras interpret the driving scene by processing large amounts of image data

Driver monitoring – An interior-facing camera can detect driver gaze and attention, and can warn an inattentive driver if intervention is necessary

Radar sensors

Antennas – A grid of antennas made of metal and other special materials emits electromagnetic waves.

Focusing lens – A plastic lens focuses the emitted waves.

Radar waves bounce around the environment, and the unit is tuned to receive returning waves at specific frequencies. A picture of the environment is generated by measuring the returned wave angle and travel time.

Radar Sensors

Limitations – Can’t detect small objects, discern road markings or read signs.

Strengths – A quick, lower resolution readout of the environment that can function well in varying light and weather conditions. Can track 30+ objects at a time.

Image Sensors (cameras)

Strengths – cameras can detect colour and contrast for functions like reading street signs, traffic signals and lane markings. Cameras can further classify objects; for example, to discern cyclists from pedestrians.

Limitations – Camera data requires powerful processing capability. Weather and light conditions can dramatically affect performance. Cameras may falsely identify printed images as real objects.


Strengths – GPS is a good starting point for trip planning.

Limitations – Current GPS systems are accurate to within about 2 metres. GPS can’t discern which way an object is facing, though it can determine travel direction.


Strengths – Ultrasonic sensors can detect transparent materials and liquids, and work well in high-glare environments where light-based sensors like cameras or lidar may fail.

Limitations – Susceptible to temperature fluctuations and wind.

Lidar Sensors

Strengths – Lidar recreates a detailed 3D view of the current driving environment.

Limitations – Lidar beam strength must operate within eye-safe levels. Lidar can sense reflective paint, but can’t detect colour or contrast, or transparent objects like glass.

Low Visibility

Radar works well in both light and dark conditions and can see through bad weather like fog, rain, and snow.

Lidar is fully functional in darkness but may be highly affected by weather particles like moisture in fog or raindrops.

Cameras are many times more perceptive than humans but still susceptible to varying weather and light conditions.

Ultrasonic sensors are susceptible to temperature fluctuations and wind.


Traffic signs, signals and road markings

Image sensor data is analyzed to discern traffic signs, signals, road markings, and car signals like blinkers or tail lights.

Other road occupants

Pedestrians, cyclists, and other road occupants might first register as a radar signal. Lidar could confirm highly accurate size an position in 3D space, while image sensors further classify the object.



Close proximity maneuvers

Ultrasonic sensors are very reliable at shorter distances, making them ideal for close-quaters manuevers like parallel parking. Image sensors and lidar may also assist.



Radar is ideal for highway driving tasks like tracking other cars and keeping adequate following distance at speed.

Image sensors interpret road markings for lane-keeping ability

Ultra-sonic sensors assist with close-proximity highway manuevers like passing or merging.


Daniel WesleyDaniel Wesley – Founder of Quote.com

Dan Wesley is a columnist at Forbes and Entrepreneur.com, as well as the founder of Quote.com – a content-rich, free resource that educates consumers on making smarter decisions about their finances.

Comments are closed.

Check Also

MaaS: Changing the way you travel

View original post. Integrated and digitised public transport systems have the potential t…