Self-Driving How Self Driving Cars Work By BMaaS Contributor Posted on May 11, 20188 min read View original post.Self driving cars have become a hot topic in the industry in the past few years, here in this article, Daniel Wesley tells us just how self-driving cars work, and the great technology behind them.Lidar sensorsReceptive optics – A separate lens receives returning beams, and the unit processes beam “flight time” and angle to form a 3D point cloud reproduction of surroundings.Dispersing beams – Newer “solid state” (no mechanical moving parts) units work by dispersing a source beam into a grid of horizontal and vertical beams.Multiple sensors – Data from multiple lidar sensors is combined since individual sensors have a limited view angle.Ultrasonic sensorsElement – Ultrasonic sensors have a round disc called an “element” that converts mechanical force (vibration) into an electrical signal and vice-versa for sending and receiving signalsPing – The element sends out a sound (called a “ping”) and listens for the response (the echo) many times per second.GPS – Connection to positioning satellites provides basic location awareness and preconstructed maps for trip planning.Image sensors (cameras)Exterior cameras – Cameras interpret the driving scene by processing large amounts of image dataDriver monitoring – An interior-facing camera can detect driver gaze and attention, and can warn an inattentive driver if intervention is necessaryRadar sensorsAntennas – A grid of antennas made of metal and other special materials emits electromagnetic waves.Focusing lens – A plastic lens focuses the emitted waves.Radar waves bounce around the environment, and the unit is tuned to receive returning waves at specific frequencies. A picture of the environment is generated by measuring the returned wave angle and travel time.Radar SensorsLimitations – Can’t detect small objects, discern road markings or read signs.Strengths – A quick, lower resolution readout of the environment that can function well in varying light and weather conditions. Can track 30+ objects at a time.Image Sensors (cameras)Strengths – cameras can detect colour and contrast for functions like reading street signs, traffic signals and lane markings. Cameras can further classify objects; for example, to discern cyclists from pedestrians.Limitations – Camera data requires powerful processing capability. Weather and light conditions can dramatically affect performance. Cameras may falsely identify printed images as real objects.GPSStrengths – GPS is a good starting point for trip planning.Limitations – Current GPS systems are accurate to within about 2 metres. GPS can’t discern which way an object is facing, though it can determine travel direction.UltrasonicStrengths – Ultrasonic sensors can detect transparent materials and liquids, and work well in high-glare environments where light-based sensors like cameras or lidar may fail.Limitations – Susceptible to temperature fluctuations and wind.Lidar SensorsStrengths – Lidar recreates a detailed 3D view of the current driving environment.Limitations – Lidar beam strength must operate within eye-safe levels. Lidar can sense reflective paint, but can’t detect colour or contrast, or transparent objects like glass.Low VisibilityRadar works well in both light and dark conditions and can see through bad weather like fog, rain, and snow.Lidar is fully functional in darkness but may be highly affected by weather particles like moisture in fog or raindrops.Cameras are many times more perceptive than humans but still susceptible to varying weather and light conditions.Ultrasonic sensors are susceptible to temperature fluctuations and wind.CityTraffic signs, signals and road markingsImage sensor data is analyzed to discern traffic signs, signals, road markings, and car signals like blinkers or tail lights.Other road occupantsPedestrians, cyclists, and other road occupants might first register as a radar signal. Lidar could confirm highly accurate size an position in 3D space, while image sensors further classify the object. Close proximity maneuversUltrasonic sensors are very reliable at shorter distances, making them ideal for close-quaters manuevers like parallel parking. Image sensors and lidar may also assist. HighwayRadar is ideal for highway driving tasks like tracking other cars and keeping adequate following distance at speed.Image sensors interpret road markings for lane-keeping abilityUltra-sonic sensors assist with close-proximity highway manuevers like passing or merging.AuthorDaniel Wesley – Founder of Quote.comDan Wesley is a columnist at Forbes and Entrepreneur.com, as well as the founder of Quote.com – a content-rich, free resource that educates consumers on making smarter decisions about their finances. Share on Facebook Share0 Share on TwitterTweet0 Share on LinkedIn Share Send email Mail
7 companies advancing the circular economy by selling products as a service View original post. The “performance economy,” as defined by the Ellen MacArthur Foundation, describes …