Self-Driving Cars, Do you want one? A car that can drive autonomously in which the driver doesn’t have to worry about anything. Sounds amazing right?
I know driving a car is fun, I have learned it too. but there are times when we need these self-driving cars. Let’s say you want to go on a very long drive, which will take 24hours then at times like this you are going to get tired.
In this post I will walk you through how a self-driving car learns driving, what are the components used, and How the technology has made a dream of an autonomous vehicle come true.
But before we get an understanding it is very important to keep in mind that – The thing about automating a job or task is if a human can do it, the machine will be able to do that too. We can teach machines how to do it under the supervision of a human.
How we drive cars?
I know most of you already know about it ;). I just wanted to add them to the picture. These are the minimalist features that are used by humans for driving.
In the case of Self-Driving cars, these are all automatically operated by Electronic Control Units or ECU in short.
Not only Self-driving cars but all of the modern cars are equipped with ECU which governs multiple functionalities such as doors, battery management, brake control, etc.
All these ECU’s in modern cars are governed or controlled by a central computer or a Master ECU. In case the central computing unit fails there are backup ECU’s to perform basic functionality, just to add safety.
Safety is most important factor when it comes to autonomous vehicles. More on active safety is here.
Sensing and Detection Technologies that makes Self-Driving cars possible
An autonomous vehicle must be aware of the objects around it. In order to provide detection and sensing abilities to cars, the following are the few important sensing and detection modules that are used widely.
Each of them have their pros and cons. The sensing ability depends upon many environmental factors such as Weather, etc.
Acoustic sensors are the ears of modern cars. They are used to collect sound, pressure, and vibration data. You can consider them as microphones.
These Sensors uses piezoelectric crystals to convert sound wave into the electrical signals. More details can be found here.
These acoustic sensors are used for on road noise testing. E.g: Brake Noise and Engine Noise testing.
The Ultra Sonic sensors are mostly used for shorter range. For E.g for features like auto parking.
Ultra Sonic sensors uses sound waves to precisely detect objects.
The job of these sensor is to control park distance. A ultra sonic distance can detect object in range of 4-6metres.
Modern cars are equipped with around 10-12 ultra sonic sensors in both front and rear.
Radars are very popular because they are being used from a very long time. All aircraft’s and other military defense mechanisms.
Radar technology uses electromagnetic waves in the radio spectrum frequency. Radar detection system uses an transmitter and receiver antenna which is used to send and receive radio frequency .
The best thing about Radar is bad weather can not affect radar much, but if we take camera weather is quite important factor when it comes to object detection.
Radar detects angle, velocity and range of an object with great accuracy.
In a Self-Driving the number of Radar can be min of 5 to Maximum of 21. Depends upon the area to be covered.
Radar depending upon their range are classified into SRR, MRR, and LRR.
SRR – Short Range Radars | Range – approximately 30m.
MRR – Mid Range Radars | Range – approximately 60m.
LRR – Long Range Radars | Range– approximately 250m
Each type Radar mentioned above has its own range and range resolution factors. Which makes them significantly useful.
Let’s take an example for Parking assist and collision detection in both front and back of the vehicle there are SRR and MRR used, but for detecting upcoming obstacles and their velocity and range LRR are used.
LiDAR acronym for Light Detection And Ranging.
Before teaching a car how to drive the cars are manually driven millions of miles under all kinds of road and weather conditions. The data generated in the manual driving is processed after by data analysts to write algorithms.
The data that is generated from LiDAR is the key ingredient to this whole Self-Driving Recipe.
LiDAR generated 3D models of on-road objects. LiDAR uses lasers to create a 3D high-resolution models of the objects that are nearly located up to 3 times a football field distance away and yes in 360 degrees.
There is one disadvantage of LiDAR, they are highly expensive but they work under all weather conditions.
A Self-Driving Car has from 1-5 of Such LiDAR of different ranges.
Camera is less expensive than that of the LiDAR but both of them have their own disadvantages.
Camera can detect traffic signs, Traffic lights, pedestrian movement, lane markers, and temperature in case of thermal camera’s.
LiDAR uses laser light to detect objects but in bad weather conditions such as fog light will not be a good solution.
Camera provide a high resolution real images of pedestrians and other object which a classifier can mark as objects.
Depending upon an object moving or not moving/living or nonliving. The data helps in a smooth autonomous driving experience.
These cameras are located in the side mirrors, front bumper, and below the nameplates of both sides. A self-driving car has 5-14 of such cameras.
Disadvantage with cameras are the light condition. In low light condition the camera doesn’t perform well.
Each of the technology mainly Radar, LiDAR and Camera have their short comings.
A Computing Engine must be used to put all data together and process in real time.
Sensor Fusion is considered as secret sauce when it comes to combining the data in self-driving cars.
We will Learn about Sensor Fusion in upcoming posts. for now you can watch this video:
Conclusion and What is Next?
- From this post, we just learned the basics of how a car senses its environment.
- The Sensing technology which helps self-driving cars
- Self-Driving Cars are not just based on machine learning and deep learning there are a lot of technologies underneath.
- We haven’t talked about the SW part when it comes to self-driving cars. We will discuss it in great detail.
- Technology is still evolving and there is a lot to learn for machines too.
Mobileye is an Intel Company that provides autonomous driving solutions. Watch their Full AV ride video.
Thanks For reading 🙂
Feel free to leave a feedback. More articles are on their way.
Very useful shrikant !!!
Hope you come up with more interesting topics
Thank you Shubham. Yes this will be a complete series. Stay tuned.
Very knowledgeable shrikant.
I am waiting for next topic.
Thank you Rishab. Next topic will be published soon 🙂
Damn it was awesome and I really want to learn more. Good work sir, keep it up!!
Glad you liked it
Useful information about external peripherals required for Autonomous Vehicle Driving………Keep it up Shrikant
Thank you Krishna. More articles on Self-driving car SW architecture are coming soon. 🙂
Interesting Blog. Would like to read more.
Awesome and knowledgeable. Eagerly waiting for next articles.
Thank you, Palash. The next article is on the way.
Hey Shrikant, Great Article!
I really like worked done by Comma AI specifically open sourcing OpenPilot . They have very different approach to this problem.
Tesla is far ahead in competition though Google has spend around 4B on waymo, they directly want L4. We are still at level 2, it is a incremental process to reach level 4. MobileEye is also also amazing but I hope they don’t starting messing up like their parent.
Thanks! Looking for More Learning.
Yes, I agree the process is incremental. The machine will take time to learn. I hope we train them hard enough for the Indian road. Thank you for the feedback Swapnil really appreciated. Looking forward to adding more interesting articles like this.
appropriate content, i love it
Thanks for the feedback.