Tesla’s AI is about to get better, and not just for Full Self-Driving

Tesla’s use of artificial intelligence in its electric vehicles is about to get better in a new software update, according to CEO Elon Musk, and he is not just talking about the automaker’s Full Self-Driving effort. While Tesla is famous for its investment in artificial intelligence for self-driving technology, the automaker has also been using its […]

0



Tesla’s use of artificial intelligence in its electric vehicles is about to get better in a new software update, according to CEO Elon Musk, and he is not just talking about the automaker’s Full Self-Driving effort.

While Tesla is famous for its investment in artificial intelligence for self-driving technology, the automaker has also been using its expertise in machine learning and other branches of AI to develop other features related to operating its vehicles.
The best example is its automatic wiper feature.
Tesla is certainly not the first automaker to implement automatic wipers, but most other automakers use a rain sensor to detect the intensity of the rain or snowfall in order to automatically adjust the speed of the wipers.
Instead, Tesla decided to put its computer vision system to the test by using its cameras to detect rain and snowfall intensity and automatically adjusting the wipers based on that information.
The automaker created a new “Deep Rain” neural net to handle the task.
It was rough at first and clearly not as efficient as a system based on a traditional rain sensor, but it has improved over the last few years to become more useful.
Now Musk has been teasing an important improvement to Tesla’s computer vision capacity coming to the new Full Self-Driving Beta update.
Over the last few months, the CEO has been teasing that Tesla will move to a completely vision-based system, and it won’t even rely on its radar system in the vehicles.
Musk explained his logic:

When radar and vision disagree, which one do you believe? Vision has much more precision, so better to double down on vision than do sensor fusion.
Sensors are a bitstream and cameras have several orders of magnitude more bits/sec than radar (or lidar). Radar must meaningfully increase signal/noise of bitstream to be worth the complexity of integrating it. As vision processing gets better, it just leaves radar far behind.

This update is expected to come to those who are using Tesla’s Full Self-Driving Beta, which is the automaker’s “feature complete” version of full self-driving.
While it automates all the tasks related to driving, the driver is actually still responsible for the vehicle and needs to be paying attention and be ready to take control at all times.
Now Musk also says that the computer vision and AI improvements are also going to bring enhancements to Tesla’s other automated features, like the auto wipers and more:

Good point.
Next major software rev will do much better with automating wipers, seat heating & defrost.
Probable seat settings just based on occupant mass distribution should be possible.
— Elon Musk (@elonmusk) April 10, 2021

Tesla’s next major software revision has been expected to be Tesla v11, and it is expected to come out soon with the first refreshed Model S and Model X deliveries.
Those new vehicles require brand-new software, and a version of it has leaked last month.
FTC: We use income earning auto affiliate links. More.
Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *