Tesla Deep Rain

It's late 2019 and Tesla have announced the release of Deep Rain, the first machine learnt, neural net solution to automatic windscreen wipers. You've got to ask the question though... why?

Tesla press release: Automatic wipers have been improved to be more likely to activate when it is lightly raining and respond to changes in rain intensity for more environments. The automatic wipers are now employing the first production deep neural network trained with over 1 million images for the detection of water droplets in a windshield and additional weather cues.

Quick history lesson

Before many of us were born, Cadillac in 1950's experimented with rain sensors, GM installed them on convertibles and by 1996 they were adjusting the wiper speed to match conditions. It's pretty tried and trusted technology that costs a $1 a car and has been on most cars ever since. Even Tesla installed such a sensor on their cars until the launch of their own autopilot hardware in late 2016.

What Tesla hadn't done back then is work out how to auto control wipers without the sensor. They planned on using the cameras to see if it was raining, but their early attempts took a year to arrive and simply didn't work.

The wipers were erratic, didn't work in the dark, and most owners gave up on them. It wasn't so much of a problem with the MS and MX as the control was on the stalk by the steering wheel, but the minimalist design of the M3 meant touching the large screen and finding the option to turn them on. As the joke goes, you can tell it doesn't rain much in Fremont.

The challenge

Tesla have big ideas but often go to market before they're fully resolved them. Autowipers without a rain sensor is just one example. The challenge they faced is how do you know the windscreen needs wiping? The old fashioned sensor worked by detecting moisture on the glass, and one would think the use of a camera could do the same, ie by seeing the moisture. Except the optics don't work that way, the focal length of the camera, the focus point and the depth of field are all set up to see into the distance, and not what's on the glass only a few mm away. Hold your finger in front of your eyes and look to the distance and while you can kind of make out there's a finger there, you can see the distance relatively clearly. This is ideal for autopilot with small specs of dirt on the camera not impinging on its capabilities to work.

As a result, Tesla has to look at wider environmental factors to determine if the wipers need to operate, and do this in all conditions and all times of the day. It's no longer a sensor to determine whether the screen is wet, but a system to work out whether its likely to be wet. They claim they look for water droplets on the screen but in our experience they can only be looking for the distortion those droplets may cause to distant images.

So why....?

There is a compelling argument that says Tesla needs to understand the weather conditions around the car. This is a material input into the AP software as it can have a baring on how far it can see, how fast things may be moving, whether its slippery or hot or foggy etc and use this as a parameter for self driving, but none of those use cases are directly relevant to wiping the windscreen. Tesla and their "must do things differently" approach threw a $1 sensor out the window which could have augmented the systems they had.

What they should have done?

Hindsight is a wonderful thing but here's what we think Tesla should have done if they really wanted to develop a better system.

These are simple steps that would result in a system that, in theory would be as good as the sensors if not better. There is however still the open question of whether it is possible? There is a fundamental assumption that looking out the window tells you whether the window needs wiping and we're not sure the case is proven. It's like looking at a million pictures of people to determine whether they have high blood pressure - you would get many right, but would it not be easier to just measure their blood pressure if you could which is what the sensor did? Tesla are making a virtue out of Deep Rain, but in reality it's a problem that didn't really need solving.