Skip to content

Driver of Tesla killed in first fatal crash while in autopilot mode

July 1, 2016

By Paul Homewood




From the Telegraph:


The driver of an electric Tesla saloon car has been killed in a road accident while  it was in autopilot mode.

It is believed that the crash, which took place in Florida on May 7, is the first fatality involving a “self-driving” car. The driver was named by several American news outlets as Joshua Brown from Ohio.

The Tesla’s autopilot mode deploys an array of safety technology including an automatic braking system, designed to bring the vehicle to a halt ahead of a possible collision.

It also uses  cameras, sensors and radar to steer the car and prevent it from colliding with anything else on the road.

Mr Brown posted a number of videos of the car’s autopilot system in operation including one which was seen a million times after being tweeted by Elon Musk, Tesla’s chief executive.

According to an  obituary, Mr Brown was  a member of the Navy SEALs for 11 years and founder of Nexu Innovations Inc., working on wireless Internet networks and camera systems. In Washington, the Pentagon confirmed Brown’s work with the SEALs and said he left the service in 2008.

A spokesman for Tesla said the features were intended to keep the car in its lane and designed for motorway driving only.

When putting the car into autopilot, drivers are instructed to keep their hands on the steering wheel and maintain control of the vehicle at all times.

The accident is being investigated by  the National Highway Traffic Safety Administration.

“What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S,” Tesla said in a statement.

“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.


Would the driver still be alive if he had not been using the driverless technology? We don’t know, but it seems likely he would have been paying much more attention.

In my view, these cars are being rushed onto roads with obscene haste for reasons of political correctness.

Certainly, the Tesla autopilot should now be banned until it can be proven to be totally safe. 

  1. July 1, 2016 10:41 am

    So it’s about “auto-pilot” and driverless cars and not about Tesla, as such! However, the following story was published today, Google John Petersen and Tesla for other great analyses

    • July 1, 2016 11:08 am

      mmm so basically they don’t know how often the battery will need replacing..
      please remember the green slime brigade are intent on degrowth,.. via a systematic undermining of the industrial base of modern primarily western and democratic nations..

      For this read permanent and deepening recessions and by necessity the curtailing of all
      mobility .. particularly air travel and road travel..
      in this brave new world we will be consigned to ‘zing railways’ and bicycles, well at least in Lincolnshire…

      the rest of you will have to saddle up your shanks pony’s, no doubt you will enjoy the ride and the very slow moving scenery, cause you won’t have much else to occupy your days…

      • Swisspeasant permalink
        July 1, 2016 12:00 pm

        I love the comment buried in the document. “In a future economy labour costs may be much lower”. In other words, we will all be peasants them.

      • Sara Hall permalink
        July 1, 2016 5:36 pm

        I frequently visit my elderly father,who lives near Spalding and often see the regular double decker that runs from there to Boston…most often with just a couple of passengers aboard. I just don’t think many people want to go to Boston anymore, which is really sad, as it used to be such a beautiful place, full of historic buildings for a start. The Zing plans are just plain crazy.

  2. July 1, 2016 11:41 am

    What I’ve heard sounds like very bad (criminally negligent) engineering. You don’t give a driver an unsafe auto-drive option with a section in the manual to say that its not really auto-drive. The BBC reports so far have been totally uncritical and fawning, just imagine if there had been a death at a fracking well.

    • July 1, 2016 5:42 pm

      If driving tests include manual supervision of auto-drive mechanisms then such mechanisms are OK, but I think it is criminally negligent to offer such mechanisms to drivers who have had no training in their use.

      Auto-pilots in aircraft are OK, because pilots are trained and examined in their use.

      • catweazle666 permalink
        July 3, 2016 4:50 pm

        “Auto-pilots in aircraft are OK, because pilots are trained and examined in their use.”

        And yet Air France Flight 447 crashed when the autopilot dropped out due to pitot tube icing and the co-pilot promptly pulled back the stick and held it there while the aircraft stalled at around 35,000 feet and pancaked into the Atlantic.

        That was an elementary mistake, even a kid playing with a game simulator knows that in a stall you push the stick forward!

        Mind you, had the plane not been fly-by-wire with pressure as oppose to movement operated flight controls, one of the other crew might have noticed the stick was right back and reacted accordingly.

        In the accident in question, it seems the forward sensors were unable to cope with a large white surface and low sun, which begs the question of why it depends on visual inputs and not radar or ultrasonics.

  3. Steve Crook permalink
    July 1, 2016 12:04 pm

    The idea of being able to prove that driverless cars are totally safe is just daft. They only have to be better than having a human driving. The insurance business will arrange premiums to suit the risk just as they would with people.

    The accident does highlight potential problems with having a mix of driverless and human driven cars on the same road.

    It’ll be possible to attribute failures to the driverless systems and therefore companies will be held liable in lawsuits where human failings wouldn’t. I suspect the law on dangerous driving will need some finessing.

    Even if the units fail in some circumstances where humans wouldn’t there will be plenty more where it’s the other way round….

    • Alan permalink
      July 1, 2016 12:54 pm

      Agreed Steve. And if we wanted to ban them until they are 100% safe then shoudln’t they do the same with cars driven by humans? In which case, I’d better dust off my walking boots!

  4. July 1, 2016 1:38 pm

    The belief in technical fail-safe systems is quit naive. All car manufacturers tell drivers to rely on their assistant – I’d call them toy – systems. Mothers turn around in their cars to look after their kids and don’t worry about what’s going on on the road.

    Having been an electronics technician back in the 70’s in the German Fly Navy I was once confronted with a problem never solved and unrepeatable except in one given configuration:
    Two navigation devices; a TACAN transceiver built up on valves and a VOR receiver built up with transistors, both with independent indicators and running on different current circuits showed the following error: Climbing over 50ft(!) above ground the indicators changed the indicated direction in aberration of 45° to the true direction – one going minus, the other plus 45° that is 90° to each other. The aircraft, a Breguet Atlantic is spacious so a search for the error during flight was possible. As it turned out only these 2 specific devices on this 1 plane reproduced this behaviour. Just swapping 1 component fixed it. And going below 50ft fixed it.
    We didn’t find out if maybe a 3rd device was interfering: Being an ECM plane filled with hundreds of electronic devices of all flavours.

    I sent the devices to the manufacturers. They couldn’t find any errors. All that remained to do was ascertain that these devices were never combined on any aircraft.

    My conclusion on this was and still is: Do not trust technical systems to 100%. You may get into an unexplainable situation as described above and sued for damage you aren’t (directly)responsible for.
    Maybe your next mobile in your next-generation car will…

    Epilogue: Hadn’t the pilots noticed this erroneous behaviour they may have been shot down at the inner German border at that time flying into GDR airspace.

  5. Tom O permalink
    July 1, 2016 1:53 pm

    Actually, as long as they can say the driver is supposed to be paying attention, the company won’t have to take the blame, and it doesn’t matter if it is said only “somewhere in the manual.”

    Governments will continue to push for the development of self-driving cars simply because it will give them yet another way to always know where you are and what you are doing. Self-driving cars are not intended, then, to make the highway safer, but to insure that government has as much control over you as possible, and since self-driving cars are computer controlled, they will have circuitry in them to be able to be taken over remotely by “authorities,” and you might find your self driving car crossing the lane and straight into an oncoming self driving semi tractor trailer IF you happen to be someone that is considered a trouble maker. Just a natural observation from the way government has trended over these past decades.

    • July 1, 2016 2:46 pm

      yep true Tom O.. same as smart meters,.. which are in reality another underhand scheme not to assist the consumer to monitor their energy usage, but to have absolute power and control over when and how much power is available (also immediate adjustment of the tariffs charged).. If it is decreed you will not have power this can be switched off if your face doesn’t fit.. Think hard before you permit your meter to be changed….

  6. Bloke down the pub permalink
    July 1, 2016 2:05 pm

    They could do with the equivalent of a back seat driver, a computer voice that if the auto drive is making too many corrections then it has a moan at the human to pay more attention.

  7. Gamecock permalink
    July 2, 2016 12:17 am

    ‘When putting the car into autopilot, drivers are instructed to keep their hands on the steering wheel and maintain control of the vehicle at all times.’

    Then what is the autopilot for ???

    There have been a few Airbuses lost due to autopilot problems. It’s going to happen with cars, too. Sensor error, and situations the programmers didn’t anticipate are bound to happen.

    I can even imagine teenagers seeing a car is being auto driven and start to play with it to see what reaction they get. Like drifting into its lane to see what it does.

  8. Dave N permalink
    July 2, 2016 6:21 am

    I’m puzzled by the condition being described as being unable to discern a white tractor from a bright sky: does the design not include something as simple as some radar type of device for detecting objects nearby? Even older non-autopilot cars use that for warnings when reversing.

    I expect there’s more to this than what has been reported.

    • Ex-expat Colin permalink
      July 2, 2016 8:13 am

      I think there must be more to it. There is a vid showing a vehicle cutting in on one of these auto vehicles at a highway feed off (USA I think). I suspect a sensor saw it early overtake on approach and cut in, but the auto system did not have a procedure to handle move sideways/brake. The driver steered it to the right slightly…JIT.

      Finite State Machines/Automata…ah yes!

  9. Stonyground permalink
    July 2, 2016 10:38 am

    I don’t think that the Zing railway concept is totally daft. The idea of running it on batteries is because it is way more sensible to use overhead cables. But I always wondered why regular passenger trains were so massively heavy, they seem to be hugely overengineered.

Comments are closed.

%d bloggers like this: