Self-driving evolution and design challenges

By Paul Wildman, with David Waterworth

In this short article, we aim to address key issues about the philosophy behind ‘self-driving’ and its broader implications for aircraft cockpit design. Neither of us are experts on the subject, as our studies are in the humanities. We both own Teslas and have requested input from a systems engineer on AI and neural networks. The various websites below are included so that you, dear reader, can ask your own questions – and please include your thoughts in the comments section below this article.

What happened to Air France flight 447? We are now aware of all the drama surrounding AF447. To me it’s an example of the “consequences-of-certain-types-of-human-automation interface” and where this interface fails. On June 1, 2009, on a flight between Rio de Janeiro and Paris, three pilots lost control of a fully functioning Airbus A330, which came to a stop under the manual control of the pilots and crashed horizontally into the Atlantic Ocean at high speed. with the loss of 228 lives. It took two years to find the black box, which is located at a depth of 3 km. Airbus and Air France have been charged with involuntary manslaughter since May 2021.

Loss of control usually occurs when pilots fail to recognize and correct a potentially dangerous situation, leaving a perfectly functioning aircraft in an unstable and irreparable state. This is called Controlled Flight Into Terrain (CFIT).

Such a “loss of control” event, in our view, has huge implications for fully self-driving cars.

Where do AI and automation come into the picture? Artificial deep neural networks (DNNs) have recently seen great success in many key areas related to text, images, videos, graphics, and so on. However, the black box nature of AI and DNNs has become one of the major obstacles to their widespread adoption in mission-critical applications such as medical diagnosis, therapy, law, and social policy. Due to the enormous potential of deep learning, also known as fuzzy logic, increasing the interpretability of deep neural networks has attracted a lot of research attention recently.

The problem is that once DNNs are “released into the wild,” they are nearly impossible to correct. Tesla’s FSD is one such DNN. Without a human in the loop, such DNNs can cause disasters such as the Dutch disease and Robo-Debt in Australia.¹

Furthermore, imagine a scenario next year when HASD (High Automation Self-Driving) comes into play) and it is a rainy night on a country road and the Self Driving decides to leave for the night. The driver must decide within seconds, WTH is going on and what correct actions are required.

Here we see the somewhat bizarre danger in “nearly” foolproof automation, as with Air France; because when the automation stops, it happens at the “peak” level of stress with multiple dramas simultaneously and with only a few ways to get out of the situation safely. In this, albeit rare situation, no automation can be better than partial automation. For example, in the Air France situation, the pilots could not fly/drive a car manually (ie haptically).²

The six levels of autonomous driving – what happens to the haptics? Level 0 — no automation, historic motor vehicles, Level 1 — basic driver assistance (~2014); Level 2 — partial automation (~2017), cruise control, lane departure warning; Level 3 — Conditional Automation (~2020); adaptive cruise control, Tesla self-driving on the highway; Level 4 — High Automation (~2023), FSD Beta; Level 5 – full automation (~2025), no steering wheel, etc.

The issue of civil liability in Level 3 and 4 accidents is a huge one, and potentially one that is delaying the adoption of self-driving vehicles. Intriguingly, as of March 2022, Mercedes-Benz has accepted legal responsibility for its autonomous “Drive Pilot” system, which can automatically brake, accelerate, steer within its lane and turn off highways.

Now we go back to Air France and the design of autonomous flying/driving especially for L3 & L4. We see that Airbus has two control sticks on either side of the cockpit, so that one pilot does not know haptically (ie by feel) what the other pilot is doing, because the joysticks are not connected.

This is in contrast to Boeing’s cockpit design, where the joysticks in the middle move simultaneously, because they are mechanically linked together. So yes, the pilot is still in the loop, but at the beginning (DeLorean) or at the end (Tesla³)? The difference is important, and very important.

Where is the driver in the loop?: If Tesla’s philosophy of automation is to take the driver out of the loop, at best the driver becomes an extension of the car, as with Airbus. At DeLorean/Boeing, the car/aircraft is not designed to become a “shield” to the outside world/road conditions. (See 16-18 minutes in this 30-minute podcast). DMC CEO Joost deVires explains the difference between being driven by the car and driving the car – be it a Tesla, Mercedes, Volkswagen or DMC. However, will DeLorean get stuck at L3?

While level 5 would mean virtually no accidents with human error, it will probably be quite a difficult period to get there in the next 3-5 years, with accidents at the interface of human | machine control through the transfer from one to another, while removing people from the cycle. We eagerly await DeLorean’s take on automation.


DNN are a form of AI and are loosely modeled after the human brain. They can also undertake ‘machine learning’ and ‘machine ethics’ take the stage on the left – the ‘hive mind’ responsible for ‘moral machines’. They are qualitatively different from the more commonly known everyday ones control systems† A control system manages, controls, controls or regulates the behavior of other devices or systems using basic + or – control loops (eg thermostat or motor controller). In this article, we are talking about the first (DNN).

² Incidentally, a major reason for the early success of the Luftwaffe in World War II was that because of the Treaty of Versailles, which restricted Germany militarily, the pilots had to learn in gliders – the essence of hands-on, haptic, manual flight. This would be so necessary in the coming dogfights.

³ Elon Musk said in a recent interview, “If a driver has to make a decision, the car has to do it for you.”


dr. Paul Wildman is a retired craftsman and academic. He was director of the Queensland Apprenticeship System for several years in the early 1990s and is excited to demonstrate the importance of craft, peer-to-peer manufacturing, collaboration and ‘our commons’ in social, economic and technological innovations such as EVs. Paul has been on Tesla for a long time, trying to prove that a Fox Terrier can be trained. See Paul’s crafter podcasts: The paulx4u’s Podcast.


 


Advertisement




Appreciate the originality of CleanTechnica? Consider becoming a CleanTechnica member, supporter, technician or ambassador – or a patron on Patreon.


 

Do you have a tip for CleanTechnica, do you want to advertise or do you want to introduce a guest for our CleanTech Talk podcast? Contact us here.

Leave a Comment