Self-driving Cars – Why they are a coders nightmare?

We already know self-driving cars are already becoming a thing in this modern and exponentially growing technologic world. We would not be surprised if they become indispensable for futuristic cities everyday environment, as these cities will be looking for numerous ways to make a living a lot more efficient.

We have already seen big names having a go on self-driving cars such as Google. Intel who reportedly created their first self-driving chip technology or Domino’s who are already being the flagship brand, bringing pizzas to your home on a self-driven car. Let’s take a look at Tesla too, all these just to name a few.

self-driving cars by GoogleEven some States have unveiled their set of rules specifically for self-driven cars. There is absolutely no doubt that self-driving cars are coming even though some big car manufacturers like BMW and Porsche are strongly against them allegedly because they create cars for human experience and the driving pleasure can not be taken away by machines.

We believe self-driving cars are a good thing but like everything in this world, with great power, comes great responsibility. The arrival of self-driving cars has brought some questions that are still unanswered and that have even become an ethics issue. The problem is that when self-driving cars become the standard way of transport, they will inevitably arrive situations that will be catastrophic for either the passenger or the people liable outside the car.

The classic ethnic problem of which one is a correct thing to do, kill the passenger and save three pedestrians or three people in another car or all the way around? Will the AI that these cars possess be able to identify the most moral way to act or in this case to steer, when in a situation where death is imminent? Will the AI ever learn how to mimic human behavior or even human ethics accurately?

self-driving cars by TeslaAll these questions will, if they aren’t already, be directly linked to the coders that will be in charge of programming these self-driving cars to take that sort of decisions. Not only will they have that weight on their shoulders.

How will manufacturers guarantee the security of the self-driving cars, ensuring that the codes programmed by the coder ordered by the higher up (may it be the government or the manufacturers) will not be hacked or compromised into doing another, completely different, the thing is programmed to?

Coders will have the responsibility to program and to bullet proof their work.

How will the government secure the perimeter against malicious coders? They’ll be using their computer programming skills and knowledge to create a code that can indeed deviate the reactions above or even program it to find a group of people and going directly towards them.

self-driving cars UberWe have already seen people using Ubuntu to create self-driving cars, and Uber is already testing self-driving cars to make pickups.

In the end, regardless of any problem that might arise, the government, or whoever will be in charge of those coding decisions, will need to take into consideration ethics, logic and what better serves the public order to make a decision.

We believe that self-driving cars will be imminent and that the issues that we now encounter will be addressed accordingly, making self-driving cars the most popular way of public transportation and changing the idea that owning a private car is a luxury.

Leave a Reply

Your email address will not be published. Required fields are marked *