Im not a fan of the self driving thing. I couldnt see trusting a my family in or near one
Self driving cars have already killedAI is very dangerous, and I think that there will be an incident in the future which will cause people to get killed by AI, which will prompt them to regulate it more closely, but I think that full human-level conscious AI is still a few hundred years away, and it is debatable whether it is even possible or not.
They could shut off water,electrical, disrupt communications, shut off any equipment using a battery or electrical cord(vehicles,defibulators, house power,refrigerators etc)I dont think killer bots or drones is what we have to fear. If AI ever truly became self aware it would absolutely view us as a threat at some point. It could wipe us out with viruses (digital) there would be no way to stop it. Almost all of our municipal systems are in some way connected to the internet, and even shutting down the net would be impossible. The software could just wait our demise out and not have to fire one bullet.
Too be fair people driving cars have killed people.Self driving cars have already killed
That is true, right now we need to focus on correcting these problemsToo be fair people driving cars have killed people.
Yes, but not intentionally. A self-driving car can't make a conscious decision to kill someone. And the ratio of passenger deaths in self-driving cars is overall much lower than human-driven cars. LOL, people have also gotten sucked into factory machinery and died, because of a lack of safety measures, and I would even argue that it is fundamentally the same, the machine didn't make a conscious decision to kill the human.Self driving cars have already killed
Agreed here, many so called accidents are actually just unintentional injuries due to operator errorYes, but not intentionally. A self-driving car can't make a conscious decision to kill someone. And the ratio of passenger deaths in self-driving cars is overall much lower than human-driven cars. LOL, people have also gotten sucked into factory machinery and died, because of a lack of safety measures, and I would even argue that it is fundamentally the same, the machine didn't make a conscious decision to kill the human.
We can't even begin to build a conscious, general, AI until we have a strong understanding of what consciousness, self-awareness, and cognition are, and I think that is still a very long ways out.