J

JankhJankh

10 karmaJoined May 2022

Comments
13

It’s in the best interests of self-driving car manufacturers to choose the lives of their customers over others. It’s a matter of business, not ethics. But what will a self-driving car do if the pedestrian has more invested in the company then the driver? The correct business choice may be to kill the driver to keep the pedestrian safe. Once this decision is made, cars have chosen a value for each life. This is no longer a moral dilemma, but a simple calculation of scores, where the value of each life is based exclusively on its value to the manufacturer. When self-driving cars have this data, it will put a bounty on the heads of every person, place, and thing that is inversely correlated to how much money they are spending. If enough accidents start happening, people may start investing in manufacturers just to stay safe from these four-wheeled bounty hunters.

Workers at autonomous vehicle companies may find their self-driving cars less reliable on their drive home from union meetings.

Crowdsourced data for AI is safe as long as the majority of data is non-malicious. The fact that so many bots become Nazis is troubling.

Much like the Internet of Things, Artificial Intelligence often finds itself in the homes of consumers before its first safety assessment.

An AI created to raise corporate profits will likely find a significant discount in replacing wages with threats of violence.

"Any AI smart enough to pass a Turing test is smart enough to know to fail it."Ian McDonald 

It’s in the best interests of self-driving car manufacturers to choose the lives of their customers over pedestrians. It’s a matter of business, not ethics.

If AI is used by a military force, it will lead to a peace so peaceful that it will make people look back fondly on war.

Left unchecked, AI will optimize away everything that makes us human.

If an AI only sees the world as atoms, things within its grasp are bound to become disassembled.

Load more