1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

If a self-driving car equipped with an AI-driven conscience gets in an accident, who's liable?

Discussion in 'Just Entertainment' started by Guts, Jun 14, 2016.

  1. Guts Black Swordsman

    Rank:
    Rank:
    Rank:
    Messages:
    233
    Joined:
    May 11, 2013
    Likes Received:
    19
    Trophy Points:
    110

    Self-driving cars, equipped with artificial intelligence, can perhaps make ethical decisions in the face of an accident, but the outcome could be strongly influenced by opinions of engineers who wrote the software,Wired reports.

    “Even if a machine makes the exact same decision as a human being, I think we’ll see a legal challenge,” says Patrick Lin, director of California Polytechnic State University, San Luis Obispo’s Ethics + Emerging Sciences Group.

    Companies that make self-driving cars, including Google, are taking a significant risk, Lin says. “They’re replacing the human and all the human mistakes a human driver can make, and they’re absorbing this huge range of responsibility.”

    Also, Lin asks: Who would buy a car that opts to kill you in the event of an accident, even if it’s the most ethical decision?

    “No one wants a car that looks after the greater good,” he says. “They want a car that looks after them.”

    The U.S. National Highway Traffic Safety Administration is expected to release regulation for self-driving cars in July, the Detroit Free Pressreports. Mark Rosekind, the agency’s administrator, said that self-driving car technology doesn’t have to be perfect to be acceptable, and the regulation can’t be so rigid that it does not keep up with evolving technology.

    The article notes that in Nevada and Michigan, lawmakers are considering whether vehicles that navigate and assess safety without human input could change standards regarding who can have drivers’ licenses.

    “We need new safety metrics. We also are going to have to broaden our view on the data sources for what those metrics might be. We have laboratory work. We have simulations and real-world data,” said Rosekind, speaking an annual conference sponsored by the vehicle technology group TU-Automotive
     
  2. Granger

    Rank:
    Rank:
    Rank:
    Messages:
    5
    Joined:
    Aug 14, 2016
    Likes Received:
    0
    Trophy Points:
    5

    Id thought about that. Good points
     
  3. Ddos_Dragon

    Rank:
    Rank:
    Rank:
    Messages:
    531
    Joined:
    Nov 5, 2016
    Likes Received:
    130
    Trophy Points:
    60

    Probably the developer of the car's AI.

    That, or the driver, who can always take the wheel.
     
  4. Doomguy

    Rank:
    Rank:
    Rank:
    Messages:
    1,977
    Joined:
    Apr 21, 2013
    Likes Received:
    271
    Trophy Points:
    240

    A.I have rights! Blame the A.I!

    Seriously though I have no idea. Maybe this should go into Hall of Elders for debate
     
  5. sayWut Head Market Research Analyst

    Rank:
    Rank:
    Rank:
    Messages:
    856
    Joined:
    Apr 28, 2013
    Likes Received:
    142
    Trophy Points:
    190

    At the end of the day, the small print on the documentation that goes with the car would state who would be to blame. Furthermore, it depends how, where and why the crash happens, Im assuming since self driving cars with AI will eventually be fully commercial in XXX years, meaning that they may implement what planes have a " black box " this would be core wired in the cars interior wiring, the speed, navigation all main aspects would be recorded into the box. Thus when an accident happens the box can determine where the error occurred and ultimately whos responsible.
     

Share This Page