{"id":18083,"date":"2022-10-11T00:00:00","date_gmt":"2022-10-10T22:00:00","guid":{"rendered":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/?post_type=purple_issue&#038;p=18083"},"modified":"2022-11-02T12:49:16","modified_gmt":"2022-11-02T11:49:16","slug":"dr-kate-darling-baby-you-can-drive-my-self-driving-car","status":"publish","type":"post","link":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/2022\/10\/11\/dr-kate-darling-baby-you-can-drive-my-self-driving-car\/","title":{"rendered":"Dr Kate Darling: Baby, you can drive my self-driving car"},"content":{"rendered":"\n<h5 class=\"has-text-align-center article-standfirst\"><span class=\"has-inline-color has-ccp-brown-color\">COMMENT<\/span><\/h5>\n\n<h3 class=\"has-text-align-center has-text-color\" style=\"color:#c30028\"><span style=\"color:#c30028\" class=\"has-inline-color\"><strong>DR KATE DARLING<\/strong>:<\/span><\/h3>\n\n<h3 class=\"has-text-align-center\">BABY, YOU CAN DRIVE MY SELF-DRIVING CAR<\/h3>\n\n<p class=\"has-text-align-center intro\">Human drivers should not be held responsible for accidents caused by autonomous vehicles <\/p>\n\n<figure class=\"no-tts wp-block-image article-in-image photo\"><img loading=\"lazy\" width=\"1680\" height=\"2048\" src=\"https:\/\/dj9jqhxgw9833.cloudfront.net\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a.jpg\" alt=\"\" class=\"no-tts wp-image-18082\" srcset=\"https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a.jpg 1680w, https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-246x300.jpg 246w, https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-840x1024.jpg 840w, https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-768x936.jpg 768w, https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-1260x1536.jpg 1260w\" sizes=\"(max-width: 1680px) 100vw, 1680px\" \/><\/figure>\n\n<p class=\"has-drop-cap article-full-body sans-serif dropcap\"><span style=\"color:#c30028\" class=\"has-inline-color\">I<\/span>n August 2022, the UK government announced a \u00a3100m plan to speed up the development and deployment of self-driving vehicles. The plan also calls for new safety regulation, including a bold objective to hold car manufacturers accountable. This would mean that when a vehicle is self-driving, the person behind the wheel will not be responsible for any driving errors. This rule stands in contrast to the US, where courts have faulted human \u2018backup drivers\u2019 for robot-caused accidents. The UK has the right idea \u2013 as long as companies don\u2019t weasel their way out. <\/p>\n\n<p class=\"article-full-body sans-serif\">Fully self-driving cars have been on the horizon for quite some time, but are taking much longer than promised to be fully realised. Despite pouring massive resources into research and development, car companies have struggled to account for the sheer amount of unexpected occurrences on roads. Freakish weather is one thing for the vehicles to contend with, but there was also a news story of a self-driving car mistaking the sunset for a traffic light, and another that drove straight into a parked $2m aircraft. So far, the large rollout of automated vehicles the UK is hoping for has remained elusive.<\/p>\n\n<p class=\"article-full-body sans-serif\">Cars are being outfitted with increasingly advanced driver assistance features (like automated steering, accelerating, and braking). These assisted driving systems mean that, until we have reliable full automation, we\u2019re going to be dealing with human-robot teams behind the wheel. It also means that when mistakes happen, we need to be particularly careful about who to hold responsible and why. <\/p>\n\n<p class=\"article-full-body sans-serif\">Robots and humans have different, often complementary, skillsets. When it comes to driving, robots excel at predictable tasks and can react faster and more precisely than a human. People, on the other hand, are great at dealing with unexpected situations, like an erratic traffic cop or a horse-drawn carriage on the highway. The ideal \u2013 at least in theory \u2013 would be to combine the skillsets of humans and robots to design a safer driving experience. But in practice, creating an effective human-robot team in the driver\u2019s seat is challenging. <\/p>\n\n<blockquote class=\"wp-block-quote has-text-align-center is-style-large\"><p><span style=\"color:#c30028\" class=\"has-inline-color\"><em><strong>\u201cThe research is clear: when a car is doing most of the driving, it\u2019s too much to ask of the person in the driver\u2019s seat to be vigilant\u201d <\/strong><\/em><\/span><\/p><\/blockquote>\n\n<p class=\"article-full-body sans-serif\">One of the cases I teach in class is a 2018 accident in Arizona, where a self-driving Uber struck a woman who was wheeling a bicycle across the road. The car\u2019s automated system couldn\u2019t decide whether she was a pedestrian, a bicycle, or a vehicle, and failed to correctly predict her path. The backup driver, who didn\u2019t react in time to stop the car, was charged with negligent homicide. An investigation by the National Transportation Safety Board identified a number of reasons the hand-off of control from vehicle to driver didn\u2019t work, but Uber was not held responsible. <\/p>\n\n<p class=\"article-full-body sans-serif\">A contributing factor may be what anthropologist Dr Madeleine Clare Elish calls the \u201cmoral crumple zone\u201d. In class, I present the Uber case as hypothetical. I include hints about human attention spans, and I don\u2019t reveal what the driver was doing (watching Netflix on her phone). Even with the case skewed in the driver\u2019s favour, about half of the students choose to fault her instead of the car company. According to Elish, this is because people tend to misattribute blame to the human in human-robot teams. <\/p>\n\n<p class=\"article-full-body sans-serif\">We need to resist this bias, because the research on automation complacency is clear: when a car is doing most of the driving, it\u2019s too much to ask of the person in the driver\u2019s seat to be vigilant. For this reason, the UK has the right idea. Letting the driver off the hook will set strong incentives for companies to figure out safety in advance, instead of offsetting some of the cost to the public. For example, Tesla UK explicitly states that the Tesla autopilot features \u201cdo not make the vehicle autonomous\u201d and that \u201cfull self-driving capability [is] intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.\u201d If a disclaimer doesn\u2019t shield them, another way car companies might cover themselves is by using systems that don\u2019t meet the definition of \u2018self-driving\u2019. Which would mean going back to more hand-offs between car and driver \u2013 and more drivers blamed when something goes wrong. <\/p>\n\n<p class=\"article-full-body sans-serif\">With the UK investing so much capital in self-driving, we may ultimately see some new technology, and a rollout of robot vehicles on predictable routes. Despite the fairly slow pace of development and deployment, it\u2019s an exciting prospect. With traditional cars, more than 90 per cent of road accidents are due to human error, so one thing is clear: in the future, streets filled with autonomous drivers will be safer. The only question is how we handle the long and winding road to get there.<\/p>\n\n<div class=\"no-tts wp-block-image article-in-image photo\"><figure class=\"no-tts alignleft size-large is-resized\"><img loading=\"lazy\" src=\"https:\/\/dj9jqhxgw9833.cloudfront.net\/uploads\/sites\/42\/2022\/05\/14b7bf06-b69e-45e9-b4bf-2b2d8cce9b4e.jpg\" alt=\"\" class=\"no-tts wp-image-13195\" width=\"80\" height=\"99\" srcset=\"https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/05\/14b7bf06-b69e-45e9-b4bf-2b2d8cce9b4e.jpg 280w, https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/05\/14b7bf06-b69e-45e9-b4bf-2b2d8cce9b4e-242x300.jpg 242w\" sizes=\"(max-width: 80px) 100vw, 80px\" \/><\/figure><\/div>\n\n<h5 class=\"article-subhead has-text-color\" style=\"color:#c30028\"><span style=\"color:#c30028\" class=\"has-inline-color\">DR KATE DARLING<\/span><\/h5>\n\n<p class=\"article-full-body sans-serif\">(<em><a href=\"https:\/\/twitter.com\/grok_\" data-type=\"URL\" data-id=\"https:\/\/twitter.com\/grok_\">@grok_<\/a><\/em>) Kate is a research scientist at the MIT Media Lab, studying human-robot interaction. Her book is <em>The New Breed<\/em> (\u00a320, Penguin). <\/p>\n\n<p class=\"article-full-body sans-serif\"><\/p>\n\n<p class=\"footer\">ILLUSTRATION: VALENTIN TKACH<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Human drivers should not be held responsible for accidents caused by autonomous vehicles <\/p>\n","protected":false},"author":24,"featured_media":18082,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"ub_ctt_via":"","purple_page_number":"30","purple_custom_meta_purple_page_number":"30","purple_seq_number":"1","purple_custom_meta_purple_seq_number":"1","purple_source_article":"article_30-1.xml","purple_custom_meta_purple_source_article":"article_30-1.xml","purple_source_issue":"October-2022","purple_custom_meta_purple_source_issue":"October-2022","purple_external_id":"October-2022-30-1","purple_custom_meta_purple_external_id":"October-2022-30-1","purple_issue_code":"|0000089659||","purple_custom_meta_purple_issue_code":"|0000089659||","purple_android_product":"com.focus.magazine.issue383","purple_custom_meta_purple_android_product":"com.focus.magazine.issue383.2","purple_ios_product":"com.focus.magazine.issue383","purple_custom_meta_purple_ios_product":"com.focus.magazine.issue383.2","purple_web_product":"","purple_custom_meta_purple_web_product":"","purple_publication_id":"0f422ad1-c939-476d-9f82-a410052ad4c3","purple_migrated":"","kt_blocks_editor_width":"","apple_news_api_created_at":"2022-10-11T10:36:09Z","apple_news_article-theme":"","apple_news_api_id":"13f542e7-0f77-46d9-93a3-ad053e82a92e","apple_news_api_modified_at":"2022-10-12T08:54:40Z","apple_news_api_revision":"AAAAAAAAAAAAAAAAAAAABQ==","apple_news_api_share_url":"https:\/\/apple.news\/AE_VC5w93RtmTo60FPoKpLg","apple_news_coverimage":0,"apple_news_coverimage_caption":"","apple_news_is_hidden":false,"apple_news_is_paid":true,"apple_news_is_preview":true,"apple_news_is_sponsored":false,"apple_news_maturity_rating":"","apple_news_pullquote":"","apple_news_pullquote_position":"","apple_news_article_theme":"","apple_news_sections":"[]"},"categories":[25],"tags":[15],"apple_news_notices":[],"featured_image_src":"https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a.jpg","author_info":{"display_name":"importmanagerhub@sprylab.com","author_link":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/author\/importmanagerhubsprylab-com\/"},"acf":{"readingTimeMinutes":"5","apple_news_title":""},"uagb_featured_image_src":{"full":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a.jpg",1680,2048,false],"thumbnail":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-150x150.jpg",150,150,true],"medium":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-246x300.jpg",246,300,true],"medium_large":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-768x936.jpg",768,936,true],"large":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-840x1024.jpg",800,975,true],"1536x1536":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a-1260x1536.jpg",1260,1536,true],"2048x2048":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2022\/10\/12eba385-8472-4a51-8854-3bae0bf5756a.jpg",1680,2048,false]},"uagb_author_info":{"display_name":"importmanagerhub@sprylab.com","author_link":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/author\/importmanagerhubsprylab-com\/"},"uagb_comment_info":0,"uagb_excerpt":"Human drivers should not be held responsible for accidents caused by autonomous vehicles","_links":{"self":[{"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/posts\/18083"}],"collection":[{"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/users\/24"}],"replies":[{"embeddable":true,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/comments?post=18083"}],"version-history":[{"count":5,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/posts\/18083\/revisions"}],"predecessor-version":[{"id":18789,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/posts\/18083\/revisions\/18789"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/media\/18082"}],"wp:attachment":[{"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/media?parent=18083"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/categories?post=18083"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/tags?post=18083"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}