{"id":32731,"date":"2023-08-26T10:00:00","date_gmt":"2023-08-26T08:00:00","guid":{"rendered":"http:\/\/6a819c47-ed46-4119-9d7f-e45c1460a3e5"},"modified":"2023-08-26T10:46:44","modified_gmt":"2023-08-26T08:46:44","slug":"ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer","status":"publish","type":"rss_feed","link":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/rss_feed\/ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer\/","title":{"rendered":"AI: Why the next call from your family could be a deepfake scammer"},"content":{"rendered":"<p class=\"rssexcerpt\">Scammers are now using artificial intelligence to replicate voices and faces, making scams even more realistic \u2013 here&#8217;s what to look out for. <\/p><p class=\"rssauthor\">By Alex Hughes\n      <\/p><p class=\"rssbyline\">Published: Saturday, 26 August 2023 at 08:00 AM<\/p><hr class=\"no-tts wp-block-separator\"\/><?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\n<!DOCTYPE html PUBLIC \"-\/\/W3C\/\/DTD HTML 4.0 Transitional\/\/EN\" \"http:\/\/www.w3.org\/TR\/REC-html40\/loose.dtd\">\n<html><body><p>Gone are the good old days of princes offering up their wealth via email and dodgy online prizes that only require all of your passwords and details. Scams are getting both more complicated and a whole lot more convincing.<\/p> <p>Thanks to the ongoing boom of <a href=\"navto:\/\/16a74e7b-0ac3-41eb-999f-f52681f2c7c9\">artificial intelligence<\/a> (AI), scammers are now able to replicate the voice of someone you know, and in some cases, even their faces. Not just for the most tech-obsessed either, this technology is available to anyone with a half-decent computer and an internet connection.<\/p> <p>Replicating your family in need of cash, friends stuck in a bad place, or just someone you work with asking for a transaction, AI phone scams play on the psychology of trust and fear to get people to hand over money, believing they know the person on the other line.<\/p> <p>So how does this technology work, and is there any way to better prepare yourself to deal with the scams of the future? We spoke to <a href=\"https:\/\/research-portal.uea.ac.uk\/en\/persons\/oli-buckley\" target=\"_blank\" rel=\"noreferrer noopener\">Oli Buckley<\/a>, a professor of cyber security at the University of East Anglia to find out more about these new scams.<\/p> <h2 id=\"h-what-is-a-deepfake\">What is a deepfake?<\/h2> <p>While scams continue to come in a variety of different forms, these latest ventures tend to rely on technology known as <a href=\"navto:\/\/7fb6f564-1453-4aee-924b-5a0ff9ea8320\">deepfakes<\/a>.<\/p> <p>\u201cUsing an artificial intelligence algorithm, they create content that looks or sounds realistic. That could be video or even just audio,&#8221; explains Buckley.<\/p> <p>\u201cThey need very little training data and can create something quite convincing with a standard laptop anyone can buy.\u201d<\/p> <p>In essence, deepfakes take examples of footage or audio of someone, learning how to accurately recreate their movements or voice. This can then be used to plant their face on someone else\u2019s body, have their voice read out a script, or a host of other malicious activities.<\/p> <p>While the technology sounds complicated, it is actually surprisingly easy for anyone to make a deepfake on their own. All they need is a publicly available video or recording of your voice, and some reasonably cheap software.<\/p> <p>\u201cThe software can be easily downloaded and anyone can make a convincing deepfake easily. It takes seconds rather than minutes or hours, and anyone with a bit of time and access to YouTube could figure out how to do it,\u201d explains Buckley.<\/p> <p>\u201cIt\u2019s one of the benefits and curses of the AI boom we\u2019re seeing right now. There is amazing technology that would have been science-fiction not that long ago. That\u2019s great for innovation, but there\u2019s also the flip side with this technology in the wrong hands.&#8221;<\/p> <figure class=\"wp-block-embed is-type-rich is-provider-spotify wp-block-embed-spotify wp-embed-aspect-21-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe title=\"Spotify Embed: How AI is changing the world of scams\" style=\"border-radius: 12px\" width=\"100%\" height=\"152\" frameborder=\"0\" allowfullscreen=\"\" allow=\"autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture\" loading=\"lazy\" src=\"https:\/\/open.spotify.com\/embed\/episode\/47jhtwYkg6tHHA8RU0NqoU?utm_source=oembed\"\/>\n<\/div><\/figure> <h2 id=\"h-the-world-of-deepfake-scams\">The world of deepfake scams<\/h2> <p>Since their first uses, deepfakes have been used in malicious ways, ranging from faking political speeches to making pornographic material. But recently their use has seen a rise in the world of scams.<\/p> <p>\u201cBeing able to make someone do or say whatever you want is quite a powerful ability for scammers. There has been a rise in AI voice scams, where someone will receive a phone call or even a video call of a loved one saying they are in trouble and need money,\u201d says Buckley.<\/p> <p>\u201cThese are pulled from data available from the internet. They don\u2019t need to be 100 per cent accurate, relying instead on fear and a desperate situation where you panic and overlook inconsistencies.&#8221;<\/p> <p>While these scams can come in many different forms, the usual format is a call from a random number. The person on the other side uses a deepfake to pretend to be a family member or someone who would normally rely on you for money.<\/p> <p>This could also take the form of a voicemail where the caller can have a pre-made script ready. In a full-length call from the scammer, there are often long pauses as they get the voice generation to create responses to questions being asked.<\/p> <p>With basic technology, these deepfakes are unlikely to be perfect, instead offering a version of someone\u2019s voice that might be slightly off. However, by relying on the stress of a moment, these scammers hope that people won\u2019t notice, or put it down to the caller being stressed.<\/p> <h2 id=\"h-how-to-fight-back-against-deepfakes\">How to fight back against deepfakes<\/h2> <p>As these scams become more common, the question arises of both the best way to deal with them, and also whether the public can do anything to make themselves less of a target of these scams.<\/p> <p>\u201cIt\u2019s easy to be critical when it isn\u2019t happening to you, but it is hard in the moment. Question whether it sounds like them, if it is something they might say themselves, or if it seems like an unlikely situation that they are describing,\u201d says Buckley.<\/p> <p>There are pieces of software that can be used to identify a fake, but the average person is unlikely to have this on hand. If you receive a call from a loved one that you\u2019re not expecting and you are suspicious, call them back or text them to check where they are. Consider the reality and go from there.\u201d<\/p> <p>To create a realistic deepfake, a surprisingly small amount of audio or footage is actually needed. In the past, this might not have been such a problem, but now for most people, there is plenty of footage and audio of them online.<\/p> <p>While it is possible to try and remove all of your online content, this is a big ask, requiring a heavy scrub of both your social media and friends and families as well. Equally, there might be more content from your work, or social groups that have usable footage and audio.<\/p> <p>\u201cWe all live quite publicly now, particularly because COVID created this sense of online community growing as we were physically separated from everyone,\u201d says Buckley.<\/p> <p>\u201cA shift towards living our lives online to a degree and maintaining digital relationships through online personas means there are loads of photos, videos, and audio of us out there. The best option is to simply be objective, consider how likely deepfake content is, and be wary of calls or videos that don\u2019t feel believable.\u201d<\/p> <h2 id=\"h-a-change-in-mindset\">A change in mindset<\/h2> <p>Artificial intelligence has grown drastically in its ability in the last year. While this has resulted in a lot of good, it has balanced out with an equal amount of bad.<\/p> <p>While there are methods to track its usage, even in the examples listed above, scammers are quick to adjust their technology once they are able to notice it. At one time, a deepfake could be identified by an obvious kind of blinking of the eye, but it was soon changed.<\/p> <p>Instead of trying to look for errors or quirks, Buckley and other experts in the field instead opt for a change in mindset.<\/p> <p>\u201cThe technology is outpacing the way we think about it and the way we try and legislate for it. We\u2019re kind of just playing catch-up at this point. It is going to get to the point where we\u2019re no longer sure what is real and what is not.<\/p> <p>\u201cYou can\u2019t just believe your eyes these days, have a think a bit more widely about the videos you see, or the calls you get. Critical thinking is the most important factor when dealing with deepfakes, or any scam like this.\u201d<\/p> <p>Buckley argues that it all comes down to the reality of the situation, taking a step back and considering it all.<\/p> <hr class=\"wp-block-separator has-alpha-channel-opacity\"\/> <h4>About our expert, Oli Buckley<\/h4> <p>Oli is a professor of cyber security at the University of East Anglia. His research focuses on the human aspects of cyber security including privacy and trust, social justice and the way technology can be used against us. His research has been published in journals including <em>Communications in Computer and Information Science<\/em>, the <em>Journal of Information Security and Applications<\/em> and <em>Entertainment Computing.<\/em><\/p> <hr class=\"wp-block-separator has-alpha-channel-opacity\"\/> <p><strong>Read more:<\/strong><\/p> <ul>\n<li><a href=\"navto:\/\/71cd1fda-27fd-4b01-ad45-55136197fe15\">ChatGPT: Everything you need to know about OpenAI\u2019s GPT-4 tool<\/a><\/li> <li><a href=\"navto:\/\/320f61c6-3ae6-4cc9-8dfa-57c8d48de3db\">Google Bard: Everything you need to know about ChatGPT\u2019s AI rival<\/a><\/li> <li><a href=\"navto:\/\/db490581-455a-4fc3-971b-c1515920d1c7\">AI: 5 of the best must-read artificial intelligence books<\/a><\/li>\n<\/ul> <\/body><\/html>\n<hr class=\"no-tts wp-block-separator\"\/>","protected":false},"excerpt":{"rendered":"<p>Scammers are now using artificial intelligence to replicate voices and faces, making scams even more realistic \u2013 here&#8217;s what to look out for. <\/p>\n","protected":false},"author":24,"featured_media":32732,"template":"","categories":[1],"acf":{"readingTimeMinutes":"7"},"uagb_featured_image_src":{"full":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2023\/08\/ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer.jpg",1200,793,false],"thumbnail":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2023\/08\/ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer-150x150.jpg",150,150,true],"medium":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2023\/08\/ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer-300x198.jpg",300,198,true],"medium_large":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2023\/08\/ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer-768x508.jpg",768,508,true],"large":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2023\/08\/ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer-1024x677.jpg",800,529,true],"1536x1536":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2023\/08\/ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer.jpg",1200,793,false],"2048x2048":["https:\/\/c01.purpledshub.com\/uploads\/sites\/42\/2023\/08\/ai-why-the-next-call-from-your-family-could-be-a-deepfake-scammer.jpg",1200,793,false]},"uagb_author_info":{"display_name":"importmanagerhub@sprylab.com","author_link":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/author\/importmanagerhubsprylab-com\/"},"uagb_comment_info":0,"uagb_excerpt":"Scammers are now using artificial intelligence to replicate voices and faces, making scams even more realistic \u2013 here's what to look out for.","_links":{"self":[{"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/rss_feed\/32731"}],"collection":[{"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/rss_feed"}],"about":[{"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/types\/rss_feed"}],"author":[{"embeddable":true,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/users\/24"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/media\/32732"}],"wp:attachment":[{"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/media?parent=32731"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/c01.purpledshub.com\/bbcsciencefocus\/wp-json\/wp\/v2\/categories?post=32731"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}