{"id":442,"date":"2024-12-10T09:24:03","date_gmt":"2024-12-10T09:24:03","guid":{"rendered":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/?p=442"},"modified":"2024-12-10T09:24:04","modified_gmt":"2024-12-10T09:24:04","slug":"universities-must-lend-their-weight-to-combating-ai-disinformation","status":"publish","type":"post","link":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/2024\/12\/10\/universities-must-lend-their-weight-to-combating-ai-disinformation\/","title":{"rendered":"Universities must lend their weight to combating AI disinformation"},"content":{"rendered":"\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-content\/uploads\/sites\/59\/2024\/12\/getty-deep-fake-1024x576.jpg\" alt=\"A person typing on a laptop with an electronic face hovering with computer security icons surrounding it including a padlock, a fingerprint and a login.\" class=\"wp-image-443\" srcset=\"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-content\/uploads\/sites\/59\/2024\/12\/getty-deep-fake-1024x576.jpg 1024w, https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-content\/uploads\/sites\/59\/2024\/12\/getty-deep-fake-300x169.jpg 300w, https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-content\/uploads\/sites\/59\/2024\/12\/getty-deep-fake-768x432.jpg 768w, https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-content\/uploads\/sites\/59\/2024\/12\/getty-deep-fake-1536x864.jpg 1536w, https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-content\/uploads\/sites\/59\/2024\/12\/getty-deep-fake.jpg 1920w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p><em>Image courtesy of Getty Images<\/em><\/p>\n\n\n\n<p><a href=\"https:\/\/www.timeshighereducation.com\/opinion\/universities-must-lend-their-weight-combating-ai-disinformation\" data-type=\"link\" data-id=\"https:\/\/www.timeshighereducation.com\/opinion\/universities-must-lend-their-weight-combating-ai-disinformation\">This piece written by Professor Nick Jennings and Andrew Chadwick was originally published on Times Higher Education.<\/a><\/p>\n\n\n\n<p>By the end of this year, about four billion citizens across more than 40 countries will have voted in elections.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Accordingly, the early months of 2024 saw a global outpouring of <a href=\"https:\/\/demos.co.uk\/research\/generating-democracy-ai-and-the-coming-revolution-in-political-communications\/\" target=\"_blank\" rel=\"noreferrer noopener\">speculation<\/a> about the democratic collapse that might be caused by AI-enabled online disinformation. Most of the commentary focused on the potential for highly realistic deepfake video to deceive the public. Some predicted the \u201c<a href=\"https:\/\/cps.org.uk\/media\/post\/2024\/cps-warns-of-uks-first-deepfake-election\/\" target=\"_blank\" rel=\"noreferrer noopener\">first deepfake elections<\/a>\u201d.&nbsp;&nbsp;<\/p>\n\n\n\n<p>This was part of the \u201chype cycle\u201d that history tells us all new technologies go through. Inflated early expectations of social and political impact \u2013rose-tinted or, as here, doom-laden \u2013 are displaced over time by the realities of evidence and adaptation.&nbsp;&nbsp;<\/p>\n\n\n\n<p>The important thing is to quickly get beyond the hype \u2013 and the fatalism and sense of powerlessness it can promote \u2013 and focus on the technology\u2019s real and lasting effects. These are often substantial but subtle, complex and more gradually felt than forecast by early optimists and pessimists.&nbsp;&nbsp;<\/p>\n\n\n\n<p>The challenge for researchers across all disciplines, then, is to learn rapidly from events and help citizens and regulators pinpoint when, where and how AI makes a difference \u2013 positive or negative \u2013 to civic life.&nbsp;&nbsp;<\/p>\n\n\n\n<p>In the event, there was no apparent <a href=\"https:\/\/www.bbc.co.uk\/news\/articles\/cj50qjy9g7ro\" target=\"_blank\" rel=\"noreferrer noopener\">deepfake crisis<\/a> in the UK election, but this produced a narrative just as unhelpful as the doom-mongering. \u201c<a href=\"https:\/\/www.theguardian.com\/commentisfree\/article\/2024\/jun\/11\/deepfakes-ignore-alarmists-elections\" target=\"_blank\" rel=\"noreferrer noopener\">Nothing to see here<\/a>\u201d quickly became <a href=\"https:\/\/uk-podcasts.co.uk\/podcast\/the-times-red-box-podcast\/the-deepfake-election-that-wasn-t\" target=\"_blank\" rel=\"noreferrer noopener\">the new vogue<\/a> \u2013 just as revelations were emerging of some serious cases of AI-driven disinformation.&nbsp;<\/p>\n\n\n\n<p>During the campaign\u2019s final weekend, investigative journalists at Australia\u2019s ABC News uncovered <a href=\"https:\/\/www.abc.net.au\/news\/2024-06-29\/uk-election-pro-russian-facebook-pages-coordinating\/104038246\" target=\"_blank\" rel=\"noreferrer noopener\">a coordinated foreign disinformation campaign targeting UK citizens on Facebook<\/a> with divisive, often racist, material (some of it illegal, unlabelled paid advertisements). Fake, AI-generated images were common \u2013 showing, for example, groups of asylum seekers massing at the UK coast.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Facebook\u2019s parent company, Meta, <a href=\"https:\/\/www.abc.net.au\/news\/2024-07-01\/meta-shuts-down-pro-russian-facebook-pages-in-uk-elections\/104045286\" target=\"_blank\" rel=\"noreferrer noopener\">took it all down<\/a>\u202fas Rishi Sunak issued a formal statement of concern. A government investigation was reportedly set up, but, by then, polling day had arrived.&nbsp;<\/p>\n\n\n\n<p>Meanwhile, <a href=\"https:\/\/www.tagesschau.de\/faktenfinder\/grossbritannien-wahl-desinformation-migration-100.html\" target=\"_blank\" rel=\"noreferrer noopener\">Germany\u2019s main public service news organization, <em>ARD-aktuell<\/em><\/a>, reported that similarly racist, anti-immigrant accounts on X were targeting the UK elections. Environmental campaign group <a href=\"https:\/\/www.globalwitness.org\/en\/campaigns\/digital-threats\/investigation-reveals-content-posted-bot-accounts-x-has-been-seen-150-million-times-ahead-uk-elections\/\" target=\"_blank\" rel=\"noreferrer noopener\">Global Witness confirmed<\/a> that automated X accounts were spreading divisive disinformation on climate change and migration, in posts viewed 150 million times. And two days after the UK vote, The Bureau of Investigative Journalism <a href=\"https:\/\/www.thebureauinvestigates.com\/stories\/2024-07-06\/russian-disinformation-networks-ramp-up-attacks-on-european-elections\/\" target=\"_blank\" rel=\"noreferrer noopener\">revealed that a Kremlin-backed network of fake news sites<\/a> had targeted the UK, French and US campaigns.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Significantly, though, the much-feared deepfake videos \u2013 which, for now at least, remains difficult to produce \u2013 were largely absent from these influence operations \u2013 illustrating that AI-generated prose, still images and audio could actually prove <a href=\"https:\/\/www.tandfonline.com\/doi\/full\/10.1080\/09546553.2024.2380089?src=#d1e239\" target=\"_blank\" rel=\"noreferrer noopener\">more consequential<\/a>.&nbsp;<\/p>\n\n\n\n<p>The network included sites that intelligence consultancy Recorded Future <a href=\"https:\/\/go.recordedfuture.com\/hubfs\/reports\/cta-2024-0509.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">revealed<\/a> in May as having used AI to \u201cplagiarize, translate, and edit content from mainstream media outlets, using prompt engineering to tailor content to specific audiences and introduce political bias\u201d.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Meanwhile, at the start of the year, a canvassing call that used a synthetic version of Joe Biden\u2019s voice <a href=\"https:\/\/www.reuters.com\/world\/us\/fcc-finalizes-6-million-fine-over-ai-generated-biden-robocalls-2024-09-26\/\" target=\"_blank\" rel=\"noreferrer noopener\">disrupted the New Hampshire primary<\/a>. A convincing fabricated audio clip of Sadiq Kahn impacted Spring\u2019s <a href=\"https:\/\/www.youtube.com\/watch?v=dLIwmaHTiCQ\" target=\"_blank\" rel=\"noreferrer noopener\">London mayoral campaign<\/a>. Equally convincing fake audio depicting <a href=\"https:\/\/fullfact.org\/election-2024\/wes-streeting-audio-clip-palestine\/\" target=\"_blank\" rel=\"noreferrer noopener\">health secretary Wes Streeting emerged during the UK election<\/a>.&nbsp;<\/p>\n\n\n\n<p>Much of that campaign\u2019s AI-generated visual fakery, such as the material ABC uncovered, consisted of still images. But, as we have also seen over recent weeks in the US campaign, most of these are not even photo-realistic. Evidently, they can still elicit strong emotions, but the fact that they are instantly recognisable due to their digital-paint aesthetic is due to leading generative AI platforms\u2019 efforts \u2013 initiated under pressure from fact checkers, citizens and emerging regulators \u2013 to restrict how they respond to user prompts.&nbsp;&nbsp;<\/p>\n\n\n\n<p>These moves gathered momentum following February\u2019s signing by major global tech companies of an <a href=\"https:\/\/www.aielectionsaccord.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">AI Elections Accord<\/a>. And though <a href=\"https:\/\/counterhate.com\/blog\/midjourney-ai-produces-misleading-images-of-biden-and-trump-in-50-of-test-cases-despite-promising-to-ban-fake-photos-of-presidential-candidates\/\" target=\"_blank\" rel=\"noreferrer noopener\">still highly imperfect<\/a>\u202fand unevenly applied (for example on X\u2019s Grok platform) they show how public pressure for regulatory guardrails can shape design choices that safeguard democracy.\u202f&nbsp;<\/p>\n\n\n\n<p>In other words, the social contexts of new technologies change as organisations and people adapt to them. Agile, well-informed regulation is achievable and starting to emerge, and vigilance among public bodies, media organisations and policy wonks about electoral threats is increasing.&nbsp;&nbsp;<\/p>\n\n\n\n<p>The UK Cabinet Office <a href=\"https:\/\/www.gov.uk\/government\/publications\/security-guidance-for-may-2021-elections\/online-disinformation-and-ai-threat-guidance-for-electoral-candidates-and-officials\" target=\"_blank\" rel=\"noreferrer noopener\">issued guidance on generative AI to electoral candidates and local officials<\/a>. The government established a <a href=\"https:\/\/news.sky.com\/story\/warning-to-uk-politicians-over-risk-of-audio-deepfakes-that-could-derail-the-general-election-13146573\" target=\"_blank\" rel=\"noreferrer noopener\">Joint Election Security Preparations Unit<\/a> in early 2024. And during the campaign itself, a simple but effective <a href=\"https:\/\/www.channel4.com\/programmes\/can-ai-steal-your-vote-dispatches\/on-demand\/75935-001\" target=\"_blank\" rel=\"noreferrer noopener\">Channel Four <em>Dispatches<\/em> documentary<\/a> highlighted deepfakes, further raising awareness. We\u2019re not as susceptible as we once were.&nbsp;<\/p>\n\n\n\n<p>Moreover, AI is starting to be used to promote accountability and fight fakery. While AI-driven online microtargeting has not yet taken off in election campaigns, the Labour Party experimented with Campaign Lab\u2019s <a href=\"https:\/\/doorknocking-bot-peymanity.replit.app\/\" target=\"_blank\" rel=\"noreferrer noopener\">chatbot scripts<\/a> to help canvassers communicate effectively with voters, using research by anti-polarisation think tank More in Common. And an <a href=\"https:\/\/www.electoralcommissionhelp.co.uk\/\" target=\"_blank\" rel=\"noreferrer noopener\">Electoral Commission guidance bot<\/a> helped candidates stay within the increasingly complex law regulating privacy and spending.&nbsp;<\/p>\n\n\n\n<p>Similar tools are now used to help human fact checkers \u2013 at the UK\u2019s <a href=\"https:\/\/fullfact.org\/blog\/2024\/jun\/the-ai-election-how-full-fact-is-leveraging-new-technology-for-uk-general-election-fact-checking\/\" target=\"_blank\" rel=\"noreferrer noopener\">Full Fact<\/a>, for example. Meanwhile, evidence from the US suggests prose AI generators can help journalists provide sophisticated rapid responses to live <a href=\"https:\/\/futurepolis.substack.com\/p\/ai-should-live-fact-check-presidential\" target=\"_blank\" rel=\"noreferrer noopener\">televised debates<\/a>.&nbsp;<\/p>\n\n\n\n<p>Universities across the world must lend their weight to such efforts. They must sidestep the hype cycle to help regulators and communicators respond quickly and effectively to the threat of online disinformation in time for the next big year of elections.&nbsp;<\/p>\n\n\n\n<p><strong>Andrew Chadwick is professor of political communication and director of the Online Civic Culture Centre at Loughborough. Professor Nick Jennings is vice-chancellor of Loughborough University and was the UK\u2019s chief scientific adviser for national security from 2010 to 2015.<\/strong>&nbsp;<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Image courtesy of Getty Images This piece written by Professor Nick Jennings and Andrew Chadwick was originally published on Times Higher Education. By the end of this year, about four billion citizens across more than 40 countries will have voted in elections.&nbsp;&nbsp; Accordingly, the early months of 2024 saw a global outpouring of speculation about [&hellip;]<\/p>\n","protected":false},"author":727,"featured_media":443,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"lboro_blog_alternative_thumbnail_image":"","footnotes":"","_links_to":"","_links_to_target":""},"categories":[10],"tags":[],"class_list":["post-442","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-articles"],"_links":{"self":[{"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/posts\/442","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/users\/727"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/comments?post=442"}],"version-history":[{"count":1,"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/posts\/442\/revisions"}],"predecessor-version":[{"id":444,"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/posts\/442\/revisions\/444"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/media\/443"}],"wp:attachment":[{"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/media?parent=442"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/categories?post=442"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/vice-chancellor\/wp-json\/wp\/v2\/tags?post=442"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}