Sunday, May 17, 2026

India’s 2024 election is being influenced by AI, particularly deepfake technology, impacting the democratic process.

Date:

Deepfake democracy: Behind the AI trickery shaping India’s 2024 election

Want an opponent to campaign for you? Confuse voters between a real and a fake video? As India prepares for the world’s largest elections, parties are turning to AI for novel – and dangerous – strategies.

Qvive Editor: Deepfake technology, a form of artificial intelligence that can create highly realistic fake videos and audio recordings, is becoming increasingly prevalent in the political landscape of India. With the 2024 election on the horizon, the use of deepfakes is posing a serious threat to the integrity of the democratic process.

The spread of deepfakes has raised concerns about the authenticity of information in the digital age. With the ability to create convincing fake content, it has become increasingly difficult for the public to discern what is real and what is fake. This has the potential to undermine trust in the electoral process and erode the foundation of democracy.

The use of deepfake technology in the upcoming election is a stark reminder of the power of AI to manipulate and deceive. It is imperative that steps are taken to safeguard the democratic process and ensure that the voices of the people are not drowned out by the deceptive tactics of technology.

In a first, BJP leader used AI-generated videos during campaigning

Fox News: In March 2023, Gates released a blog post called “The Age of AI has begun,” where he touched on his thoughts about AI 

Gates showed his views on how AI would affect our workplace and touched on various cases where AI would have a direct effect on the jobs that humans do. Essentially speaking, Gates shares his idea on how AI can replace humans.

Modi’s BJP has been both a pioneer in the use of AI in campaigning and a victim of deepfakes

Indian political parties are using deepfakes for the 2024 Lok Sabha Election campaigns, sparking worries about digital deception.

Al Jazeera: As voters queued up early morning on November 30 last year to vote in legislative elections to choose the next government of the southern Indian state of Telangana, a seven-second clip started going viral on social media.

Posted on X by the Congress party, which is in opposition nationally, and was in the state at the time, it showed KT Rama Rao, a leader of the Bharat Rashtra Samiti that was ruling the state, calling on people to vote in favour of the Congress.

The Congress shared it widely on a range of WhatsApp groups “operated unofficially” by the party, according to a senior leader who requested anonymity. It eventually ended up on the official X account of the party, viewed more than 500,000 times.

It was fake.

“Of course, it was AI-generated though it looks completely real,” the Congress party leader told Al Jazeera. “But a normal voter would not be able to distinguish; voting had started [when the video was posted] and there was no time for [the opposition campaign] to control the damage.”

The astutely timed deepfake was a marker of the flood of AI-generated, or manipulated, media that marred a series of elections in India’s states in recent months, and that’s now threatening to fundamentally shape the country’s coming general elections.

Between March and May, India’s nearly one billion voters will pick their next national government in the world’s, and history’s, biggest elections. The threats posed by deceptive AI-generated media caught the world’s attention when faked sexually explicit images of the artist Taylor Swift appeared on social media platforms in January. In November, Ashwini Vaishnaw, India’s information technology minister, called deepfakes a “threat to democracy” and Prime Minister Narendra Modi has echoed those concerns.

But with the increased availability of handy artificial intelligence tools, teams across India’s political parties, including Modi’s Bharatiya Janata Party and the Congress, are deploying deepfakes to influence voters, managers of nearly 40 recent campaigns told Al Jazeera. While several AI tools used to generate deepfakes are free, others are available on subscription for as little as 10 cents per video.

‘Creating perception’

The BJP, arguably India’s most technologically sophisticated party, has been at the forefront of using illusions for campaigning. As far back as 2012, the party used 3D hologram projections of Modi so that he could simultaneously “campaign” in dozens of places at the same time. The strategy was deployed widely during the 2014 general elections that brought Modi to power.

Narendra Modi’s first 3D holographic projection speech in Ahmedabad, Gujarat

Watch: Narendra Modi’s hi-tech Election Campaign with 3-D

There was little deception involved there, but in February 2020, Manoj Tiwari, a BJP member of parliament, became among the world’s first to use deepfakes for campaigning. In three videos, Tiwari addressed voters in Delhi ahead of the capital’s legislative assembly elections in Hindi, Haryanvi and English – reaching three distinct audiences in the multicultural city. Only the Hindi video was authentic: The other two were deepfakes, where AI was used to generate his voice and words and alter his expressions and lip movement to make it almost impossible to detect, just on viewing, that they were not genuine.

n recent months, the Dravida Munnetra Kazhagam (DMK), which rules the southern state of Tamil Nadu, has used AI to resurrect its iconic leader M Karunanidhi from the dead, using lifelike videos of the former movie writer and veteran politician at campaign events.

Now, consultants and campaign managers say the 2024 elections could turbocharge the use of deepfakes even further.

“Politics is about creating perception; with AI tools [of voice and video modulation] and a click, you can turn the perception on its head in a minute,” said Arun Reddy, the national coordinator for social media at the Congress, who oversaw the party’s tech-savvy Telangana election. He added that the team was bursting with ideas to incorporate AI in campaigning, but that they didn’t have enough “trained people” to execute them all.

Reddy is strengthening his team – as are other parties.

“AI will have a resounding effect in creating the narrative,” Reddy told Al Jazeera. “The political AI-manipulated content will increase multifold, much more than what it ever was.”

‘Campaigns are getting weirder’

From the desert town of Pushkar in western India, 30-year-old Divyendra Singh Jadoun runs an AI startup, The Indian Deepfaker. Launched in October 2020, his company cloned the voice of Rajasthan state’s Congress chief ministerial candidate Ashok Gehlot for his team to send personalised messages on WhatsApp, addressing each voter by their name, during November assembly elections. The Indian Deepfaker is currently working with the team of Sikkim’s Chief Minister Prem Singh Tamang for holograms during upcoming campaigns. Sikkim is one of India’s smallest states in the northeast, perched on the Himalayas between India, Bhutan and China.

That’s the clean, official work, he said. But in recent months, he has been swamped by what he describes as “unethical requests” from political campaigns. “The political parties reach out indirectly via international numbers on WhatsApp, burner handles on Instagram, or connect on Telegram,” Jadoun told Al Jazeera in a phone interview.

In the November election, his company denied more than 50 such requests, he said, where potential clients wanted videos and audio altered to target political opponents, including with pornography. As a startup, Jadoun said his company is particularly careful to avoid any legal trouble. “And it is a very unethical use of AI,” he added. “But I know many people who are doing it for very low prices and are readily available now.”

During the election campaigns for the state legislatures of Madhya Pradesh in central India and Rajasthan in the west last November, police registered multiple cases for deepfake videos targeting senior politicians including Modi, Shivraj Singh Chauhan, Kailash Vijayvargia (all BJP) and Kamal Nath (Congress). The deepfake content production is often outsourced to private consulting firms, which rely on social media networks for distribution, spearheaded by WhatsApp.

A political consultant who requested anonymity told Al Jazeera that numbers of ordinary citizens with no public profile are registered on WhatsApp and used for the campaigns to make it harder for anyone to directly trace them back to parties, candidates, consultants and AI firms.

This consultant ran six campaigns in assembly elections last year for both the BJP and Congress. “In Rajasthan, we were using phone numbers of construction labourers to run our network on WhatsApp,” they said, “where deepfakes were primarily circulated.”

Meanwhile, AI-manipulated audios are particularly valuable tools in smaller constituencies, “targeting candidates with forged call recordings about arranging ‘black money’ for elections or threatening someone to buy votes,” the consultant said, whose own candidate was targeted with one such recording. The recordings are generally masked with candidates’ voices to cast them as evidence of corruption.

“Manipulating voters by AI is not being considered a sin by any party,” they added. “It is just a part of the campaign strategy.”

India has 760 million internet users – more than 50 percent of the population – behind only China.

Among all the requests, one from a constituency in southern Rajasthan stood out to Jadoun. Ahead of the state election in November, the caller requested that Jadoun alter a problematic but authentic video of their candidate – ​​whose party he did not disclose – to make a realistic deepfake. The aim: to claim that the original was a deepfake, and the deepfake the original.

“The opposition had a troubling video of their candidate and they wanted to spread it quicker on social media to claim it is a deepfake,” he said, bursting into awkward laughter. “Political campaigns are getting weirder.”

Threats to election integrity

Indian laws currently do not define “deepfakes” clearly, said Anushka Jain, a policy researcher at Goa-based Digital Futures Lab. The police have been using laws against defamation, fake news or violation of a person’s modesty, combined with the Information Technology Act, to try and tackle individual cases. But often, they’re playing whack-a-mole.

“The police are prosecuting on the effect of the deepfake and not because it is a deepfake itself,” she said.

Analysts say that the Election Commission of India (ECI), an autonomous body that conducts polling, needs to catch up with the shifting nature of political campaigns.

In the days leading up to the voting in Telangana state elections last year, ruling Bharat Rashtra Samithi party leaders repeatedly warned their followers on social media to stay alert against deepfakes deployed by the Congress party. They also appealed to the ECI against the deepfake clip that the Congress shared on the morning of the vote.

But the video remains online and the party never received notice from the ECI, two Congress leaders aware of the issue told Al Jazeera.

Al Jazeera has sought comments from the ECI but is yet to receive a response.

“Even if one person is misled into believing something and that changes his mind, it vitiates the purity of the election process,” said SY Quraishi, former chief election commissioner of India. “Deepfakes have made the problem of rumour-mongering during the polls graver by a thousand times.”

Quraishi said that deepfakes need to be moderated in real time to minimise the damage they can cause to Indian democracy.

“The ECI needs to take action before the damage is done,” he said. “They need to be a lot more prompt.”

‘Truth is out of reach’

The Indian government has been pressing major tech companies, including Google and Meta, to actively make efforts to moderate deepfakes on their platforms. IT minister Rajeev Chandrasekhar has met officials from these firms as part of deliberations over the threats posed by deepfakes.

By asking the tech sector to take the lead, the government escapes any criticism that it is trying to selectively censor selective deepfakes, or that it is trying to crack down on emerging AI technologies more broadly.

But by passing the buck to private companies, the government is raising questions about the sincerity of its intent to regulate manipulative content, said Prateek Waghre, the executive director of India’s Internet Freedom Foundation, a leading New Delhi-based tech policy think-tank. “It is almost wishful thinking,” he said.

Arguing that the tech companies have not been able to deal with the existing problems with content moderation, Waghre said that “the rise of AI now” has compounded challenges. And the current approach to content moderation ignores what’s really at the heart of the problem, he said.

“You are not solving the problem,” he said. “The design [of algorithms] is just flawed.”

On February 16, major tech companies signed an accord at the Munich Security Conference to voluntarily adopt “reasonable precautions” to prevent artificial intelligence tools from being used to disrupt democratic elections around the world. But the vaguely worded pact left many advocates and critics disappointed.

YouTube has announced that it will enable people to request the removal of AI-generated or altered content that simulates an identifiable person, including their face or voice, using its privacy request process.

“I’m not very optimistic about the platform’s capabilities to detect deepfake,” said Ravi Iyer, managing director of the Neely Center for Ethical Leadership and Decision-Making at the University of Southern California’s Marshall School of Business. “With low digital literacy and rising consumption of videos, this poses a grave risk to India’s election integrity.”

Identifying every AI-manipulated media is not a reasonable task, Iyer said, so companies need to redesign algorithms that don’t promote polarising content. “Companies are the ones with the money and resources, they need to take reasonable steps to tackle the rise of deepfakes,” he said.

The Internet Freedom Foundation has published an open letter urging electoral candidates and parties to voluntarily refrain from using deepfake technology ahead of the national elections. Waghre isn’t confident that many will bite, but he said it’s worth a try.

Meanwhile, political campaigns are bolstering their AI armouries – and some, like Reddy, the national coordinator for social media at Congress, concede that the future looks dark.

“Most people using AI are out there to distort the facts. They want to create a perception that’s not based on truth,” said Reddy. “Combine the penetration of social media in India with the rise of AI, the truth will be out of reach of people in the elections now.”

How AI-Generated Images Took Centre Stage In Telangana Elections During the Telangana elections, AI-generated images were widely circulated. What will this trend mean for Lok Sabha elections 2024?

Ref: https://www.boomlive.in/decode/how-ai-generated-images-took-centre-stage-in-telangana-elections-24076

India is committed to responsible and ethical use of AI: PM Modi

PM Modi addressed the Global Partnership on Artificial Intelligence (GPAI) Summit at Bharat Mandapam, New Delhi. “The development mantra of India is ‘Sabka Saath Sabka Vikas’, the Prime Minister said, underlining that the government has drafted its policies and programs with the spirit of AI for All. He said that the government strives to take maximum advantage of AI’s capabilities for social development and inclusive growth, while also committing to its responsible and ethical usage.

Source: Aljazeera, Pune Pulse-Image, Creativeblock-Image, Businesstoday-Image, Decode-Image

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Related articles

Al-Powered Smart Dust Can Spy, Heal, and Control Devices-All Smaller Than a Grain of S

RFID powder (often called "smart dust") refers to ultra-miniaturized Radio Frequency Identification chips. Hitachi pioneered ultra-small Radio-Frequency Identification (RFID) by developing...

India’s LPG and oil shortages will end; 6 major benefits from PM Modi’s UAE visit

Prime Minister Narendra Modi made a short visit to the United Arab Emirates (UAE) on Friday. This...

NEET-UG to Go Digital by 2027: A Security Upgrade or a Strategic Pivot Toward ‘Digital India’?

In a move that has sparked intense debate among educators, parents, and millions of aspirants, Union Education Minister...

Major revelation in NEET paper leak! NTA expert teacher turns out to be the mastermind, CBI investigation uncovers conspiracy

The country's largest medical entrance exam, NEET-UG 2026, is once again under scrutiny. The CBI has made revelations...
news-1701

sabung ayam online

yakinjp

yakinjp

rtp yakinjp

slot thailand

yakinjp

yakinjp

yakin jp

yakinjp id

maujp

maujp

maujp

maujp

sabung ayam online

sabung ayam online

judi bola online

sabung ayam online

judi bola online

slot mahjong ways

slot mahjong

sabung ayam online

judi bola

live casino

sabung ayam online

judi bola

live casino

SGP Pools

slot mahjong

sabung ayam online

slot mahjong

SLOT THAILAND

berita 128000726

berita 128000727

berita 128000728

berita 128000729

berita 128000730

berita 128000731

berita 128000732

berita 128000733

berita 128000734

berita 128000735

berita 128000736

berita 128000737

berita 128000738

berita 128000739

berita 128000740

berita 128000741

berita 128000742

berita 128000743

berita 128000744

berita 128000745

berita 128000746

berita 128000747

berita 128000748

berita 128000749

berita 128000750

berita 128000751

berita 128000752

berita 128000753

berita 128000754

berita 128000755

artikel 128000821

artikel 128000822

artikel 128000823

artikel 128000824

artikel 128000825

artikel 128000826

artikel 128000827

artikel 128000828

artikel 128000829

artikel 128000830

artikel 128000831

artikel 128000832

artikel 128000833

artikel 128000834

artikel 128000835

artikel 128000836

artikel 128000837

artikel 128000838

artikel 128000839

artikel 128000840

artikel 128000841

artikel 128000842

artikel 128000843

artikel 128000844

artikel 128000845

artikel 128000846

artikel 128000847

artikel 128000848

artikel 128000849

artikel 128000850

article 138000756

article 138000757

article 138000758

article 138000759

article 138000760

article 138000761

article 138000762

article 138000763

article 138000764

article 138000765

article 138000766

article 138000767

article 138000768

article 138000769

article 138000770

article 138000771

article 138000772

article 138000773

article 138000774

article 138000775

article 138000776

article 138000777

article 138000778

article 138000779

article 138000780

article 138000781

article 138000782

article 138000783

article 138000784

article 138000785

article 138000816

article 138000817

article 138000818

article 138000819

article 138000820

article 138000821

article 138000822

article 138000823

article 138000824

article 138000825

article 138000826

article 138000827

article 138000828

article 138000829

article 138000830

article 138000831

article 138000832

article 138000833

article 138000834

article 138000835

article 138000836

article 138000837

article 138000838

article 138000839

article 138000840

article 138000841

article 138000842

article 138000843

article 138000844

article 138000845

article 138000786

article 138000787

article 138000788

article 138000789

article 138000790

article 138000791

article 138000792

article 138000793

article 138000794

article 138000795

article 138000796

article 138000797

article 138000798

article 138000799

article 138000800

article 138000801

article 138000802

article 138000803

article 138000804

article 138000805

article 138000806

article 138000807

article 138000808

article 138000809

article 138000810

article 138000811

article 138000812

article 138000813

article 138000814

article 138000815

story 138000816

story 138000817

story 138000818

story 138000819

story 138000820

story 138000821

story 138000822

story 138000823

story 138000824

story 138000825

story 138000826

story 138000827

story 138000828

story 138000829

story 138000830

story 138000831

story 138000832

story 138000833

story 138000834

story 138000835

story 138000836

story 138000837

story 138000838

story 138000839

story 138000840

story 138000841

story 138000842

story 138000843

story 138000844

story 138000845

article 138000726

article 138000727

article 138000728

article 138000729

article 138000730

article 138000731

article 138000732

article 138000733

article 138000734

article 138000735

article 138000736

article 138000737

article 138000738

article 138000739

article 138000740

article 138000741

article 138000742

article 138000743

article 138000744

article 138000745

article 208000456

article 208000457

article 208000458

article 208000459

article 208000460

article 208000461

article 208000462

article 208000463

article 208000464

article 208000465

article 208000466

article 208000467

article 208000468

article 208000469

article 208000470

journal-228000376

journal-228000377

journal-228000378

journal-228000379

journal-228000380

journal-228000381

journal-228000382

journal-228000383

journal-228000384

journal-228000385

journal-228000386

journal-228000387

journal-228000388

journal-228000389

journal-228000390

journal-228000391

journal-228000392

journal-228000393

journal-228000394

journal-228000395

journal-228000396

journal-228000397

journal-228000398

journal-228000399

journal-228000400

journal-228000401

journal-228000402

journal-228000403

journal-228000404

journal-228000405

article 228000376

article 228000377

article 228000378

article 228000379

article 228000380

article 228000381

article 228000382

article 228000383

article 228000384

article 228000385

article 228000386

article 228000387

article 228000388

article 228000389

article 228000390

article 228000391

article 228000392

article 228000393

article 228000394

article 228000395

article 228000396

article 228000397

article 228000398

article 228000399

article 228000400

article 228000401

article 228000402

article 228000403

article 228000404

article 228000405

article 228000406

article 228000407

article 228000408

article 228000409

article 228000410

article 228000411

article 228000412

article 228000413

article 228000414

article 228000415

article 228000416

article 228000417

article 228000418

article 228000419

article 228000420

article 228000421

article 228000422

article 228000423

article 228000424

article 228000425

article 228000426

article 228000427

article 228000428

article 228000429

article 228000430

article 228000431

article 228000432

article 228000433

article 228000434

article 228000435

article 238000461

article 238000462

article 238000463

article 238000464

article 238000465

article 238000466

article 238000467

article 238000468

article 238000469

article 238000470

article 238000471

article 238000472

article 238000473

article 238000474

article 238000475

article 238000476

article 238000477

article 238000478

article 238000479

article 238000480

article 238000481

article 238000482

article 238000483

article 238000484

article 238000485

article 238000486

article 238000487

article 238000488

article 238000489

article 238000490

article 238000491

article 238000492

article 238000493

article 238000494

article 238000495

article 238000496

article 238000497

article 238000498

article 238000499

article 238000500

article 238000501

article 238000502

article 238000503

article 238000504

article 238000505

article 238000506

article 238000507

article 238000508

article 238000509

article 238000510

article 238000511

article 238000512

article 238000513

article 238000514

article 238000515

article 238000516

article 238000517

article 238000518

article 238000519

article 238000520

update 238000492

update 238000493

update 238000494

update 238000495

update 238000496

update 238000497

update 238000498

update 238000499

update 238000500

update 238000501

update 238000502

update 238000503

update 238000504

update 238000505

update 238000506

update 238000507

update 238000508

update 238000509

update 238000510

update 238000511

update 238000512

update 238000513

update 238000514

update 238000515

update 238000516

update 238000517

update 238000518

update 238000519

update 238000520

update 238000521

sumbar-238000396

sumbar-238000397

sumbar-238000398

sumbar-238000399

sumbar-238000400

sumbar-238000401

sumbar-238000402

sumbar-238000403

sumbar-238000404

sumbar-238000405

sumbar-238000406

sumbar-238000407

sumbar-238000408

sumbar-238000409

sumbar-238000410

news-1701