Sunday, May 17, 2026

Urgent Update: Former Google Executive Breaks Silence About AI Risks, Describing It as ‘More Than Just an Emergency’ and ‘Greater Than Climate Change’.

Date:

Qvive Editor: According to a former Google exec, AI is more likely to disrupt the world in the next few years than climate change.

In recent years, the rise of Artificial Intelligence (AI) has been nothing short of remarkable. From self-driving cars to virtual assistants like Siri and Alexa, AI has become an integral part of our daily lives. However, with this rapid advancement comes the looming question: is AI a threat to humanity?

Despite its numerous advantages, the rise of AI has raised ethical concerns among experts and the general public. One of the primary concerns is the potential impact of AI on the job market. As AI technology becomes more advanced, there is a fear that it could replace human workers, leading to widespread unemployment.

Goldman Sachs Predicts 300 Million Jobs Will Be Lost Or Degraded By Artificial Intelligence

In a 2019 Wells FargoWFC+0.1% [1] study, the bank concluded that robots would eliminate 200,000 jobs in the banking industry [2] within the next 10 years. 

Ref: https://www.forbes.com/sites/jackkelly/2023/03/31/goldman-sachs-predicts-300-million-jobs-will-be-lost-or-degraded-by-artificial-intelligence/?sh=52e2caf5782b

According to the World Economic Forum’s Future Fastest Declining Jobs

https://www.weforum.org/agenda/2023/05/jobs-lost-created-ai-gpt/
[1] https://www.forbes.com/companies/wells-fargo

[2] https://www.forbes.com/sites/jackkelly/2019/10/08/wells-fargo-predicts-that-robots-will-steal-200000-banking-jobs-within-the-next-10-years/

The idea of AI posing a threat to humanity has been a popular topic in science fiction for decades. From movies like “The Terminator” to “Ex Machina,” the concept of AI turning against its creators has been a recurring theme. While this may seem like a far-fetched scenario, some experts warn that the rapid development of AI could have unintended consequences.

One of the main concerns is the possibility of AI surpassing human intelligence and becoming uncontrollable. This so-called “singularity” scenario, where AI becomes self-aware and surpasses human intelligence, has sparked debates among scientists and technologists. If AI were to reach this level of intelligence, it could potentially pose a threat to humanity as we know it.

Leading the charge in funding AI research and development are the big tech companies such as Google, Amazon, Microsoft, and Facebook. These companies have deep pockets and the resources to invest heavily in AI technologies. They have dedicated research teams working on cutting-edge AI applications, from voice assistants and autonomous vehicles to deep learning algorithms.

Tech giants aren’t the only ones fueling the growth of AI; venture capitalists are also making a big impact by providing funding. They are investing in startups that are pushing the boundaries of AI innovation. These startups are focused on developing AI-powered solutions for a wide range of industries, including healthcare, finance, and cybersecurity. Venture capitalists see the potential for high returns on investment in the rapidly growing AI market. As AI technologies become more prevalent in our daily lives, there is a growing need to ensure that AI is developed and used ethically. Philanthropic organizations are funding initiatives to promote ethical AI research and development. 

Governments around the world are also recognizing the importance of AI and are investing in research and development initiatives. Countries like the United States, China, and the European Union have set up programs to fund AI projects and support AI education and training. Given the potential risks associated with AI, the government must step in and regulate its development and deployment. Without proper oversight, there is a real danger that AI technology may be misused or exploited. The government has a responsibility to protect its citizens and ensure that AI is used ethically and responsibly.

A former Google exec warned about the dangers of AI saying it is ‘beyond an emergency’ and ‘bigger than climate change’

A former Google officer has weighed in on the debate around AI and warned that it is a bigger emergency than climate change, in an an episode of The Diary of a CEO [a] podcast released Thursday. 

[a] https://www.youtube.com/watch?v=bk-nQ7HF6k4

Mo Gawdat, previously chief business officer at Google X — the company’s division for ambitious projects known as “moonshots” — spoke with podcast host Steven Bartlett about whether AI is sentient, its impact on jobs, and how he believes the government needs to regulate the industry.

‘When machines are specifically built to discriminate, rank and categorize, how do we expect to teach them to value equality?’ 
Mo Gawdat outlines the terrifying future of artificial intelligence and the ethical code we all must teach to machines to avoid it.

“It is beyond an emergency,” Gawdat told Bartlett in the podcast. “It’s the biggest thing we need to do today. It’s bigger than climate change believe it or not.”

He added: “The likelihood of something incredibly disruptive happening within the next two years that can affect the entire planet is definitely larger with AI than it is with climate change.” 

Gawdat further argued that he believes the rapid development of AI will result in “mass job losses” and that governments need to step in to regulate the technology.

He said: “I have a very clear call for action for governments. I’m saying tax AI-powered businesses at 98% so suddenly you do what the open letter was trying to do, slow them down a little bit, and at the same time get enough money to pay for all of those people that will be disrupted by the technology.”

Gawdat was referring to an open letter in March [b] that called for a six month pause on the development of AI more powerful than OpenAI’s GPT-4, signed by AI experts and leading figures in the industry including Elon Musk, Apple co-founder Steve Wozniak, and Stability AI CEO Emad Mostaque. 

[b] https://www.businessinsider.com/ai-letter-elon-musk-out-of-control-new-tech-gpt4-2023-3?r=US&IR=T

The letter said tech firms are part of an “out-of-control race to develop and deploy,” new AI technologies which risks losing control of civilization. 

Insider reached out to Gawdat for further comment via LinkedIn, but did not immediately hear back. 

After OpenAI’s chatbot ChatGPT launched in November and became the fastest growing consumer app in internet history [c] , Google launched a competing product called Bard [d] in March.

[c] https://www.businessinsider.com/chatgpt-may-be-fastest-growing-app-in-history-ubs-study-2023-2?r=US&IR=T

[d] https://www.businessinsider.com/google-bard-ai-chatgpt-alternative-2023-3?r=US&IR=T

He said: “I have a very clear call for action for governments. I’m saying tax AI-powered businesses at 98% so suddenly you do what the open letter was trying to do, slow them down a little bit, and at the same time get enough money to pay for all of those people that will be disrupted by the technology.”

Gawdat was referring to an open letter in March [e]  that called for a six month pause on the development of AI more powerful than OpenAI’s GPT-4, signed by AI experts and leading figures in the industry including Elon Musk, Apple co-founder Steve Wozniak, and Stability AI CEO Emad Mostaque. 

[e] https://www.businessinsider.com/ai-letter-elon-musk-out-of-control-new-tech-gpt4-2023-3?r=US&IR=T

The letter said tech firms are part of an “out-of-control race to develop and deploy,” new AI technologies which risks losing control of civilization. 

Insider reached out to Gawdat for further comment via LinkedIn, but did not immediately hear back. 

After OpenAI’s chatbot ChatGPT launched in November and became the fastest-growing consumer app in Internet history [f], Google launched a competing product called Bard [g]in March.

[f] https://www.businessinsider.com/chatgpt-may-be-fastest-growing-app-in-history-ubs-study-2023-2?r=US&IR=T

[g] https://www.businessinsider.com/google-bard-ai-chatgpt-alternative-2023-3?r=US&IR=T

This AI Is Dangerous – Open AI Sora

Source: Businessinsider, Panmacmillan, Youtube,

Also Read:

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Related articles

Al-Powered Smart Dust Can Spy, Heal, and Control Devices-All Smaller Than a Grain of S

RFID powder (often called "smart dust") refers to ultra-miniaturized Radio Frequency Identification chips. Hitachi pioneered ultra-small Radio-Frequency Identification (RFID) by developing...

India’s LPG and oil shortages will end; 6 major benefits from PM Modi’s UAE visit

Prime Minister Narendra Modi made a short visit to the United Arab Emirates (UAE) on Friday. This...

NEET-UG to Go Digital by 2027: A Security Upgrade or a Strategic Pivot Toward ‘Digital India’?

In a move that has sparked intense debate among educators, parents, and millions of aspirants, Union Education Minister...

Major revelation in NEET paper leak! NTA expert teacher turns out to be the mastermind, CBI investigation uncovers conspiracy

The country's largest medical entrance exam, NEET-UG 2026, is once again under scrutiny. The CBI has made revelations...
news-1701

sabung ayam online

yakinjp

yakinjp

rtp yakinjp

slot thailand

yakinjp

yakinjp

yakin jp

yakinjp id

maujp

maujp

maujp

maujp

sabung ayam online

sabung ayam online

judi bola online

sabung ayam online

judi bola online

slot mahjong ways

slot mahjong

sabung ayam online

judi bola

live casino

sabung ayam online

judi bola

live casino

SGP Pools

slot mahjong

sabung ayam online

slot mahjong

SLOT THAILAND

berita 128000726

berita 128000727

berita 128000728

berita 128000729

berita 128000730

berita 128000731

berita 128000732

berita 128000733

berita 128000734

berita 128000735

berita 128000736

berita 128000737

berita 128000738

berita 128000739

berita 128000740

berita 128000741

berita 128000742

berita 128000743

berita 128000744

berita 128000745

berita 128000746

berita 128000747

berita 128000748

berita 128000749

berita 128000750

berita 128000751

berita 128000752

berita 128000753

berita 128000754

berita 128000755

artikel 128000821

artikel 128000822

artikel 128000823

artikel 128000824

artikel 128000825

artikel 128000826

artikel 128000827

artikel 128000828

artikel 128000829

artikel 128000830

artikel 128000831

artikel 128000832

artikel 128000833

artikel 128000834

artikel 128000835

artikel 128000836

artikel 128000837

artikel 128000838

artikel 128000839

artikel 128000840

artikel 128000841

artikel 128000842

artikel 128000843

artikel 128000844

artikel 128000845

artikel 128000846

artikel 128000847

artikel 128000848

artikel 128000849

artikel 128000850

article 138000756

article 138000757

article 138000758

article 138000759

article 138000760

article 138000761

article 138000762

article 138000763

article 138000764

article 138000765

article 138000766

article 138000767

article 138000768

article 138000769

article 138000770

article 138000771

article 138000772

article 138000773

article 138000774

article 138000775

article 138000776

article 138000777

article 138000778

article 138000779

article 138000780

article 138000781

article 138000782

article 138000783

article 138000784

article 138000785

article 138000816

article 138000817

article 138000818

article 138000819

article 138000820

article 138000821

article 138000822

article 138000823

article 138000824

article 138000825

article 138000826

article 138000827

article 138000828

article 138000829

article 138000830

article 138000831

article 138000832

article 138000833

article 138000834

article 138000835

article 138000836

article 138000837

article 138000838

article 138000839

article 138000840

article 138000841

article 138000842

article 138000843

article 138000844

article 138000845

article 138000786

article 138000787

article 138000788

article 138000789

article 138000790

article 138000791

article 138000792

article 138000793

article 138000794

article 138000795

article 138000796

article 138000797

article 138000798

article 138000799

article 138000800

article 138000801

article 138000802

article 138000803

article 138000804

article 138000805

article 138000806

article 138000807

article 138000808

article 138000809

article 138000810

article 138000811

article 138000812

article 138000813

article 138000814

article 138000815

story 138000816

story 138000817

story 138000818

story 138000819

story 138000820

story 138000821

story 138000822

story 138000823

story 138000824

story 138000825

story 138000826

story 138000827

story 138000828

story 138000829

story 138000830

story 138000831

story 138000832

story 138000833

story 138000834

story 138000835

story 138000836

story 138000837

story 138000838

story 138000839

story 138000840

story 138000841

story 138000842

story 138000843

story 138000844

story 138000845

article 138000726

article 138000727

article 138000728

article 138000729

article 138000730

article 138000731

article 138000732

article 138000733

article 138000734

article 138000735

article 138000736

article 138000737

article 138000738

article 138000739

article 138000740

article 138000741

article 138000742

article 138000743

article 138000744

article 138000745

article 208000456

article 208000457

article 208000458

article 208000459

article 208000460

article 208000461

article 208000462

article 208000463

article 208000464

article 208000465

article 208000466

article 208000467

article 208000468

article 208000469

article 208000470

journal-228000376

journal-228000377

journal-228000378

journal-228000379

journal-228000380

journal-228000381

journal-228000382

journal-228000383

journal-228000384

journal-228000385

journal-228000386

journal-228000387

journal-228000388

journal-228000389

journal-228000390

journal-228000391

journal-228000392

journal-228000393

journal-228000394

journal-228000395

journal-228000396

journal-228000397

journal-228000398

journal-228000399

journal-228000400

journal-228000401

journal-228000402

journal-228000403

journal-228000404

journal-228000405

article 228000376

article 228000377

article 228000378

article 228000379

article 228000380

article 228000381

article 228000382

article 228000383

article 228000384

article 228000385

article 228000386

article 228000387

article 228000388

article 228000389

article 228000390

article 228000391

article 228000392

article 228000393

article 228000394

article 228000395

article 228000396

article 228000397

article 228000398

article 228000399

article 228000400

article 228000401

article 228000402

article 228000403

article 228000404

article 228000405

article 228000406

article 228000407

article 228000408

article 228000409

article 228000410

article 228000411

article 228000412

article 228000413

article 228000414

article 228000415

article 228000416

article 228000417

article 228000418

article 228000419

article 228000420

article 228000421

article 228000422

article 228000423

article 228000424

article 228000425

article 228000426

article 228000427

article 228000428

article 228000429

article 228000430

article 228000431

article 228000432

article 228000433

article 228000434

article 228000435

article 238000461

article 238000462

article 238000463

article 238000464

article 238000465

article 238000466

article 238000467

article 238000468

article 238000469

article 238000470

article 238000471

article 238000472

article 238000473

article 238000474

article 238000475

article 238000476

article 238000477

article 238000478

article 238000479

article 238000480

article 238000481

article 238000482

article 238000483

article 238000484

article 238000485

article 238000486

article 238000487

article 238000488

article 238000489

article 238000490

article 238000491

article 238000492

article 238000493

article 238000494

article 238000495

article 238000496

article 238000497

article 238000498

article 238000499

article 238000500

article 238000501

article 238000502

article 238000503

article 238000504

article 238000505

article 238000506

article 238000507

article 238000508

article 238000509

article 238000510

article 238000511

article 238000512

article 238000513

article 238000514

article 238000515

article 238000516

article 238000517

article 238000518

article 238000519

article 238000520

update 238000492

update 238000493

update 238000494

update 238000495

update 238000496

update 238000497

update 238000498

update 238000499

update 238000500

update 238000501

update 238000502

update 238000503

update 238000504

update 238000505

update 238000506

update 238000507

update 238000508

update 238000509

update 238000510

update 238000511

update 238000512

update 238000513

update 238000514

update 238000515

update 238000516

update 238000517

update 238000518

update 238000519

update 238000520

update 238000521

sumbar-238000396

sumbar-238000397

sumbar-238000398

sumbar-238000399

sumbar-238000400

sumbar-238000401

sumbar-238000402

sumbar-238000403

sumbar-238000404

sumbar-238000405

sumbar-238000406

sumbar-238000407

sumbar-238000408

sumbar-238000409

sumbar-238000410

news-1701