Skip to content
Search AI Powered

Latest Stories

Thousands of Tech Leaders Have Signed a Pledge to Curb the Development of Deadly AIs

Thousands of Tech Leaders Have Signed a Pledge to Curb the Development of Deadly AIs

As the 21st century welcomes an AI boom, science and technology leaders sign a historical pledge to save humanity from the development of lethal autonomous weapons.

Autonomous AI and weapons may have a future. Just not together.

As of August 2018, more than 2400 high-impact players in science and technology — from SpaceX CEO Elon Musk to late astrophysicist Stephen Hawking — have signed a Lethal Autonomous Weapons Pledge declaring their intentions to halt an autonomous AI arms race before it begins. The historic motion urges governments to consider instituting regulations that preemptively ban, deter, and monitor militarized nations from amassing Lethal Autonomous Weapon Systems (LAWS): a growing classification of automated weaponry, including unmanned drones, fighter jets, and any lethal AI endowed with decisive power over human life.


Authored by The Future of Life Institute (FLI), the pledge warns against the cataclysmic potential of developing LAWS to autonomously identify and exterminate a human target. Independent from the pledge, 26 countries in the United Nations have explicitly endorsed a ban on LAWS, including Argentina, China, Iraq, Mexico and Pakistan.

Last year at the International Joint Conference on Artificial Intelligence (IJCAI), one of the world’s leading AI conferences, FLI’s AI and robotics researchers released an open letter calling for the initial ban on LAWS to avert a “third revolution in warfare.” The letter instigated the United Nations to prevent autonomous AI technologies from being used as “sanctioned instruments of violent oppression.”

However, as AI integrates daily into global defense initiatives, spending on national machine-learning programs has only increased. Passed by US Congress in the last few weeks, the Pentagon’s Project Maven — a program designed to categorize objects in drone imagery — received a 580% funding increase under President Trump who signed the $717 billion National Defense Authorization Act (NDAA).

Cockpit of a Typhoon fighter jet, but pilots may get the ejector seat. Credit: Source

Described as an outreach organization protecting humanity against destructive technologies, FLI’s utopic mission hopes to “catalyze and support research initiatives for safeguarding life, developing optimistic visions of the future...including positive ways for humanity to steer its own course considering new technologies and challenges.” FLI’s founders commit to “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.” In a world without LAWS, autonomous AI could better resolve societal challenges, such as resource management, energy renewal, environmental conservation, and stabilizing evolving difficulties posed by the ongoing global financial crisis.

One of the pledge’s first signatories, Stuart Russell, a leading AI scientist at the University of California in Berkeley, believes manufacturing LAWS devastates basic human security and freedom: “It is not science fiction. In fact, it is easier to achieve than self-driving cars, which require far higher standards of performance. Our AI systems must do what we want them to do…keeping artificial intelligence beneficial.”

Elon Musk, alongside Facebook’s Mark Zuckerberg, is one of many entrepreneurs funding “beneficial” AI in joint machine-learning ventures like Vicarious FPC: a company building a replicable neural network of the brain’s neocortex that controls vision, body movement, and language functions. Google’s DeepMind, with over $400 million invested to date, currently leads the private sector in AI development aimed toward making a positive impact.

However, while not intended for LAWS, neuro-inspired learning algorithms used in Facebook’s current face-recognition software, Apple & Samsung’s smartphone personal assistants, and Google’s self-driving cars lend themselves perfectly to lethal AI applications. In India and the UK, researchers have similarly flown drones employing image recognition algorithms that scan video footage to permissively fire on “violent” targets.

How software analyzes individual poses and matches them to “violent” postures. Credit: Source

Non-governmental coalitions, such as the Human Rights Watch & Campaign to Stop Killer Robots, address LAWS as new “weapons of mass destruction,” indicating fears that warring nations and rogue terrorists alike would automate genocide at a horrific pace. Taking human life should never be solely delegated or decided by a machine intelligence. As autonomous AI reframes modern-day perspectives on military, industrial, and economic landscapes, policymakers must contemplate the long-term moral and geopolitical consequences of engaging human targets without constant oversight.

The FLI’s pledge concludes that building and deploying LAWS expedites an unprecedented level of catastrophe: “Once developed, lethal autonomous weapons will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”

The monumental steps taken toward proactively regulating LAWS are not only altruistic, but heralds a universal and existential responsibility to avoid summoning a demon” before it’s too late. Once opened, leading experts are unanimous: this Pandora’s box can’t be closed.

More from News

Teacher leading math class
Compassionate Eye Foundation/Steven Errico/Getty Images

Teacher Stunned After Student Argues That People Shouldn't Have To 'Think Anymore' Thanks To ChatGPT

There's no doubt that ChatGPT and similar tools are growing in relevance and application, and they're growing fast. The problem is that many people, especially younger individuals, seem to struggle with how much they should depend on the tools.

We already knew that ChatGPT could be a problem regarding critical thinking and creativity, so maybe we should have anticipated the mindsets that would develop, snubbing independent thinking when tools like ChatGPT are available.

Keep ReadingShow less
Rapunzel and crows at Tokyo DisneySea
@PopBase/X

Video Of Crows Ripping Out Animatronic Rapunzel's Hair At Tokyo DisneySea Goes Viral—And Yikes!

Disney princesses are usually known for their whimsical singing and befriending creatures from all across the animal kingdom, but Princess Rapunzel at Tokyo DisneySea may have misunderstood the assignment.

Earlier this week, Rapunzel was caught on video at DisneySea in Tokyo, but she didn't go viral for her cheery demeanor or her singing voice, which passers-by can hear from the base of her elegant tower. Rather, it was a pair of intruders who put her in the spotlight.

Keep ReadingShow less
Man getting a haircut
YakobchukOlena/Getty Images

Bald Men Are Up In Arms Over Viral Chart That Predicts Political Affiliation Based On A Man's Haircut

Can a man's haircut tell you his political affiliation? Scientifically, of course not... but we probably all have a gut feeling about it, regardless!

And a TikToker has followed that lead by developing a chart that predicts a man's political persuasion based on his hair alone—and bald men are NOT happy about it.

Keep ReadingShow less
transgender pride flag in front of Supreme Court
Heather Diehl/Getty Images

Republicans Slammed For Soulless One-Word Response To Democrats' Trans Day Of Visibility Tweet

According to research by the Williams Institute at UCLA School of Law, transgender people in the United States were over four times more likely than cisgender people to be victims of violent crime based on statistics from 2017-2018. A study by the non-profit Everytown for Gun Safety found the number of trans people murdered in the U.S. nearly doubled between 2017and 2021.

In the last 5–9 years, those figures have only increased as the Republican Party has made trans people the target of many of their political campaigns and legislative actions.

Keep ReadingShow less
Pete Hegseth; Screenshot of Kid Rock during Army helicopter fly-by
Mandel Ngan/AFP via Getty Images; @KidRock/X

Pete Hegseth Slammed After Calling Off Investigation Into Army Helicopter Fly-By At Kid Rock's House

Defense Secretary Pete Hegseth was criticized for calling off the U.S. Army's investigation after MAGA musician Kid Rock posted a video of an Army Apache helicopter doing a fly-by at his Nashville home.

The video shows Kid Rock saluting as the aircraft hovers near his property, standing next to a replica Statue of Liberty by his pool. In the brief clip, a helicopter that appears to be an AH-64 Apache—an attack helicopter used by the U.S. Army and National Guard—flies at low altitude near his estate in Whites Creek.

Keep ReadingShow less