skip to main content
Article Podcast Report Summary Quick Look Quick Look Video Newsfeed triangle plus sign dropdown arrow Case Study All Search Facebook LinkedIn YouTube Right Arrow Press Release External Report

Search Results

Your search for Larry Lewis found 39 results.

ai with ai: U.N. Convention on Conventional Weapons, Part II
/our-media/podcasts/ai-with-ai/season-1/1-6b
Dr. Larry Lewis   joins Andy and Dave to discuss the U.N. Convention on Conventional Weapons, which met in mid-November with a "mandate to discuss" the topic of lethal autonomous weapons. Larry provides an overview of the group's purpose, the group's schedule and discussions, the mood and reaction of various parts of the group, and what the next steps might be.
1-6B Dr. Larry Lewis   joins Andy and Dave to discuss the U.N. Convention on Conventional Weapons, which met in mid-November with a "mandate to discuss" the topic of lethal autonomous weapons. Larry provides an overview of the group's purpose, the group's schedule and discussions, the mood and reaction of various parts of the group, and what the next steps might be. U.N. Convention on Conventional Weapons, Part II TOPICS November 13-17 meeting of the Convention on Conventional Weapons (CCW) Group of Governmental Experts (GGE) on lethal autonomous weapons systems (86 countries
ai with ai: U.N. Convention on Conventional Weapons, Part I
/our-media/podcasts/ai-with-ai/season-1/1-6
Dr. Larry Lewis   joins Andy and Dave to discuss the U.N. Convention on Conventional Weapons, which met in mid-November with a "mandate to discuss" the topic of lethal autonomous weapons. Larry provides an overview of the group's purpose, the group's schedule and discussions, the mood and reaction of various parts of the group, and what the next steps might be.
Dr. Larry Lewis   joins Andy and Dave to discuss the U.N. Convention on Conventional Weapons, which met in mid-November with a "mandate to discuss" the topic of lethal autonomous weapons. Larry provides an overview of the group's purpose, the group's schedule and discussions, the mood and reaction of various parts of the group, and what the next steps might be. U.N. Convention on Conventional Weapons, Part I TOPICS November 13-17 meeting of the Convention on Conventional Weapons (CCW) Group of Governmental Experts (GGE) on lethal autonomous weapons systems (86 countries) 22
ai with ai: K9mm
/our-media/podcasts/ai-with-ai/season-5/5-1
Welcome to Season 5.0 of AI with AI! Andy and Dave discuss the latest in AI news and research, including. The White House calls for an AI “bill of rights,” and invites comments for information. In its 4th year, Nathan Benaich and Ian Hogarth publish their State of AI Report, 2021. [1:50] OpenAI uses reinforcement learning from human feedback and recursive task decomposition to improve algorithms’ abilities to summarize books. [3:14] IEEE Spectrum publishes a paper that examines the diminishing returns of deep learning, questioning the long-term viability of the technology. [5:12] In related news, Nvidia and Microsoft release a 530 billion-parameter style language model, the Megatron-Turing Natural Language Generation model (MT-NLG). [6:54] DeepMind demonstrates the use of a GAN in improving high-resolution precipitation “nowcasting.” [10:05] Researchers from Waterloo, Guelph, and IIT Madras publish research on deep learning that can identify early warning signals of tipping points. [11:54] Military robot maker Ghost Robots creates a robot dog with a rifle, the Special Purpose Unmanned Rifle, or SPUR. [14:25] And Dr. Larry Lewis joins Dave and Andy to discuss the latest report from CNA on Leveraging AI to Mitigate Civilian Harm, which describes the causes of civilian harm in military operations, identifies how AI could protect civilians from harm and identifies ways to lessen the infliction of suffering, injury, and destruction overall. [16:36]
. [11:54] Military robot maker Ghost Robots creates a robot dog with a rifle, the Special Purpose Unmanned Rifle, or SPUR. [14:25] And Dr. Larry Lewis joins Dave and Andy to discuss the latest report
ai with ai: Russian AI Kryptonite
/our-media/podcasts/ai-with-ai/season-1/1-40
CNA’s expert on Russian AI and autonomous systems,   Samuel Bendett , joins temporary host Larry Lewis (again filling in for Dave and Andy) to discuss Russia’s pursuits with the militarization of AI and autonomy. Russian Ministry of Defense (MOD) has made no secret of its desire to achieve technological breakthroughs in IT and especially artificial intelligence, marshalling extensive resources for a more organized and streamlined approach to information technology R&D. MOD is overseeing a significant public-private partnership effort, calling for its military and civilian sectors to work together on information technologies, while hosting high-profile events aiming to foster dialogue between its uniformed and civilian technologists. For example, Russian state corporation Russian Technologies (Rostec), with extensive ties to the nation’s military-industrial complex, has overseen the creation of a company with the ominous name – Kryptonite. The company’s name – the one vulnerability of a super-hero – was unlikely to be picked by accident. Russia’s government is working hard to see that the Russian technology sector can compete with American, Western and Asian hi-tech leaders. This technology race is only expected to accelerate - and Russian achievements merit close attention.
1-40 CNA’s expert on Russian AI and autonomous systems,   Samuel Bendett , joins temporary host Larry Lewis (again filling in for Dave and Andy) to discuss Russia’s pursuits with the militarization of AI and autonomy. Russian Ministry of Defense (MOD) has made no secret of its desire to achieve technological breakthroughs in IT and especially artificial intelligence, marshalling extensive resources for a more organized and streamlined approach to information technology R&D. MOD is overseeing a significant public-private partnership effort, calling for its military and civilian sectors
ai with ai: Hawking, Measuring “AI Capability” to the 9th Decimal Point, AI & Legal Liability, and Digital Creativity
/our-media/podcasts/ai-with-ai/season-1/1-22
Larry Lewis , Director of CNA’s   Center for Autonomy and AI , again sits in for Dave this week. He and Andy discuss: the recent passing of physicist Stephen Hawking (along with his "cautionary" views on AI); CNAS’s recent launch of a new Task Force on AI and National Security, Microsoft’s AI breakthrough in matching human performance translating news from Chinese to English; a report that looks at China’s "AI Dream" (and introduces an "AI Potential Index" to assess China’s AI capabilities compared to other nations); a second index, from a separate report, called the "Government AI Readiness Index," which inexplicably excludes China from the top 35 ranked nations; and the issue of legal liability of AI systems. They conclude with call outs to a fun-to-read crowd-sourced paper written by researchers in artificial life, evolutionary computation, and AI that tells stories about the surprising creativity of digital evolution, and three videos: a free BBC-produced documentary on Stephen Hawking, a technical talk on deep learning, and a Q&A session with Elon Musk (that includes an exchange on AI).
Larry Lewis , Director of CNA’s   Center for Autonomy and AI , again sits in for Dave this week. He and Andy discuss: the recent passing of physicist Stephen Hawking (along with his "cautionary" views on AI); CNAS’s recent launch of a new Task Force on AI and National Security, Microsoft’s AI breakthrough in matching human performance translating news from Chinese to English; a report that looks at China’s "AI Dream" (and introduces an "AI Potential Index" to assess China’s AI capabilities compared to other nations); a second index, from a separate report, called the "Government AI
ai with ai: Common Sense, Black Boxes, and Getting Robots to Teach Themselves
/our-media/podcasts/ai-with-ai/season-1/1-21
Larry Lewis , Director of CNA’s   Center for Autonomy and AI , sits in for Dave this week, as he and Andy discuss: a recent report that not all Google employees are happy with Google’s partnership with DoD (in developing a drone-footage-analyzing AI); research efforts designed to lift the lid – just a bit - on the so-called “black box” reasoning of neural-net-based AIs; some novel ways of getting robots/AIs to teach themselves; and an arcade-playing AI that has essentially “discovered” that if you can’t win at the game, it is best to either kill yourself or cheat. The podcast ends with a nod to a new free online AI resource offered by Google, another open access book (this time on the subject of Robotics), and a fascinating video of Stephen Wolfram of Mathematica fame, lecturing about artificial general intelligence and the “computational universe” to a computer science class at MIT.
Larry Lewis , Director of CNA’s   Center for Autonomy and AI , sits in for Dave this week, as he and Andy discuss: a recent report that not all Google employees are happy with Google’s partnership with DoD (in developing a drone-footage-analyzing AI); research efforts designed to lift the lid – just a bit - on the so-called “black box” reasoning of neural-net-based AIs; some novel ways of getting robots/AIs to teach themselves; and an arcade-playing AI that has essentially “discovered” that if you can’t win at the game, it is best to either kill yourself or cheat. The podcast ends
ai with ai: AI with AI: Lethal Autonomy and the Military Targeting Process, Part II
/our-media/podcasts/ai-with-ai/season-1/1-16b
Andy and Dave welcome back   Larry Lewis , the Director for CNA's Center for Autonomy and Artificial Intelligence, and welcome   Merel Ekelhof , a Ph.D. candidate at VU University Amsterdam and visiting scholar at Harvard Law School. Over the course of this two-part series, the group discusses the idea of "meaningful human control" in the context of the military targeting process, the increasing role of autonomous technologies (and that autonomy is not simply an issue "at the boom"), and the potential directions for future meetings of the U.N. Convention on Certain Weapons.
1-16B Andy and Dave welcome back   Larry Lewis , the Director for CNA's Center for Autonomy and Artificial Intelligence, and welcome   Merel Ekelhof , a Ph.D. candidate at VU University Amsterdam and visiting scholar at Harvard Law School. Over the course of this two-part series, the group discusses the idea of "meaningful human control" in the context of the military targeting process, the increasing role of autonomous technologies (and that autonomy is not simply an issue "at the boom"), and the potential directions for future meetings of the U.N. Convention on Certain Weapons. AI
ai with ai: Lethal Autonomy and the Military Targeting Process, Part I
/our-media/podcasts/ai-with-ai/season-1/1-16
Andy and Dave welcome back   Larry Lewis , the Director for CNA's Center for Autonomy and Artificial Intelligence, and welcome   Merel Ekelhof , a Ph.D. candidate at VU University Amsterdam and visiting scholar at Harvard Law School. Over the course of this two-part series, the group discusses the idea of "meaningful human control" in the context of the military targeting process, the increasing role of autonomous technologies (and that autonomy is not simply an issue "at the boom"), and the potential directions for future meetings of the U.N. Convention on Certain Weapons.
Andy and Dave welcome back   Larry Lewis , the Director for CNA's Center for Autonomy and Artificial Intelligence, and welcome   Merel Ekelhof , a Ph.D. candidate at VU University Amsterdam and visiting scholar at Harvard Law School. Over the course of this two-part series, the group discusses the idea of "meaningful human control" in the context of the military targeting process, the increasing role of autonomous technologies (and that autonomy is not simply an issue "at the boom"), and the potential directions for future meetings of the U.N. Convention on Certain Weapons. Lethal
cna talks: National Security Seminar: Artificial Intelligence in Nuclear Operations
/our-media/podcasts/cna-talks/2023/07/national-security-seminar-artificial-intelligence-in-nuclear-operations
On April 20, 2023, CNA’s National Security Seminar (NSS) series hosted a virtual panel discussion on the challenges, opportunities, and risks of incorporating artificial intelligence (AI) into nuclear operations. The event was centered on a recently released CNA report, Artificial Intelligence in Nuclear Operations. 
and Policy Analysis Program, Larry Lewis, Principal Research Scientist and AI expert, Special Activities and Analysis Program, CNA​   ContactName /*/Contact/ContactName ContactTitle /*/Contact