Why we need to regulate non-state use of arms
It's time for states to agree on anti-proliferation measures so that non-state actors cannot create lethal autonomous weapons by repurposing civilian products.
1982, BA in Physics, Oxford; 1986, PhD in Computer Science, Stanford. Since 1986, Professor, University of California, Berkeley. Fellow, present or former: AAAS; ACM; AAAI; Chaires Blaise Pascal. Research interests: all areas of artificial intelligence; nuclear treaty verification; intensive care medicine; long-term future of human and machine intelligence.
It's time for states to agree on anti-proliferation measures so that non-state actors cannot create lethal autonomous weapons by repurposing civilian products.
We asked experts to discuss the challenges of artificial intelligence developments in the workplace, their economic impacts and share their positive AI visions for the future.
Why science fiction authors and filmmakers can work alongside policymakers and AI experts in crafting a positive vision for the future economy.
What is artificial general intelligence and what will it mean if we ever get to develop it? What are the biggest risks of AI for our society? Can we control the way it affects our everyda...
The secret ingredient in PA technology that has been largely lacking to date is context. Up to now, machines have been largely oblivious to the details of our work, our bodies, our lives.
Davos 2016: There is no doubt that as the technology improves, autonomous weapons will be highly effective. But does that necessarily mean they’re a good idea?