It became very clear to me, probably about a decade ago, that we’re just on a trend line where on a long enough time scale I can 3D print a nuclear weapon in my house. On a long enough time scale, I can take a synthetic biology lab in my room and create a weaponized virus that combines the most virulent features of smallpox and the longevity of AIDS and spreads like the flu.
play_arrow
stop
Existential Risks: Weaponized Viruses, The End of Commercial Air Travel, 3D Printing Nuclear Weapons
Drones are essentially guided bullets. It’s a bullet with a tracking system... I think they easily end commercial aviation as we know it
play_arrow
stop
Drones Will End Commercial Aviation as We Know It
At any given time on the planet, there are a few people – and one is too many – if you gave them a button that would end the planet, they would press that button
play_arrow
stop
With Suicidal Mass Shootings & Murders, Technology is the Force Multiplier
On a long enough time scale, I think commercial air travel will come to an end because weaponized drones will be too prevalent
play_arrow
stop
Existential Risks: Weaponized Viruses, The End of Commercial Air Travel, 3D Printing Nuclear Weapons
There’s no free lunch here. The act of creating the technology required to save us will first create the technology that will destroy us.
play_arrow
stop
Creating the Technology Required to Save Humanity Will First Create the Technology That Destroys Humanity
Just look at how we treat every other creature on Earth to get an idea of how AI will treat us
play_arrow
stop
How Will AI Treat Humans?
What if it turns out that all the things that we do to make day-to-day life a little more pleasurable, ranging from taking SSRIs to using Facebook, make your average baseline pleasure line slightly higher, but actually increase the blow-up risk.
play_arrow
stop
With Suicidal Mass Shootings & Murders, Technology is the Force Multiplier
I don’t think general AI is high up on the list of issues, not because it’s impossible, but because it’s going to take a very long time. It's improbable in the next 50-100 years.
play_arrow
stop
Naval Thinks General AI is Not Happening Anytime Soon
If you look at how much of this AI stuff is being peddled today by people who did not have "AI" as a title on their business card five or ten years ago, it tells you that there's no such thing as a true AI expert who we should all be following and listening to
play_arrow
stop
There Are No General Artificial Intelligence Experts
It’s been getting easier and easier to destroy something thanks to technology, and it’s getting harder and harder to defend against
play_arrow
stop
Existential Risks: Weaponized Viruses, The End of Commercial Air Travel, 3D Printing Nuclear Weapons
If you were a Bond villain trying to take over the world, the three technologies that you would look at are synthetic biology, hidden nuclear weapons, and hunter-killer suicide drones that are miniaturized…like nano drones that use pheromone tracking to find their targets.
play_arrow
stop
Drones Will End Commercial Aviation as We Know It
The basic idea is that if we cross certain technological red lines, there’s some non-zero chance that a super artificial intelligence could arise, seize control, and imperil humanity
play_arrow
stop
With AI, We're Dealing With Scientists Making Dangerous Bets For Privatized Gains and Socialized Losses
Saying you’re an “AI expert” just means your identity is tied up in the technology, making it hard to tolerate any criticism of it
play_arrow
stop
There Are No General Artificial Intelligence Experts
Physical privacy is dead
play_arrow
stop
It's Only a Matter of Time Before AI Can Design Better Software Than Humans
When suicidal mass murderers really go all in, technology is the force multiplier.... When someone goes nuts with a knife versus a machine gun, the person with the more powerful technology is going to kill more people.
play_arrow
stop
With Suicidal Mass Shootings & Murders, Technology is the Force Multiplier
It’s just the nature of the nature of the Faustian bargain with technology that we get so much more power over our natural environment. That power includes the ability to destroy things and the destructive powers arrive long before the protective powers do.
play_arrow
stop
Existential Risks: Weaponized Viruses, The End of Commercial Air Travel, 3D Printing Nuclear Weapons
It's going to cost a hundred billion dollars plus to develop an AI
play_arrow
stop
With AI, We're Dealing With Scientists Making Dangerous Bets For Privatized Gains and Socialized Losses
At some level, it’s just hard to image how we tame nature without the ability to end nature
play_arrow
stop
Creating the Technology Required to Save Humanity Will First Create the Technology That Destroys Humanity
Creating destructive power is actually a lot easier (compared to protective powers) - “From the moment we first split the atom to the first nuclear bomb was a much shorter trip compared to the first nuclear power plant
play_arrow
stop
Amara's Law: We Overestimate Technology in the Short Run and Underestimate it in the Long Run
Even though they signed the Genova Convention, I will bet you there are multiple working bioweapons labs in the world today
play_arrow
stop
The Prevelance of Bioweapons Labs (even though they're against the law)