Whole Brain Emulation, as a platform for creating safe AGI Anna Salamon and Carl Shulman
Feb 23, 2016
Whole Brain Emulation, as a platform for creating safe AGI
Anna Salamon and Carl Shulman
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
AGI: More change, faster
AGIs can be copied
AGIs can be copied
Serial speedup
Serial speedup
Solving coordination problems
Recursive self-improvement
Recursive self-improvement
Recursive self-improvement
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Intelligence + Goals= Change
More intelligence → More change
More intelligence → more change
Most ways of rearranging our world would kill us
Values vary starkly
Arbitrary, rewirable receptors
Threading the needle
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Copying specific individuals
WBEs share our values
WBEs share our values
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
WBEs can also make AGIs. Faster.
Eventually
WBEs
AGIstable
cessation of research
WBEs can't rapidly self-improve
But can try variations
Copy what works
Save states can improve work skills
SaveRestore
... and stabilize goals
SaveRestore
Single “upload clans” could take over
Single “upload clans” could take over
... or invent powerful AI
... or invent powerful AI
Values also lost through competition
See Nick Bostrom's "The Future of Human Evolution," http://www.nickbostrom.com/fut/evolution.html
Fun
Happiness
Humangenes
Love
Play
Jobskills
Values also lost through competition
business as usual
human extinction
stable totalitarianism
controlled intelligence explosion
uncontrolled intelligence explosion
business as usual
human extinction
stable totalitarianism
controlled intelligence explosion
uncontrolled intelligence explosion
WBEs
So: is our shot better or worse with WBE?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
WBE can ease coordination
Races incentivize risk-taking
Safety infrastructure could be designed ahead of time
Arms races and time for precautions
1940 1950 1960 1970
Year of first nuclear testUS
1945USSR1949
UK1952
France1960
PRC1964
Forever in a day
1940 1950 1960 1970
USSR1949
UK1952
France1960
PRC1964
July 16,1945
July 17,1945
July 18,1945
July 19,1945
WBE as platform for further tech development
The possibility of a highly-controlled structure
Carefully chosen, tested components
Safety infrastructure could be designed ahead of time
... and stabilize goals
SaveRestore
Moral objections?
Even without “WBE systems”, advantages to WBEs
Reduced coordination problems
Taking the long view
Existence proof
?
WBE can ease the technical challenge
Subjectively slower computers
Moore's Law of Mad Science"Every eighteen months, the IQ required to destroy the world drops by one point." —Eliezer Yudkowsky
WBE safety monitors: as fast and numerous as AIs
WBE safety monitors: as fast and numerous as AIs
Cognitive enhancement for WBEs
But modified WBE can become inhuman fast
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Outline Background
AGI volatility Why risky
WBEs Whole Brain Emulation (WBE) WBE not stable But may help to build safe AI Can we make WBE come first?
Can we make WBE come first?
Inputs to WBE
Hardware Brain imaging Computational neuroscience
Hard to budge Moore's law And if we succeed, we accelerate AGI, too
Could automate brain imaging
Neuroscience models most likely bottleneck
Hardware Brain imaging Computational neuroscience
But same inputs also speed up brain-like AI
Hardware Brain imaging Computational neuroscience
How much boost each way?
?
WBE from partially controlled AI
So, what shall we do?
business as usual
human extinction
stable totalitarianism
controlled intelligence explosion
uncontrolled intelligence explosion
business as usual
human extinction
stable totalitarianism
controlled intelligence explosion
uncontrolled intelligence explosion
WBEs
Thank you
[email protected]@singinst.org