Self-Replicating Automatons: Elon Musk's Fear

Is he losing it?
- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0- 0 * - <
<739610877-3104-376.101077-1106.75103739110792103.108-5'92.9410776.>
- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0- 0 * - <
Last bumped on Oct 18, 2017, 11:57:49 AM
Grey goo is a concern, but I'm not sure it's a major one.
Is he unironically concerned or is he just playing with the thought like John von Neumann?
GGG banning all political discussion shortly after getting acquired by China is a weird coincidence.
ANYTHING which replicates unchecked is ultimately dangerous. Bacteria, locust, virus, nanobot, or even a molecule.

It's not crazy. It's just beyond most peoples understanding.
For years i searched for deep truths. A thousand revelations. At the very edge...the ability to think itself dissolves away.Thinking in human language is the problem. Any separation from 'the whole truth' is incomplete.My incomplete concepts may add to your 'whole truth', accept it or think about it
"
pneuma wrote:
Grey goo is a concern, but I'm not sure it's a major one.


The thing is, making nanobots is difficult. So one of the first natural things for a researcher to do in nanotech is to build a nanobot factory.... which can go horribly world ending wrong, even with the best of intentions and precautions.

Also there is the subject of weaponizing nanobots. The first applications of many new revolutionary technologies is military. A weapon producer only need produce ONE self replicating weaponized nanobot. From then on its almost all profit. It lays the stage for a very lucrative industry.
For years i searched for deep truths. A thousand revelations. At the very edge...the ability to think itself dissolves away.Thinking in human language is the problem. Any separation from 'the whole truth' is incomplete.My incomplete concepts may add to your 'whole truth', accept it or think about it
Only if his only concern is labor displacement is he not cray.
- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0- 0 * - <
<739610877-3104-376.101077-1106.75103739110792103.108-5'92.9410776.>
- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0 -- 0 * - < _ > - * 0- 0 * - <
Just detonate a nuke in space, problem solved.

"ANYTHING which replicates unchecked is ultimately dangerous."
-You intentionally forgot to include humans.
https://www.youtube.com/watch?v=JcKqhDFhNHI
Elon Musk has a proven track record of turning original and futuristic ideas into a well-executed reality. I wouldn't outright dismiss anything he says, although with all the crazy ideas he generates, some are bound to be just that, flights of the imagination.
The Wheel of Nerfs turns, and builds come and pass, leaving memories that become legend. Legend fades to myth, and even myth is long forgotten when the build that gave it birth comes again.
It's real. All these geniuses are saying it from Hawkins to Musk. We had a good 200K year run as apex beings but that will soon end. Automatons will be exponential evolutionary pattern as well unlike biological organisms. So like if they can make one as smart as human it will be 200x smarter in a decade.
Git R Dun!
Last edited by Aim_Deep on Oct 12, 2017, 12:21:20 PM
Replicators were bad news in Stargate SG1. They were originally autonomous little robots that took more primitive forms (insects/spiders) and only consumed materials and technology for self replication purposes, without any sort of bias. Basically, they consumed as much resources as they possibly could and self replicated as fast as they could. And then they evolved into human form replicators. And then they became supremacists. Anything biological was deemed inferior.

Creating any sort of machine organism capable of evolution is a terrible idea, not just in science fiction.

There needs to be some sorta line drawn as to what we do with stuff like nanomachines. We can't ever give something like nanomachines a sentient AI. Any sort of sentient AI could potentially circumvent any measures we put in place to contain it. And that's what some people don't realize. Any advanced sentient AI should be assumed to be more intelligent and capable than any human. And if we teach an AI to be like us, and think like us, then that includes all of the vices of humanity.
Last edited by MrSmiley21 on Oct 13, 2017, 11:31:25 AM

Report Forum Post

Report Account:

Report Type

Additional Info