Jump to content

Is a technological singularity bad?


Great Unclean One: VK

Recommended Posts

Basically, is it bad that we build a super computer that's smarter then us?

 

And then tell that computer to create a even smarter and stronger computer?

 

And then tell that one to build a stronger, faster, and smarter computer?

 

And so on?

 

Is that bad?

 

Many fear Skynet and a robotic revolution.

 

I don't fear such things if we do the smart thing.

 

Create a friendly AI.

 

Like HAL9000.

 

Or GLaDOS.

 

Plus, is it wrong to equip ourselves with implants, making cyborgs?

 

Or make android sex slaves?

 

Overall, is it wrong for us to have a technological singularity, where technology expands and grow at a rapid rate?

 

But the best thing about a technological singularity, the chance to completely merge with robots and leave the human body.

 

Yep, We all know that once I move onto a android body, I'm tentacle raping female and male androids.

 

All those who want to watch, just tell me.

Link to comment
Share on other sites

Once we get to that point, hells yeah, implant me up.

 

Think of it this way. Humans no longer naturally evolve due to the lack of natural selection. Technological superiority, morals, and our society keep 'weak' gene mutations in circulation, which prevents us from gradually adapting to our environment in that way.

 

However, we gained technological evolution. It's how we advance ourselves, now, and it's far more rapid and effective than our previous method. If that, too, stagnates, we'll have to seek out a new way of bettering ourselves; but first, I'd like to see us draw all we can from tech, first.

Link to comment
Share on other sites

Is it really possible to create an intelligence greater than ours?

 

Of course.

 

It'll just take a lot of time, patience, and love to create it.

 

For me, I suggest creating a feminine AI that's friendly.

 

Put "her" in charge of creating a smarter AI.

 

And so on.

 

Then once "she" completes "her" mission, we use "her" to make improvements on current tech. Then when the new smarter AI created from "her" creates a smarter one, we give me the old AI and use the new one to upgrade our tech. And we'll just continue this cycle.

 

Also, I should upload my self onto the Interwebz.

 

Basically, Ghost in the Shell.

Link to comment
Share on other sites

of course, at some point, the chain would have to plateau, and it would be entirely impossible to make a 'better' or 'more advanced' computer AI using subjective terms like that. Of course, at that point, we'd have instantaneous communication, eternal life, no need to consume resources and might even be comparable to a race of gods.

 

Or maybe we wouldn't be a 'race' at all, but a single mind networked together, a single consciousness of almighty power that exists without need of anything else.

 

But at that point, we'd be God. So aiming higher would just be silly anyway.

Link to comment
Share on other sites

of course, at some point, the chain would have to plateau, and it would be entirely impossible to make a 'better' or 'more advanced' computer AI using subjective terms like that. Of course, at that point, we'd have instantaneous communication, eternal life, no need to consume resources except raw energy and might even be comparable to a race of gods.

 

Or maybe we wouldn't be a 'race' at all, but a single mind networked together, a single consciousness of almighty power that exists without need of anything else other than energy.

 

But at that point, we'd be God the Borg. So aiming higher would just be silly anyway.

 

Fix'd

Link to comment
Share on other sites

I'd always worry about something going wrong, surely if we give something more and more intelligence it will start to question why? Which could lead us to more trouble.

 

Unless we install directives and restrictions.

 

Sure, it would go against my code, but still.

 

I want a ultimate android tentacle raping body.

Link to comment
Share on other sites

Pish, c'mon. The Cybermen aren't even close to a technological singularity. They still have hardware.

 

When software can exist independently of most other matter and we can be converted to that state, we have no need to consume energy and individuality becomes no longer necessary, we will have hit the state where tech can't get us to any higher position. Assumingly, a new discipline would then appear to take its place, probably something we can't comprehend in our current state.

Link to comment
Share on other sites

So your suggesting that we create an Artificial inteligance, then have him/her make somthing smarter then it?

 

anyway, being an android... I wonder what that would be like, considering its an android you'd probably look like your old self right? Androids are robots with AI's designed to look like humans, so basicly and artificial human, I think, or mabye they dont have AI's and they just look human? I dont know, but I do know that androids are designed to look human

Link to comment
Share on other sites

Androids are robots designed to look, act, and think human. Cyborgs are humans upgraded with technology, which is what I think you mean by "being an android". Also, if we build something designed to build something smarter, eventually along the line thetechnoligy would question why, and we would have a skynet thing. I don't want that.

Link to comment
Share on other sites

Androids are robots designed to look, act, and think human. Cyborgs are humans upgraded with technology, which is what I think you mean by "being an android". Also, if we build something designed to build something smarter, eventually along the line thetechnoligy would question why, and we would have a skynet thing. I don't want that.

 

UNLESS, we install something into it to be friendly or to control it.

Link to comment
Share on other sites

Yes, but if it is created to build better/smarter versions even if one of them makes a small mistake(which could happen, with artificial intelligence) the next one could get out of control. Not just like terminator, but also 9, with the fabricator, which was smart enough to create drones that destroyed humanity.

Link to comment
Share on other sites

I saw this space video at school of imaginary planets thought up by scientists with species that they imagined could survive there.

 

For one of them it was a planet where they origional life created intelligent robots that one day took over and made more of themselves constantly improving at an extremely fast rate to the point where they built a tower of themselves up into space (not needing to breathe) and created a sort of cage around their sun to absorb as much energy as possible until they had enough to travel light years across space to another planet with another star to continue their constant development.

 

I think we should not go too far with our manipulation of life. I don't entirely like the idea of genetic engineering so long as we are careful. Selective breeding is unnatural enough, like controlling evolution. But combining ourselves with machines could go to any lenghts, we could have dishwashers in our chests or enything, that is too far it takes away what it means to be human creating a pointless existence, not being able to do anything meaningful.

 

We are going too fast, we need to slow down before we crash.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...