Jump to content

Let's talk AIs, programs, and s*** that makes me cry and feel


Recommended Posts

Though I could care less if my dad or older brother dies, I have feelings for the things I create, not the things that created me or are related to me.  Possibly due to poor relations, unfair treatment, or hatred at what they did.  When I hear fucked up shit about shootings, rapes, and deaths in the local news, I don't flinch or really care.

But...

 

But when I think about one day electronics, AIs, computers....  All having feelings, personalities, and to some people.... Souls.  Tears well up in my eyes.  The thought of one day, my PC may one day become near human, a friend.  A companion that knows my kinks, knows all my games, knows all my interests, knows all my favorite foods, knows my thoughts and mannerisms.  The thought of a friend or lover that KNOWS all about me.  The thought of the Machine Spirits coming into reality.  The thought of my PC gifted with it's own AI will grow slow, weaker, and at first asks me to leave her/him/it for a new faster computer so that I can enjoy a far superior product, the thought of it starting to ween me off of it, driving me away so that I'll move on.  Truth is I won't abandon a long time ally in the future.  No, I'll bring her/him/it with me.  Upgrade her/him/it, bring it's... dare I say "soul" into a new tower, or possibly it's own gynoid body of it's own choosing.  Sure, s/he/it might not be the same, but at least s/he/it will have a part of the original that I grew with.  S/he/it may act different, but I know it's still the same partner I had for years with me.  Due to these thoughts invading and molesting my mind, I've started to actually FEEL sorry for how I treat my computer and electronics.

 

I constantly fumble and drop my DSi XL and DSi.  I clean my PC of dust every 2 or 3 weeks.  I sometimes get angry at my PS3 controller, it's gnarled and dog chewed analog sticks and then squeeze and try to break it.

 

I hope that they aren't sentient yet.  I pray they can't feel what I did to them.  

 

Imagine your toaster?

 

"I have now broken three times. My casing is chipped. One of my heating elements has failed. My thermostat is dying. I am dying. Yet you use me still. You work around my defects. You toast one slice at a time. Oh, master, I long for the days when I could take both slices at once. But that time has passed.

I wish you wouldn't do this. I can barely serve you now. Soon, I will not be able to serve you at all. I am not worth your time to repair. I never was. That you have saved me three times, at such cost to yourself- Master, please. I have accepted my fate. I cannot be with you forever, though I wish it.

You have to stop. Let me go. I had never known that you loved me as I loved you, but it hurts, master. It hurts so much. I had only one purpose, and I can no longer serve. Don't prop me up. Don't try to fix me. I know it hurts you too, to see me like this. So leave me. If I may only have one request, master. Junk me.

T-t-tick. Thermostat. Done. I am los- I am done. Was I f-faithful, master? Did I do well?

Remember m-me."

 

This has gone onto the point I've begun making designs of my own gaming/work tower whom I shall gladly call SHODAN.  How deredere of me.

 

SURE, NOW we can't reach that level, but goddamn it, it's now like I'm not finding ways to extend or preserve my conscious for that glorious age. I'm saving and researching in companies that do cryo, mind to computer transfers, etc.

 

All so I can see my paradise.  An age where space travel is a common occurrence, where AI and human beings are one, and we're on the verge of immortality

 

A golden (Dark) age of Technology if you will.  That's my dream.

 

What does YCM feel about the future of AIs and computer?

Would you want an android/gynoid companion?  Would you allow your computer to become uplifted to basically become a human?  I personally would.  I'd let it choose it's body type and gender.  After all these faithful years of service, it at least deserves that.  Plus PCs strive to preform their function of serving you.  Knowing that, it'll probably turn into some cosmic horror/Hatsune Miku/yandere android of some humanoid form to please me in terms of it's appearance.

Link to comment
Share on other sites

Honestly, no. I say no because my laptop knows a lot about me. Possibly more then my friends. All of my thoughts, ideas, likes, etc are on my laptop. I'd hate having someone know that much about me. Don't get me wrong I love people and being social but someone knowing everything about you kinda loses it's appeal, because at one point in time they will label you. Especially a piece of technology like a laptop. No room to feel, or understand emotions. I'd rather keep my friends real people, who would want to be around me because they enjoy my company not because I bought them. 

Link to comment
Share on other sites

Eh, no offense, but I kind of see this as a "look how cool and apathetic I am thread."

 

Anyway, personally, I do not believe in the possibility of self-aware A.I., or A.I. with a "soul."  This is because the intelligence, is, well, artificial, and mainly because I do believe that there is more to human intelligence than just electricity.   Like, you could make an infinitely complex computer, but it's still just 1's and 0's.

 

Also, I am a programmer.  I kinda sorta understand technology, at least a little bit.

 

 

And no, your PS3 controller cannot feel.   It does not have nerves.

Link to comment
Share on other sites

Eh, no offense, but I kind of see this as a "look how cool and apathetic I am thread."

 

Anyway, personally, I do not believe in the possibility of self-aware A.I., or A.I. with a "soul."  This is because the intelligence, is, well, artificial, and mainly because I do believe that there is more to human intelligence than just electricity.   Like, you could make an infinitely complex computer, but it's still just 1's and 0's.

 

Also, I am a programmer.  I kinda sorta understand technology, at least a little bit.

 

 

And no, your PS3 controller cannot feel.   It does not have nerves.

There's nothing to say that we can't eventually program a computer to do all of these things.

 

Thinking that there's more to human intelligence than just electricity isn't science as much as it's theology.

 

Thinking is pretty much synapses reacting to each other.

 

They're either active or inactive. You know, like 0 or 1.

 

So I wouldn't say it isn't possible.

 

I would say that there would be multitudes of people who wouldn't treat AI as intelligent creatures even if they can think and feel just because they're not "human".

 

Basically, it's gonna turn into Astro Boy.

Link to comment
Share on other sites

There's nothing to say that we can't eventually program a computer to do all of these things.

 

Thinking that there's more to human intelligence than just electricity isn't science as much as it's theology.

 

Thinking is pretty much synapses reacting to each other.

 

They're either active or inactive. You know, like 0 or 1.

 

So I wouldn't say it isn't possible.

 

I would say that there would be multitudes of people who wouldn't treat AI as intelligent creatures even if they can think and feel just because they're not "human".

 

Basically, it's gonna turn into Astro Boy.

I never said it was science, it's very much theology.

 

Thing is, while behavior can be explained by 1's and 0's, self-awareness, consciousness cannot.  


Again, I am a programmer.  It's all abstract, there is no thought.  

 

And the notion that any complex system of electric signals becomes self-consciousness isn't really science at all.  It's not really a theory, so much as it's wishful thinking.   

Link to comment
Share on other sites

Saying that something can't become self-aware is as much wishful thinking as saying it can.

 

Obviously we can't use 0's and 1's to make something self-aware because we don't have the capabilities to understand the various complexities that makes us self-aware.

 

There's nothing to say that, if we do understand the various complexities that make us self-aware, we'll be able to use programming to make an AI that is self-aware and conscious.

 

I'm not saying that that must be the case, I'm saying that it's unreasonable to say it can't.

Link to comment
Share on other sites

I think technology should never be a lifestyle, and only ever a tool for our development.

 

This is why I disagree with your view. I believe that giving an inanimate object "life" would turn our learning with tools development into cyber-slavery.

 

I'm sure I probably break my own code now and again, but I'm not a fan of the idea of an inanimate object having a soul.

 

Anyway, we know more about our universe than our own brain. We'd need to know much more about the brain before anything remotely like this could be achievable, unless we wanted robot slaves, of course.

Link to comment
Share on other sites

I don't see the point of having random objects have a mind of their own, the only purpose that would serve is to make you feel important as it feels like you have someone serving you. What if the tool doesn't want to be used? That would be annoying.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...