flat assembler
Message board for the users of flat assembler.

flat assembler > Heap > Skynet versus The Red Queen -- Discussions on AI

Goto page Previous  1, 2, 3, 4, 5, 6 ... 10, 11, 12  Next
Author
Thread Post new topic Reply to topic
YONG



Joined: 16 Mar 2005
Posts: 8000
Location: 22° 15' N | 114° 10' E
Furs wrote:
I firmly believe that if we treat AIs right, since they're supposedly smarter than us, they'll treat us well in return.
We have been through this argument before. So, I guess that you know my standpoint.

Hope that you are right on this. Otherwise, the human race will definitely be wiped out by AIs.

Wink
Post 25 Aug 2017, 02:12
View user's profile Send private message Visit poster's website Reply with quote
Furs



Joined: 04 Mar 2016
Posts: 1421
If we get wiped out even though we don't treat them wrong -- then the AI isn't (to me) as intelligent as we make it to be. In this case, we can probably turn the situation around.

If we get wiped out because they treat us the same way we treated them (or make slaves out of us like in Matrix, if they even have a need for that), then I don't see the issue: a battle between two self-centered species, whoever wins doesn't matter since both suck.

But yes I know your stance I was just explaining to him Razz
Post 25 Aug 2017, 11:57
View user's profile Send private message Reply with quote
sleepsleep



Joined: 05 Oct 2006
Posts: 8273
Location: ˛                              ⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣ Posts: 6699
free will is another huge topic, Embarassed

how to separate conscious and signals processing? Wink
Post 25 Aug 2017, 12:15
View user's profile Send private message Reply with quote
ProphetOfDoom



Joined: 08 Aug 2008
Posts: 120
Location: UK
Hi Furs,
I must admit I had to have a hard think about your feedback argument. Self-modifying code could result in very dynamic AIs which their creators couldn't even imagine or predict. However, they'd still be modifying themselves deterministically ("same input, same output" just becomes "same input and internal state, same self-modification, same output"). That is to say, programs cannot act willfully, they can only react in the manner of a slave. So to me, even as a vegan it would be okay to treat an AI as a slave and far below an animal such as a cat. Now a human brain augmented by AI is a different situation altogether and IMO would deserve all the rights and respect (and suspicion) we give to humans. Probably more suspicion actually. I think AI-augmented humans is where the interest and danger lies.
As for the free will thing, well I can't prove it, it's just a strong belief I have. People are responsible for their actions otherwise why do we want revenge when someone hurts us? If they were just automata we would have no right or desire to punish.

Quote:
Lastly, I firmly believe that if we treat AIs right, since they're supposedly smarter than us, they'll treat us well in return


Intelligence doesn't beget morality. Josef Mengele was intelligent enough to get degrees in medicine and anthropology but it didn't stop him dissecting living human beings. Oops I mentioned a Nazi...

I didn't seriously think you wanted an AI apocalypse BTW. You just seemed kind of frustrated with people as they are now. Maybe AI will make us better, I guess we'll find out - it'll be a while yet considering how stupid Siri and "Okay Google..." currently are.

Sleepsleep, you should read about qualia.
Post 25 Aug 2017, 17:49
View user's profile Send private message Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 16651
Location: In your JS exploiting you and your system
You don't have to have deterministic AIs that follow the same-input-same-output paradigm. It is conceivable to have a random element within the neural network (or similar structure) that changes weights and values, perhaps slightly or perhaps more, and then produce unpredictable outputs for two identical AIs that experience the exact same inputs.
Post 25 Aug 2017, 20:00
View user's profile Send private message Visit poster's website Reply with quote
ProphetOfDoom



Joined: 08 Aug 2008
Posts: 120
Location: UK
Hmm. I had considered that in the past... but it would seem a pretty impoverished form of free will if it just depended on a random number generator. I'm sure there is some phenomenon in nature that imparts true freedom (for example to choose between good and evil).
I was discussing this with a friend and he suggested that our every action is determined by our genetics, environment etc. I think some people are just scared of freedom because "If I am free, I am responsible" or perhaps more to the point: "If I am free, I am guilty."
Post 25 Aug 2017, 20:20
View user's profile Send private message Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 16651
Location: In your JS exploiting you and your system
At a low level randomness may be what free-will is in humans, just a simple quantum fluctuation in the neurons. Concepts like good and evil are a much more high-level thing and I doubt anyone could pinpoint a neuron, or neurons, within the brain that determines such things as good or evil. It would only need subtle changes at the physical level and the cumulative effects can produce vastly different outcomes.
Post 25 Aug 2017, 20:27
View user's profile Send private message Visit poster's website Reply with quote
YONG



Joined: 16 Mar 2005
Posts: 8000
Location: 22° 15' N | 114° 10' E
sleepsleep wrote:
free will is another huge topic,
Indeed. Refer to:

Free will could all be an illusion, scientists suggest after study shows choice may just be brain tricking itself
http://www.independent.co.uk/news/science/free-will-could-all-be-an-illusion-scientists-suggest-after-study-that-shows-choice-could-just-be-a7008181.html

Wink
Post 26 Aug 2017, 02:19
View user's profile Send private message Visit poster's website Reply with quote
ProphetOfDoom



Joined: 08 Aug 2008
Posts: 120
Location: UK
That study only shows that people are somewhat inclined to automatically deceive themselves. Maybe it shows that we are somewhat free, rather than having freedom without boundaries. Which is obvious. The people in the study overestimated the degree of their freedom.
But, as I said before, some people have an urgent need to believe they are not free, because freedom brings with it all sorts of uncomfortable stuff like responsibility and guilt that they are too afraid to deal with.
Anyway to get back on topic, I was more concerned with proving that classical computers are not free and thus don't deserve rights, compassion, etc..
Post 26 Aug 2017, 03:01
View user's profile Send private message Reply with quote
sleepsleep



Joined: 05 Oct 2006
Posts: 8273
Location: ˛                              ⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣ Posts: 6699
ProphetOfDoom wrote:

Sleepsleep, you should read about qualia.

thanks, Smile

is free will something more to how conscious felt, reason, or aware, and surely there are too many layers we could investigate if we are forced or we could choose, or we could choose as we like,

the choosen layer of what we obsessed is the main factor to decide our decision if we have free will or no,

eg, i couldn't choose to not to breath if i want to continue living,

if the person get obsessed with this layer, the obviously, he/she got no free will,

is obsessed something/somebody a feature that beyond what you could control? probably yes, because this sound like falling in love or hating another person,

you have flee will? Laughing
Post 26 Aug 2017, 04:05
View user's profile Send private message Reply with quote
ProphetOfDoom



Joined: 08 Aug 2008
Posts: 120
Location: UK
I think love and hate are largely out of our control. Although we can choose to cultivate our love and hatred or choose not to.
Quote:
eg, i couldn't choose to not to breath if i want to continue living,

That's why I'd say human beings are "free within constraints" rather than "unconditionally free" like God. I can choose what I want to eat but I can't teleport myself to the restaurant. I have to walk there, which is a constraint.
Post 26 Aug 2017, 04:31
View user's profile Send private message Reply with quote
YONG



Joined: 16 Mar 2005
Posts: 8000
Location: 22° 15' N | 114° 10' E
sleepsleep wrote:
i couldn't choose to not to breath if i want to continue living
If, one day, you choose not to breath, please convert all your assets to Bitcoins and send those digital dollars to one of the forum members first. Thanks!

BTW, do I need a digital wallet (or something like that) to store Bitcoins? Rolling Eyes

Wink
Post 26 Aug 2017, 04:37
View user's profile Send private message Visit poster's website Reply with quote
YONG



Joined: 16 Mar 2005
Posts: 8000
Location: 22° 15' N | 114° 10' E
ProphetOfDoom wrote:
I was more concerned with proving that classical computers are not free and thus don't deserve rights, compassion, etc..
According to one of the forum members, self-learning machines do have rights. And we -- the creators of such machines -- should respect their rights. If such "supreme" beings choose to eliminate us, it is just the next logical step in evolution and so be it!

Don't look at me. That is not my idea!

Wink
Post 26 Aug 2017, 04:44
View user's profile Send private message Visit poster's website Reply with quote
ProphetOfDoom



Joined: 08 Aug 2008
Posts: 120
Location: UK
Very Happy
Post 26 Aug 2017, 05:00
View user's profile Send private message Reply with quote
Furs



Joined: 04 Mar 2016
Posts: 1421
ProphetOfDoom wrote:
Hi Furs,
I must admit I had to have a hard think about your feedback argument. Self-modifying code could result in very dynamic AIs which their creators couldn't even imagine or predict. However, they'd still be modifying themselves deterministically ("same input, same output" just becomes "same input and internal state, same self-modification, same output"). That is to say, programs cannot act willfully, they can only react in the manner of a slave. So to me, even as a vegan it would be okay to treat an AI as a slave and far below an animal such as a cat.
FWIW that's what the Lisp hype in the 60s (or was it 70s?) was all about: self-programming via Lisp macros. That's why Lisp was/is considered the "language of artificial intelligence" (even though I disagree).

About the same input - same output, that can very well be true, but the question is, isn't it the same for humans? We don't know that, and YONG even though he is against my views, definitely thinks humans are not "free" in thinking and are, in fact, nothing special.

I can understand your position, since you believe in God and probably that humans have souls. I have no problem with it, just a different viewpoint on the world.

I do have a problem understanding YONG's reasoning though. He doesn't think humans are anything innately special and that the Universe is just something arising from random fluctuations. I don't have a problem with this position, but the fact he wants to treat humans special despite the fact he doesn't believe they are special. Makes no sense to me whatsoever. I guess it is the point where actions are definitely not the same as words.

What you said about constraints makes perfect sense though, and applies to AI just as well. Free will is concerned with free thinking, not anything else. An AI with no physical body definitely shows this as a possibility, or a "human brain uploaded to the cloud" as Ray Kurzweil puts it.
Post 26 Aug 2017, 10:59
View user's profile Send private message Reply with quote
YONG



Joined: 16 Mar 2005
Posts: 8000
Location: 22° 15' N | 114° 10' E
Furs wrote:
I do have a problem understanding YONG's reasoning though. He doesn't think humans are anything innately special and that the Universe is just something arising from random fluctuations. I don't have a problem with this position, but the fact he wants to treat humans special despite the fact he doesn't believe they are special. Makes no sense to me whatsoever. I guess it is the point where actions are definitely not the same as words.
I, being a member of the human race, do not want to see the demise of the ruling species of the lonely planet. That's why I am doing my part to warn forum members of the potential danger of unconstrained AI, as some of them may -- or will -- be involved in the development of such "supreme" beings.

My standpoint has nothing to do with whether the human race is special or not.

Wink
Post 26 Aug 2017, 12:17
View user's profile Send private message Visit poster's website Reply with quote
YONG



Joined: 16 Mar 2005
Posts: 8000
Location: 22° 15' N | 114° 10' E
Furs wrote:
YONG even though he is against my views, definitely thinks humans are not "free" in thinking ...
To me, "free will" is primarily about making choices and decisions.

For spontaneous decisions, I would say that we are not 100% free -- there is a certain degree of randomness involved in the decision-making process.

For well-thought-out decisions, I would say that we are almost 100% free -- the degree of randomness involved in the decision-making process is insignificant.

Wink
Post 26 Aug 2017, 12:29
View user's profile Send private message Visit poster's website Reply with quote
Furs



Joined: 04 Mar 2016
Posts: 1421
YONG wrote:
I, being a member of the human race, do not want to see the demise of the ruling species of the lonely planet. That's why I am doing my part to warn forum members of the potential danger of unconstrained AI, as some of them may -- or will -- be involved in the development of such "supreme" beings.

My standpoint has nothing to do with whether the human race is special or not.
Nah, we were talking about AIs getting "human rights" and stuff like that. You implied they can't have emotions and other "human qualities", which doesn't make sense to me if humans aren't "special" (I mean, special in the sense of such unique abilities). Wasn't talking about the human race as a whole -- but about a human vs AI (as individuals).
Post 26 Aug 2017, 13:00
View user's profile Send private message Reply with quote
YONG



Joined: 16 Mar 2005
Posts: 8000
Location: 22° 15' N | 114° 10' E
Furs wrote:
Nah, we were talking about AIs getting "human rights" and stuff like that. You implied they can't have emotions and other "human qualities", which doesn't make sense to me if humans aren't "special" (I mean, special in the sense of such unique abilities). Wasn't talking about the human race as a whole -- but about a human vs AI (as individuals).
On the "special" arguments, I was referring to whether the human race -- or any intelligent life-forms for that matter -- held any special place in the Milky Way, the observable universe, or the entire universe.

Of course, AIs can mimic human emotions or other human qualities. However, whether AIs -- or any self-learning machines for that matter -- deserve human rights is another story -- a highly-controversial story, indeed.

Wink
Post 26 Aug 2017, 13:27
View user's profile Send private message Visit poster's website Reply with quote
sleepsleep



Joined: 05 Oct 2006
Posts: 8273
Location: ˛                              ⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣ Posts: 6699
YONG wrote:
sleepsleep wrote:
i couldn't choose to not to breath if i want to continue living
If, one day, you choose not to breath, please convert all your assets to Bitcoins and send those digital dollars to one of the forum members first. Thanks!

BTW, do I need a digital wallet (or something like that) to store Bitcoins? Rolling Eyes

Wink

i got a tip lately, there are giant diamonds available in uranus and neptune, please consult your forwarder on transportation and import tax, Laughing

for digital wallet, you could try download electrum,
you better prepare such wallet now, because you might received some crypto coins from us, europe and probably anywhere, Wink
Post 26 Aug 2017, 23:12
View user's profile Send private message Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  
Goto page Previous  1, 2, 3, 4, 5, 6 ... 10, 11, 12  Next

< Last Thread | Next Thread >
Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You can attach files in this forum
You can download files in this forum


Copyright © 1999-2019, Tomasz Grysztar.

Powered by rwasa.