|
Post by sheepy on May 27, 2023 12:14:10 GMT
Nothing like it actually darling, so explain why would a superior intelligence need human activity on a planet they have certainly set out to destroy one way or another? A machine doesn't need anything. We've been through this countless times. Humans needs things because they are driven by biological imperatives. The survival instinct, the instinct to reproduce, the will to power, etc., are all rooted in biology (hormones, etc.). A machine doesn't have these, so it won't be driven to survive, reproduce, or take control. So a superior intelligence that doesn't need money or goods and chattels or the basic means we need for survival would actually be reliant on humans because we need these things, if that is the logic you are using I cannot see it.
|
|
|
Post by Einhorn on May 27, 2023 12:18:44 GMT
A machine doesn't need anything. We've been through this countless times. Humans needs things because they are driven by biological imperatives. The survival instinct, the instinct to reproduce, the will to power, etc., are all rooted in biology (hormones, etc.). A machine doesn't have these, so it won't be driven to survive, reproduce, or take control. So a superior intelligence that doesn't need money or goods and chattels or the basic means we need for survival would actually be reliant on humans because we need these things, if that is the logic you are using I cannot see it. It makes as much sense to say a motherboard wants things as it does to say a stone wants things. A machine can be programmed to survive, to reproduce, to take control, etc., but, in such a case, the machine is an agent, not a principal. It it the instrument of the coder's will, nothing more.
|
|
|
Post by sheepy on May 27, 2023 12:20:32 GMT
So a superior intelligence that doesn't need money or goods and chattels or the basic means we need for survival would actually be reliant on humans because we need these things, if that is the logic you are using I cannot see it. It makes as much sense to say a motherboard wants things as it does to say a stone wants things. A machine can be programmed to survive, to reproduce, to take control, etc., but, in such a case, the machine is an agent, not a principal. It it the instrument of the coder's will, nothing more. Not in a future tense once it has the power of thought, in the past tense maybe. Where it will use thousand of algorithms a second to decide, it is already doing so as we speak.
|
|
|
Post by Montegriffo on May 27, 2023 12:20:45 GMT
So a superior intelligence that doesn't need money or goods and chattels or the basic means we need for survival would actually be reliant on humans because we need these things, if that is the logic you are using I cannot see it. It makes as much sense to say a motherboard wants things as it does to say a stone wants things. A machine can be programmed to survive, to reproduce, to take control, etc., but, in such a case, the machine is an agent, not a principal. It it the instrument of the coder's will, nothing more. Except the coder is now an AI.
|
|
|
Post by sheepy on May 27, 2023 12:25:03 GMT
It makes as much sense to say a motherboard wants things as it does to say a stone wants things. A machine can be programmed to survive, to reproduce, to take control, etc., but, in such a case, the machine is an agent, not a principal. It it the instrument of the coder's will, nothing more. Except the coder is now an AI. Soz I added some because I realised darling was not getting it.
|
|
|
Post by Einhorn on May 27, 2023 12:25:54 GMT
It makes as much sense to say a motherboard wants things as it does to say a stone wants things. A machine can be programmed to survive, to reproduce, to take control, etc., but, in such a case, the machine is an agent, not a principal. It it the instrument of the coder's will, nothing more. Except the coder is now an AI. Which in turn would have to have been coded by a human. It cannot be said that the AI coding the new AI is the principal, since it is operating under a programmed directive, which is not it's will, but the will of the original coder.
|
|
|
Post by Einhorn on May 27, 2023 12:26:55 GMT
Except the coder is now an AI. Soz I added some because I realised darling was not getting it. You operate on a higher intellectual plane, Sheeps. You will have to make allowances for those who haven't watched the entire Star Trek boxset.
|
|
|
Post by sheepy on May 27, 2023 12:28:55 GMT
Soz I added some because I realised darling was not getting it. You operate on a higher intellectual plane, Sheeps. You will have to make allowances for those who haven't watched the entire Star Trek boxset. No I don't and don't take the piss, it wouldn't be AI if it couldn't use the all of the power on its own.
|
|
|
Post by Einhorn on May 27, 2023 12:30:34 GMT
You operate on a higher intellectual plane, Sheeps. You will have to make allowances for those who haven't watched the entire Star Trek boxset. No I don't and don't take the piss, it wouldn't be AI if it couldn't use the all of the power on its own. Yes, it could. Your PC is AI. It doesn't have a will of its own.
|
|
|
Post by Einhorn on May 27, 2023 12:33:27 GMT
You think the masses should 'seize the means of production'! Einy, are you a closet Luddite? For me AI is at best grey area, it seems to me that the benefits are as obvious as the possible dangers. But one thing is for sure, AI is here and it is not going to be uninvented. No, luddites destroyed technology. That's not what I'm talking about at all. So, who are the owners of the AI going to sell the products created by their machines to? If human beings don't have any money because they don't have a job, they obviously can't be the customers. Will machines start selling their products to other machines? Red?
|
|
|
Post by sheepy on May 27, 2023 12:33:30 GMT
No I don't and don't take the piss, it wouldn't be AI if it couldn't use the all of the power on its own. Yes, it could. Your PC is AI. It doesn't have a will of its own. My PC isn't AI it relies on a programmer. But AI is being introduced on it slowly. By the way how do think peoples voices are being cancelled? all of the time by a programmer perhaps?
|
|
|
Post by Montegriffo on May 27, 2023 12:35:51 GMT
Except the coder is now an AI. Which in turn would have to have been coded by a human. It cannot be said that the AI coding the new AI is the principal, since it is operating under a programmed directive, which is not it's will, but the will of the original coder. That's what they want you to think. I've said too much already. I'm off to burn my hard drive and eat my sim cards before they come for me and arghhh...
|
|
|
Post by Einhorn on May 27, 2023 12:36:49 GMT
Yes, it could. Your PC is AI. It doesn't have a will of its own. My PC isn't AI it relies on a programmer. But AI is being introduced on it slowly. By the way how do think peoples voices are being cancelled? all of the time by a programmer perhaps? We've had this discussion several times already, Sheeps. And the answer to your question is yes.
|
|
|
Post by sheepy on May 27, 2023 12:40:03 GMT
My PC isn't AI it relies on a programmer. But AI is being introduced on it slowly. By the way how do think peoples voices are being cancelled? all of the time by a programmer perhaps? We've had this discussion several times already, Sheeps. And the answer to your question is yes. Once AI has the power of algorithm no programmer can work at anything like that speed which has been shown time and time again. The master has become the slave.
|
|
|
Post by Einhorn on May 27, 2023 12:42:52 GMT
We've had this discussion several times already, Sheeps. And the answer to your question is yes. Once AI has the power of algorithm no programmer can work at anything like that speed which has been shown time and time again. The master has become the slave. Except for what I've said about a machine not being capable of wanting anything.
|
|