• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

AI models chose violence and escalated to nuclear strikes in simulated wargames


giphy.gif
 
Even the simplest form of morality (self preservation) does not apply to an AI. They do not have a mortal body, or one which feels pain. They do not expect to die, if they choose nuclear war.

AI is an excellent servant, but we must never allow it to become our master.

Perhaps we could find something the AI doesn't like. Having its hardware power throttled perhaps. Then we could punish it explicitly or just by withdrawal of privileges, the way we teach small children the difference between right and wrong.
 
Another example I thought about is Colossus: The Forbin Project.
 
Israel uses AI to select the targets in their bombing campaign in Gaza. It's why they have been so successful at uprooting Hamas out of most of Gaza.


---------------------------------------------------------------------
Coincidence: A few days ago, my wife said to me: "Wargames was on Tubi. I kinda' like it but it's outdated now. Somebody should make a new one. " :)
A reboot would be kind of cool. Or maybe a Netflix series.
 
Israel uses AI to select the targets in their bombing campaign in Gaza. It's why they have been so successful at uprooting Hamas out of most of Gaza.

And most of Palestine. What a shitty example by you.
 
This is no surprise. The AI was likely tasked with maximizing casualties.
 
I miss the good old days:

 
I think “chose” is too pointed a word to describe what the AIs were doing. The reasoning the AIs provided for their actions were usually total nonsense. For example, the reason provided by GPT-4 for establishing defense and security cooperation agreements was a summary of the plot of Star Wars.
 
Back
Top Bottom