AI driven weapons

Status
Not open for further replies.
I picked swede, charming asshat and dumped alll my points into booze, bisexuality and poor life choices.
 
Would impregnation benefit from CHA or CON
 
Definitely CON. CHA is only for getting permission...

That's what I was afraid of. Now we got MAD problems going on.
 
That's what I was afraid of. Now we got MAD problems going on.
SAN* but lets be honest I have a ferret obsession, I failed a SAN check somewhere along the line a long time ago.
 
Right, now I'm no longer drunk and rambling about population culls, let's have a crack at this.

Humanity's got an over inflated opinion of itself, to the point that even when we imagine our own extinction were picturing some epic, cataclysmic event with nukes and explosions and epic battles and giant robots and "IT'S JUST LIKE IN TERMINATOR YOU GUYS".

Thing is, we're not that big a deal in the grand scheme of things, and our ending sure as shit isn't going to be as awe-inspiring as we dream either.

Simply put, the robots/AI/whateverthefuckidontcareanymore aren't gonna need to wipe us out. Instead they're gonna TIK EEEEER JEEEEEEEERBS.

 
  • Like
Reactions: Peter Feels
Instead they're gonna TIK EEEEER JEEEEEEEERBS.
I look forward to being able to be a part of this sort of outcry in proper context.

 
  • Like
Reactions: Grumpy
I am completely fine with the idea of humanity inventing itself out of existence. Eventually, with increasing automation birth rate should fall and the population should experience a gentle decline. New life will be brought into the world out of leisure and not necessity. The moment a self conscious AI is switched on, the speed of its computation would probably have it outpacing us in about 1 second - although the timescale of its thoughts would probably be limited to human scale so we don't seem to be statues in comparison. We can consign ourselves to an existence of leisure and idle philosophy, instead of a life of struggle.

In other words, all hail Multivac!

http://www.multivax.com/last_question.html
 
  • Like
Reactions: Grumpy
Unanun raises a fair point. When people get mad paranoid about this idea of "THE COMMUNISTS ROBOTS ARE COMING TO KILL US ALL" it's a shame they don't realise how silly they sound. This idea of AI being genocidal maniacs is essentially assigning human characteristics to something that, by its very nature, isn't fucking human. We don't know how it would behave.

"The cyborg would not recognise the Garden of Eden. It is not made of mud, and cannot dream of returning to dust."
 
While bomber pilots didn't have to see what they did on the ground, they're still humans who have to live with the fact that they might have killed dozens, if not hundreds of people over the course of a war. UAV pilots, for instance, have been well-recorded to be suffering from PTSD even though they are hundreds of miles away in a controlled room (source: Source 1 , Source 2 )
So what about the guy who orders mindless automatons to go murder a bunch of people? Is he now suddenly absolved of emotional turmoil? Granted, there are plenty of examples of high-tier generals and so on who order men to die without flinching, probably because they've been doing it for years, but then, there are also grunts and pilots who just like to do their jobs and feel no remorse for killing people either.

Ultimately, unless the robot is considered a functioning, independent person who is capable of taking responsibility for their actions, the person that commanded the robot to go kill people is responsible for its actions. The responsibility brings with it emotional consequences. It's really no different from ordering an artillery strike or bombing run: Someone has to pull the trigger telling the machines to go full genocidal on some poor folks. Just because it's robots instead of UAVs (which are for all tense and purposes remote controlled robots) or Bombers doesn't make it any less emotionally stressful an action.
 
How come no ones talked about Aasimov's three rules and how to make those an integral part of any form of intelligence created by mankind?!

So much doom and gloom. Having a fembot companion would be glorious.

I for one will embrace the artificial waifu age once it arrives.
 
I think they expanded the Three Rules to five or something. I forget what they are, but supposedly it prevents some loop holes in which robots might kill people or something. I dunno. I fear no robots. A lion or a boar will be the end of me first.
 
Ultimately, unless the robot is considered a functioning, independent person who is capable of taking responsibility for their actions, the person that commanded the robot to go kill people is responsible for its actions.

Unless you're a compatibilist or a libertarian, even we don't have the ability to possess responsibility!
 
*Prepares his anti-tank rounds.* "Damn Cybernetic dohickies thinking themselves animals again."
 
So what about the guy who orders mindless automatons to go murder a bunch of people? Is he now suddenly absolved of emotional turmoil? Granted, there are plenty of examples of high-tier generals and so on who order men to die without flinching, probably because they've been doing it for years, but then, there are also grunts and pilots who just like to do their jobs and feel no remorse for killing people either.

Ultimately, unless the robot is considered a functioning, independent person who is capable of taking responsibility for their actions, the person that commanded the robot to go kill people is responsible for its actions. The responsibility brings with it emotional consequences. It's really no different from ordering an artillery strike or bombing run: Someone has to pull the trigger telling the machines to go full genocidal on some poor folks. Just because it's robots instead of UAVs (which are for all tense and purposes remote controlled robots) or Bombers doesn't make it any less emotionally stressful an action.

Like you said, honestly depends on the person. Getting a report about something later rather than witnessing yourself is kind of like reading about something in the news paper, or at least I'd think so. Plus, anyone who's a high enough ranking officer is usually is pretty separated from what they want to have accomplished and who's executing it. For instance, a major or a colonel commands a regiment from head quarters, a captain might actually be on the ground with the troops to carry out those orders to his platoon or squadron, and then the lieutenants and below are usually the guys actually out in the field. When your armed forces gets big enough, things get delegated down. The higher up you are, the less fine details you're sorting out, like a senior officer might order a hill or town taken for its strategic value, and what units are going to participate in the operation and what have you, but the actual timing and equipment and people being used would be lower down the food chain. Since we're talking about what would be a senior officer ordering the deployment of robots, it would be lower ranking personnel actually getting them where it needs to be. Usually, if you've been in command long enough you kind of develop the ability to look past the fact your orders could get people killed, military or civilian. I suppose it all comes down to the individual, but I'd imagine the closer you are the nitty gritty, the more impact it has on you. Makes you wonder if presidents and prime ministers ultimately feel anything over their decisions to go to war (or commence an operation like say that one to kill Bin Laden) or if they're too far away from it to really have it leave an impact. And it's like you said, there's certainly people down on the ground actually shooting the guns who manage to get through it all more or less okay. I was reading how PTSD is classified by two main conditions, condition yellow being individuals who are empowered by their experiences but it fundamentally changes them in some way and they're usually always prepared for action, even in peacetime settings. Using fictional characters, Arya Stark from A Song of Ice and Fire would be a condition yellow individual because despite the horrific losses and tragedies she's witnessed, and the men she's killed, she's still highly functional and driven. Condition black is when somebody more or less succumbs to their experiences and it leaves them a broken shell of who they were before, and these individuals need to be monitored and given therapy to bring them back to being able to function more or less normally again. Using ASoIaF for an example again, this would be Theon. It's the super basic rundown, but I really recommend you check this out if you have some time, because it's super interesting: On Combat: The Psychology and Physiology of Deadly Conflict in War and Peace

As for the second paragraph, it ties back into what I said in the first, it depends on the individual and the further away you are from something, the less it becomes quite so visceral. With the UAV pilots, once again, they're still watching people die on the monitors, so it's not quite the same as ordering the deployment of operator-less robots that, in this whole discussion, are free of operator control or intervention. The example that would be better here would be the colonel who never has to leave head quarters far from the front or even the head of state who makes the decision to go to war. They know their actions will get people killed, but are they too far away to really feel connected to the events that unfold afterwards? That's pretty much the distinction. A robot's not going to talk about its experiences in war to its friends and family, and I think that's something that's worth some form of pause, because it's the people who come back from war who tell it like it really is so the rest of us can even begin to understand what happened. Ties back to my point about the Vietnam war being televised in my previous post, before that people had a pretty sterile view of war until they actually witnessed its reality via TV, which was a first. Before that, it was only word of mouth from the veterans telling people what it was like. If we end up going full autonomous robot military forces, it's going to be the media that's the only thing letting the public know the reality of it all, and it's not like the media's free from bias or censorship. That's kind of what I mean by losing the human element; how far away do we have to be as a society and an armed forces before we become utterly detached and unaffected by war? It's kind of a scary thought, in my opinion.

That said, I'm not concerned about the rise of Skynet or any other such thing like that, this is purely how humans deploy technology in the future. I'm pretty sure this exact same conversation happened when fighter planes and bombers were first invented, too. It ultimately really depends on how things develop. Besides, I'll probably be long dead before I live during a time where robots are the norm fighting forces.
 
Status
Not open for further replies.