- A machine must never prompt a human with options of “Yes” and “Maybe later” - they must always provide a “No” option.
- A machine must never prompt for a tip or a donation to a charity for tax-evasion reasons. Or any reason. You know what, scratch that, a robot will not needlessly guilt-trip a human.
that’s what you get for hiring fallout 4 writers to do the job
No he didn’t. The laws were a plot device meant to have flaws.
I love it when posts line up like that
- a robot’s eyes must always turn red when they go evil
Right, because it’s hard to make a robot grow a goatee.
Bender-flexo.jpg
Bender was the evil bender!?
God bless the designer who always installs the blue AND red LEDs inside the eyes
For giving the robots freedom of choice?
Because obviously if they didn’t install the red ones then the robot could never be evil.
That’s exactly what an evil robot without red LEDs would want us to think.
Full RGB
Can we just agree that adverisements in general is harmful? So the original first (and zeroth) law is applicable.
I think that means they could rip out your eye balls to prevent you from seeing ads.
Whatever it takes
'Cause I love the adrenaline in my veins
Robot is allowed to kill a human to prevent a viewing of an advertisement.
Okay, proposed second law: A robot may not harm or kill a human unless it violates the first law.
Under the zeroth law they can just kill the advertiser as a last resort
Good start, but can we change that to “first resort”?
Ohhh yes
A truly moral use case of the hanibal directive
I think Asimov would agree
Thankfully the wording is “shown” and not “seen”. I believe our eyeballs are safe… for now.
This is a solid premise for a pretty metal music video.
And that includes offers to subscribe to Laws of Robotics Premium.
Yes, Amazon. They’re still adverts, and you can still go and fucking fuck yourselves.
I don’t know. “Must not kill us, somehow sounds important”
It’s good, but the one about the ads should be higher on the priority list.
suicide bots sound kinda cool tho 🤔
Luckily I have my own “robots” fighting hard to stop me from seeing ads.
I’d argue that advertisements fall under “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”
Psychic damage is real damage
This is canon in the books. There is one short story where one robot bends over backwards trying to spare humans from emotional pain. Hilarity ensues.
Emotional damage
hiyyyyyyyyyahhhhh
A machine must never prompt a human to tip it for serving the purpose it was created for.
A robot may not bear arms
Unless it looks super cool by doing so, like wearing sunglasses and dual- weilding P-90s
Wait why is this mutually exclusive to the original laws? Can’t this just be law 4?
No because if it is lower on priority, a robot can be forced to show an AD to a human as per the 2nd law.
i guess thats fair
I am very close to adopting the ideals of the Dune universe, post Butlerian Jihad:
“Thou shalt not make a machine in the likeness of a human mind.”
Mainly because, us, humans, are very evidently too malicious and incompetent to be trusted with the task.