The WHO has now classified processed meats as carcinogens.If you eat processed meat every day, then maybe you should pay attention to the warning. But if bacon is an occasional indulgence, then you're not really running a big risk.
And yes, there could be a scenario in which your self-driving car decides that you're the one who dies.
So, after seeing Open Carry-Tarrant County in action, can I hope for self-shooting guns? Then those AR's could decide those morons are too dumb to carry them safely and protect the rest of us.
ReplyDeleteScrew 'em. I'll continue to eat my bacon...
ReplyDeleteCP, huh??
ReplyDeleteOn topic:
Its more interesting to read their whole list. Sunlight and many other things
are listed. In short being on the earth and breathing is hazardous to ones
health based on it.
Hard to take it seriously at all.
Eck!
I don't think reality is likely to offer a self-driving car binary choices like that. A reasonably competent human could strike a sensible compromise in any of those three scenarios. I see only two solutions: (a) grant emergency authority to the occupant (kinda like aviation, n'est-ce pas?) or (b) implement Asimov's rules.
ReplyDeleteEck!, OC-TC were the morons photographed in Smashburger with their AR's propped up on seats and unattended (and, at least one, with the safety clearly off). I was hoping that a self aware AR might just cap a couple of them to prevent something stupid from happening.
ReplyDeleteCP, I know who those idiots are.
ReplyDeleteWhy I was lost is how you could conflate them with computer driven cars and the complex system of problem solving we do so well and can't program. Now if you had gone with drones firing guns/missiles without HIL (Human In the Loop) then I can
see a relationship to automation and AI decision making systems and their possible moral implications.
I cannot see how a computer driven system should be so constrained, the people in the way didn't just materialize they had been there and the approach would have been long reduction in speed. Now if that group of people are in a bus that just crashed then what is physically possible are the choices. That would be constrained by mass and speed and current ability to redirect it. The problem is real time data ages fast and what might have been good 10 milliseconds ago is not void or a better choice emerges. What's forgotten is that fast computers can examine all known possible paths and select one. There is no moral there only limited choices and physics. The end goal is to keep all alive and whole if bruising them is a choice then it goes that way. But the weighting system is not trivial.
Asimov's rules were good but the story base for those robots was partly based on the conflicts they setup and the potential resolutions or failures resulting. The key to those stories is the human projection of things like life and also human limitations in things like strength or speed.
Eck!
Asimov's rules also assumed no (educated) humans would be stupid enough, in a group, so as to support any type of robots without some constraints of this sort. Additionally, the three Robot Laws would require intelligence and programming orders beyond what currently exists.
ReplyDeleteHow is this "news"?
ReplyDeleteI'm currently 62 but when I was a teenager the first studies came out that Bacon and Coffee can cause cancer.
I still have my Coffee with my Bacon, but maybe not everyday with the bacon.
I didn't quit then, I'm not about to quit now, and it hasn't got me yet.
w3ski
The self driving car dilemma will be solved by limiting their speed to 2 mph. No one will care because that will still be faster than most people's commutes, and if not, it will give them more time to play tiddly-winks on their smart phones, which is what they really want to do with their lives.
ReplyDeleteJeez, Chuck, should they get off your lawn, too?
ReplyDelete