A few days ago I read about the trolley problem. It was mentioned in an smbc-comics.com comic. I hadn't heard of this before, but I have considered similar situations before when thinking about driverless cars.
When I thought about it the ideas of social class came to mind. Would you be less likely to throw the switch if the five people on the current track appeared to be homeless, or possibly drug users, and the single person on the other track was perhaps a railroad worker. It could matter how long you have to make your decision.
Previous thoughts I have had involved driverless cars attempting to save their occupants. Given a choice should the car run over pedestrians to save the occupants of the vehicle. For example if the car was unable to stop in time should it choose to go off a cliff killing the occupants, or run over some number of pedestrians and stop safely.
I think current driverless cars would probably go off a cliff to avoid colliding with the people, but probably only because they have never encountered a cliff before. I don't know if they have any preset knowledge about what situations could destroy the car, or if it all comes from learning while driving.
I'm not sure where I heard this, and it wasn't worded this way, but it helps to think about coding this way. Basically, any progra...
Some Android applications use space on your Google Drive to store data. You can't see this data by browsing drive the normal way. You ...
I recently migrated a site from Magento 1 to Magento 2. The product image thumbnails work differently in Magento 2. In Magento 2 they use ...