possible side effect of driverless cars

llaht

Member
Supporting Member
Joined
Dec 18, 2016
Messages
1,818
Location
central NC
Rating - 0%
0   0   0
https://www.windstream.net/news/rea...cit_side_effect_of_driverless_cars-rnewsersyn

One possible side effect of driverless cars that may not have occurred to you? An increasing number of people having sex in cars. In a new study from the Annals of Tourism Research titled "Autonomous vehicles and the future of urban tourism," researchers note that self-driving cars could ultimately become rolling hotel rooms rentable by the hour.

Autonomous vehicles are expected to eventually replace traditional taxis; cab companies, no longer having to pay drivers, could instead invest in roomy interiors—perhaps roomy enough to include bedding or even, researchers suggest, massage chairs.

(Volvo has already introduced the concept of a self-driving car with a sleeping pod.) "It is just a small leap to imagine Amsterdam’s Red Light District 'on the move,'" they write.



Customers could summon the rolling "rooms" via as-yet-undeveloped mobile apps, researchers suggest. "It’s only a natural conclusion that sex in autonomous vehicles will become a phenomenon," one of the study authors, a tourism professor, tells the Washington Post.

If the cars are used for prostitution, of course, there will be legal ramifications as the practice is barred most everywhere in the US; the study also notes driverless taxis could be used for other illegal activities including drug use or drug dealing.

And there are also less-illicit ramifications to the rise of autonomous vehicles: The study notes restaurants and hotels will have to compete with self-driving taxis that users can dine and sleep in, per a press release.

Travelers may even opt to take driverless vehicles long distances rather than flying, the Telegraph reports.
 
Not interested in one, now a flying car auto piloted, now we're talking.

fly me.jpg
 
Last edited:
When I was a "kid" I had a number of vehicles. One that only saw use on the weekends was a 56 Chevy Nomad. Anyone care to guess why?? So this ain't nothing new, and back then it was nothing to have your friend and his girl in the front seat driving you around while doing the deed. So what's new?? Been there, done that, they say that History repeats itself..
 
As long at it keeps dumbasses from driving, I'm fine with that. Even if said autonomous car has a glitch and sends people off cliffs, I'm good with that also.
 
As long at it keeps dumbasses from driving, I'm fine with that. Even if said autonomous car has a glitch and sends people off cliffs, I'm good with that also.

The greater good and all that.

And of course we get a new genre of found footage as the discreet surveillance camera in the vehicle leads to tons of candid porn on the internet.
 
The greater good and all that.

And of course we get a new genre of found footage as the discreet surveillance camera in the vehicle leads to tons of candid porn on the internet.
and just like the current flood of amateur footage, 95% of it will be darkly lit and unwatchable, like 2 whales mating.
 
I wonder if driverless cars will know to get out of the $*@!ing left lane when there's a mile of traffic backed up behind them? One can only hope.

Actually, that makes me wonder,.... I assume driverless cars have some sort of collision avoidance, so what would happen if you rode their bumper? Would they speed up to avoid being hit (i.e., get out of the way)? That would be awesome!! Or if you pulled up beside them on their left side and then started coming over on them,.... would they then move right and out of the way automatically to avoid being hit?
 
Last edited:
Here's an actual problem the people designing driverless cars are dealing with: How and when to make your car kill you.

If millions of people are using driverless cars, it will be rare, but there will inevitably be a few accidents where there has to be some instructions over whether or not to kill you.

For example, lets say that something in your car breaks, and it breaks as you are rounding a corner next to a school crosswalk. The problem is that your brakes no longer work. You are in the car with your pregnant wife and 5 year old child. The car senses there are 10 children at the crosswalk. There are trees next to the crosswalk, and you are going 60.

Does the car continue on and kill the children, and find a safe place to crash you, or does it crash into the trees, killing you and your family, but saving numerically more people.

Does the car prioritize children, or adults already contributing, or just numbers as a comparator? Does the status of the person matter(should the Presidents car save the President over other citizens)? Should a driver be willing to pay for a premium package to always prioritize the owner of the vehicle over people outside the vehicle? What are the ethics of designing a robot to break Isaac Asimov's First Law, and does that lead to concerns for future robots, like drones? Would there be terrorism concerns with this?

Things are pretty interesting in Comp Sci right now debating this.
 
Here's an actual problem the people designing driverless cars are dealing with: How and when to make your car kill you./QUOTE]

http://moralmachine.mit.edu/

This has been around for a while... pretty interesting.
Been listening to a book by Richard Clark and RP Eddy talking about Cassandras and precursors to disasters. Most experts about AI agree that it would only take about 20-40 years for AI to get to a point where we loose control...and that we are about 5-10 years in.
 
Here's an actual problem the people designing driverless cars are dealing with: How and when to make your car kill you.

If millions of people are using driverless cars, it will be rare, but there will inevitably be a few accidents where there has to be some instructions over whether or not to kill you.

For example, lets say that something in your car breaks, and it breaks as you are rounding a corner next to a school crosswalk. The problem is that your brakes no longer work. You are in the car with your pregnant wife and 5 year old child. The car senses there are 10 children at the crosswalk. There are trees next to the crosswalk, and you are going 60.

Does the car continue on and kill the children, and find a safe place to crash you, or does it crash into the trees, killing you and your family, but saving numerically more people.

Does the car prioritize children, or adults already contributing, or just numbers as a comparator? Does the status of the person matter(should the Presidents car save the President over other citizens)? Should a driver be willing to pay for a premium package to always prioritize the owner of the vehicle over people outside the vehicle? What are the ethics of designing a robot to break Isaac Asimov's First Law, and does that lead to concerns for future robots, like drones? Would there be terrorism concerns with this?

Things are pretty interesting in Comp Sci right now debating this.

Taking that a bit further... does your car kill you to stabilize the population of the planet. This technology is teetering on extremely dangerous.
 
Here's an actual problem the people designing driverless cars are dealing with: How and when to make your car kill you.

If millions of people are using driverless cars, it will be rare, but there will inevitably be a few accidents where there has to be some instructions over whether or not to kill you.

For example, lets say that something in your car breaks, and it breaks as you are rounding a corner next to a school crosswalk. The problem is that your brakes no longer work. You are in the car with your pregnant wife and 5 year old child. The car senses there are 10 children at the crosswalk. There are trees next to the crosswalk, and you are going 60.

Does the car continue on and kill the children, and find a safe place to crash you, or does it crash into the trees, killing you and your family, but saving numerically more people.

Does the car prioritize children, or adults already contributing, or just numbers as a comparator? Does the status of the person matter(should the Presidents car save the President over other citizens)? Should a driver be willing to pay for a premium package to always prioritize the owner of the vehicle over people outside the vehicle? What are the ethics of designing a robot to break Isaac Asimov's First Law, and does that lead to concerns for future robots, like drones? Would there be terrorism concerns with this?

Things are pretty interesting in Comp Sci right now debating this.

Yes. There are many moral implications. Logical choices will be made by the systems, but will they be the right choices. Another issue I see, is that while the dedicated developers and programers will strive for safety, what about the hackers? Someone will figure out how to race them or otherwise speed them up, "We'll just tweak down the safety protocols a little and we'll gain time."


Taking that a bit further... does your car kill you to stabilize the population of the planet. This technology is teetering on extremely dangerous.

Ultimately, it will be driverless vehicles totally. Then they can be centrally controlled. This prevents accidents as all vehicles will be tracked in relationship to each other. Then things will interesting. That scenario in Minority Report is an example. But, even before then driverless vehicles will be communicating to each other, knowing the intended paths of each other.
 
I thought the computer age was supposed to eliminate our every day need to travel by car in the first place? I sit behind a computer all day every day and I can't work from home... what happened?
 
I remember 15 years or so ago saying there was no way electric cars would become a "thing" due to the potential liability of a car malfunctioning and killing people..... and now here we are.
 
Getting in a driver-less cab will be like going to an airport or train station... check your rights at the curb. No guns allowed, nor anything else on the No Drive List.
 
Back
Top Bottom