Announcement

Collapse
No announcement yet.

Self-driving cars- the future or just another shiny toy?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Self-driving cars- the future or just another shiny toy?

    Folks,
    What do you think about autonomous vehicles? Are they going to become the only allowed form of driving in a few years and who should decide that? It looks like all large automotive producers, BMW included, are competing to design the best technology "for the sake of the public good" but topics like affordability and security risks get neglected in the public conversation and the marketing articles.
    I am doing a research on autonomous driving and its social impact and I'd love to learn what you think. If you've got a minute to answer 8 questions about your preferences as a BMW driver and what you think about the autonomous driving technology, I'd be grateful :)

    The purpose of this survey is to explore BMW drivers' attitude towards self-driving vehicles, the social importance of this technology and how it can be regulated.


    Cheers!

    #2
    Biggest waste of R&D money in automotive history. I have a suggestion: When all of the roads, bridges and other infrastructure are complete and maintained then put money into driver education.

    If no one wants to drive, lets build trains!

    Comment


      #3
      I agree, there are so many things that can be addressed before jumping to the next shiny toy that a non-automotive company started developing. But because it's "innovative" (which it isn't at all), everyone takes after like an obedient soldier without asking why. Thanks for the help, mate ;)

      Comment


        #4
        I’m fine with all vehicles being autonomous, except mine. Nearly every day I get the shit scared out of me by some idiot staring at their phone and crossing the double yellow headed straight for me.
        My son has the 1987 325e, 2 door, 5speed
        I daily the 1989 325i, 4 door, 5speed

        Comment


          #5
          Yeah, I am afraid the idiots chatting while driving or doing God knows what else, are everywhere nowadays. Unfortunately, if we go autonomous, we need to go autonomous all the way, otherwise, it will become a complete chaos. For me it will be ideal if the laws get tougher instead, and there are severe penalties for the drivers not following the rules. I wouldn't like to give up driving just because most people do not care about their life and the life of others. It's like pouring water into a bucket with a hole in, idiots will not get healed by introducing autonomous vehicles, it's all about competition and money, nobody gives a nickle about people's well-being or optimization of the roads.

          Comment


            #6
            I'm surprised that no one has written an encyclopedia of examples of why this is a bad turn in the road - PUN INTENDED! Let's come up with some scenarios:

            1) If an owner loans their self-driving car to friends and there is an accident, who is liable? In other words, owner is not present - multiple passengers (who is the driver)?

            2) If someone has a party and invites 50 people, how do self-driving cars park themselves at a residence that has four spaces? Will they respond to humans giving hand directions to park in the yard?

            Comment


              #7
              This downward spiral we've allowed technological advancements to achieve will be the end of free will/independent thinking as we know it. I certainly hope I'm not alive to see the day when all cars are fully autonomous (and it is coming).
              If it's got tits or tires, it's gonna cost ya!

              Comment


                #8
                Originally posted by packratbimmer View Post
                I'm surprised that no one has written an encyclopedia of examples of why this is a bad turn in the road - PUN INTENDED! Let's come up with some scenarios:

                1) If an owner loans their self-driving car to friends and there is an accident, who is liable? In other words, owner is not present - multiple passengers (who is the driver)?

                2) If someone has a party and invites 50 people, how do self-driving cars park themselves at a residence that has four spaces? Will they respond to humans giving hand directions to park in the yard?

                1. Completely situational. Hell, it may be a cyclists fault for the accident. It may be grandma stepping off the sidewalk. That's a pretty generic complaint with no way to answer. Even if the owner is there, in the car, he's not exactly responsible if the computer decides to take a hard right into a mailbox...

                2. What's the matter with relinquishing control to a driver at some point? who says it has to be autonomous at all times? Hell, computers could stack car in super tight and then program an 'exodus' mode or something to move all the cars blocking you in if need be without having to run around the whole party looking for car owners. Sounds better to me.

                I for one, am looking forward to self driving cars. It'd be so much nicer and less stressful to drive if you take out random human actions. People do some dumb shit on the roadway and if most every car it autonomous then it'd be much easier to 'drive around' them.

                I would never own one, but I can't wait for their arrival.
                84 325e - 91 325i - 92 318 touring - 91 Trans Am - 01 S4 avant - 03 S-type R - 96 F350
                Manual swap all the things!

                Comment


                  #9
                  Google says there are 225 million drivers in the US. The only way for autonomous vehicles to be truly feasible is if everyone has one because you need everyone following the same rules and not thinking that the shoulder is an extra passing lane. Next time you see a car with a duct tape bumper, flashlight for headlight and tires that resemble racing slicks remember our tax dollars get to buy and maintain them a brand new fully updated autonomous vehicle.

                  Comment


                    #10
                    Yes. But not as soon as some people would like to think.
                    2006 GMC Sierra 2500HD 4WD LBZ/Allison
                    2002 BMW M3 Alpinweiß/Black
                    1999 323i GTS2 Alpinweiß
                    1995 M3 Dakargelb/Black
                    - S50B32/S6S420G/3.91
                    1990 325is Brilliantrot/Tan
                    1989 M3 Alpinweiß/Black

                    Hers: 1996 Porsche 911 Turbo Black/Black
                    Hers: 1988 325iX Coupe Diamantschwartz/Black 5spd

                    sigpic

                    Comment


                      #11
                      Agreed - I think Ford has been the most honest about this. The others are tech companies trying to push up their stock price before disappearing into the night (Uber anyone - will they be around in 5 years time?).

                      Real world conditions are too chaotic for robots to handle at this stage. What happens in heavy rain, snow or dust storms where the sensor signals get too noisy for the computers to handle? Not only do the cars need to be autonomous themselves, but widespread sensors also need to be installed in the roads to mark lane and highway boundaries in all conditions. Add in changes to legislation required for some of the curlier ethical situations that arise, and widespread adoption is a long way off.

                      Just for laughs, let's see some autonomous vehicles attempting the infamous Elk test in Germany. Our state government here has certain highways for autonomous vehicle testing and they did a test with a stuffed kangaroo, even the relatively simple EBS systems on a lot of cars didn't work for that and the cars hit the roo :)
                      My e30: OEM+ with M30B35

                      Comment


                        #12
                        I think the big issue is the idea of programming morality into these cars. Often derived from the trolley problem thought experiment.



                        If a person needs to make a decision between hitting a granny with a puppy or two kids on bikes, assuming there was no other choices that a reasonable person could have made then we don't generally hold the person to a decision that was made under less than ideal circumstances.

                        But with self driving cars, this decision needs to be programmed in. So assuming that the hazards are identified correctly (another issue, false positives...) , who gets to decide who dies? the other scenario that comes up in discussions is whether the vehicle protects the occupants or protects people outside the car. Ie if the decision is between hitting a wall and killing the occupants or hitting a pedestrian and killing the pedestrian with the occupants surviving. The computer in the car has (or will have) the ability to identify these risks, and it needs to make a preprogrammed decision. And will some companies prioritise the occupants, and other companies priorities the pedestrian? will one company advertise "we prioritise the occupants in our software" ?

                        I've no doubt the technology will be here and affordable before we know it, but technology is the easy bit.
                        Last edited by e30davie; 07-12-2019, 04:30 AM.

                        Comment


                          #13
                          Originally posted by e30davie View Post
                          I think the big issue is the idea of programming morality into these cars.
                          All valid points, and thank you for bring it up in a way people might have to stop and think about. I would guess this will all get settled in the courts in not to distant future. While that is happening (2-4 years?) the tech will march on.

                          The stats of self driving car miles vs Human driving car miles will accumulate. It will be clear as day that you're more likely to be killed by a human driver then a computer driver.

                          But news stories of someone being killed by an autonomous Uber/Truck/tesla/GM/volvo will dominate the new cycle. (If it bleeds, it leads is an old adage in news reporting)

                          The only question is how will it be sold to the public? It's a hard sell. People still think flying is dangerous - when it's the safest form of travel ever in mankind by a factor of a 10,000ish?

                          But you HAVE TO FLY - it's not a choce in modern life for most. But you don't have to vote (literally or with your wallet) on autonomous cars. So, that going to be the thing.

                          It's not about what's safter, it about what people will accept. And people don't like to be spooked. So - if Uber can "wow" you with a driverless car before it kills someone like in some tragic crash (Princess Diana would be an example of something that would be tragic - sorry to bing that up, but just as a point o reference), it will get a public nod of approval.
                          Originally posted by Matt-B
                          hey does anyone know anyone who gets upset and makes electronics?

                          Comment


                            #14
                            i cant wait for the day when im on some old twisty back road and i hit the spirited driving button and lay back while my car starts hugging the turns. i wonder if theyll sell retrofit kits so our e30s can stay on the road. maybe one day ill have to pretend im texting when i pass a cop so he doesnt realize im actually controlling the car.

                            Comment


                              #15
                              Originally posted by e30davie View Post
                              I think the big issue is the idea of programming morality into these cars. Often derived from the trolley problem thought experiment.



                              If a person needs to make a decision between hitting a granny with a puppy or two kids on bikes, assuming there was no other choices that a reasonable person could have made then we don't generally hold the person to a decision that was made under less than ideal circumstances.
                              The trolley problem is something that's brought up about autonomous cars a lot. The big difference is that a trolley is on rails, and a car is not. There will probably be a maximum that the car will be programmed to swerve, driven by some probability of remaining in control. If that is exceeded, it will likely be programmed to brake in a straight line to maximize braking performance and control of the vehicle.

                              Maximizing system performance has the potential to somewhat avoid tricky catch-22 ethical problems for these companies.


                              I'm honestly looking forward to them because a lot of people can't figure out how to drive. Less road rage, fewer accidents, and traffic flow improvements are all huge things I look forward to.

                              On the other hand, too many people drive already, and self driving cars would enable more people to drive, and to drive for longer distances. I know I would be willing to have a much longer work commute if I could sleep through it, so I assume that is the same with a lot of people. Pretty much the opposite of the direction we should be heading as far as traffic/pollution goes, although maybe electric cars will offset some of that.
                              Originally posted by priapism
                              My girl don't know shit, but she bakes a mean cupcake.
                              Originally posted by shameson
                              Usually it's best not to know how much money you have into your e30

                              Comment

                              Working...
                              X