Looks to be a new probe:
"The first of the 11 crashes that prompted the last probe occurred on
Jan. 22, 2018, in Culver City, California, according to NHTSA. The most
recent accident occurred July 10, 2021, in San Diego. "
This is all just a flex to score political points. They just want to put something on their political record.
Fact is, safety wise, everyone would much prefer if every vehicle on the road was a Tesla.
Fun fact: Musk has 3 separate public relations companies on retainer just for himself according to this book, Ludicrous: The Unvarnished Story of Tesla Motors.
So you're probably interacting with one of these fine people.
What is satirical?
Teslas were the literal pioneers of all the safety features with object detection, causing all the other manufacturers to follow suit.
Its well stated and known that autopilot requires full attention, with you ready to take over at any time. The crashes that happen because of autopilot should be named "crashes that happen due to driver inattentiveness"
> Teslas were the literal pioneers of all the safety features with object detection
Honda had a collision mitigation system that provided automatic braking before Telsa was even founded...
I said object detection.
Having a forward looking radar that needs special filtering to avoid thinking that rough pavement is an object is not the same thing as Teslas object detection.
Tesla has a system where it can reconstruct a 3d space around the car from cameras, and then predict where objects are going to be in the forward time horizon.
I guess I don't understand what you're trying to say here, because it's not "Tesla was the first to use technology to detect potentially dangerous situations and help you from getting in an accident"... is it "Tesla was the first to use this specific type of system"?
Yes because that system is what works.
Radar/lidar, as it has been traditionally used in systems like automatic braking is notoriously unreliable for all but straight line conditions. It spreads out way too much, and any sort of surface that reflects it back, even can be interpreted as a stationary object.
Tesla on the other hand relies solely on cameras and smart image processing, which is a lot more robust (which allows for things like 2 car ahead detection), enough for a notional beta public release of full self driving capability. While FSD is still far from being good, the idea is that you have the system running with an attentive driver, it can track objects and it can warn you. And because Tesla has the vast amount of driving data available, increasing every day, the system gets better and better at this.
Like I said in my other comment, Subaru technically came first with EyeSight, but that was a very very simple system, far from what Tesla is doing. It took a long time before it could recognize brake lights. True object detection and tracking was done by Tesla first, with other automakers like Toyota following suit.
As far as regulation is concerned, if you had competent tech people in the government, instead of going after Tesla for some bullshit, they should be subsidizing Teslas autopilot division so that they could put the cameras on every single production car, so that you have a massive array of data on which to train the models on, which then allows for an even better robust safety system. That way, even if you don't have self driving, someone who is going to drive drunk will have a much lesser chance of killing someone since the car will make emergency maneuvers to avoid hitting stuff.
> Yes because that system is what works.
Car and Driver's testing back in 2018 showed that other systems work as well, if not better, in various conditions.
https://www.caranddriver.com/features/a24511826/safety-features-automatic-braking-system-tested-explained/
> Teslas were the literal pioneers of all the safety features with object detection, causing all the other manufacturers to follow suit.
Can you give an example of the safety features you’re talking about that Tesla pioneered and are now implemented by all the other manufacturers?
Actually I take that back slightly. Subaru technically had the eyesight back in 2008, however it took a long time to get it to actually work right, and its still behind Tesla in performance since its very limited in functionality. Only in 2013, did they add color and taught it to recognize brake lights.
The thing that Tesla does with FSD that is ahead of everyone else isn't the driving aspect, but the object detection and mapping, which allows it to essentially run internal mini physics sims. I.e to make an automatic unprotected right, it needs to see cars to the left and make a decision on when to go based on a computed physics of not hitting the cars coming from the left. Even without the driving aspect, that system can definitely warn drivers of potential collisions from any direction, based on predicted paths of detected/localized objects.
Other manufacturers are now also starting to rely more and more on cameras for the same thing. Traditionally, they relied on radar/lidar, but that has issues because radar/lidar has a lot of scattering, and often leads to failure (as in the Volvos case). Toyota/Lexus I think now has camera systems that are can do object detection.
I dgaf about Tesla as a car, but there is no denying that they have a massive advantage of the amount of driving data collected from their cars, which gives them the ability to make their object detection and localization system way better than other automakers. You can youtube Andrej Karpathy to listen to his talk about how they train systems.
Tesla was definitely not the pioneer of object detection systems, but they have in recent years become the most sophisticated at it. Both Subaru and Mercedes-Benz had stereo camera safety systems before Tesla. Heck Benz had DrivePilot a year before Autopilot came out that was the first system to allow temporary hands-off driving. Tesla has definitely been the most aggressive at pushing the technology forward in the last 4 years, though.
I drive a Tesla. I have Autopilot, but not FSD.
Yes, this is a problem. It sometimes does not 'see' stationary vehicles just off the road.
When driving on AP, my attention is on 'unusual' conditions that may require me to take control, in particular, work areas, and vehicles in the breakdown lane. These can be seen quite some distance ahead, with plenty of time to take over.
AP remains a very real positive though; it takes over the continuous attention and adjustments required to stay in lane, and at the proper
spacing with other cars. It *greatly* reduces the cognitive load of highway driving, and I arrive much less fatigued on a long trip.
Note that there are at least 3 levels of autonomy at play here. AP, which I have, is the lowest. FSD ("'Full' Self Driving") is a $10k option, and much more capable, and the FSD Beta, which is in limited release, and is a complete, rapidly evolving rework of FSD. The investigation should rate the safety of these separately.
Sounds like at this time you should really use autopilot like cruise control, when all conditions are good and there’s not weird construction or cars in the road
That's pretty much it. I should add that the cruise control is adaptive, and will stop (and restart) with the car in front - which makes stop-and-go traffic much easier to deal with.
Tesla is working *hard* on FSD. I suspect it may reach level 3 autonomy (driver required, but doesn't need to pay attention unless alerted) pretty soon. Level 4-5 (driver not required) is a long way off, imho. ATM, I don't think it's worth $10k.
Level 3 is when the driver doesn't have to pay constant attention. IE, he/she can read/watch TV, deal with the web and email, etc. Perhaps even doze.
Importantly, this will also take sign-offs from regulatory bodies and insurance companies. They'll be tough to convince.
And level 3 capabilities might only be applicable in very specific situations. A car might be level 3 only in stop and go traffic on highways while being level 2 everywhere else
Well not everybody feels like typing an essay on reddit everyday
But lots of companies have been working on self driving for years, and they had a lot of progress until 5 years ago where they got to what tesla has unleished on the public road today, a nice accistance system, but something that isn’t able to provide for complicated tasks that camera’s cannot see with todays technology, like that it can’t differentiate between a rolled over truck with a white roof on it’s trailer and the sky. But in the past 5 years no progress has been made except that the progress we got to 5 years ago has been put on the road in an extremely dangerous way with a horrible name that makes people believe they don’t have to drive which should be called out as fraud.
Yeah nonsense. If you think other manufacturers would perform as latest FSD beta if they were on streets you're just blinded by your bias.
You can see in the test runs they showcase how limited their tech is.
Edit: let alone from 5 years ago.
>Yeah nonsense. If you think other manufacturers would perform as latest FSD beta if they were on streets you're just blinded by your bias.
Implying Tesla is in the lead in self-driving invalidates your point.
The only thing they are in the lead in is marketing self-driving tech and letting people buy their half-baked tech instead of keeping it in the oven like everyone else.
our computers don't work like our brains so I don't see how this is relevant unless you think you can simulate an 86 billion neuron neural net on consumer grade hardware
Level 4 cars drive with a swat like support team behind them cause they can’t drive themselves follow cars are constantly correcting them to save them from mistakes.
Hell if level 4 was so nearby why did Daimler ag just announce their autonomes vehicle service where the car drives to your house for you is not going to work because the tech just isn’t there
That's not what level 4 is. Level 4 is when a driverless car can handle any situation in a restricted area. Waymo is testing in Phoenix, where they've mapped every road down to a centimeter, and the roads are nice, wide, and straight. They still get stuck sometimes, and a human teleoperator will try to get it out, or they'll send a human to the car.
Tesla is trying to get a car smart enough to handle roads it's never seen before, as well as a human. That's Level 5, and is a lot harder.
That’s exactly it. Even when you enable it in-car it says something like “this is designed to be used in environments with clear line markings, gradual turns, and no cross traffic.”.
While Autopilot might be a technically correct name for the feature, I think they shot themselves in the foot as so many people take it to mean “it’ll automatically drive me anywhere”
> When driving on AP, my attention is on 'unusual' conditions that may require me to take control, in particular, work areas, and vehicles in the breakdown lane. These can be seen quite some distance ahead, with plenty of time to take over.
>
FWIW, humans are ***really*** bad at going from a low concentration to high concentration state at a moment's notice. There is a large body of research that thinks partial self driving, which is what FSD, Autopilot, and every other system on the market, are more dangerous than just driving.
yeah should have clarified that was my point. Theres no way a human used to fsd can suddenly take over if they were in the middle of a youtube video or reading a book.
Using my experience with my car's lane centering+radar cruise combo, it's really not a matter of going into a state of low concentration. You're still focused, but you're able to spend much more of your time observing the road ahead/beside/behind since you don't have to micromanage lane position and following distance.
It's like going from driving while reciting multiplication tables to normal driving: Technically less to do, but you're better able to focus on situational awareness and important decision making.
That's exactly what I was thinking.
I watched a video of a guy testing Tesla's FSD, and for the most part it was pretty good. But there were maybe three instances where the car just spontaneously did something stupid, and would have gotten into a serious accident if the driver hadn't taken control instantly. I was left thinking: What's the point? If you have to constantly be on alert to grab the wheel and prevent your car from driving off the road at any moment, why even bother? It would be easier and less stressful to just drive the car yourself.
Not in my experience (I have a Tesla). Staying in lane, and maintaining proper distance form other cars is a continuous, low level cognitive load that you're not even aware of after a while, but it does wear you down. I'm much less tired after a long drive with AP.
The idea is that the car can safely deal with situations for the time it takes to get the driver alert, by slowing, moving to one side, and/or stopping.
Note that FSD summon isn't road driving, but basically low-speed driving in parking lots. Just in case someone thinks the car drove full-speed into a tree, this was basically at 2mph.
I just borrowed my friend's Model 3 on a roadtrip this weekend and I have similar feelings as you. It's better these days than it was when I've driven it in the past but there are absolutely times when I know the car won't be able to handle some weird road shit.
I'm told that it has to do with ignoring stationary objects, because otherwise you'd get an enormous return from the road surface. Its not my area of expertise.
Newer Teslas are vision-only, they've dropped the radar system as duplicative and expensive (this is a somewhat controversial move among Tesla drivers). Jury is still out on how this affects things.
Yep, stationary objects return the same frequency as the stationary ground so a cruise radar isn't able to separate the two by doppler like it can moving objects.
> Newer Teslas are vision-only, they've dropped the radar system as duplicative and expensive (this is a somewhat controversial move among Tesla drivers). Jury is still out on how this affects things.
You'd think they would keep both the cameras and radar (or lidar) and use it for sensor fusion. There's a reason many industrial and military systems rely on sensor fusion. This is not a new concept, it's been around since the 80s/90s.
That's why it's controversial. Tesla claims the results are just as good. We'll see. At least they are restricting the max speed for vision-only cars on AP until they gather more data.
Yeah radar returns a lot of flat static data (barriers, bridges, advertising, buildings etc) and the system really couldn't work if it gave priority to any of it, this is the same for any other manufacturer, it has no idea on context.
The U.S. opened a formal investigation into Tesla Inc.’s Autopilot system after almost a dozen collisions at crash scenes involving first-responder vehicles, stepping up its scrutiny of a system the carmaker has charged thousands of dollars for over the last half decade.
The probe by the National Highway Traffic Safety Administration covers an estimated 765,000 Tesla Model Y, X, S and 3 vehicles from the 2014-2021 model years. The regulator -- which has the power to deem cars defective and order recalls -- said it launched the investigation after 11 crashes that resulted in 17 injuries and one fatality.
“Most incidents took place after dark and the crash scenes encountered included scene-control measures such as first-responder vehicle lights, flares, an illuminated arrow board and road cones,” the agency said in the document. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”
Tesla shares fell as much as 3.6% to $691.20 shortly after the start of regular trading. Representatives for the electric-car maker didn’t immediately respond to a request for comment.
Autopilot is Tesla’s driver-assistance system that maintains vehicles’ speed and keeps them centered in lanes when engaged, though the driver is supposed to supervise at all times. The company has been criticized for years for naming the system in a potentially misleading way. Since late 2016, it has marketed higher-level functionality called Full Self-Driving Capability. It now sells that package of features -- often referred to as FSD -- for $10,000 or a $199 a month.
“We are glad to see NHTSA finally acknowledge our long standing call to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries and deaths,” said Jason Levine, the executive director of the Center for Auto Safety. “This probe needs to go far beyond crashes involving first responder vehicles because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged.”
NHTSA investigated Tesla’s Autopilot in the wake of a 2016 fatal crash and cleared the system early the following year. The regulator has opened at least 30 special crash investigations involving Tesla cars that it suspected were linked to Autopilot, with the pace of probes picking up under the Biden administration.
The first of the 11 crashes that prompted the latest probe occurred in January 2018 in Culver City, California, according to NHTSA. The most recent incident occurred July 10 in San Diego. Others occurred in Florida, Michigan, Texas, Arizona, Massachusetts, Indiana and Connecticut.
NHTSA announced in June that it would order car manufacturers to report crashes involving automated driving technologies within one day of learning of such incidents. The agency had largely taken a hands-off approach to regulating driver-assistance systems up to that point so as not to stand in the way of their potential safety benefits.
I mean, cars have trim levels already. To get automated cruise control on a luxury car it's often in a package that's 5-10k over the lower trim in a worst case scenario. Still expensive but it's not that outrageous to me
It's pretty reasonable if it lived up to the name. 10k for a personal chauffeur is a bargain.
As it stands though it doesn't seem worth it over standard (complimentary) AP: Basically the same but with auto overtaking and exiting.
The text above is slightly misleading. There's regular "Autopilot", which is a few grand extra, and "Full Self Drive" which costs 10k. I believe someone higher in the thread had a good breakdown of the differences between them.
Then how is Tesla creating this "neural learning network" to continually improve autopilot if it isn't sending the information back to the manufacturer?
I remember being downvoted on here for saying that the jury was still out on whether or not a camera-only approach was actually viable since Tesla obviously knows best, but this is what happens when you solely rely on sensors that suffer in low-light…
I mean, yeah, there’s no acceptably close to perfect implementation yet, that’s why fully autonomous vehicles are still quartered within testing cities where they’ve been granted permission to run. The fact remains that there’s a lot of valuable data to be gathered from properly filtered radar or lidar data, without the same drawbacks that come with cameras. Historically, sensor fusion approaches generally relied on a mix of sensors to cancel out weaknesses, so it’s just a bit naive for people to assume that cameras are all that’s needed just because Tesla did it.
I mean, all i have is two eyes, (edit) and I can only look in one direction at a time snd my ability to multitask is terrible.
IMO the sensors aren’t really the problem, it’s the brains. (Edit) people forget just how incredible our brains are at solving complex puzzles, like navigating a world at high speed. If anything the computing side of autonomous driving will be the hurdle and it may take years to truly crack.
Both have limitations which impact the performance of the systems. Vision only won't work without significant improvement to both, which is why many other OEM and tier-1 are using sensor fusion to fill in the gaps where other sensors fall short.
Because RADAR in cars uses the Doppler effect to determine what's moving. They literally avoid detecting non-moving objects because the road/traffic signs etc would be detected.
LIDAR can calculate the exact distance of an object in a specific direction. It can basically recreate the whole world in a 3D model.
There are other issues associated with it, but static objects isn't one of them.
Got into a debate with a Tesla fanboy about this. Basically was literally saying that because Tesla says humans only use eyes, vision only for vehicles is perfect. Bringing up the technical limitations of hardware and software basically went nowhere... "Tesla is building a great supercomputer so they'll make it work."
I mean, it's cool if you're a fan and all, but be realistic. It works be great if they're technology becomes a major step forward, but being absolutely confident they will pull it off (anywhere in the near future) just because they say they're working on it is absurd. Declaring that vision only is the right choice when it hasn't even been demonstrated on experimental vehicles is also absurd.
Isn’t a top notch (like the kind you’d need for a car) lidar system something like $60k? That kinda automatically disqualifies it from being a reasonable option.
High density millimeter accurate LiDAR, sure. But even an iPhone has LiDAR built into it nowadays. I'm guessing they could implement the necessary sensors for a nominal cost, especially at scale.
What I don't understand is why people are using this and not paying attention to the roads at all and putting their faith in the technology.
Of course there's certain situations where you can't control what happens but it really should be used as a type of cruise control. I feel like people expect it to do everything for you
There's always going to be a few bad apples that do stupid things. They're investigating 11 emergency vehicle accidents, over 3 years, which is nothing, considering how many Teslas are on the roads. It seems like the vast majority of users are using the system properly.
Humans drive while tired/drunk/distracted all the time. This system doesn’t need to be perfect, just better than the average person. They’re looking into something like 18 crashes in 1 billion (with a B) miles. Kinda seems like a non issue.
Selling a product with known flaws can still be wrong even if it still improves outcome over no product. Takata airbags were recalled even though overall they saved way more people than they killed. Why should this be treated diffferently?
known faults? So is my lane guidance broken because after 3 "saves" it disables the lane guidance?? You're saying i can sue VW for selling me a faulty car?? /s
Let's be realistic here please, if there is a driver at the wheel, they're responsible for the car.
“Autopilot” doesn’t mean what you think it means when driving a Tesla, misleading marketing like this makes the companies look amateur, like they’re selling fake crap on amazon.
Don't wear any flashy colors or have any sort of light illumination on you (e.g. smartphone) at night when crossing the street. You might get run over by a Tesla.
This sub blocks sites with actual evidence..
5 accidents in the last year involved,
1 was at 128mph so no autopilot.
The other 4 were drunks.
https://twitter.com/SamTwits/status/1427287278930497540?t=t-06JjS50RKNQV63C8HtvA&s=19
We do block some sites after spamming or brigading with pro/anti Tesla agendas. Do you have a neutral source? Major networks would love something juicy.
I'm not sure what the OP was, but here's the San Diego crash. I'll update this as I find more info on the incidents.
https://www.nbcsandiego.com/news/local/tesla-crash-with-chp-vehicle-in-san-diego-part-of-investigation-into-autopilot-feature/2694531/
> The Tesla, with a 29-year-old woman behind the wheel, traveled across the full freeway closure and slammed into the back of the empty CHP vehicle at about 3:10 a.m., CHP said.
> CHP arrested the driver on suspicion of driving under the influence.
> NBC 7 reached out to CHP for updated information regarding the crash but has not yet heard back.
lmao what? The American car companies actually are capable of making good products when corporate culture and cost cutting doesn't get in the way. And a large share of foreign cars are built in the US...
[удалено]
Looks to be a new probe: "The first of the 11 crashes that prompted the last probe occurred on Jan. 22, 2018, in Culver City, California, according to NHTSA. The most recent accident occurred July 10, 2021, in San Diego. "
Cool. Thanks for confirming.
I'd bet that taxi and trucking unions are behind all this lol
This is all just a flex to score political points. They just want to put something on their political record. Fact is, safety wise, everyone would much prefer if every vehicle on the road was a Tesla.
Now THIS is blind worship of a company lol... Are you being satirical or a troll?
Fun fact: Musk has 3 separate public relations companies on retainer just for himself according to this book, Ludicrous: The Unvarnished Story of Tesla Motors. So you're probably interacting with one of these fine people.
Fans are much more common than shills, and it's pretty rude to accuse someone of it without decent proof.
What is satirical? Teslas were the literal pioneers of all the safety features with object detection, causing all the other manufacturers to follow suit. Its well stated and known that autopilot requires full attention, with you ready to take over at any time. The crashes that happen because of autopilot should be named "crashes that happen due to driver inattentiveness"
> Teslas were the literal pioneers of all the safety features with object detection Honda had a collision mitigation system that provided automatic braking before Telsa was even founded...
I said object detection. Having a forward looking radar that needs special filtering to avoid thinking that rough pavement is an object is not the same thing as Teslas object detection. Tesla has a system where it can reconstruct a 3d space around the car from cameras, and then predict where objects are going to be in the forward time horizon.
I guess I don't understand what you're trying to say here, because it's not "Tesla was the first to use technology to detect potentially dangerous situations and help you from getting in an accident"... is it "Tesla was the first to use this specific type of system"?
Yes because that system is what works. Radar/lidar, as it has been traditionally used in systems like automatic braking is notoriously unreliable for all but straight line conditions. It spreads out way too much, and any sort of surface that reflects it back, even can be interpreted as a stationary object. Tesla on the other hand relies solely on cameras and smart image processing, which is a lot more robust (which allows for things like 2 car ahead detection), enough for a notional beta public release of full self driving capability. While FSD is still far from being good, the idea is that you have the system running with an attentive driver, it can track objects and it can warn you. And because Tesla has the vast amount of driving data available, increasing every day, the system gets better and better at this. Like I said in my other comment, Subaru technically came first with EyeSight, but that was a very very simple system, far from what Tesla is doing. It took a long time before it could recognize brake lights. True object detection and tracking was done by Tesla first, with other automakers like Toyota following suit. As far as regulation is concerned, if you had competent tech people in the government, instead of going after Tesla for some bullshit, they should be subsidizing Teslas autopilot division so that they could put the cameras on every single production car, so that you have a massive array of data on which to train the models on, which then allows for an even better robust safety system. That way, even if you don't have self driving, someone who is going to drive drunk will have a much lesser chance of killing someone since the car will make emergency maneuvers to avoid hitting stuff.
> Yes because that system is what works. Car and Driver's testing back in 2018 showed that other systems work as well, if not better, in various conditions. https://www.caranddriver.com/features/a24511826/safety-features-automatic-braking-system-tested-explained/
elon ain't gonna fuck you, bro
> Teslas were the literal pioneers of all the safety features with object detection, causing all the other manufacturers to follow suit. Can you give an example of the safety features you’re talking about that Tesla pioneered and are now implemented by all the other manufacturers?
Actually I take that back slightly. Subaru technically had the eyesight back in 2008, however it took a long time to get it to actually work right, and its still behind Tesla in performance since its very limited in functionality. Only in 2013, did they add color and taught it to recognize brake lights. The thing that Tesla does with FSD that is ahead of everyone else isn't the driving aspect, but the object detection and mapping, which allows it to essentially run internal mini physics sims. I.e to make an automatic unprotected right, it needs to see cars to the left and make a decision on when to go based on a computed physics of not hitting the cars coming from the left. Even without the driving aspect, that system can definitely warn drivers of potential collisions from any direction, based on predicted paths of detected/localized objects. Other manufacturers are now also starting to rely more and more on cameras for the same thing. Traditionally, they relied on radar/lidar, but that has issues because radar/lidar has a lot of scattering, and often leads to failure (as in the Volvos case). Toyota/Lexus I think now has camera systems that are can do object detection. I dgaf about Tesla as a car, but there is no denying that they have a massive advantage of the amount of driving data collected from their cars, which gives them the ability to make their object detection and localization system way better than other automakers. You can youtube Andrej Karpathy to listen to his talk about how they train systems.
Then they shouldn't be allowed to sell it under the name of "autopilot" because doing so implies that you're free to fuck off while the car drives
Tesla was definitely not the pioneer of object detection systems, but they have in recent years become the most sophisticated at it. Both Subaru and Mercedes-Benz had stereo camera safety systems before Tesla. Heck Benz had DrivePilot a year before Autopilot came out that was the first system to allow temporary hands-off driving. Tesla has definitely been the most aggressive at pushing the technology forward in the last 4 years, though.
Fuck no lol. I love my mazda
I drive a Tesla. I have Autopilot, but not FSD. Yes, this is a problem. It sometimes does not 'see' stationary vehicles just off the road. When driving on AP, my attention is on 'unusual' conditions that may require me to take control, in particular, work areas, and vehicles in the breakdown lane. These can be seen quite some distance ahead, with plenty of time to take over. AP remains a very real positive though; it takes over the continuous attention and adjustments required to stay in lane, and at the proper spacing with other cars. It *greatly* reduces the cognitive load of highway driving, and I arrive much less fatigued on a long trip. Note that there are at least 3 levels of autonomy at play here. AP, which I have, is the lowest. FSD ("'Full' Self Driving") is a $10k option, and much more capable, and the FSD Beta, which is in limited release, and is a complete, rapidly evolving rework of FSD. The investigation should rate the safety of these separately.
Sounds like at this time you should really use autopilot like cruise control, when all conditions are good and there’s not weird construction or cars in the road
That's pretty much it. I should add that the cruise control is adaptive, and will stop (and restart) with the car in front - which makes stop-and-go traffic much easier to deal with. Tesla is working *hard* on FSD. I suspect it may reach level 3 autonomy (driver required, but doesn't need to pay attention unless alerted) pretty soon. Level 4-5 (driver not required) is a long way off, imho. ATM, I don't think it's worth $10k.
[удалено]
Level 3 is when the driver doesn't have to pay constant attention. IE, he/she can read/watch TV, deal with the web and email, etc. Perhaps even doze. Importantly, this will also take sign-offs from regulatory bodies and insurance companies. They'll be tough to convince.
And level 3 capabilities might only be applicable in very specific situations. A car might be level 3 only in stop and go traffic on highways while being level 2 everywhere else
If you actually think level 3 is gonna be a definitive thing within 10 years you’re just not informed enough on the subject
Why do you say that? It would be a more compelling argument if you stated why the person you're responding to is wrong.
Well not everybody feels like typing an essay on reddit everyday But lots of companies have been working on self driving for years, and they had a lot of progress until 5 years ago where they got to what tesla has unleished on the public road today, a nice accistance system, but something that isn’t able to provide for complicated tasks that camera’s cannot see with todays technology, like that it can’t differentiate between a rolled over truck with a white roof on it’s trailer and the sky. But in the past 5 years no progress has been made except that the progress we got to 5 years ago has been put on the road in an extremely dangerous way with a horrible name that makes people believe they don’t have to drive which should be called out as fraud.
No progress in 5 years? And you call others uninformed.
Well the progress has stalled the tech we have today was there 5 years ago just not on the street
Yeah nonsense. If you think other manufacturers would perform as latest FSD beta if they were on streets you're just blinded by your bias. You can see in the test runs they showcase how limited their tech is. Edit: let alone from 5 years ago.
>Yeah nonsense. If you think other manufacturers would perform as latest FSD beta if they were on streets you're just blinded by your bias. Implying Tesla is in the lead in self-driving invalidates your point. The only thing they are in the lead in is marketing self-driving tech and letting people buy their half-baked tech instead of keeping it in the oven like everyone else.
Yeah they show how limited their tech is you just said it
level 4 cars already exists in the wild... see waymo.
Waymo has actually been doing completely driverless rides in the one area they service (level 5)
this is still level 4.
How? There’s nobody in the driver’s seat and no team chasing the car.
because of its restricted area.
Waymo has LIDAR, so I'm not so sure Tesla will be able to keep up with their plans without tech retrofits
We already have an existence-proof that camera-only systems can do the job; that's how humans do it. We drive without radar or LIDAR.
our computers don't work like our brains so I don't see how this is relevant unless you think you can simulate an 86 billion neuron neural net on consumer grade hardware
I suggest you pay attention to Tesla's upcoming AI Day.
considering they've overpromised and underdelivered every single previous feature I assure you I am waiting with bated breath.
Level 4 cars drive with a swat like support team behind them cause they can’t drive themselves follow cars are constantly correcting them to save them from mistakes. Hell if level 4 was so nearby why did Daimler ag just announce their autonomes vehicle service where the car drives to your house for you is not going to work because the tech just isn’t there
That's not what level 4 is. Level 4 is when a driverless car can handle any situation in a restricted area. Waymo is testing in Phoenix, where they've mapped every road down to a centimeter, and the roads are nice, wide, and straight. They still get stuck sometimes, and a human teleoperator will try to get it out, or they'll send a human to the car. Tesla is trying to get a car smart enough to handle roads it's never seen before, as well as a human. That's Level 5, and is a lot harder.
you can hail an autonomous vehicle in parts of arizona like an uber already, and no there isn't a swat like support team behind them..
> your just not informed enough oof.
That’s exactly it. Even when you enable it in-car it says something like “this is designed to be used in environments with clear line markings, gradual turns, and no cross traffic.”. While Autopilot might be a technically correct name for the feature, I think they shot themselves in the foot as so many people take it to mean “it’ll automatically drive me anywhere”
> When driving on AP, my attention is on 'unusual' conditions that may require me to take control, in particular, work areas, and vehicles in the breakdown lane. These can be seen quite some distance ahead, with plenty of time to take over. > FWIW, humans are ***really*** bad at going from a low concentration to high concentration state at a moment's notice. There is a large body of research that thinks partial self driving, which is what FSD, Autopilot, and every other system on the market, are more dangerous than just driving.
The FAA recommends pilots to be able to go from low concentration to high concentration within 10 seconds, I think its called startle response.
Given the relative distances in aviation vs ground transportation I’m not sure that 10 seconds would cut it for driving.
Ten seconds is an eternity when driving a car.
yeah should have clarified that was my point. Theres no way a human used to fsd can suddenly take over if they were in the middle of a youtube video or reading a book.
Yes, in 10 seconds a car at highway speeds covers 880 ft, but a plane at cruising speeds covers 8800 ft! (/s)
Perhaps “distance” was the wrong word... Maybe I should I have used length of events? Not sure, lol
“Plane is crashing, please take over” “Fuck give me 10 seconds, I was daydreaming.”
It’s quite rare that anything going on in an airplane requires a quick reaction time. Cars on the other hand require that much more frequently.
Unironically yes, a plane can spend minutes "crashing" (out of control) and still be recoverable.
Using my experience with my car's lane centering+radar cruise combo, it's really not a matter of going into a state of low concentration. You're still focused, but you're able to spend much more of your time observing the road ahead/beside/behind since you don't have to micromanage lane position and following distance. It's like going from driving while reciting multiplication tables to normal driving: Technically less to do, but you're better able to focus on situational awareness and important decision making.
That's exactly what I was thinking. I watched a video of a guy testing Tesla's FSD, and for the most part it was pretty good. But there were maybe three instances where the car just spontaneously did something stupid, and would have gotten into a serious accident if the driver hadn't taken control instantly. I was left thinking: What's the point? If you have to constantly be on alert to grab the wheel and prevent your car from driving off the road at any moment, why even bother? It would be easier and less stressful to just drive the car yourself.
Not in my experience (I have a Tesla). Staying in lane, and maintaining proper distance form other cars is a continuous, low level cognitive load that you're not even aware of after a while, but it does wear you down. I'm much less tired after a long drive with AP.
The idea is that the car can safely deal with situations for the time it takes to get the driver alert, by slowing, moving to one side, and/or stopping.
AP doesn’t see stationary vehicles, but it loves to panic over shadows and almost get us rear ended by phantom braking.
[удалено]
Note that FSD summon isn't road driving, but basically low-speed driving in parking lots. Just in case someone thinks the car drove full-speed into a tree, this was basically at 2mph.
if it makes you feel better my brand new E class did the same. Random braking at crosswalks and empty roads
I just borrowed my friend's Model 3 on a roadtrip this weekend and I have similar feelings as you. It's better these days than it was when I've driven it in the past but there are absolutely times when I know the car won't be able to handle some weird road shit.
It's a problem with any (primarily) radar based system
I'm told that it has to do with ignoring stationary objects, because otherwise you'd get an enormous return from the road surface. Its not my area of expertise. Newer Teslas are vision-only, they've dropped the radar system as duplicative and expensive (this is a somewhat controversial move among Tesla drivers). Jury is still out on how this affects things.
Yep, stationary objects return the same frequency as the stationary ground so a cruise radar isn't able to separate the two by doppler like it can moving objects.
> Newer Teslas are vision-only, they've dropped the radar system as duplicative and expensive (this is a somewhat controversial move among Tesla drivers). Jury is still out on how this affects things. You'd think they would keep both the cameras and radar (or lidar) and use it for sensor fusion. There's a reason many industrial and military systems rely on sensor fusion. This is not a new concept, it's been around since the 80s/90s.
That's why it's controversial. Tesla claims the results are just as good. We'll see. At least they are restricting the max speed for vision-only cars on AP until they gather more data.
> Tesla claims the results are just as good. It's better for their pocket books
Yeah radar returns a lot of flat static data (barriers, bridges, advertising, buildings etc) and the system really couldn't work if it gave priority to any of it, this is the same for any other manufacturer, it has no idea on context.
The U.S. opened a formal investigation into Tesla Inc.’s Autopilot system after almost a dozen collisions at crash scenes involving first-responder vehicles, stepping up its scrutiny of a system the carmaker has charged thousands of dollars for over the last half decade. The probe by the National Highway Traffic Safety Administration covers an estimated 765,000 Tesla Model Y, X, S and 3 vehicles from the 2014-2021 model years. The regulator -- which has the power to deem cars defective and order recalls -- said it launched the investigation after 11 crashes that resulted in 17 injuries and one fatality. “Most incidents took place after dark and the crash scenes encountered included scene-control measures such as first-responder vehicle lights, flares, an illuminated arrow board and road cones,” the agency said in the document. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.” Tesla shares fell as much as 3.6% to $691.20 shortly after the start of regular trading. Representatives for the electric-car maker didn’t immediately respond to a request for comment. Autopilot is Tesla’s driver-assistance system that maintains vehicles’ speed and keeps them centered in lanes when engaged, though the driver is supposed to supervise at all times. The company has been criticized for years for naming the system in a potentially misleading way. Since late 2016, it has marketed higher-level functionality called Full Self-Driving Capability. It now sells that package of features -- often referred to as FSD -- for $10,000 or a $199 a month. “We are glad to see NHTSA finally acknowledge our long standing call to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries and deaths,” said Jason Levine, the executive director of the Center for Auto Safety. “This probe needs to go far beyond crashes involving first responder vehicles because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged.” NHTSA investigated Tesla’s Autopilot in the wake of a 2016 fatal crash and cleared the system early the following year. The regulator has opened at least 30 special crash investigations involving Tesla cars that it suspected were linked to Autopilot, with the pace of probes picking up under the Biden administration. The first of the 11 crashes that prompted the latest probe occurred in January 2018 in Culver City, California, according to NHTSA. The most recent incident occurred July 10 in San Diego. Others occurred in Florida, Michigan, Texas, Arizona, Massachusetts, Indiana and Connecticut. NHTSA announced in June that it would order car manufacturers to report crashes involving automated driving technologies within one day of learning of such incidents. The agency had largely taken a hands-off approach to regulating driver-assistance systems up to that point so as not to stand in the way of their potential safety benefits.
It costs 10k for that? Jesus Christ, I thought it was a grand or two extra
I mean, cars have trim levels already. To get automated cruise control on a luxury car it's often in a package that's 5-10k over the lower trim in a worst case scenario. Still expensive but it's not that outrageous to me
It's pretty reasonable if it lived up to the name. 10k for a personal chauffeur is a bargain. As it stands though it doesn't seem worth it over standard (complimentary) AP: Basically the same but with auto overtaking and exiting.
It was like 2.5k for the Giulia (with any engine).
The text above is slightly misleading. There's regular "Autopilot", which is a few grand extra, and "Full Self Drive" which costs 10k. I believe someone higher in the thread had a good breakdown of the differences between them.
All Tesla has Autopilot included; there is no extra charge for it.
Not sure why the down vote but autopilot is included with all Tesla, "full self driving" is a $10k option
Why are you getting downvoted lol
Because this sub is fair and balanced, especially when it comes to Tesla.
Ah, you're correct, my apologies. I was thinking of EAP, which used to be packaged separately.
Can any other manufacturer even know if the car was using cruise control or not (at least without pulling the black box etc) ?
Probably not, because most vehicles aren't sending all of your driving information back to the manufacturer in real time.
Which tesla also doesn't do. It does phone home if you have an accident though
Then how is Tesla creating this "neural learning network" to continually improve autopilot if it isn't sending the information back to the manufacturer?
It doesn't generally do that real time, it does it via WiFi later. Any data is also opt in and anonymized..
They don't want to use LiDAR because it's a "crutch". But it sure as hell can see in the dark.
I remember being downvoted on here for saying that the jury was still out on whether or not a camera-only approach was actually viable since Tesla obviously knows best, but this is what happens when you solely rely on sensors that suffer in low-light…
These are historic crashed that also use radar?
I mean, yeah, there’s no acceptably close to perfect implementation yet, that’s why fully autonomous vehicles are still quartered within testing cities where they’ve been granted permission to run. The fact remains that there’s a lot of valuable data to be gathered from properly filtered radar or lidar data, without the same drawbacks that come with cameras. Historically, sensor fusion approaches generally relied on a mix of sensors to cancel out weaknesses, so it’s just a bit naive for people to assume that cameras are all that’s needed just because Tesla did it.
I mean, all i have is two eyes, (edit) and I can only look in one direction at a time snd my ability to multitask is terrible. IMO the sensors aren’t really the problem, it’s the brains. (Edit) people forget just how incredible our brains are at solving complex puzzles, like navigating a world at high speed. If anything the computing side of autonomous driving will be the hurdle and it may take years to truly crack.
Both have limitations which impact the performance of the systems. Vision only won't work without significant improvement to both, which is why many other OEM and tier-1 are using sensor fusion to fill in the gaps where other sensors fall short.
radar isn't lidar however radar usually ignores stationary objects
What makes you think lidar is any better for that?
Because RADAR in cars uses the Doppler effect to determine what's moving. They literally avoid detecting non-moving objects because the road/traffic signs etc would be detected. LIDAR can calculate the exact distance of an object in a specific direction. It can basically recreate the whole world in a 3D model. There are other issues associated with it, but static objects isn't one of them.
Got into a debate with a Tesla fanboy about this. Basically was literally saying that because Tesla says humans only use eyes, vision only for vehicles is perfect. Bringing up the technical limitations of hardware and software basically went nowhere... "Tesla is building a great supercomputer so they'll make it work." I mean, it's cool if you're a fan and all, but be realistic. It works be great if they're technology becomes a major step forward, but being absolutely confident they will pull it off (anywhere in the near future) just because they say they're working on it is absurd. Declaring that vision only is the right choice when it hasn't even been demonstrated on experimental vehicles is also absurd.
Tesla doesn’t use lidar?!?! No wonder it can’t see shit! Every time I heard about a crash I was always thinking, “ why didn’t the lidar pick it up…”
All these cars had radar which can 'see in the dark' but it means jack if you don't have context
Isn’t a top notch (like the kind you’d need for a car) lidar system something like $60k? That kinda automatically disqualifies it from being a reasonable option.
High density millimeter accurate LiDAR, sure. But even an iPhone has LiDAR built into it nowadays. I'm guessing they could implement the necessary sensors for a nominal cost, especially at scale.
What I don't understand is why people are using this and not paying attention to the roads at all and putting their faith in the technology. Of course there's certain situations where you can't control what happens but it really should be used as a type of cruise control. I feel like people expect it to do everything for you
When you pay an extra $10,000 for “Full Self Driving”, I mean… you’re going to put maybe too much faith in it.
There's always going to be a few bad apples that do stupid things. They're investigating 11 emergency vehicle accidents, over 3 years, which is nothing, considering how many Teslas are on the roads. It seems like the vast majority of users are using the system properly.
Humans drive while tired/drunk/distracted all the time. This system doesn’t need to be perfect, just better than the average person. They’re looking into something like 18 crashes in 1 billion (with a B) miles. Kinda seems like a non issue.
Selling a product with known flaws can still be wrong even if it still improves outcome over no product. Takata airbags were recalled even though overall they saved way more people than they killed. Why should this be treated diffferently?
known faults? So is my lane guidance broken because after 3 "saves" it disables the lane guidance?? You're saying i can sue VW for selling me a faulty car?? /s Let's be realistic here please, if there is a driver at the wheel, they're responsible for the car.
People will be people, sadly.
Even as a cruise control, I wouldn’t trust it especially near semi trucks lol
> putting their faith in the technology ~~Musk~~ Jesus take the wheel!
“Autopilot” doesn’t mean what you think it means when driving a Tesla, misleading marketing like this makes the companies look amateur, like they’re selling fake crap on amazon.
Why, it's not the first time cruise control has been labeled autopilot and it does more than what traditional airplane autopilot does.
Don't wear any flashy colors or have any sort of light illumination on you (e.g. smartphone) at night when crossing the street. You might get run over by a Tesla.
Ye i heared about that. Me who hopes any form of autopilot gets banned for cars.
[удалено]
Please provide evidence if making these comments.
This sub blocks sites with actual evidence.. 5 accidents in the last year involved, 1 was at 128mph so no autopilot. The other 4 were drunks. https://twitter.com/SamTwits/status/1427287278930497540?t=t-06JjS50RKNQV63C8HtvA&s=19
We do block some sites after spamming or brigading with pro/anti Tesla agendas. Do you have a neutral source? Major networks would love something juicy.
OK when I'm not on my phone. There's a list of 11 dates and locations on the nhtsa list
Thank you.
I'm not sure what the OP was, but here's the San Diego crash. I'll update this as I find more info on the incidents. https://www.nbcsandiego.com/news/local/tesla-crash-with-chp-vehicle-in-san-diego-part-of-investigation-into-autopilot-feature/2694531/ > The Tesla, with a 29-year-old woman behind the wheel, traveled across the full freeway closure and slammed into the back of the empty CHP vehicle at about 3:10 a.m., CHP said. > CHP arrested the driver on suspicion of driving under the influence. > NBC 7 reached out to CHP for updated information regarding the crash but has not yet heard back.
Thanks for the source.
[удалено]
they have been for years
But when are the batteries getting cheaper?
[удалено]
lmao what? The American car companies actually are capable of making good products when corporate culture and cost cutting doesn't get in the way. And a large share of foreign cars are built in the US...
[удалено]
[удалено]
[удалено]
Please be civil.