Open Side Menu Go to the Top
Register
TSLA showing cracks? TSLA showing cracks?

04-04-2018 , 06:40 PM
Here's a recent post (among MANY) from a Tesla owner about his experience with "Autopilot"

Quote:
This evening I ran into two Autopilot failure cases in quick succession. I was curious whethe Autopilot could handle the 270-degree offramp from 101W to 405S in moderate traffic. The whole turn is confined to a single lane, so I expected it to be a straightforward "lane-keeping" task, thus falling under the domain of Autopilot and not FSD. Here's how it looked on dashcam (KDLinks DX2, powered from console 12V):

You'll see that Autopilot first tried to drive me into the offramp guardrail (it either didn't see the curvature of the road, lost track of the car in front of it, or didn't recognize the lower speed limit), causing me to yank the wheel and slam the brakes. Then seconds later, getting onto the 405, it had trouble recognizing the lane lines and almost ran me into a series of guard posts. (The buzzing camera is due to its mount coming slightly loose, possibly as a result of these hijinks.)
The responses to this show this to be normal.

Tesla can't even safely take highway off ramps or notice barriers it's about to slam into when re-merging onto a highway. Meanwhile, GM and Waymo have full autonomy on any road in the city in complex conditions including obeying traffic lights, four way stop signs with traffic, avoiding pedestrians and double parked cars, no lines, construction zones, bikes, etc etc, with one human intervention in 3000 miles. An example from over a year ago:



That you think the research report is "paid for" and that Tesla might be ahead is ****ing ridiculous, Spurious. Do you not realize you're ridiculous here? What planet do you live on?
TSLA showing cracks? Quote
04-04-2018 , 06:46 PM
Here's another recent video of Tesla completely ****ing up when lanes disappear. It lacks any competent mapping of its environment and just blindly follows lanes and the car in front (and will sometimes swerve into oncoming traffic, confusing it for a car it's supposed to follow). It doesn't even warn when lanes are lost. Tesla are years behind the majors and have no viable path for going beyond level 2.



Tell me when you concede that Tesla are far behind and I'll stop. There are dozens of these videos. TMC forums are full of Tesla owners complaining about how dangerous and unsophisticated autopilot is anywhere outside of very well marked freeways.
TSLA showing cracks? Quote
04-04-2018 , 09:45 PM
Quote:
Originally Posted by ToothSayer
Here's another recent video of Tesla completely ****ing up when lanes disappear. It lacks any competent mapping of its environment and just blindly follows lanes and the car in front (and will sometimes swerve into oncoming traffic, confusing it for a car it's supposed to follow). It doesn't even warn when lanes are lost. Tesla are years behind the majors and have no viable path for going beyond level 2.



Tell me when you concede that Tesla are far behind and I'll stop. There are dozens of these videos. TMC forums are full of Tesla owners complaining about how dangerous and unsophisticated autopilot is anywhere outside of very well marked freeways.
Waymo and Cruise Automation also have trouble with disappearing lanes. It's probably the #1 reason for disengagements.

Also, that doesn't seem like that twitter video driver was driving on a highway.
TSLA showing cracks? Quote
04-04-2018 , 10:17 PM
Quote:
Originally Posted by donfairplay
Waymo and Cruise Automation also have trouble with disappearing lanes. It's probably the #1 reason for disengagements.

Also, that doesn't seem like that twitter video driver was driving on a highway.
Please. Tesla would disengage every 5 miles in the city roads that Waymo and GM drive on, at least, every 0.5 miles in the inner city, for which Tesla lacks anything at all. Tesla can't even handle the simplicity of highway off ramps. Meanwhile, Waymo is making fabulous progress:



In November they had it down to 1 disengagement in 30K miles navigating complex and busy city traffic, often without lanes. The idea that Tesla are anything but years behind Waymo and GM is ****ing absurd, and anyone who says otherwise should simply be mocked and ignored, because they have no clue what they're talking about.

GM Cruise has similar declining disengagement numbers, 3 and 4 for October and November 2017. And three of those four are "other road user behaving poorly", e.g. an imminent crash where the other user was unavoidably at fault. Disengagements that are an autonomous driving fail are zero or one out of 30K miles in later months.

Last edited by ToothSayer; 04-04-2018 at 10:31 PM.
TSLA showing cracks? Quote
04-05-2018 , 12:12 AM
I never said Tesla is even close to the level of sophistication of Waymo and Cruise Automation.

I merely said disappearing lanes are a problem for everyone. Comma ai ran into this problem in Nevada highways. You're essentially forcing the autonomous software to guess with disappearing lanes, and putting your car into the hands of machine learning.

Tesla would disengage way more often than Waymo and Cruise, complete agreement. Tesla isn't even meant to be used off the highway. You agree that twitter video you posted is probably not the highway, right?
TSLA showing cracks? Quote
04-05-2018 , 03:31 AM
TS,

I am not making up a conspiracy theory. I am stating the fact that Navigant primarily makes money from selling services. You don't understand this simple fact and rely so heavily on this that you'll end up being wrong.

I am not defending Tesla's AP, I attack your low quality posting.

Also, GM and Waymo have most likely problems as well. Waymo has been involved in a number of accidents where they allegedly weren't responsible but they had quite a few rear end accidents. GM has one video and you take it as the absolute truth. This is so embarrassing. Tesla's AP needs to navigate in a variety of situations whereas GM and Waymo restricts the area in which they operate. This vastly reduces complexity.

Here are a few examples:
https://www.reuters.com/article/auto...-idUSL2N1MF1RO


Quote:
My trip was far from smooth, the vehicle so careful that it jolted, disconcertingly, to a stop at even the whisper of a collision.
https://www.wired.com/story/ride-gen...f-driving-car/
TSLA showing cracks? Quote
04-05-2018 , 06:59 AM
The big difference is that GM is underselling super cruise while Tesla is overhyping autopilot to the level where they endanger lives.
TSLA showing cracks? Quote
04-05-2018 , 07:44 AM
Quote:
Originally Posted by Spurious
TS,

I am not making up a conspiracy theory. I am stating the fact that Navigant primarily makes money from selling services. You don't understand this simple fact and rely so heavily on this that you'll end up being wrong.
This is total bull**** man. I post credible research from a billion dollar research company on the state of autonomous driving, showing Tesla dead last as every data point backs up, you come back with this horse ****:
Quote:
Originally Posted by Spurious
TS is claiming BS and referencing paid by automotive companies consultants studies on the subject. GM who has a history of actually killing people would bring out something that worked if they had something that was best in class. They don't and as such TS claims they are the best and what not are silly.
Here, you imply the study showing Tesla dead last and GM and Waymo well ahead is invalid, because they are paid by "automotive companies" - a false claim, by the way, pure conspiracy. Navigant are a research company who publishes research across a large variety of industries, which is then sold to interested parties - media outlets, researchers, hedge funds, industry participants (like parts suppliers).

Now you backtrack from that absurd conspiracy theory claim (the report is suspect/wrong because it's paid for by autonotive companies), lying about what you said. How much more dishonest can you be? You should have stopped while you were behind.

Quote:
I am not defending Tesla's AP
You are though. Any criticism of anything to do with Tesla you bend into idiotic conspiracy theories for why Tesla aren't behind in spite of the clear facts. Any evidence of Tesla's horrendous, far behind state of autopilot - which is everywhere - you try to deflect. I mean, right here, you're trying to deflect again:

Quote:
Also, GM and Waymo have most likely problems as well. Waymo has been involved in a number of accidents where they allegedly weren't responsible but they had quite a few rear end accidents.
Dude, we have hard data on disengagements and incidents. They are incredibly low. I posted the table above.

Quote:
GM has one video and you take it as the absolute truth. This is so embarrassing.
No, dude. This is you being a dishonest, bad faith piece of **** again. I posted data on disengagements. I pointed out that they have a city wide fully autonomous ride hailing service actually running right now. I posted a study showing them far ahead of Tesla, along with most of the other majors. You rubbish and try to deflect every single data point, because you are a deranged cultist. Incredibly, you seem not to realize this.

This is like talking to someone on Yahoo! stock forums. I've rarely encountered such mouth breathing stupidity on 2p2. Congratulations on bringing down the quality of discussion on 2p2.

Quote:
Tesla's AP needs to navigate in a variety of situations whereas GM and Waymo restricts the area in which they operate. This vastly reduces complexity.
Tesla's AP only works on well laned freeways with on and off ramps. Waymo is navigating inner city traffic, pedestrians, bikes, cross traffic, traffic lights, four way stop signs, no lanes, etc etc.

Did you even read the article you linked? These accidents were the fault of other drivers, entirely.

Quote:
Most of the crashes involved drivers of other vehicles striking the GM cars that were slowing for stop signs, pedestrian or other issues. In one crash, a driver of a Ford Ranger was on his cell phone when he rear-ended a Chevrolet Bolt that was stopped at a red light.

In another instance, the driver of a Chevrolet Bolt noticed an intoxicated cyclist in San Francisco going the wrong direction toward the Bolt. The human driver stopped the Bolt and the cyclist hit the bumper and fell over. The bicyclist pulled on a sensor attached to the vehicle causing minor damage.

In another incident on Sept. 15 in San Francisco, a Dodge Charger in the left-turn lane attempted to illegally pass a Bolt in driverless mode. The GM Cruise employee took control of the vehicle and the Dodge scraped the front sensor and fled the scene without stopping.

“All our incidents this year were caused by the other vehicle,” said a Rebecca Mark, spokeswoman for GM Cruise.
How much of a disingenuous cult inductee are you? Here are the facts:

- Tesla autopilot is terrible and in a poor state of development. They are dead last of all major car companies in their level of developmentr, a fact which is shown by multiple data points. They are struggling with creating a safe level 2 and have massively oversold the capabilities of their "autopilot", as well as lied about the time frame (Musk promised Fully Self Driving coming in "maybe 3 months, definitely 6" in January 2017. This irresponsibility has killed people, including recently.

- GM and Waymo are far ahead and have a highly competent level 4 that on latest data does one disengagement every 10K miles - most of those for the faults of other drivers - and at most 1 in every 30K miles that are their own fault in the last few months.

That you can't accept these facts shows how unhinged you are. You're here to defend your cult leader with whom you are in love, and provide nothing of worth to anyone.
TSLA showing cracks? Quote
04-05-2018 , 08:56 AM
TS,

of course you never have accidents which are your fault if you drive ultra slow and abruptly brake. This is not a good argument. I read the article and that's why I linked it. I would have explicitly stated if it was their fault.

The more often you repeat multi billion dollar research company, the less I believe you that you yourself believe their stuff.

chytry,

I absolutely agree. I am only raising the point because TS is actually doing the opposite of what GM is doing.
TSLA showing cracks? Quote
04-05-2018 , 10:40 AM
I find it amusing there is "debate" on whether Autopilot (or other AES systems) prevent crashes.

https://static.nhtsa.gov/odi/inv/201...16007-7876.PDF

Guess there is a loose grasp on reality here.
TSLA showing cracks? Quote
04-05-2018 , 11:17 AM
exactly

Quote:
NHTSA’s examination did not identify any defects in the design or performance of the AEB or Autopilot systems of the subject vehicles nor any incidents in which the systems did not perform as designed. AEB systems used in the automotive industry through MY 2016 are rear-end collision avoidance technologies that are not designed to reliably perform in all crash modes, including crossing path collisions. The Autopilot system is an Advanced Driver Assistance System (ADAS) that requires the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes. Tesla's design included a hands-on the steering wheel system for monitoring driver engagement. That system has been updated to further reinforce the need for driver engagement through a "strike out" strategy. Drivers that do not respond to visual cues in the driver monitoring system alerts may "strike out" and lose Autopilot function for the remainder of the drive cycle.
TSLA showing cracks? Quote
04-05-2018 , 11:40 AM
None for nothing, if the system needs continuous user input, maybe calling it "autopilot" was a bit of an oversight.
TSLA showing cracks? Quote
04-05-2018 , 11:52 AM
There are three separate things here:

1. What Tesla classifies itself as in regulatory documents (level 2 - nothing but lane keeping and following and changing)

2. What Tesla represents and hypes the state of its "autonomous" driving as to customer, investors and the general public.

3. What Tesla claims about the state of development and what is coming (for example, "Fully Self Driving" hardware sales with the features coming in "3 to 6 months" in Jan 2017

The first is reality. The second and third are pure bull**** that Tesla is selling and people like Spurious are buying. Tesla has level 2 and nothing beyond that and little hope of going beyond it, since they can't even do level 2 + highway obstacle avoidance properly.

The tragedy comes when people don't use "autopilot" properly because Tesla has downloaded pure bull**** into their mind about the state of development, how safe and capable it is.
TSLA showing cracks? Quote
04-05-2018 , 12:28 PM
I don't know man, a brief perusal of the Model S owner's manual makes Autopilot's limitations very clear. Just look at page 65 and on. There are many very clear warnings of Autopilot's capabilities and limitations. I understand you have problems reading, but even you can probably understand.


https://www.tesla.com/sites/default/...rica_en_ca.pdf
TSLA showing cracks? Quote
04-05-2018 , 12:52 PM
So apparently the fact that its called "Autopilot" means everyone is a total idiot now and Musk is to blame. I guess we need larger warnings on toasters explaining not to use them in the bathtub for these same people.

I guess from a guy who's hung his hat on a fake trading record for years. (Has he even made a single trade in his prop bet?) It isn't really a whole lot to expect this sort of dishonesty. Apparently every expert is only an expert when they say Tesla is bad, never when they acknowledged they had the premier SDC technology deployed. Or the actual regulatory bodies.
TSLA showing cracks? Quote
04-05-2018 , 12:57 PM
Quote:
Originally Posted by Mihkel05
So apparently the fact that its called "Autopilot" means everyone is a total idiot now and Musk is to blame. I guess we need larger warnings on toasters explaining not to use them in the bathtub for these same people.
Using something called "Autopilot" as an autopilot is a wee bit different than using a toaster as a bath warmer.
TSLA showing cracks? Quote
04-05-2018 , 01:05 PM
Just sayin', "requires constant user input" is literally the exact opposite of "autopilot." Seems like after the first guy died because he mistakenly thought the autopilot was an autopilot, the company should have considered renaming the feature. I get that the manual clearly explains that the autopilot is not to ever be used as an autopilot and will drive right into the nearest truck if it's on an unapproved road, but still you can appreciate why people get confused.
TSLA showing cracks? Quote
04-05-2018 , 01:16 PM
Quote:
Originally Posted by SenorKeeed
I don't know man, a brief perusal of the Model S owner's manual makes Autopilot's limitations very clear. Just look at page 65 and on. There are many very clear warnings of Autopilot's capabilities and limitations. I understand you have problems reading, but even you can probably understand.


https://www.tesla.com/sites/default/...rica_en_ca.pdf
Of course the manual is clear. They're covering their ass like crazy. They do the same in their SEC filings, which use much more muted language than Musk does on Twitter or PR releases when say, talking about production targets or the date when fully self driving is coming.

How many people read a car's manual in full? Any? The fact is that the minds of most people, including drivers, are filled with the hype that Tesla puts out.

That's why Tesla drivers are watching Harry Potter and getting decapitated rather than the road. They've been sold bull**** rather than being told the system is incredibly dangerous and non-functioning outside of following the car in front and keeping in a lane in a well marked area.

I understand you have trouble following human nature and arguments, but this isn't hard, man.

Last edited by ToothSayer; 04-05-2018 at 01:26 PM.
TSLA showing cracks? Quote
04-05-2018 , 01:23 PM
Show me an example of this Autopilot hype you're claiming.
TSLA showing cracks? Quote
04-05-2018 , 02:08 PM
Quote:
Originally Posted by Trolly McTrollson
None for nothing, if the system needs continuous user input, maybe calling it "autopilot" was a bit of an oversight.
Airliner auto pilots don’t do collision avoidance either. Keeping Airways safe requires constant monitoring by air traffic controllers. Just sayin.
TSLA showing cracks? Quote
04-05-2018 , 03:04 PM
Naming the thing Autopilot was probably more a marketing move and they didn't anticipate people needing to put the hands on the steering wheel all that often. But people who can afford a $100+k car should probably be trusted to understand that Autopilot does not mean machine chauffeur.
TSLA showing cracks? Quote
04-05-2018 , 03:18 PM
Regardless of the name, turning on autopilot for highway driving should perform well enough to not slam into lane dividers.

https://arstechnica.com/cars/2018/04...viders-poorly/
TSLA showing cracks? Quote
04-05-2018 , 03:59 PM
Quote:
Originally Posted by SenorKeeed
Show me an example of this Autopilot hype you're claiming.
I can show you dozens. The below are all from Tesla's autopilot promotion page.

How could anyone possibly get the wrong idea that it's safer to just let Tesla do the driving (my yellow highlight)?



How could anyone possibly get the wrong idea that Tesla isn't in fact capable of detecting fire trucks or concrete barriers and either giving ample warning or stopping itself?
Quote:
Advanced Sensor Coverage

Eight surround cameras provide 360 degrees of visibility around the car at up to 250 meters of range. Twelve updated ultrasonic sensors complement this vision, allowing for detection of both hard and soft objects at nearly twice the distance of the prior system. A forward-facing radar with enhanced processing provides additional data about the world on a redundant wavelength that is able to see through heavy rain, fog, dust and even the car ahead.
How could anyone possibly get the wrong idea that Tesla can handle all aspects of highway driving without slamming into anything?
Quote:
On-ramp to Off-ramp

Once on the freeway, your Tesla will determine which lane you need to be in and when. In addition to ensuring you reach your intended exit, Autopilot will watch for opportunities to move to a faster lane when you're caught behind slower traffic. When you reach your exit, your Tesla will depart the freeway, slow down and transition control back to you.
How could anyone possibly get the wrong idea that Tesla isn't merely an incompetent level 2 software with unreliable safety features?

Quote:
Full Self-Driving Capability

Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.
The only disclaimer on their entire promotional page:

Quote:
. Every driver is responsible for remaining alert and active when using Autopilot, and must be prepared to take action at any time.
How could anyone possibly get the wrong idea and think that Tesla has their back in multiple ways on avoiding fire trucks in your lane, trucks blocking the entire highway, and lethal concrete divider crashes?

Quote:
Standard safety features

Automatic Emergency Braking

Designed to detect objects that the car may impact and applies the brakes accordingly

Front Collision Warning

Helps warn of impending collisions with slower moving or stationary cars
I think after reading that your average buyer is going to think that "Autopilot" can reliably find and warn about obstacles, follow lanes properly (and not into concrete dividers or fire trucks), and be safer than they are on highways. Tesla claims as "level 2" for legal purposes and then hypes the **** out of it, claiming far more and far more reliability than is justified for mere lane changing and leader following.

That's without getting into Musk's tweet hype or public statements or interviews/articles from friendly news outlet hyping their "autopilot"

Amazing that you guys are arguing with me on this. You're dead wrong and really stupid besides.

Last edited by ToothSayer; 04-05-2018 at 04:04 PM.
TSLA showing cracks? Quote
04-05-2018 , 04:00 PM
Quote:
Originally Posted by donfairplay
Regardless of the name, turning on autopilot for highway driving should perform well enough to not slam into lane dividers.

https://arstechnica.com/cars/2018/04...viders-poorly/
No ****, right? Not to deranged Musk fanboys though. It says it all when you have Spurious and SenorKeed in one corner.
TSLA showing cracks? Quote
04-05-2018 , 04:14 PM
In other news, Tesla at $305 now after being at $260 a few days ago.

Quote:
Originally Posted by ToothSayer
Keybanc analysts said it well:
Quote:
This will drive a relief rally
This is how the street will view this result.
Quote:
Originally Posted by ToothSayer
That is the carrot now and Musk has spun this well enough that the whole situation is bullish to enough of his investors (and scares enough of the shorts) that the bias is net bullish.
Quote:
Originally Posted by ToothSayer
Quarter was a total disaster looked at objectively, but spun beautifully both in the statement and the prior PR/leaks. I sold puts yesterday and am holding them. The carrot is firmly back in front of the donkey. Only real risk is the market imo, or some (unlikely in the time frame) bad news.
Nice short squeeze and could even have a little more to run if there isn't bad news, $320 isn't out of play. Market helped a little but this 16% run is mostly bullish bias + short covering.

When the biggest bear is selling puts on a terrible quarter, you don't stay short. I would look to reshort/buy puts up here if any bad news comes out or the market turns, but mainly for a scalp. The carrot is in front of the donkey again and the market run off lows was the perfect timing for worsening the short squeeze.
TSLA showing cracks? Quote

      
m