Open Side Menu Go to the Top
Register
TSLA showing cracks? TSLA showing cracks?

06-30-2016 , 07:01 AM
up 11% since monday's low... lol.
TSLA showing cracks? Quote
06-30-2016 , 04:56 PM
http://www.autonews.com/article/2016...crash-in-tesla

Surprised this isn't getting more attention.
TSLA showing cracks? Quote
06-30-2016 , 05:26 PM
Tesla response feels kind of cold
TSLA showing cracks? Quote
06-30-2016 , 06:10 PM
Brexiters' advisers should remind Tesla that facts dont work.
TSLA showing cracks? Quote
06-30-2016 , 06:17 PM
Quote:
Originally Posted by Riverman
http://www.autonews.com/article/2016...crash-in-tesla

Surprised this isn't getting more attention.
It is getting a lot of attention, but mostly to improve trailer safety:
TSLA showing cracks? Quote
06-30-2016 , 08:34 PM
Quote:
Originally Posted by Riverman
http://www.autonews.com/article/2016...crash-in-tesla

Surprised this isn't getting more attention.
Tesla's safety systems/"autonomous" driving are a sad joke. We've seen multiple times that it fails at the most basic safety tests (such as hitting the brakes when there's a huge object in the way). Both the software and hardware is clownishly inadequate at even basic safety procedures/doing the basic stuff it should do - like follow the highway instead of taking off ramps, not swerve toward cars, reliable avoid obstacles in autopilot). Good to see some scrutiny on this, though it's sad that someone had to die for Musk's foolishness.

The response is terrible and makes the company look bad/Musk look completely delusional/self interested (which he is). The apparent lack of disclosure is even more troubling - the crash happened a month ago.

Should be good for more downside tomorrow. "At least braking when a massive obstacle enters your field of view" is the most basic solved safety problem that Tesla can't even do (this isn't the first instance of the clownish inability of its software/hardware to perform its basic functions, merely the first known fatal one).

If Tesla's systems worked properly or were even basically correct, this wouldn't matter so much; the fact that they don't is going to bring a lot of bad scrutiny on the company/Musk's cowboy "beta" software (he's going to regret that word when the lawsuit hits).

Last edited by ToothSayer; 06-30-2016 at 08:40 PM.
TSLA showing cracks? Quote
06-30-2016 , 09:05 PM
Tesla does have a real problem here - I've documented it earlier in this thread - but here's another example:
Quote:
Major Blind Spot with Tesla’s Autopilot

...I then noticed my car accelerating as it started to keep pace with the motorcycle [this is Tesla's automatic following of another car].

I looked up and saw the car in front of me nearly stopped while I was still accelerating. I had clearly looked at the display for a second too long and the autopilot software didn’t at all account for the stopped vehicle in front of me.

I slammed on the brakes, recognized the imminent impact and swerved into the right lane only clipping the car in front. Thankfully the other lane was clear. There was minimal damage and no injuries which is just incredible for highway speeds and the amount of traffic on the road.
There are lots of examples like this of Tesla becoming blind when following another car and missing OBVIOUS obstacles like other cars when in follow mode. Clearly, as I've stated earlier, the software/hardware is deeply flawed and they haven't fixed the most basic of bugs in it that other car makers have solved years ago. Now that someone has died and this is being properly investigated, this will get very ugly for Tesla.

The important thing here is how clownishly inadequate this software is, and that it demonstrates how incompetent the team are. This problem, other ones like following off ramps on the highway - sometimes at too high a speed - rather than going straight, are examples of horrifically bad software it was simply indefensible to put out, and show cowboyism/lack of the most basic competence in their autopilot department.

Last edited by ToothSayer; 06-30-2016 at 09:12 PM.
TSLA showing cracks? Quote
06-30-2016 , 11:02 PM
How many miles do you think Autopilot has put in around the world? What is its the fatality/accident rate per 100k miles? How do you think that fares vs human drivers?
TSLA showing cracks? Quote
07-01-2016 , 05:34 AM
Quote:
Originally Posted by NxtWrldChamp
How many miles do you think Autopilot has put in around the world? What is its the fatality/accident rate per 100k miles? How do you think that fares vs human drivers?
Apples to oranges. It probably is worse than humans who aren't impaired.
TSLA showing cracks? Quote
07-01-2016 , 05:52 AM
Quote:
Originally Posted by NxtWrldChamp
What is its the fatality/accident rate per 100k miles?
We don't have enough data for this, or even to create a reasonable bound.

Quote:
How do you think that fares vs human drivers?
Probably worse than someone without autopilot engaged, given how "autopilot" takes drivers' eyes/attention off the road. Probably better than an impaired driver or regularly distracted drivers sans autopilot. Probably far worse than a system without "autopilot" - especially autopilot marketed as Musk as irresponsibly marketed it - but with an object safety detection braking feature that works properly, such as on some Mercedes.

Anyway, you're missing the picture. A Tesla car, with autopilot engaged, ran into a giant truck at full speed with plenty of warning without even attempting to brake (nice safety auto breaking system?). It decapitated the driver and continued on for another couple of hundred yards.

Tesla itself claims the truck was missed was because it was white/lots of light (a problem with camera quality/resolution - hi Mikhel - notice the edge case that requires higher resolution/better camera electronics?), essentially admitting that this was an autopilot flaw.

It's the first autonomous driving fatality in history, which will drive headlines and discussion of Tesla's obvious gaping flaws/cowboy attitude to safety.

That's assuming some of Tesla's many other fatalities weren't due to autopilot - I'd say we don't know, given that it took a month and the start of a formal NHTSA investigation before Tesla even disclosed its first autopilot-caused decapitation.
TSLA showing cracks? Quote
07-01-2016 , 06:09 AM
Quote:
Originally Posted by Subfallen
But until this is empirically established, it's just an assumption.

Here, for example, is Tesla co-founder Marc Tarpenning answering a question about Tesla's perceived competitive advantage. He says he consulted for big car companies after leaving Tesla, and was shocked to the extent they had outsourced almost all engineering (except, crucially, for the engine).
To me this shows just how behind the curve/clueless Tesla are. Car manufacturing is a cutthroat business with cutthroat margins that require large efficiencies of scale to achieve breakeven, let alone profitability. Most engineering is outsourced because it's cheaper and better. It's the way of the world today and the only way to survive. And it's not just a car thing. Boeing for example imports hundreds of parts from all the over the world in an incredibly complex supplier network. This comes with obvious potential problems - reliability, transport issues, yet it's the only way the world's complex manufacturers work. The world is that way for a reason. No one engineers and builds in house.

There are lots of reasons for this (scale, efficiency, labor savings, localized know how/supplier networks, competition), but one that's easy to grasp is testing. It requires enormous resources and synergies - truly mind boggling - to design and test to a high level of reliability every part, as well as the factory components to build them. No one company can do that. Tesla so far has managed to sneak in the backdoor due to very low total volume and young age of its cars, as well as an enthusiastic and tolerant fan base. That's going to change as the 1 in 50,000 or "after three years" errors that require massive resources and know how to test for and eliminate start to pop up.
TSLA showing cracks? Quote
07-01-2016 , 06:17 AM
By the way, this is how incompetent Tesla's autopilot system is at avoiding even a very low speed crash:



This is just horrible software. NHTSA will no doubt be looking at all of this kind of stuff now that there's been a death. And the clowns in this thread claim that Tesla is ahead of everyone on autonomous driving. Just insanity.
TSLA showing cracks? Quote
07-01-2016 , 06:18 AM
Supposedly more than 200mn km driven before first fatality, compares to the average of 145mn km in the US.
TSLA showing cracks? Quote
07-01-2016 , 07:54 AM
Good to see TS still is totally wrong about the underlying technology in a company he purportedly trades. Radar is the obvious technology for this sort of detection.

Average miles driven with combined highway/city versus autopilot is nuts. They have virtually nothing in common and is an obvious statistical sleight of hand. Lets call a spade a spade. If Tesla is #1 in SDC game, they can still be run by a dishonest borderline delusional maniac.
TSLA showing cracks? Quote
07-01-2016 , 08:07 AM
Quote:
Originally Posted by Mihkel05
Average miles driven with combined highway/city versus autopilot is nuts. They have virtually nothing in common and is an obvious statistical sleight of hand. Lets call a spade a spade. If Tesla is #1 in SDC game, they can still be run by a dishonest borderline delusional maniac.
I agree, it's a statistical data point but make of it what you want.

There had to be the first SDC death in history and that was it. Others had predicted it to be the case in 2015, it happened "now".

At the end of the day, SDC are the future and while systems aren't there yet, you can't go without testing.

TS is hilarious with his arguments. He has no background in business and it shows time and time again. It's ridiculous what he claims. He doesn't understand the auto industry. I linked GM's disaster previously. It claimed way more lives than the autopilot will ever do per year.
TSLA showing cracks? Quote
07-01-2016 , 08:22 AM
Quote:
Originally Posted by Spurious
I agree, it's a statistical data point but make of it what you want.

There had to be the first SDC death in history and that was it. Others had predicted it to be the case in 2015, it happened "now".

At the end of the day, SDC are the future and while systems aren't there yet, you can't go without testing.

TS is hilarious with his arguments. He has no background in business and it shows time and time again. It's ridiculous what he claims. He doesn't understand the auto industry. I linked GM's disaster previously. It claimed way more lives than the autopilot will ever do per year.
He's def the most vocal, but there have been other equally insane things posted by many people.

Weirdly lots of people in this forum don't understand the underpinnings of many businesses (Google search algo being google's moat was my favorite among those).

Weirdly his record on guessing market moves seems positive (no idea if they are profitable trades since I've never seen any sort of documentation). Guess trading doesn't really require any knowledge of how a business operates, could be just psychological insight.

I honestly don't know why his unsubstantiated trolling is tolerated. He's repeated this insanity as self-evident repeatedly and its disruptive and incorrect.
TSLA showing cracks? Quote
07-01-2016 , 08:43 AM
Quote:
Originally Posted by Mihkel05
Good to see TS still is totally wrong about the underlying technology in a company he purportedly trades. Radar is the obvious technology for this sort of detection.
The problem with Radars is that either you get a lot of false positive, like VW which brakes when you drive over metal plates, or you get false negatives with vehicles with high ground clearance like in this case. It should be noted that Lidars (like Google are using) have less of this issue. While Lidar are not practical today, a lot of development is happening and soon Lidars might be cheap, small and robust. There is also a lot of progress happening with camera systems such as trifocal cameras and deep learning.
TSLA showing cracks? Quote
07-01-2016 , 11:15 AM
Looks like they lost about 2.5% after hours when the news broke and it's already recovered. Solar City acquisition confirmed worse than death.
TSLA showing cracks? Quote
07-01-2016 , 03:09 PM
So it appears I was infracted for my post stating that higher resolution cameras are not the solution to this particular problem. Have I gone totally crazy or hasn't this been covered repeatedly in both this thread and others?

Specifically detailing the usage of other technologies and the issues with processing that that are going on. (Coupled with the lack of high end cameras even being used. Hi 8k.)

Anyway, I thought this was like stating "sky is blue". Does anyone need a further explanation of this?
TSLA showing cracks? Quote
07-01-2016 , 03:25 PM
Quote:
Originally Posted by heltok
The problem with Radars is that either you get a lot of false positive, like VW which brakes when you drive over metal plates, or you get false negatives with vehicles with high ground clearance like in this case. It should be noted that Lidars (like Google are using) have less of this issue. While Lidar are not practical today, a lot of development is happening and soon Lidars might be cheap, small and robust. There is also a lot of progress happening with camera systems such as trifocal cameras and deep learning.
From my understanding Lidar is currently being used for lane detection and radar for object detection. You're more aware of the current milieu than I am. Is this being shifted to just pure lidar (or moving in that direction)?

As with everything, getting the sensors out and drawing a rich dataset are the most rudimentary steps to getting a functional algorithm out. Impossible to run a supervised algorithm without data.
TSLA showing cracks? Quote
07-01-2016 , 04:13 PM
Quote:
Originally Posted by ToothSayer
We don't have enough data for this, or even to create a reasonable bound.


Probably worse than someone without autopilot engaged, given how "autopilot" takes drivers' eyes/attention off the road. Probably better than an impaired driver or regularly distracted drivers sans autopilot. Probably far worse than a system without "autopilot" - especially autopilot marketed as Musk as irresponsibly marketed it - but with an object safety detection braking feature that works properly, such as on some Mercedes.

Anyway, you're missing the picture. A Tesla car, with autopilot engaged, ran into a giant truck at full speed with plenty of warning without even attempting to brake (nice safety auto breaking system?). It decapitated the driver and continued on for another couple of hundred yards.

Tesla itself claims the truck was missed was because it was white/lots of light (a problem with camera quality/resolution - hi Mikhel - notice the edge case that requires higher resolution/better camera electronics?), essentially admitting that this was an autopilot flaw.

It's the first autonomous driving fatality in history, which will drive headlines and discussion of Tesla's obvious gaping flaws/cowboy attitude to safety.

That's assuming some of Tesla's many other fatalities weren't due to autopilot - I'd say we don't know, given that it took a month and the start of a formal NHTSA investigation before Tesla even disclosed its first autopilot-caused decapitation.
How big of an impact do you think their autopilot issues could potentially have on their stock price? I would think it could be pretty big, especially if this issue continues to metastasize. Market seems to be more or less shrugging off the news so far though.
TSLA showing cracks? Quote
07-01-2016 , 05:45 PM
If anything it seems like the issue with the guy that died is due to Tesla's autopilot being good enough to make people complacent, reports suggest he may have been watching a dvd while driving (e.g., eyes completely off the road). And his youtube channel shows him being a big fan of the the autopilot system. Had he been watching the road he should have been able to take over and hit the breaks. I have a new jeep with adaptive cruise control and it is great and i use it all the time, but I'm very much aware that it isn't good enough for me to take my eyes off the road. I had one close call the other week where it didn't register the car in front due to a low sun on the horizon but since i was watching i was able to hit the breaks and avoid an accident.
TSLA showing cracks? Quote
07-01-2016 , 06:02 PM
Tesla slappies coming off terribly. The car ran straight into a ****ing semi, without braking at all. Further, it seems other companies are far, far ahead with this technology, and that releasing it was incredibly irresponsible. The legal risks here are astronomical.
TSLA showing cracks? Quote
07-02-2016 , 01:59 AM
With regards to this crash. It should be noted that it was not the Autopilot which failed, it was the AEB(Automatic Emergency Brake).

Quote:
Originally Posted by Mihkel05
From my understanding Lidar is currently being used for lane detection and radar for object detection. You're more aware of the current milieu than I am. Is this being shifted to just pure lidar (or moving in that direction)?

As with everything, getting the sensors out and drawing a rich dataset are the most rudimentary steps to getting a functional algorithm out. Impossible to run a supervised algorithm without data.
Lidar is mostly used for object detection in front of the vehicle.
Radars are mostly used for object detection around the car(also in the front).

Lane detection is mostly done by cameras.

The shift seems to be to use multiple sources for everything, so that if the camera fails the Lidar/Radar/etc should still work.

I really want this car!
TSLA showing cracks? Quote
07-02-2016 , 02:20 AM
Quote:
Originally Posted by Malachii
How big of an impact do you think their autopilot issues could potentially have on their stock price? I would think it could be pretty big, especially if this issue continues to metastasize. Market seems to be more or less shrugging off the news so far though.
I think today was a low volume pre holiday short covering - IB had zero borrows available and a 33% borrow rate, and the big fear selling seems to have abated last night, so the shorts were maxed out in a positive market which basically sets up a squeeze. This is actually what's driven a lot of Tesla's stock price.

I was on the wrong side and lost quite a bit of money today - the first time in a long time I've had a losing trade on Tesla.
Quote:
Originally Posted by Riverman
Tesla slappies coming off terribly. The car ran straight into a ****ing semi, without braking at all. Further, it seems other companies are far, far ahead with this technology, and that releasing it was incredibly irresponsible. The legal risks here are astronomical.
It's obviously a very big issue, more so because Tesla explicitly claimed - I think while trying to minimize the issue - that their sensors didn't even see the car because of light and color. In fact, it kept driving and driving even after the top was knocked off, hundreds of yards down the road, before finally being stopped by a pole. There's clearly a large flaw here and their emergency braking failed. Autopilot is simply unsafe to use and shouldn't be marketed as such. It's amazing there hasn't been a fatal accident yet, or a pedestrian killed (Tesla has a lot of trouble spotting them, which is pretty obvious if you realize it can't spot vans or massive trucks - the brakes were never applied).

What it does to the stock price depends on what the NHTSA says,I think, whether a lawsuit happens, or how long it is until the next person is killed (and if there are third party victims in the next one apart from the driver).
Quote:
Originally Posted by heltok
With regards to this crash. It should be noted that it was not the Autopilot which failed, it was the AEB(Automatic Emergency Brake).
Autopilot should be detecting distant obstacles and slowing down/warning the driver. That's a very basic function of autopilot, and other systems have long range sensors that do just that. It's absolutely irresponsible to have this technology and market it as they do while guaranteeing to slam into stationary objects or pedestrians even with plenty of warning, if something is a little out of the norm.

Legally, it's completely foreseeable that this tragedy would happen, particularly after the various videos of people monkeying around with the "autopilot", and various dangerous misses. To have autopilot functions marketed as they are without even long range radar/proper debugging opens them up to a world of liability and bad press. The other automakers have been far more cautious in what they release and market.
TSLA showing cracks? Quote

      
m