This is nonsense. The human eye has a much, much higher dynamic range than any camera sensor.
It's not that much higher but sure. But the relevant metric is the performance on task such as finding edges, objects etc, doing it fast, doing it reliably in various conditions. Do you have an example of a scenario where the human eye performs better(than recent FSD say last 1 year) thanks to its dynamic range?
I am trying to find some example, this is old and not great, but shows that the cameras did at least not fail at low/high light:
Remember, this discussion started with this statement:
Quote:
I mentioned one example: sunlight. I don't think it's possible to implement self driving with cameras only.
Maybe the human eye is better at some specific task, but humans have terrible attention. If the car can see reasonable well and have good attention it might still be many times safer than the average human even with slightly worse cameras.
The debate if camera only are possible or not was a fun one before there was an actual product you could buy, with hundreds of thousands of people trying it and recording videos of it. Sure the software might not be perfect, but if there was some hardware limitation it seems it should have been found now and speculating about X being the failure case seems a bit silly without a single example to show.
I'm not hyper-focused on sunlight. You asked for an example of a problem for a camera only system, I gave one. Subaru Eyesight has problems with sunlight, fog on the glass, and heavy precipitation. All these cases are solved with lidar.
Regarding the human eye talk, I'm sure software makes cameras superior in most ways.
I'm not hyper-focused on sunlight. You asked for an example of a problem for a camera only system, I gave one. Subaru Eyesight has problems with sunlight, fog on the glass, and heavy precipitation. All these cases are solved with lidar.
Regarding the human eye talk, I'm sure software makes cameras superior in most ways.
Ok, but you think camera only is impossible?
Imo for the cases you listed it is better to use a frequency that penetrates water better. For example Radar. Lidar typically use near infra red light which doesn't penetrate water. There are lidars that use water penetrating frequencies but they are usually limited to military application. See this video: https://youtu.be/O6zhJglz_Eg?t=46
When I drive Tesla with Autopilot in heavy rain I feel that it drives safer than I can do. In heavy fog I am not sure if anyone can drive safely, cameras should perform near equally bad as humans. Radar helps a lot with object detection, but not really for reading signs etc. I think the robotaxi-solution is to not drive in heavy fog.
Imo for the cases you listed it is better to use a frequency that penetrates water better. For example Radar. Lidar typically use near infra red light which doesn't penetrate water. There are lidars that use water penetrating frequencies but they are usually limited to military application. See this video: https://youtu.be/O6zhJglz_Eg?t=46
When I drive Tesla with Autopilot in heavy rain I feel that it drives safer than I can do. In heavy fog I am not sure if anyone can drive safely, cameras should perform near equally bad as humans. Radar helps a lot with object detection, but not really for reading signs etc. I think the robotaxi-solution is to not drive in heavy fog.
Lidar or another sensor other than camera. I'm sorry but I haven't watched any videos.
Not heavy fog. Condensation on the lens glass. Like dew. Hundreds of taxis operate around the clock in SF. None are using camera only solution.
I don't think TSLA has another option for consumers than cameras as they wouldn't be able to profit. They already require hardware upgrades for this feature.
I don't think TSLA has another option for consumers than cameras as they wouldn't be able to profit. They already require hardware upgrades for this feature.
Tesla used to have Sonar and Radar and could easily add a Lidar if they wanted. Some newer Teslas have a new radar, it's a bit unclear what's the long term strategy is.
The cost of sensors is pretty low compared to the value of robotaxi, at least except Velodyne or RTK GPS which you generally only do for reference or for development. The problem with adding more sensors is that they add complexity(permutations, supply chain, datasets), add failure modes(mechanical sensors tend can have issues) and you have to find a way to do sensor fusion which can complicate things and add false positives and/or false negatives. I did my master in sensor fusion and Lidars and used to be pretty sold on solid state Lidars. I have recently changed my mind seeing how much information neural networks can get from images if you have the right data and now believe cameras could end up winning thanks to their data advantage. We will see which technology wins.
Tesla made a deal with Samsung for its new optics contracts.
Makes sense with a Gigafactory and Samsung production plant in Austin.
Unfortunately, camera only option will never be viable.
A sticker, used on a children's incentive chart, could easily negate the whole system.
Rock chips.
Haze forming on lenses from environmental pollutants similar to what happens to headlight lenses.
Tar from fresh asphalt.
Paint spilling from the bed of a truck in front of you.
Insect viscera.
Hail damage.
I could continue, but Elon's probably scraping the web for troubleshooting tips because that's peak Elon. I mean, aren't subscribers paying to beta test his "FSD" product?
These issues would impact other methods of automated driving, sure. But short of every vehicle on the road having sensors that could communicate with one another in a hive mind system, automated driving, on a macro scale just isn't going to happen anytime soon. It's going to take full integration from sensors on vehicles, roadways, etc.
Unfortunately, camera only option will never be viable.
A sticker, used on a children's incentive chart, could easily negate the whole system.
Rock chips.
Haze forming on lenses from environmental pollutants similar to what happens to headlight lenses.
Tar from fresh asphalt.
Paint spilling from the bed of a truck in front of you.
Insect viscera.
Hail damage.
The way to deal with these issues is to have the neural network detect failures such as these and then require manual intervention and service to fix them. It should also be noted that lidar based systems suffer from the same issues, illustrated here:
The way to deal with these issues is to have the neural network detect failures such as these and then require manual intervention and service to fix them. It should also be noted that lidar based systems suffer from the same issues, illustrated here:
Its unlikely the lidar system is being disabled by the traffic cone. Much more likely they have extremely stringent safety protocols that halts the system when an object large enough is on the car.
As an aside, the *******s in that video are hilarious. "We weren't personally consulted about our opinion on this subject". People take a stand on the dumbest ****.
Its unlikely the lidar system is being disabled by the traffic cone. Much more likely they have extremely stringent safety protocols that halts the system when an object large enough is on the car.
As an aside, the *******s in that video are hilarious. "We weren't personally consulted about our opinion on this subject". People take a stand on the dumbest ****.
These issues would impact other methods of automated driving, sure. But short of every vehicle on the road having sensors that could communicate with one another in a hive mind system, automated driving, on a macro scale just isn't going to happen anytime soon. It's going to take full integration from sensors on vehicles, roadways, etc.
You skipped over that part?
I was already aware of the coning going on in San Fran.
market paying a premium for EV biz growth, energy, and chance of FSD/Teslabot success. i think we are fairly valued right here. 20%+ IRR for $3-4T 2030 outcome
3:39 AM · Sep 7, 2023
·
121.4K
Views
market paying a premium for EV biz growth, energy, and chance of FSD/Teslabot success. i think we are fairly valued right here. 20%+ IRR for $3-4T 2030 outcome ��
3:39 AM · Sep 7, 2023
·
121.4K
Views
When can investors expect a quarterly dividend payment?
When can investors expect a quarterly dividend payment?
Their cash balance sheet is starting to grow pretty large, however interest rates are high so it makes a bit less sense. They will need to build a few rather big factories in the near term: Nevada, Austin expansion for 25k/robotaxi/bot, Mexico, Shanghai expansion, Berlin phase 2 etc. My guess is that it will happen in the second half of next year, but lots of uncertainty on my guess.
Lol Morgan Stanley. They used to have a $0.66 price target, now they have a $400 price target and their top pick after they discovered Dojo. And the crazy thing is that this will probably move the market, even though they should have lost all credibility a long time ago...
they should probably have a $0.66 target right now. would make them look insanely smart in a few years. but looking smart doesn't pay for manhattan offices and year-end bonuses.
that upgrade means they are either going to raise capital very soon, elon is going to sell a huge amount of stock or maybe even both.
+10% seems like a bit of an overreaction. However imo the market has been sleeping on Dojo, V12 etc. The main benefit of Dojo will not be Dojo as a service etc as some believe, but that the speed of development will increase. This will be harder to observe and take some time to play out and we will not have anything to compare with, but that's why I am bullish on Dojo.
Quote:
Originally Posted by BooLoo
that upgrade means they are either going to raise capital very soon, elon is going to sell a huge amount of stock or maybe even both.
They don't need to raise capital and Elon will "definitely" not sell any stocks this year, probably not next year either.
they should probably have a $0.66 target right now. would make them look insanely smart in a few years. but looking smart doesn't pay for manhattan offices and year-end bonuses.
that upgrade means they are either going to raise capital very soon, elon is going to sell a huge amount of stock or maybe even both.
Great article tying Morgan Stanley's interest in seeing Elon's companies do well. Morgan Stanley owns 28 million shares of TSLA and gave Elon $3.5 billion to purchase Twitter.