Wouldn’t it have been your job to find out before you fill the forum with FUD?
FSD drives about 15 million miles every day, so one would imagine that clear and continuous problems with motorcycles would have been in the headlines a bit more than just 5 cases?
I’m not saying that Tesla’s doors couldn’t be improved, but the collision doesn’t necessarily have to be even significant for the door to be so crumpled that it can only be opened with firefighter’s shears, regardless of the car.
Interesting that Teslas sold in Norway in November about the same amount as in France, Sweden, Denmark, Spain, and the Netherlands combined.
Monthly registrations, a proxy for sales, slumped by 58% in France to 1,593 vehicles sold, by 59% to 1,466 cars in Sweden, by 49% to 534 cars in Denmark, by 44% in the Netherlands to 1,627, and by 9% in Spain to 1,523, official data showed.
But in Norway, they almost trebled to 6,215 cars, beating the country’s annual sales record with one month to spare.
Its overall market share in the continent was down to 1.6% in January-October, from 2.4% in the same period last year
The “sources” you shared were so out of touch that they didn’t even understand the difference between Autopilot and FSD.
We have data that the latest version of FSD works well
We DO NOT have data that the latest version of FSD doesn’t recognize motorcycles
In fact, even the news article you linked doesn’t claim that the latest FSD doesn’t recognize motorcycles. Did you apparently make that up yourself? If you bother to look at the sources of the linked news, it becomes clear that the accidents referred to involve Tesla’s Autopilot. Usually cases that are years old.
We have data on the latest version, both that it works well and that it makes stupid mistakes. You can find these by simply using Google; I’m not going to link videos all over the wall.
Neither of us knows whether it recognizes a motorcycle or not, because I couldn’t find a single video where a motorcycle influences the car’s decisions. As I understand it, there’s a 50% chance here that it recognizes it or doesn’t recognize it. At least in some conditions. And it needs to recognize it in all situations.
Fortunately, it’s now less than 27 days until the safety driver is removed from Austin and cars drive around haphazardly there . Of course, there’s also the possibility that the can will be kicked down the road a bit, or that things will start to get messy. Or that a breakthrough has been made and soon cars will be driving around haphazardly here too.
So I have a strong feeling that I know more people working at Tesla than you do. And the fact that you share either a Tesla-reported X publication or some random X publication here doesn’t make things true. The real stories don’t all end up on the internet.
Perhaps we will both become smarter about this quickly. Once permits start dropping in different countries, and the devices work. There is also the possibility of a complete disaster here.
But you can always sacrifice a few good intentions at the altar
The stock has been quite stable now; people still have faith and hope. Car sales seem to be a bit difficult.
Dig up old Autopilot data and claim based on it that current FSD cannot work.
Add erroneous claims of your own that are not even mentioned in the linked article.
When the claims prove to be erroneous, change the subject and resort to emotional rhetoric.
At this point, the argumentation has gotten out of hand.
Lack of evidence is not evidence of a problem.
If FSD genuinely had problems with motorcycles, the internet would be full of videos about them – just like all other FSD bugs.
When nothing is found, it doesn’t mean a 50% chance of a problem. It means there is no evidence of a problem.
Everyone can see the tremendous development in the software, especially within the last couple of years. Many who have followed FSD’s development believe that it is approaching unsupervised capability.
I would like to ask the bears on this forum what is the technical challenge preventing Tesla from reaching this level? Is it still the lack of lidar or something else? Preferably with justifications and example videos of situations that the car cannot handle and, in your opinion, will not be able to handle in the future with the current hardware.
I do believe that Tesla will eventually get rid of safety drivers and be able to run “unsupervised” robotaxis. But I remain skeptical about whether it will become a profitably scalable business for anyone for a long time to come.
Furthermore, I believe that the journey from an unsupervised robotaxi to a consumer-available unsupervised FSD is still long. That is, to a point where I could buy a Tesla and drive it without having to constantly monitor traffic myself and be ready to take control. If we look at the table (created by a tesla-bull) in the quoted message, green might be enough for a robotaxi, but for the consumer version, almost all points need to be blue (perfect). The journey from nine to ten can be long.
I base this on the fact that a robotaxi operates in a predetermined area where the system has been tested and is known to function with sufficient certainty. If roadworks or other special situations arise that prove challenging for the robot, those road sections can be blocked and rerouted via an easier path. Additionally, in the event of accidents, responsibility naturally lies with the service provider, so as long as the accident risk is lower than that of a human, errors leading to accidents occasionally are acceptable.
FSD intended for consumer use must work everywhere and in almost all exceptional situations. Liability issues are problematic, as Tesla is unlikely to want to take responsibility for all accidents caused by FSD (even if there were fewer of them than accidents caused by humans).
The journey from 99% to 99.9999% can be surprisingly long.
I guess it will start working with HW5. Just like it will for others in a few years. What will be the liability for all HW4 owners if it is decided that new cameras and a more powerful computer are needed?
Financial liabilities are easy to handle with insurance. The insurance premium doesn’t even need to increase if FSD definitely causes fewer “oopsie” fender benders than a human. However, if an FSD car runs over a pedestrian on a crosswalk, the question of liability becomes problematic.
I really liked the way my friend tracked and visualized the progress. Clearly, progress has been made. But how much further is there to go?
I added an important observation related to this to one image:
Perhaps that progress shouldn’t be drawn linearly. I added an arc in Paint following the points, and what was revealed? Something that has been brought up many times here as well, namely that the difficulty of progression increases from 99.999% → 99.9999% etc.
The actual percentages here are speculative, but exponential progress is possible. This is helped by the fact that FSD data is constantly increasing, and Tesla can also train edge cases with simulation – simulation alone is not a Tesla moat; Xiaomi also has its own SimScale program for autonomy. Every FSD diss on YouTube, robotaxi crash, and other reporting is certainly reviewed in simulation.
FSD data and sales are also helped by the fact that US/Canada HW4 cars received a 40-day free trial as a Christmas gift. When 1.5 million more cars, with a snap of the fingers, record their FSD kilometers and interventions into Tesla’s machine learning, what other car manufacturer has a similar opportunity to get one more 9 in the decimal?
For now, it seems that if Tesla removes safety drivers before the end of the year, it will happen on a very small scale. I would imagine that if the data looked good, Tesla would have expanded more significantly in Austin. It’s difficult to draw conclusions about how close self-driving is based on the data available so far. FSD has certainly developed, but there is still no evidence whether lidar-less self-driving (without casualties when scaled up widely) is a month or ten years away, or if it’s even possible.
Exponential progression is a very rare phenomenon in the real world. It is mainly found in explosives or rabbit populations on a deserted island, and the result is not a stable situation. Instead, the logarithmic curve presented by @Construct-Destruct is much more common. Reaching the limit (perfection) in it requires infinite time. On the other hand, FSD does not need to be perfect; the key question is when it is good enough. We can only speculate about that for now.
I don’t know what version is being run, but we discussed whether it notices motorcycles, well, at least it doesn’t notice everything that’s on the road.
It happened already in September and thus was not using v14 but 13.9.x.
Cameras always have their limitations, of course, now the conditions were ideal, and camera technology hasn’t changed in Teslas for a couple of years, they claim that the upcoming HW5 hardware will get 20mpix cameras instead of the current HW4 hardware’s 5mpix cameras, which will probably help.
Meanwhile, elsewhere: Tesla’s major owner, Elon Musk, has clearly become active in politics again. Most recently this past weekend, he incited European nations to revolution against the EU on his X-media platform.
Russia’s former prime minister, Dmitry Medvedev, known for his aggressive comments towards the West, expressed support for Musk’s message.
A large number of Europeans are bewildered by these political overtures from Musk.
How long will Europeans continue to buy anything that benefits Elon Musk – such as Teslas?