Hacker Newsnew | past | comments | ask | show | jobs | submit | gibolt's commentslogin

Having been through SOC2, it doesn't mean a company is rock solid, but it definitely makes the company button up loose ends, if taken seriously.

Everyone has access to the same models. Even the best internal builds are only a month away from public access.

The ones a year from now from all companies will likely be better than the best today.


Hyperloop is the only thing you listed that is accurate, although it was only a whitepaper + competition. It was open for others to pursue.

Tesla easily has the best vehicle software + OTA and has since the S in 2012. It still feels better than most new vehicles.

You can buy a Tesla (including Cybertruck) today that will do 95+% of drives with 0 intervention. It may not be 100% autonomous yet, but there isn't anything obvious limiting the last step.

The robots exist but are still being developed. Within 5 years, it is hard to imagine them not becoming super valuable within factory settings.



If you think Tesla is bad, you should look into GM or Ford.

There have been many accusations about sudden accelleration, but except for the Cybertruck's pedal-cover slide, there has never been a proven case of a Tesla autonomously accellerating into a crash. But these accusations come a lot, because people are always wanting to shift the blame away from themselves and the automaker seems like an easy target.


If you think Starship is behind, look at the 'competition'.

Learnings per flight may not be maximal, but they are measured with enough risk so that bureaucrats will approve it (not restrict future launches) and other countries won't be impacted by a failure.


It isn't monocular though. A Tesla has 2 front-facing cameras, narrow and wide-angle. Beyond that, it is only neural nets at this point, so depth estimation isn't directly used; it is likely part of the neural net, but only the useful distilled elements.


I never said it was. I was using it as a lower bound for what was possible.


Lidar fails worse than cameras in nearly all those conditions. There are plenty of videos of Tesla's vision-only approach seeing obstacles far before a human possibly could in all those conditions on real customer cars. Many are on the old hardware with far worse cameras


Interesting, got any links? Sounds completely unbelievable, eyes are far superior to the shitty cameras Tesla has on their cars.


There's a misconception that what people see and what the camera sees is similar. Not true at all. One day when it's raining or foggy, have some record the driving, through the windshield. You'll be very surprised. Even what the camera displays on the screen isn't what it's actually "seeing".


Yea.. not holding my breath on links to superman tesla cameras performing better than eyes


This is the key. The known instances I've seen are very minor taps / fender benders. Not great, but not fatal accidents


Cost to manufacture is likely around $25k


The project looks super cool, but the idea of wearing a sharp screen that close to my eye on a bike could be one reason it didn't sell.


I wouldn't really have any reservations with the glasses insert in, I think it would protect well enough.


I found this yesterday https://minimis.life/ it also has cycling in mind based off their description


It is, for the purpose of this test. Don't want it coming back down on land somewhere unexpected :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: