If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash “less than humans”.
Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.
Noone but tesla is doing camera only “self driving” and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.
What are you? Some kind of lidar shill? Camera only should obviously be the endgame goal for all robots.
Also, this article is not even about camera only.
I’ve heard Elon Musk (or was it Karpathy?) talking about how camera should be sufficient for all scenarios because humans can do it on vision alone, but that’s poor reasoning IMO. Cars are not humans, so there’s no reason to confine them to the same limitations. If we want them to be safer and more capable than human drivers, one way to do that is by providing them with more information.
Because that’s expensive and can be done with a camera. And once you figure the camera stuff out - you gucci. Now you can do all kinds of shit without needing a lidar on every single robot.
My eyes are decent, but if I had a sixth sense that gave me full accurate 3D 360 spatial awareness regardless of visibility, I would probably not turn it off just to use my eyes. I’d use both.
If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash “less than humans”.
Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.
Noone but tesla is doing camera only “self driving” and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.
https://www.youtube.com/watch?v=Gm2x6CVIXiE
What are you? Some kind of lidar shill? Camera only should obviously be the endgame goal for all robots. Also, this article is not even about camera only.
I’ve heard Elon Musk (or was it Karpathy?) talking about how camera should be sufficient for all scenarios because humans can do it on vision alone, but that’s poor reasoning IMO. Cars are not humans, so there’s no reason to confine them to the same limitations. If we want them to be safer and more capable than human drivers, one way to do that is by providing them with more information.
We built things like Lidars and ultrasound because we want better than our eyes at depth and sight.
Obviously?
Why wouldn’t we want other types of sensors…?
Because that’s expensive and can be done with a camera. And once you figure the camera stuff out - you gucci. Now you can do all kinds of shit without needing a lidar on every single robot.
My eyes are decent, but if I had a sixth sense that gave me full accurate 3D 360 spatial awareness regardless of visibility, I would probably not turn it off just to use my eyes. I’d use both.
Idiot