As I was saying [0] you need at least a fail safe mechanism to override the autonomous system in case it fails, especially if your life is at risk.
The same is true for coding agents. It does not mean you don't need to look at the code any more than you do not need to look at the roads with autopilot just because the AI will do it all for you.
So let this glitch be a lesson for what will come later.
The same is true for coding agents. It does not mean you don't need to look at the code any more than you do not need to look at the roads with autopilot just because the AI will do it all for you.
So let this glitch be a lesson for what will come later.
[0] https://news.ycombinator.com/item?id=48110144