The only person in my company using AI to code writes stuff with tons of memory leaks that require two experienced programmers to fix. (To be fair, I don’t think he included “don’t have memory leaks” in the prompt.)
The only person in my company using AI to code writes stuff with tons of memory leaks that require two experienced programmers to fix. (To be fair, I don’t think he included “don’t have memory leaks” in the prompt.)
No they don’t
Last I saw, exploding heads gave up on it because it was too difficult to use and had too many crypto bros on it. They’re trying to move to Gab now.
Also me when I’m forced to write documentation for a Python function
def delete_first_of_list(the_list: list):
Capitalism doesn’t require rational behavior; all it requires is a system that allows some people to make passive income by exploiting other people, based on the idea of private property. CEOs, for example, don’t care about what’s good for the business; they care about extracting as much value from the business and the workers as they can get away with before moving on to the next victim.
Are you able to share what kinds of applications and what languages you write in? I’m still trying to grasp why LLM programming assistants seem popular despite the flaws I see in them, so I’m trying to understand the cases where they do work.
For example, my colleague was writing CUDA code to simulate optical physics, so it’s possible that the LLM’s failure was due in part to the niche application and a language that is unforgiving of deviations from the one correct way of writing things.