@mathieui @rasterweb true but as LLMs are getting better at finding bugs they also get better at identifying bugs in own code as more of the current harnesses automate security reviews of written code
wojtekpow@mastodon.social
Posts
-
From a blog post by @wojtekpow -
From a blog post by @wojtekpow@rasterweb I don’t mean apps that add AI features in UX, I mean apps coded with the help of AI. Even the Linux kernel accepts AI code. I just don’t think you’ll be able to say that you don’t use any AI-written code in your life. To me, AI in code writing is simply another abstraction shift. We moved from machine language to Fortran, COBOL to C and C++ to Objective-C and Swift, and scripting languages like Python. AI in my daily coding life feels like another step in this chain.
-
From a blog post by @wojtekpow@rasterweb fair point, I’ll make a correction.
Separate from the article though, which talks about small bespoke apps. Everyone ends up at least using AI software in their daily life. Pretty much all of corporate software at this point has some AI-written code in it, and that percentage will only increase with time. -
LLMs are the future of software engineering.@mergesort
Personally, I really enjoy how LLMs allow me to bring my ideas to life, using capable models like Opus 4.6. I am able to build solutions for some real-life problems I encounter. As much as I understand the naysayers, I truly don’t think there’s any way back. In some ways, it’s like moving from writing all SW in assembly to all of a sudden having access to high-level object-oriented languages.