I'm a professional developer and have tested AI tools extensively over the last few years as they develop. The economic implications of the advancements made over the last few months are simply impossible to ignore. The tools aren't perfect, and you certainly need to structure their use around their strengths and weaknesses, but assigned to the right tasks they can be 10% or less of the cost with better results. I've yet to have a project where I've used them and they didn't need an experienced engineer to jump in and research an obscure or complex bug, have a dumb architectural choice rejected, or verify if stuff actually works (they like reporting success when they shouldn't), but again the economics; the dev can be doing other stuff 90% of the time.
Don't get me wrong, on the current trajectory this tech would probably lead to deeply terrible socioeconomic outcomes, probably techno neofeudalism, but for an individual developer putting food on the table I don't see it as much of a choice. It's like the industrial revolution again, but for cognitive work.
I keep hearing stuff like this, but I haven't found a good use or workflow for AI (other than occasional chatbot sessions). Regular autocomplete is more accurate (no hallucinations) and faster than AI suggestions (especially accounting for needing to constantly review the suggestions for correctness). I guess stuff like Cursor is OK at making one-off tools on very small code-bases, but hits a brick-wall when the code base gets too big. Then you're left with a bunch of unmaintainable code you're not very familiar with and you would to spend a lot of time trying to fix yourself. Dunno if I'm doing something wrong or what.
I guess what I'm saying is that using AI can speed you up to a point while the project accumulates massive amounts of technical debt, and when you take into account all the refactoring and debugging time, it results in taking longer to produce a buggier project. At least, in my experience.
That's perfect for higher ups. They don't care if what you release has bugs as long as you work on them when they pop up, they consider that part of your job. They want a result quickly and will accept 85% if it moves the needle forward.
These people don't care about technical debt, they don't care about exploits until it happens to them, then it's how bad and how long to fix. No one cares about doxxes anymore, it's just the cost of doing business. Like recalls.
This is perfect for CEOs and billionaires because they don't care how something is done at a 35,000 foot view, they just want it now. AI is a nightmare of exploits that haven't even begun to be discovered yet. Things that will be easily exploitable, especially by other algorithms.
Coders are just as effected by supply and demand, and the demand is for AI products.
Hmm, a lot of my career was done doing embedded programming, where mistakes in production are very costly, and software/hardware has to be released with basically zero bugs, so that may be where the disconnect is. I still think bugs and technical debt are costly elsewhere too if the product is going to have a long lifecycle, but executives are just dumb.
I'm a professional developer and have tested AI tools extensively over the last few years as they develop. The economic implications of the advancements made over the last few months are simply impossible to ignore. The tools aren't perfect, and you certainly need to structure their use around their strengths and weaknesses, but assigned to the right tasks they can be 10% or less of the cost with better results. I've yet to have a project where I've used them and they didn't need an experienced engineer to jump in and research an obscure or complex bug, have a dumb architectural choice rejected, or verify if stuff actually works (they like reporting success when they shouldn't), but again the economics; the dev can be doing other stuff 90% of the time.
Don't get me wrong, on the current trajectory this tech would probably lead to deeply terrible socioeconomic outcomes, probably techno neofeudalism, but for an individual developer putting food on the table I don't see it as much of a choice. It's like the industrial revolution again, but for cognitive work.
I keep hearing stuff like this, but I haven't found a good use or workflow for AI (other than occasional chatbot sessions). Regular autocomplete is more accurate (no hallucinations) and faster than AI suggestions (especially accounting for needing to constantly review the suggestions for correctness). I guess stuff like Cursor is OK at making one-off tools on very small code-bases, but hits a brick-wall when the code base gets too big. Then you're left with a bunch of unmaintainable code you're not very familiar with and you would to spend a lot of time trying to fix yourself. Dunno if I'm doing something wrong or what.
I guess what I'm saying is that using AI can speed you up to a point while the project accumulates massive amounts of technical debt, and when you take into account all the refactoring and debugging time, it results in taking longer to produce a buggier project. At least, in my experience.
That's perfect for higher ups. They don't care if what you release has bugs as long as you work on them when they pop up, they consider that part of your job. They want a result quickly and will accept 85% if it moves the needle forward.
These people don't care about technical debt, they don't care about exploits until it happens to them, then it's how bad and how long to fix. No one cares about doxxes anymore, it's just the cost of doing business. Like recalls.
This is perfect for CEOs and billionaires because they don't care how something is done at a 35,000 foot view, they just want it now. AI is a nightmare of exploits that haven't even begun to be discovered yet. Things that will be easily exploitable, especially by other algorithms.
Coders are just as effected by supply and demand, and the demand is for AI products.
Hmm, a lot of my career was done doing embedded programming, where mistakes in production are very costly, and software/hardware has to be released with basically zero bugs, so that may be where the disconnect is. I still think bugs and technical debt are costly elsewhere too if the product is going to have a long lifecycle, but executives are just dumb.
There's been an unbalancing of top down power, especially in venture capital, we will pay for these decisions down the line.