My honest take on AI (and how I actually use it)
If you have been on the internet lately, you have definitely seen the hype. Everyone and their dog seems to be screaming about how AI is going to replace programmers or how you can build a massive SaaS business in five minutes without writing a single line of code.
It is absolute rubbish.
Don't get me wrong, the tech is impressive, but people are using it in the most dangerous way possible. I have spent a lot of time messing around with these tools, and I have a pretty specific way I handle them now.
The "Last Resort" rule
I do use AI, but I have a strict system. I am not opening ChatGPT the second I start a new file.
I only use it when I am completely stuck on something and physically cannot get it to work. I am talking about those moments where I have been staring at VS Code for three hours, the error logs are speaking a different language, and I am about to lose my mind.
That is when I use it. I use it to troubleshoot specific issues. I paste the error in and ask it what is going wrong.
But here is the massive catch. If I ask it for a fix, I never just copy and paste the code. I ask it to explain exactly what that code does and why it fixes the issue. If I can't understand the solution, I won't use it. The goal is to learn from it so I don't have to ask the computer next time. It has to be a learning tool, not a crutch.
Why building entire projects with it is a bad idea
This is where I see people messing up big time.
If you try to get AI to build an entire project for you, you are asking for trouble. It is fine for snippets, but for a whole architecture? Expect bugs. Expect weird logical issues that are impossible to trace.
The problem is that if you let it build the whole thing, you don't know how the system works. When it inevitably breaks (and it will), you won't be able to fix it because you didn't write it. You become completely dependent on the AI. You end up in this loop where you are asking the AI to fix a bug that the AI created, and it just hallucinates more bad code.
The security nightmare
There is also the technical side that people ignore. These models don't always have the latest info.
If you ask it to set up a Node.js backend, it might tell you to use a package that hasn't been updated since 2019. You could be installing libraries with critical security vulnerabilities without even realising it because the AI is confident that it is the right tool for the job.
Plus, you have to know what you are doing to even get a good result. If you don't know the technical terms or how to direct the AI, it isn't going to give you what you want. You need to know the subject matter to spot when it is talking nonsense.
Use it to get unstuck, but don't let it do the driving.
– Blake
February 14, 2026