The lesson is that if you’re too big to fail no laws apply to you and there unlimited money to be made.
It has been learned very well.
The brazen violation of intellectual property was a precondition of making this technology useful. Taking the risk of breaking the law at this unprecedented scale was an informed decision made based on this very lesson.
Also the replacement of craftsmanship with mass produced lower quality output possible for workers with less training and partial understanding to produce.
Processes are more efficient, machines are faster, workers are easily replaceable. The quality and complexity of the product is limited by these requirements.
> If we generate so much code using AI that no one is really looking or reading the code anymore, just verifying end functionality, we can really just skip all that and go straight to assembler, no?
We could also just autogenerate the content of our websites, emails, contracts.
And we do, resulting in mountains of slop, varying from soulless to wildly incorrect.
Code is a precise way to describe intent. Using LLMs make up some of the intent results in the author not knowing what the precise functionality of the resulting code is.
The companies selling LLM services present this as magic which will magically do what the author wants it to do, without even the author themselves knowing or defining it.
In reality it is simply ignorance and lies.
Sorry we can’t wishful think good working software into existence.
It has been learned very well.
The brazen violation of intellectual property was a precondition of making this technology useful. Taking the risk of breaking the law at this unprecedented scale was an informed decision made based on this very lesson.
reply