A vector for malware, enabled by generative AI (LLMs):
If you ask an LLM to write code for you, the resulting code may include the names of software packages that don’t exist.
In theory, that might not be a big deal. If a human tries to run that code, they’ll find that the packages don’t exist.
But a security researcher has now found that sometimes LLMs repeatedly make up the same names for nonexistent software packages. And he created and published a real software package with one of those recurring names.
And that package has now been downloaded over 15,000 times.
The real package didn’t contain malware, but the researcher’s point is that it could have.
So if you’re a software developer, and you’re using code written by an LLM, maybe check that all of the dependencies that it tells you to rely on are legitimate.
Leave a Reply