Marketers are promoting AI-assisted developer tools as an essential workhole for today’s software engineers. For example, developer platform GitLab claims that the duo’s chatbots can “generate a to-do list immediately” that eliminates the burden of “walking through the water through weeks of commitment.” What these companies don’t say is that these tools are easily fooled by temperament, if not by default, by malicious actors to carry out hostile actions towards their users.
On Thursday, researchers at security firm Regain demonstrated an attack that led the duo to insert malicious code into scripts they were instructed to write. Attacks can also leak private code and sensitive issue data, including details about zero-day vulnerabilities. All you need to do is instruct the chatbot to interact with merge requests from external sources or similar content.
AI Assistant Double-edged Blade
Of course, the mechanism that triggers an attack is a rapid injection. Among the most common forms of chatbot exploits, rapid injection is embedded in content. The chatbot will be asked to interact with emails to answer, calendars to consult with, and web pages to summarise. Large language model-based assistants are keen to follow instructions to receive orders from almost anywhere, including sources that malicious actors can control.
The attacks targeting the duo came from a variety of resources commonly used by developers. Examples include merge requests, commits, bug descriptions and comments, and source code. Researchers have demonstrated how instructions embedded in these sources can lead to misleading duoes.
“This vulnerability highlights the double-edged nature of AI assistants like the GitLab duo. When deeply integrated into the development workflow, it inherits risk, not just the context, but also the risk.” “By incorporating hidden instructions in seemingly harmless project content, we were able to manipulate the duo’s behavior, remove private source code, and demonstrate how AI responses can be exploited for unintended, harmful outcomes.”