Injecting instructions into LLMs
Add some nicely formatted instructions that aren’t visible to humans to your webpage
Wait for LLMs to stumble on it
Profit
What if the page was a legit looking blog post about some technical topic and the LLM a coding agent running on a developer’s machine with access to the local environment?
What if this was a product’s website, and the user was asking to compare it to some competitors?
Given that this is something inherent to how LLMs work, I’m really curious to see how, if ever, we’ll solve this issue in the coming years.
Further reading