Unfortunately, as nearly everyone knows, every LLM is susceptible to prompt injection.
Some people predict that prompt injection will always be a problem for LLMs. And if I can tell your LLM to do what I want it to do, suddenly your exposed 'search' API endpoint is incredibly valuable to me.
Which is why I propose that the mere existence of a public facing LLM on your site is incredibly dangerous [to you and your site].