Search for a command to run...
LLM Guard is a security toolkit for LLM interactions, protecting against harmful language and data leakage.