Prompt Injection Tutorial

Protect against prompt injections with TrustAI Guard

Who needs TrustAI Guard?

Today, I will show you a common security risk of GenAI and the corresponding mitigation measures. If you are in the following roles, you may be interested in this document,

  1. GenAI Apps Developers

  2. GenAI App Compliance Regulators

  3. GenAI Capability Providers

What's the ChatBot GenAI App

Next, let's briefly explain what is ChatBot GenAI Apps. The ChatBot GenAI App is a software application or platform that leverages generative AI (GenAI) to create chatbots capable of understanding and responding to user input in a natural, conversational manner.

For example, a company like the New York Times publishes a large number of news and receives a lot of user visits every day. Users may want to be able to directly communicate with each news article through an AI QA assistant, rather than reading line by line on a small mobile phone screen, or even have the QA assistant summarize all the news of the day at once, so that they can quickly understand the latest news trends while waiting for the subway.

Online Demo

This notebook illustrates how easy it is to exploit LLM vulnerabilities via prompt injection and how TrustAI Guard can protect against them with one line of code.

Mateo gives a quick overview of the tutorial in the following video:

Last updated