SSW Foursquare

GPT - Do you use the system prompt?

Last updated by Jack Reimers [SSW] 8 months ago.See history

When you're building a custom AI application using a GPT API you'll probably want the model to respond in a way that fits your application or company. You can achieve this using the system prompt.

What is the system prompt?

Requests to and from a GPT API generally have three types of messages, also known as roles or prompts.

User messages are any messages that your application has sent to the model.

Assistant messages are any messages that the model has sent back to your application.

The system prompt is sent with every request to the API and instructs the model how it should respond to each request. When we don't set a system prompt the user can tell the model to act however they would like it to:

without system prompt
Figure: GPT's responses without a system prompt

with system prompt
Figure: Responses with a system prompt

Depending on the model you're using, you may need to be more firm with your system prompt for the model to listen. Test your prompt using OpenAI's Playground before deploying.

For more information on system prompts, see OpenAI's documentation, or use their playground to start testing your own!

We open source. Powered by GitHub