Skip to main content

Prompt Node

Prompt Node Screenshot

Overview

The Prompt Node is used to create a chat message, which is a string of text with an attached "type" indicating who sent the message (User, Assistant, System) and optionally an attached "name".

The Prompt Node also provides the same interpolation capabilities as a Text Node, allowing you to dynamically insert values into the message.

This node can also compute a token count for the generated chat message, which can be useful for things like switching the LLM used based on the size of a message.

A useful pattern is to use the default message {{input}} to convert any text into a prompt message.

Inputs

TitleData TypeDescriptionDefault ValueNotes
Function CallobjectAn optional input that can be used to attach a function call to the chat message.(empty)This input is only available if Enable Function Call is enabled.
TypestringThe type of the chat message. This input is only available if Use Type Input is enabled.(empty)The input will be coerced into a string if it is not a string.
NamestringThe name to attach to the chat message. This input is only available if Use Name Input is enabled.(empty)The input will be coerced into a string if it is not a string.
(custom names)stringThe values to be interpolated into the prompt text. The names of these inputs are dynamically generated based on the prompt text.(empty)The input will be coerced into a string if it is not a string. Each input creates a corresponding input port on the node.

Example 1: Generate a chat message with interpolation

  1. Create a Prompt Node.
  2. Set the Type to user.
  3. Set the Prompt Text to Hello, {{name}}!.
  4. Create a Text Node and set the text to John Doe.
  5. Connect the Text Node to the name input of the Prompt Node.
  6. Run the graph. The Output of the Prompt Node should be a chat message with the type user and the message Hello, John Doe!.

Prompt Node Example 1

Example 2: Convert an LLM response into an Assistant message

  1. Create a Prompt Node. Leave the content as the default {{input}}. Set the Type to assistant.
  2. Create a Chat Node and connect its Output to the input of the Prompt Node.
  3. Give the LLM a prompt text and run the graph. You should see the LLM response as an Assistant message in the Prompt Node.

Prompt Node Example 2

Error Handling

The Prompt Node will error if the Prompt Text is not provided or if the Type is not one of the allowed types (system, user, assistant, function).

FAQ

Q: Can I use the Prompt Node to generate a chat message with a function call?

A: Yes, you can use the Function Call input to attach a function call to the chat message. This is useful for simulating function calls that the LLM has executed in the past. See the GPT Function node for more information.

See Also