May 1, 2023 · 5 min read · 1,660 views
Integrating OpenAI's ChatGPT with Laravel can be an exciting venture, but it often necessitates the API to return structured data that fits seamlessly within your application's flow. This comprehensive tutorial is here to walk you through the process of crafting Prompt Templates and Output Parsers, ensuring that you receive responses in the precise format you require. Additionally, we'll delve into the response parsing techniques and demonstrate how to rigorously test your implementation with Pest PHP.
Before we start:
A huge thanks to the folks over at Langchain, this tutorial took heavy inspiration from their JS library.
1. Initial Setup
2. Creating Prompt Templates
3. Creating Output Parsers
4. Parsing the ChatGPT Response
5. Testing with Pest
Before we begin, you need to install the OpenAI Client and Guzzle HTTP by running the following commands:
composer require openai-php/client guzzle http/guzzle
In your config/services.php
, add the OpenAI API key:
<?php
return [
/*
|--------------------------------------------------------------------------
| Third Party Services
|--------------------------------------------------------------------------
|
| This file is for storing the credentials for third party services such
| as Mailgun, Postmark, AWS and more. This file provides the de facto
| location for this type of information, allowing packages to have
| a conventional file to locate the various service credentials.
|
*/
'mailgun' => [
'domain' => env('MAILGUN_DOMAIN'),
'secret' => env('MAILGUN_SECRET'),
'endpoint' => env('MAILGUN_ENDPOINT', 'api.mailgun.net'),
'scheme' => 'https',
],
'postmark' => [
'token' => env('POSTMARK_TOKEN'),
],
'ses' => [
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION', 'us-east-1'),
],
// OpenAI Key here:
'open_ai' => env('OPEN_AI_KEY'),
];
Prompt Templates can help encapsulate logic for interacting with the OpenAI Client. To create Prompt Template classes, set up the following directory structure: App/Prompts/
, and create a PHP class called PromptTemplate.php
. Inside this class, include the necessary methods as shown below.
<?php
namespace App\Prompts;
use Illuminate\Support\Str;
class PromptTemplate
{
public function __construct(
public string $template
)
{
}
public static function create(string $template): PromptTemplate
{
return new self($template);
}
public function format(array $inputVariables): PromptTemplate
{
$this->template = Str::swap($inputVariables, $this->template);
return $this;
}
public function outputParser(string $outputParser): PromptTemplate
{
$this->template = Str::squish($this->template . $outputParser);
return $this;
}
public function parse($response): object
{
$formatted = str($response)
->after('```json')
->before('```')
->trim();
return (object)json_decode($formatted, true);
}
public function toString(): string
{
$this->template = Str::squish($this->template);
return $this->template;
}
}
Here is a breakdown of the class and what it does:__construct()
The constructor method takes a string parameter $template
and assigns it to a public property $template
.
create()
The create()
method is a static factory method that creates a new instance of the PromptTemplate
class and returns it. It takes a string parameter $template
and passes it to the constructor of the PromptTemplate
class.
format()
The format()
method takes an array parameter $inputVariables
and replaces all occurrences of the keys in the $inputVariables
array with their corresponding values in the $template
property of the current instance of the PromptTemplate
class. It uses the swap()
method of the Illuminate\Support\Str
class to perform the replacement. It returns the current instance of the PromptTemplate
class.
outputParser()
The outputParser()
method takes a string parameter $outputParser
and appends it to the $template
property of the current instance of the PromptTemplate
class. It uses the squish()
method of the Illuminate\Support\Str
class to remove any unnecessary white space between the $template
property and the $outputParser
parameter. It returns the current instance of the PromptTemplate
class.
parse()
The parse()
method takes a parameter $response
and extracts a JSON object from it. It uses the after()
, before()
, and trim()
methods of the Illuminate\Support\Str
class to extract the JSON object from the $response
parameter. It then uses the json_decode()
function to decode the JSON object into a PHP object and returns it.
toString()
The toString()
method returns the $template
property of the current instance of the PromptTemplate
class after removing any unnecessary white space using the squish()
method of the Illuminate\Support\Str
class.
Next, you'll need to create the SystemMessagePromptTemplate
and UserMessagePromptTemplate
classes to interact with the Chat model in the OpenAI client. Place both classes in the newly created App\Prompts\Chat
folder.
Here are those two classes:
<?php
namespace App\Prompts\Chat;
class SystemMessagePromptTemplate
{
public static function fromString(string $message): array
{
return [
'role' => 'system',
'content' => $message,
];
}
}
<?php
namespace App\Prompts\Chat;
class UserMessagePromptTemplate
{
public static function fromString(string $message): array
{
return [
'role' => 'user',
'content' => $message,
];
}
}
As we have developed our PromptTemplate
class, we can now proceed to create a new Prompt. This will enable the OpenAI ChatGPT model to provide a conversational-style response. However, if you are building non-conversational applications, you may require a structured response that lacks any filler text.
To achieve this, we introduce the JsonListParser
class, which we place under App\OutputParser
. This class serves as an instructor for the ChatGPT model, guiding it on how to structure its response. By using this class, you can create a structured response that suits your application's needs.
<?php
namespace App\OutputParser;
use Stringable;
class JsonListParser implements Stringable
{
public function __toString()
{
return <<<TEXT
RESPONSE FORMAT INSTRUCTIONS
----------------------------
When responding to me please, please output the response in the following format:
\`\`\`json
{{{{
"data": array \\ An array of strings.
}}}}
\`\`\`
However, above all else, all responses must adhere to the format of RESPONSE FORMAT INSTRUCTIONS.
Remember to respond with a json blob with a single key, and NOTHING else.
TEXT;
}
}
To ensure the proper functioning of our classes, we will use the console.php
file to interact with them. Let's create a new prompt that generates ten keywords based on a given topic.
<?php
use App\Prompts\Chat\SystemMessagePromptTemplate;
use App\Prompts\Chat\UserMessagePromptTemplate;
use App\Prompts\PromptTemplate;
use Illuminate\Support\Facades\Artisan;
/*
|--------------------------------------------------------------------------
| Console Routes
|--------------------------------------------------------------------------
|
| This file is where you may define all of your Closure based console
| commands. Each Closure is bound to a command instance allowing a
| simple approach to interacting with each command's IO methods.
|
*/
Artisan::command('keywords', function () {
$prompt = PromptTemplate::create(template: 'Generate 10 keywords for {topic}')
->format([
'{topic}' => 'Laravel',
])
->outputParser(new App\OutputParser\JsonListParser());
$client = OpenAI::client(config('services.open_ai'));
$response = $client->chat()->create([
'model' => 'gpt-3.5-turbo',
'messages' => [
SystemMessagePromptTemplate::fromString("
You are an SEO Expert. You are helping a client to generate keywords for their website.
Please follow the RESPONSE FORMAT INSTRUCTIONS when responding to the User.
"),
UserMessagePromptTemplate::fromString($prompt->toString()),
],
]);
$this->comment($response->choices[0]->message->content);
});
To execute this prompt, enter php artisan keywords
in the console. If everything is working as expected, you should receive the following response:
{
"data": [
"Laravel",
"PHP framework",
"Laravel development",
"Laravel web development",
"Laravel applications",
"Laravel projects",
"Laravel web applications",
"Laravel PHP",
"Laravel programming",
"Laravel framework"
]
}
Great! Our ChatGPT API is returning a structured response as we intended. But how do we convert this response into a PHP object?
Fortunately, our PromptTemplate
class includes a parse
function that handles this conversion. Below is the complete code, which includes the parsing function:
<?php
use App\Prompts\Chat\SystemMessagePromptTemplate;
use App\Prompts\Chat\UserMessagePromptTemplate;
use App\Prompts\PromptTemplate;
use Illuminate\Support\Facades\Artisan;
/*
|--------------------------------------------------------------------------
| Console Routes
|--------------------------------------------------------------------------
|
| This file is where you may define all of your Closure based console
| commands. Each Closure is bound to a command instance allowing a
| simple approach to interacting with each command's IO methods.
|
*/
Artisan::command('keywords', function () {
$prompt = PromptTemplate::create(template: 'Generate 10 keywords for {topic}')
->format([
'{topic}' => 'Laravel',
])
->outputParser(new App\OutputParser\JsonListParser());
$client = OpenAI::client(config('services.open_ai'));
$response = $client->chat()->create([
'model' => 'gpt-3.5-turbo',
'messages' => [
SystemMessagePromptTemplate::fromString("
You are an SEO Expert. You are helping a client to generate keywords for their website.
Please follow the RESPONSE FORMAT INSTRUCTIONS when responding to the User.
"),
UserMessagePromptTemplate::fromString($prompt->toString()),
],
]);
$output = $prompt->parse($response->choices[0]->message->content);
collect($output->data)->each(function ($keyword) {
$this->comment($keyword);
});
});
After verifying that our PrompTemplate class is functional and our command is producing the desired output, the next step is to create test cases to ensure the code's reliability. In this case, we will be using Pest PHP to perform our testing. The test case is located in tests/Unit/Prompts/PromptTemplateTest.php
.
<?php
use App\Prompts\PromptTemplate;
it('can create a new prompt template', function () {
$template = 'This is a {variable} template.';
$promptTemplate = PromptTemplate::create($template);
expect($promptTemplate)->toBeInstanceOf(PromptTemplate::class);
});
it('can format a template with input variables', function () {
$template = 'This is a {variable} template.';
$prompt = PromptTemplate::create($template)
->format(['{variable}' => 'test']);
expect($prompt->toString())
->toBe('This is a test template.');
});
it('can append an output parser to a template', function () {
$outputParser = '---RESPONSE FORMAT INSTRUCTIONS---';
$prompt = PromptTemplate::create('This is a test template.')
->outputParser($outputParser);
expect($prompt->toString())
->toBe('This is a test template. ---RESPONSE FORMAT INSTRUCTIONS---');
});
it('can parse a response', function () {
$response = '```json { "data": "value" } ```';
$prompt = PromptTemplate::create('Test prompt')
->parse($response);
expect($prompt)
->toBeObject()
->and($prompt)
->toHaveProperty('data', 'value');
});
it('can convert a template to a string', function () {
$prompt = PromptTemplate::create('This is a {variable} template. ')
->format(['{variable}' => 'test'])
->toString();
expect($prompt)->toBe('This is a test template.');
});
In conclusion, this tutorial provides a friendly and easy-to-follow guide on incorporating OpenAI ChatGPT in your Laravel application. By implementing Prompt Templates, Output Parsers, and parsing the response, you can enjoy structured API responses that cater to your non-conversational application needs. The step-by-step instructions enable you to interact with the OpenAI client effortlessly while keeping your code clean and easy to read. Moreover, don't forget to test your implementation with Pest PHP to ensure smooth sailing and dependable performance. Happy coding!
If you need help with a Laravel project let's get in touch.
Lucky Media is proud to be recognized as a Top Laravel Development Agency
Technologies:
Related Posts
Stay up to date
Be updated with all news, products and tips we share!