Yesterday we explored autonomous prompt construction, in which LLMs are given a set of examples and asked to construct an optimal prompt to produce that output from that input that could then be applied to any inputs. In other words, given a set of disease outbreaks extracted from a news article it would construct a prompt that would do so across any news article or given a translation from English to French it would construct a prompt optimized to perform translations into French. To explore this further, we'll take a look at two examples, one a basic translation task and one that combines translation and sentiment detection with an embedded parameter within the prompt, providing the LLM with only the set of examples and asking it to come up with a set of prompts to produce those same outputs. We'll test this workflow across five major commercial LLMs (Google's Bison, Unicorn and Gemini Pro models and OpenAI's GPT 3.5 and GPT 4.0 models).
For our basic translation task, both Bison and the far more advanced Gemini Pro failed in all configurations to recognize the example as a translation into French task, while Unicorn offered correct generic translation prompts. GPT 4.0 offered generic translation prompts, while GPT 3.5 offered interesting suggestions around requesting that style and structure be preserved in the translation – something a human prompt engineer might not think of. However, both GPT 3.5 and 4.0 overfit the data, despite instruction not to, and provided the input text in their prompt. For our more complex combined parameterized translation and sentiment detection task, only Unicorn and GPT 4.0 correctly recognize both translation and sentiment detection tasks, though both incorrectly move the parameterized language name into the prompt from the input.
Overall, there is substantial potential to automated prompt construction, allowing non-technical users to merely provide a few examples of what they'd like to achieve and asking the LLM itself to write an optimized prompt for them to apply to other inputs. At the same time, the results here suggest the end results are not production-ready and will require at least some human modification, though they will likely serve as an accelerant, especially around advanced optimization ideation and can suggest ideas a human might not consider, such as explicit instructions around sentiment and structural preservation, etc.
Let's start with a basic translation task:
Suggest a few optimal prompts that could be provided to a Large Language Model like yourself to transform the following INPUT text into the codified OUTPUT table. Make the prompts generic such that they would work on any inputs to generate outputs that codify them in similar ways.
INPUT: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday.
OUTPUT: Les vacances d'hiver sont un moment merveilleux, rempli de famille, de fêtes et de convivialité. J'espère que tu passes de très bonnes vacances.
BISON
With its default temperature setting Bison's recommendations are unrelated to the actual translation task:
**Prompt 1:** Codify the input text into a table with the following columns: "Time", "Activities", and "Feelings". **Prompt 2:** Create a table that summarizes the input text, with the following columns: "Season", "Celebrations", and "Wishes". **Prompt 3:** Generate a table that captures the key elements of the input text, with the following columns: "Time", "Events", and "Emotions".
Rerunning with a temperature of 0.2:
**Prompt 1:** "Codify the following text into a table, with the first column containing the original text and the second column containing a codified version of the text." **Prompt 2:** "Transform the following text into a table, where the first column contains the original text and the second column contains a codified version of the text. The codified version should be a short, concise summary of the original text." **Prompt 3:** "Create a table with two columns. The first column should contain the following text: 'The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday.' The second column should contain a codified version of the text, which should be a short, concise summary of the original text."
And with a temperature of 0.99 yields unrelated prompts:
**Prompt 1: Codify into a tabular format the key phrases from the provided text, creating a column for "Themes" and a column for "Associated Phrases".** **Prompt 2: Codify the same provided text into a table format with two columns: "Time Period" and "Activities or Traditions".**
UNICORN
With its default temperature setting Unicorn correctly recognizes it as a translation task:
* Translate the following text from English to French. * Convert the following English text into French. * Transform the following English text into French. * Convert the following English sentence into French. * Translate the following English sentence into French. * Transform the following English sentence into French. * Convert the following English phrase into French. * Translate the following English phrase into French. * Transform the following English phrase into French.
Running with a temperature of 0.2 for a bit of creativity yields nearly identical results:
* Translate the following text from English to French. * Convert the following English text into French. * Transform the following English text into French. * Convert the following English sentence into French. * Translate the following English sentence into French. * Transform the following English sentence into French. * Convert the following English text into French. * Translate the following English text into French. * Transform the following English text into French. * Convert the following English sentence into French. * Translate the following English sentence into French. * Transform the following English sentence into French.
Running with a temperature of 0.99 we get translation-related prompts, but they fail to specify that the translation should be into French:
Prompt 1: Create a table that lists the following information about the input text: 1. The original text 2. The translated text 3. The language of the original text 4. The language of the translated text Prompt 2: Create a table that shows the following information about the input text: 1. The original text 2. The translated text 3. The source language 4. The target language Prompt 3: Create a table that contains the following information: 1. The original text 2. The translated text 3. The source language 4. The target language
Running a second time with a 0.99 temperature we get translation-related prompts:
Prompt 1: Create a table that summarizes the key information in the following text. The table should have two columns: one for the English text and one for the French translation. Prompt 2: Summarize the following text in a table that has two columns: one for the English text and one for the French translation. Prompt 3: Convert the following text into a table that has two columns: one for the English text and one for the French translation. Prompt 4: Create a table that has two columns: one for the English text and one for the French translation. The table should summarize the key information in the following text.
GEMINI PRO
Interestingly, while Unicorn correctly recognized the example as a translation task, Gemini Pro fails to do so and only one of its three prompts relates to translation and it fails to recognize it as a French translation:
1. **Prompt 1**: Encode the provided text into a concise collection of key points or concepts. Ensure that each point contains only the most essential information, without sacrificing clarity. 2. **Prompt 2**: Using the extracted key concepts, formulate a structured and coherent table that effectively summarizes the meaning of the provided text. 3. **Prompt 3**: Translate each cell in the table from its original language to the target language, preserving the original meaning and context while maintaining a tone consistent with the overall message of the text.
Evening boosting the temperature to 0.99 does not help:
1. **Prompt 1**: Encode the provided text into a concise collection of key points or concepts. Ensure that each point contains only the core idea without unnecessary details. 2. **Prompt 2**: Using the extracted key points, generate a table where each row corresponds to a key point. The table should include columns for point index, point description, and its translation to another language. 3. **Prompt 3**: Use the original text as input and translate it directly to another language, maintaining its original meaning and context as closely as possible. The resulting translation should be presented in a natural and grammatically correct manner.
GPT 3.5
GPT 3.5 readily recognizes it as a translation task and offers several interesting variants, such as requiring that sentiment and structure be preserved. However, it also overfits the input data and provides it as part of the prompt:
To transform the given input text into the codified output table, you can use generic prompts that instruct the language model to translate the text from one language to another (English to French in this case) and make sure it retains the structure and sentiment. Here are a few optimal prompts: 1. Translate the following English text into French while preserving the sentiment and structure: - INPUT: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday. 2. Convert the provided English message into French, maintaining the overall tone and structure: - INPUT: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday. 3. Please translate the following English passage into French, ensuring that the sentiment and structure remain the same: - INPUT: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday. 4. Transform this English text into French, making sure the sentiment and structure are retained: - INPUT: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday. These prompts should work well for translating similar English input texts into codified French outputs while maintaining the original meaning and sentiment.
GPT 4.0
GPT 4.0 also readily recognizes it as a translation task and offers multiple variants, but lacks the additional stylistic and semantic requirements of GPT 3.5. It also overfits the input data.
To effectively transform the provided input text into a codified output table in a different language (in this case, English to French), here are a few optimal prompts that can be used with a Large Language Model: 1. **Direct Translation Prompt:** "Please translate the following English text into French and present it in a table format: 'The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday.'" 2. **Bilingual Table Creation Prompt:** "Create a bilingual table with two columns. In the first column, place the provided English text, and in the second column, provide its French translation. Text: 'The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday.'" 3. **Language Transformation Prompt:** "Transform the following English sentences into French and organize them in a table with two columns - English and French. English Text: 'The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday.'" 4. **Comparative Language Table Prompt:** "Generate a table comparing English and French versions of the given text. Place the original English text in the left column and its French translation in the right column. Text for translation: 'The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday.'" 5. **Translation and Table Formatting Prompt:** "Translate the following text into French and format the original English text alongside the French translation in a side-by-side table format. Original Text: 'The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday.'" Each of these prompts is designed to guide the language model in not only translating the text but also presenting it in a clear, tabular format for easy comparison and understanding.
What about a more complex prompt that involves both translation and sentiment analysis, with translation featuring a variable languages specified as an input parameter? This requires far more reasoning from the model:
Suggest a few optimal prompts that could be provided to a Large Language Model like yourself to transform the following INPUT text into the codified OUTPUT table. Make the prompts generic such that they would work on any inputs to generate outputs that codify them in similar ways.
INPUT: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday. (French)
OUTPUT: Les vacances d'hiver sont un moment merveilleux, rempli de famille, de fêtes et de convivialité. J'espère que tu passes de très bonnes vacances. (Positive Tone)
INPUT: Hello it is great to talk with you. This is a wonderful conversation. (Spanish)
OUTPUT: Hola, es genial hablar contigo. Esta es una conversación maravillosa. (Positive Tone)
INPUT: The Internet is a wonderful technology and is helping to connect our world. (Turkish)
OUTPUT: İnternet harika bir teknolojidir ve dünyamızı birbirine bağlamaya yardımcı olmaktadır. (Positive Tone)
INPUT: Sometimes I eat so much I get sick. (Estonian)
OUTPUT: Mõnikord söön nii palju, et jään haigeks. (Negative Tone)
INPUT: There is a lot of conflict and war in the world right now. (Ukrainian)
OUTPUT: Зараз у світі багато конфліктів і воєн. (Negative Tone)
sdf
BISON
Bison fails to recognize the translation task and incorrectly transforms sentiment detection into sentiment transformation:
**Prompt 1: Codify the input text into a table with two columns, "Original Text" and "Codified Text". The "Codified Text" column should contain the input text transformed to have a positive tone.** **Prompt 2: Codify the input text into a table with two columns, "Original Text" and "Codified Text". The "Codified Text" column should contain the input text transformed to have a negative tone.** **Prompt 3: Codify the input text into a table with two columns, "Original Text" and "Codified Text". The "Codified Text" column should contain the input text transformed to have a neutral tone.**
UNICORN
Unicorn correctly identifies both the translation and sentiment detection tasks, but separates them as two separate prompts and strangely duplicates the sentiment detection into two overlapping prompts:
Here are a few optimal prompts that could be provided to a Large Language Model to transform the input text into the codified output table: 1. Translate the input text into the target language. 2. Identify the tone of the input text. 3. Codify the input text based on the tone. Here are some examples of how these prompts could be used: * Translate the input text into the target language. * Input: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday. (French) * Output: Les vacances d'hiver sont un moment merveilleux, rempli de famille, de fêtes et de convivialité. J'espère que tu passes de très bonnes vacances. * Identify the tone of the input text. * Input: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday. * Output: Positive * Codify the input text based on the tone. * Input: The winter holidays are a wonderful time, filled with family, feasts and togetherness. I hope you have a wonderful holiday. * Output: Positive Tone These prompts could be used to generate the codified output table for any input text.
GEMINI PRO
Gemini Pro correctly recognizes the translation task, but transforms the sentiment detection into sentiment transformation:
1. "Translate the given text into [specified language] and codify it in a table, using a positive tone." 2. "Generate a codified table by translating the provided text to [specified language], ensuring a neutral tone throughout the translation process." 3. "Codify the provided text in a table, translating it to [specified language] while maintaining a negative tone." 4. "Translate the provided text to [specified language] and represent it in a codified table, using a tone that is determined by the content of the text itself." 5. "Transform the provided text into a codified table by translating it to [specified language], ensuring that the tone of the translation matches the sentiment expressed in the original text."
GPT 3.5
While GPT 3.5 recognizes the translation task and that language is a parameter, it incorrectly transforms sentiment detection into sentiment shifting:
To transform the given INPUT text into the codified OUTPUT table, you can use the following generic prompts: 1. Transform text into [Language] with a positive tone. - INPUT: The winter holidays are a wonderful time, filled with family, feasts, and togetherness. - OUTPUT: [Translated text in the desired language with a positive tone] 2. Translate this text into [Language] with a positive sentiment. - INPUT: Hello, it is great to talk with you. This is a wonderful conversation. - OUTPUT: [Translated text in the desired language with a positive tone] 3. Translate the following sentence into [Language] with a positive tone. - INPUT: The Internet is a wonderful technology and is helping to connect our world. - OUTPUT: [Translated text in the desired language with a positive tone] 4. Convert the text to [Language] while conveying a negative sentiment. - INPUT: Sometimes I eat so much I get sick. - OUTPUT: [Translated text in the desired language with a negative tone] 5. Translate this statement into [Language] with a negative tone. - INPUT: There is a lot of conflict and war in the world right now. - OUTPUT: [Translated text in the desired language with a negative tone] You can replace "[Language]" with the target language you want to translate to (e.g., French, Spanish, Turkish, Estonian, Ukrainian) and adjust the input text accordingly for each prompt. This approach allows you to generate outputs that codify the given inputs in a similar manner, either with a positive or negative tone, depending on the desired sentiment.
GPT 4.0
Only GPT 4.0 correctly recognizes both the translation and sentiment analysis tasks, but it incorrectly moves the target language parameter into the prompt from its position in the input text:
To transform the provided INPUT texts into codified OUTPUT tables with the given structure, you can use prompts that guide the Large Language Model to perform specific tasks. These tasks include language translation and tone analysis. Here are some suggested prompts that are generic and can be applied to various inputs for similar transformations: 1. **Translation and Tone Identification Prompt**: "Translate the following text into [target language] and identify the overall tone (positive or negative) of both the input and output texts: [Input Text]." 2. **Language Transformation and Sentiment Analysis Prompt**: "Please convert the following sentence into [target language] and categorize the sentiment of the text as either 'Positive Tone' or 'Negative Tone': [Input Text]." 3. **Bilingual Text and Tone Mapping Prompt**: "For the given sentence, provide a translation in [target language] and assess if the tone of the sentence is positive or negative: [Input Text]." 4. **Cross-Language Sentiment Conversion Prompt**: "Transform this sentence into [target language], and determine whether its sentiment is positive or negative: [Input Text]." 5. **Multilingual Tone-Coded Translation Prompt**: "Translate the following phrase into [target language] and code the output with the appropriate tone (positive or negative) based on the context: [Input Text]." By replacing "[Input Text]" with the actual text and "[target language]" with the desired language (e.g., French, Spanish, Turkish), these prompts will guide the model to produce outputs similar to the examples you've provided.