BishopPhillips
BishopPhillips
BPC Home BPC AI Topic Home BPC RiskManager BPC SurveyManager BPC RiskWiki Learn HTML 5 and CSS Enquiry

LARGE LANGUAGE MODELS - Prompt Engineering

LLM Chain Of Thought Prompting
AI Revolution

Chain-of-thought prompting is a technique that involves generating text in a conversational manner. It involves breaking down complex queries or tasks into a series of interconnected prompts. Instead of relying on a single input, the model is guided through a sequence of prompts that refine and build upon each other. Chain-of-thought prompting can be used to generate more engaging and interactive outputs from LLMs.

For example, consider the following prompt:

I want to bake a cake. What ingredients do I need?
This prompt could be followed by a series of prompts that guide the model through the process of baking a cake. For example:
  1. What type of cake do you want to bake?
  2. What ingredients do you need for that type of cake?
  3. Do you have all the necessary ingredients?
  4. What are the steps for baking the cake?
  5. How long does it take to bake the cake?
  6. How should I decorate the cake?

By using chain-of-thought prompting, prompt engineers can create more engaging and interactive chatbots that can generate high-quality and relevant outputs.

If you want to learn more about chain-of-thought prompting and its applications, I recommend checking out this article on Prompt Engineering Guide that provides a comprehensive overview of the topic with examples and use cases.

Let's consider a more complex scenario. Returning to our mathematical reasoning problems, we have this example that appears in the literature, that caused earlier LLM's a problem which was solved by adding "Lets think step by step." to the end of the problem. On current LLM's that addition appears unnecessary.

I went to the market and bought 10 apples. I gave 2 apples to the neighbor and 2 to the repairman. I then went and bought 5 more apples and ate 1. How many apples did I remain with?

BingAI returns the following correct response:

"You started with 10 apples. You gave 2 apples to the neighbor and 2 apples to the repairman, so you had 6 apples left. You then bought 5 more apples, so you had 11 apples. Finally, you ate 1 apple, so you remained with 10 apples."

This response is correct. Changing the numbers so that it is not merely drawing from a documented example, still returns the correct result, but let us introduce an additional item and change the problem slightly so that the story about the apples is effectively irrelevant:

I went to the market and bought 10 apples. I gave 2 apples to the neighbor and 2 to the repairman. I found 2 bananas. I then went and bought 5 more apples and ate 1. How many bananas did I remain with?

Now the LLM goes off the rails:

You started with 10 apples. You gave 2 apples to the neighbor and 2 apples to the repairman, so you had 6 apples left. You found 2 bananas. You then bought 5 more apples, so you had 11 fruits in total. Finally, you ate 1 apple, so you remained with 10 fruits. Unfortunately, I cannot determine how many bananas you have left based on the information provided. If you have any other information or details, please let me know and I’ll do my best to help!

It did detect that both apples and bananas are fruits, but it got the total fruits wrong and then couldn't decide what happened to the bananas. So lets try to get it to think step by step!

I went to the market and bought 10 apples. I gave 2 apples to the neighbor and 2 to the repairman. I found 2 bananas. I then went and bought 5 more apples and ate 1. How many bananas did I remain with? Lets think step by step.

Nope - it still can't work it out, so the problem is not that it is failing to break the problem up into parts (which given its explanation, was expected) - although it did add a nice picture of a bunch of bananas!

You started with 10 apples. You gave 2 apples to the neighbor and 2 apples to the repairman, so you had 6 apples left. You found 2 bananas. You then bought 5 more apples, so you had 11 fruits in total. Finally, you ate 1 apple, so you remained with 10 fruits. Unfortunately, I cannot determine how many bananas you have left based on the information provided. If you have any other information or details, please let me know and I’ll do my best to help!

So now we start teaching it, by drawing its attention to the key details it is missing:

Did I dispose of the bananas in any way?

It responded:

I’m sorry, but I don’t have any information about what happened to the bananas in your story. If you have any other information or details, please let me know and I’ll do my best to help!

To which I replied:

So if you have no information on what happened to the bananas, is it reasonable to assume I still have them?

And the LLM thought this reasonable and replied:

Based on the information provided, it is reasonable to assume that you still have the bananas. However, I would like to clarify that I don’t have any information about what happened to the bananas in your story. If you have any other information or details, please let me know and I’ll do my best to help!

Now we test to see if it can extrapolate that new knowledge to the problem:

That is correct, so how many bananas do I have at the end of the story based only on the information provided?

Still not getting it:

Based on the information provided, you started with 10 apples and found 2 bananas. You then gave 2 apples to the neighbor and 2 apples to the repairman, so you had 6 apples left. You then bought 5 more apples, so you had 11 fruits in total. Finally, you ate 1 apple, so you remained with 10 fruits.
Unfortunately, there is no information provided about what happened to the bananas in your story.

The calculation of the number of fruits has been wrong each time, so I decide to teach it something about acquisition:

If a buy 1 apple and find 1 banana I now have 2 fruits. How many bananas do I have?

Now something interesting happens:

I apologize for the confusion in my previous response. Based on the information provided, you started with 1 apple and found 1 banana. Therefore, you have 2 fruits in total.

Now I did not ask it how many fruits I had. I asked about bananas, but this gives a clue to where it is going wrong. We need to help it extract the bananas from the fruit group it has created. It's statement was correct, just not the answer to the question asked. So:

Correct. Now how many of the fruits are bananas?

Now we have success:

Based on the information provided, you started with 10 apples. You gave 2 apples to the neighbor and 2 apples to the repairman, so you had 6 apples left. You found 2 bananas. You then bought 5 more apples, so you had 11 fruits in total. Finally, you ate 1 apple, so you remained with 10 fruits.
Since you started with 2 bananas and did not dispose of them, you have 2 bananas.

How the LLM progressed through chain of thought to understanding here is interesting. So let us change the numbers and some of the characters and incorporate all we learned into a single prompt and see if the LLM can now reason its way through the problem:

I went to the market and bought 14 apples. I gave 3 apples to the Jack and 4 to the Sam. I found and kept 3 bananas. I then went and bought 4 more apples and ate 3. Based only on this information, did I dispose of any bananas, and thus how many of my total remaing fruits were apples and how many bananas? Let's think step by step.

And it looks like we have worked out how to prompt the LLM to help it reason its way through the problem:

Sure, let's break it down step by step.
You started with 14 apples. You gave 3 apples to Jack and 4 apples to Sam, so you had 7 apples left. You found and kept 3 bananas, so you had 10 fruits in total. You then bought 4 more apples, so you had 14 fruits in total. Finally, you ate 3 apples, so you remained with 11 fruits.
Since you found and kept 3 bananas, you did not dispose of any bananas.
Therefore, out of the 11 fruits remaining, 3 are bananas and the rest are apples.

Prompt Engineering - Self-Consistency