BishopPhillips
BishopPhillips
BPC Home BPC AI Topic Home BPC RiskManager BPC SurveyManager BPC RiskWiki Learn HTML 5 and CSS Enquiry

LARGE LANGUAGE MODELS - Prompt Engineering

LLM Knowledge Geration Prompting
AI Revolution

Knowledge Generation prompting is a technique that involves generating text based on a given set of facts or knowledge. It can be used to generate more informative and accurate text from LLMs.

The idea behind knowledge generation prompting is to provide the LLM with relevant knowledge or information before generating the output. This can help the model make more accurate predictions and generate more coherent and fluent text.

For example, consider the following prompt:

What is the capital of France?

This prompt could be followed by a series of prompts that provide relevant knowledge about France, such as its location, population, and history. By using knowledge generation prompting, prompt engineers can create more informative and accurate chatbots that can generate high-quality and relevant outputs. For example:

Knowledge: France is a country of northwestern Europe that covers an area of 248,573 square miles (643,801 square kilometers). It is bordered by the Bay of Biscay and the English Channel to the west, Belgium, Luxembourg, Germany, Switzerland, Italy, and Spain to the east and south, and the Mediterranean Sea to the southeast.
Knowledge: France was one of the first countries to develop an organized nation-state and a centralized government in the Middle Ages.
Knowledge: France was the birthplace of the French Revolution (1789-1799), which overthrew the monarchy and established a republic based on the principles of liberty, equality, and fraternity. The revolution also inspired other movements for democracy and human rights around the world, such as the American Revolution and the Haitian Revolution.
Q: What is the capital of France?

Perhaps more surprisingly, you can achieve the correct outcome by giving the LLM a series of "knowledge statements" on totally unrelated facts in the form:

Input: Lithium is found in rock, brine and even sea water.
Knowledge: Lithium is one of the most abundant minerals on earth, but appears in such minute quantities that it can be uneconomic to extract. In higher, minable concentrations it is found in rock often around 1-2% concentration from which it is concentrated into spodumene at around 5 to 6% concentration. The spodumene is then roasted in an energy intensive process to produce high purity lithium carbonate or lithium hydroxide.
Input: Parrots are not all tropical birds.
Knowledge: Some species, like the kea and the kakapo, are native to New Zealand and live in cold and mountainous habitats. The kea is known for its curiosity and intelligence, while the kakapo is the world's only flightless and nocturnal parrot.
Q: What is the capital of France?

These facts can be delivered as a seemingly random collection on unrelated topics, after which the target question can be posed again and the answer may now be correct.

The question is why does this work? One of the theories that have been advanced for why knowledge generation using unrelated facts works is the analogical reasoning theory. This theory suggests that by using unrelated facts from a different domain, the prompt engineer can help the AI model to draw analogies and similarities between the two domains, and thus generate a more accurate and relevant response in the target domain. For example, by using facts from biology to prompt an AI model to generate text about computer science, the prompt engineer can help the AI model to use biological concepts and metaphors to explain computer science concepts and terms. This can result in a more creative and engaging text that also demonstrates the AI model's understanding of both domains. Another theory that has been proposed for why knowledge generation using unrelated facts works is the serendipity theory. This theory suggests that by using unrelated facts from a different domain, the prompt engineer can create a sense of surprise and curiosity in the AI model, and thus stimulate its creativity and originality. For example, by using facts from history to prompt an AI model to generate text about music, the prompt engineer can create a novel and unexpected connection between the two domains, and thus inspire the AI model to generate a unique and original text that also reflects its musical knowledge and style.

If you want to learn more about knowledge generation prompting and its applications, I recommend checking out this article on ...Prompt Engineering Guide... that provides a comprehensive overview of the topic with examples and use cases.

Next: Prompt Engineering - ReAct Prompting