DESIGN TOOLS
Micron technology glossary

Prompt engineering

Generative AI is growing in prominence as a subset of artificial intelligence, with generative AI chatbots becoming everyday technology for many. Prompt engineering is the process by which humans interact with generative AI, crafting, and refining queries to ensure the models understand and respond effectively. This technique helps in obtaining accurate and relevant information from AI systems.

What is prompt engineering?

Prompt engineering definition: Prompt engineering is the practice of crafting and refining inputs to enable artificial intelligence (AI) tools to produce optimal results by accurately interpreting and processing information. 

Prompt engineering can significantly enhance the effectiveness of any given AI tool. Providing additional context for these artificial intelligence tools ensures the most effective and accurate outputs.

When users input a simple query or prompt into a ​generative AI model, they should expect a simple output. Conversely, when users input more detailed, complex queries, models typically output more detailed, valuable responses.

Without prompt engineering, the detailed response would not be possible since the model could neither comprehend nor satisfactorily respond to the query. 

Like other aspects of fine-tuning generative AI tools, a combination of both creativity and testing is key to ensuring that the tools are able to learn and adapt. Prompt engineering uses the best and most applicable phrases, symbols, and words for these AI tools.

How does prompt engineering work?

Generative AI models are trained and built on ​deep learning architectures that enable them to grasp basic language, including tone and context. Prompt engineering helps mold the output of the AI tools and ensure that it is satisfactorily answering a query. 

Prompt engineering elevates the usefulness of generative AI tools when a user provides a prompt lacking any context or clarity. For example, when a user gives a prompt such as “Where can I buy a laptop,” the prompt engineering portion of the tool considers this prompt with more context.

For example, after going through prompt engineering, this prompt will look something like “This user needs to purchase a laptop. As if you are a sales assistant for a tech store, provide a concise list of which laptops are available in x location.”

Without the additional insight of prompt engineering for generative AI tools, the user experience and overall performance of the tool would suffer.

To create effective prompts, users should consider the following elements:

  • Identify the task: Clearly state what they need the AI to do, such as “Provide a list of laptops available for purchase.”
  • Provide context: Give background information that helps the AI understand the situation, such as “The laptop is for work purposes.”
  • Specify persona: Define the role the AI should assume, such as “As if you are a sales assistant for a tech store.”
  • Set a tone: Indicate the desired tone of the response, such as “Provide a friendly and professional recommendation.”
  • Give examples: Offer examples to guide the AI, such as “Suggest laptops like the Dell XPS 13 or MacBook Air.”
  • Specify format: Outline how the response should be structured, such as “Provide the information in a bulleted list.”

By incorporating these elements, prompt engineering ensures that generative AI tools provide more accurate, relevant, and user-friendly responses.

What is the history of prompt engineering?

Prompt engineering is a developing aspect of generative AI, which is itself a novel technology and has a relatively recent history. Although its history may be shorter than some other aspects of artificial intelligence, the impact of prompt engineering in this brief period has been significant.

  • 2017, introduction of transformers: The introduction of transformers with ​natural language processing (NLP) reshaped the field of AI, opening new possibilities. These models could be pretrained on large datasets, which increased the demand for AI tools to produce competent work based on these larger datasets.
  • 2018, prompt engineering put to the test: Researchers began to propose that previous tasks in NLP could be used as a question-answer exercise with additional context. This idea led to asking AI tools basic questions, such as historical inquiries or simple translations. 
  • 2021, biggest prompt engineering test to date: A group of researchers conducted a test in which a pretrained model ran 12 NLP tasks. To solve these tasks, the model had to correctly answer prompts.
  • 2023, height of prompt engineering: By 2023, numerous text-to-text and text-to-image prompt databases were made publicly available. These tools have become increasingly popular, ChatGPT is an example.

What are key types of prompt engineering?

A few kinds of prompt engineering offer unique ways to train generative AI models and instill them with the right skills for particular use cases. 

  • Zero-shot learning gives the model a query to handle with no prior learning or context to gauge the AI output without context. 
  • One-shot learning provides the model a single example or piece of context to help it provide the right output.
  • Few-shot learning gives the AI model a handful of examples and context clues to give a sense of context and pattern recognition.
  • Chain-of-thought prompting offers a step-by-step thought process to the AI model for a comprehensive, humanlike perspective on the query.
  • Iterative prompting gives a prompt and then refines it based on the output from the initial prompt.
  • Negative prompting guides the AI model on what not to produce rather than what to produce.

How is prompt engineering used?

Prompt engineering is a specific method for aiding artificial intelligence tools, with its use cases designed to enhance and influence these tools. It involves the practice of designing and refining inputs (prompts) to help AI tools produce optimal results. This process includes providing context and crafting queries in a way that the AI can accurately interpret and respond to, ensuring that the output is relevant and useful.

Prompt engineers are people who specialize in this practice. They possess a deep understanding of both the subject matter and the AI's capabilities, allowing them to create effective prompts that guide the AI to generate accurate and meaningful responses. Prompt engineers play a crucial role in maximizing the efficiency and reliability of AI systems, especially in complex or sensitive fields like medicine and law. In these fields, prompt engineering can provide context and instruct AI systems to summarize data and assist with medical treatments or legal recommendations. 

Prompt engineering can also tackle cybersecurity by making the AI tools more comprehensive in their ability to carry out a threat in a simulated environment. This use gives software developers a more thorough testing situation so that they can add extra layers of security in the development stage. 

One other key area in which prompt engineering is invaluable is developing critical thinking. For generative AI tools to grow and become more comprehensive in their responses, prompt engineering helps develop patterns of critical thinking. Without this level of critical thinking, AI tools would provide less credible answers since they would not have analyzed the prompt in the same way as they can with critical thinking. 

Frequently asked questions

Prompt engineering FAQs

The focus of prompt engineering is to provide further context and training to ensure that AI tools can consistently improve and provide well-articulated and sourced answers to a user's query.

Prompt engineering provides additional context and helps AI models decipher language in terms they will understand. It contributes to AI model reliability by allowing the tools to provide strong answers to prompts.

Prompt engineering helps with both the input and output of AI tools. It makes the prompt more understandable for the tool and ​​large language models. It also helps refine the tool's output to ensure it is concise and useful for the user.