Mastering the Machine: Ten LLM Prompting Tips

Chris Weidemann

Large Language Models (LLMs) are a mysterious topic for many right now. When they initially become popular, the general internet user saw them as an advancement for search engines. Now many are beginning to understand that traditional search engines sift through indexed content while LLMs generate responses based on how they are prompted. What many have yet to learn is that the quality of your interaction with an LLM is largely dependent on how well you phrase your questions (prompts). In this article, we delve into the art of “prompt engineering,” exploring ten tips that will refine your querying skills, ensuring you get the most accurate and relevant information from LLMs.

Understanding how to effectively prompt an LLM is crucial; it’s a skill that distinguishes successful interactions from unproductive ones. Prompting is more than just asking a question—it’s about crafting a context, setting the stage for the AI, and being specific about the expertise you require. If you give these advanced prompts a chance, you will learn to shape your prompts to elicit the best responses, helping you leverage the full capabilities of the AI. Whether you’re a developer, researcher, or just an AI enthusiast, mastering these strategies will enhance your efficiency and expand the horizons of what you can achieve with LLMs.

Personally, I would keep this list handy, and the next time you find yourself inside of your favorite LLM, try your normal prompt and then select one of these advanced prompts and compare the outputs. Quickly, you’ll start to notice those tacit patterns that become vital

1. Specify your context clearly ask it to act as an expert in that context

Tip: Start your interaction by clearly defining the context or background of your query.

Explanation: Providing a clear context helps the LLM understand the scope and specifics of your inquiry, which improves the accuracy and relevance of the responses. For instance, if you are asking for advice on software development, ask it to act as a subject matter expert of that programming language along with explaining the context of your project can significantly enhance the guidance you receive.

Example:”Act as a Senior Software Engineer that specializes in Python and Django. Analyze the following and provide you feedback as comments inline. The software’s requirements document states that it should perform X, Y, and Z tasks while conforming to our teams guidelines, which can be found here https://intranet.company.com/bestpractices”

2. Set the desired detail level

Tip: Frame your questions with precision and include necessary details.

Explanation: Specific questions lead to specific answers. When you use precise language and include all relevant details, you help the LLM zero in on exactly what you need. This minimizes misunderstandings and irrelevant information, making the interaction more efficient and productive.

3. Use Examples

Tip: Provide examples to clarify the type and format of the information you need.

Explanation: Examples are powerful clarifiers; they dramatically reduce the ambiguity of your requests. By showing the LLM what you’re expecting in return, you can often leapfrog common misunderstandings and get to the heart of the matter more quickly.

Example: “Can you explain how inheritance works in Python with a simple class example?”

4. Encourage Creativity

Tip: Instruct the LLM to think creatively or “outside the box” for brainstorming sessions.

Explanation: Encouraging the LLM to generate creative or unconventional ideas can be incredibly beneficial, especially when tackling problems that benefit from fresh perspectives. This approach can yield innovative solutions and ideas that conventional thinking might miss.

5. Use precise and detailed questions

Tip: Frame your questions with precision and include necessary details.

Explanation: Specific questions lead to specific answers. When you use precise language and include all relevant details, you help the LLM zero in on exactly what you need. This minimizes misunderstandings and irrelevant information, making the interaction more efficient and productive.

Example: Instead of asking “How do I write better code?”, ask “What are some Python coding practices to enhance readability and performance for beginners?”

6. Iterate through a series of questions or leverage follow up questions

Tip: If the first answer isn’t satisfactory, refine your question and ask again. Use follow-up questions to dive deeper or clarify previous answers.

Explanation: Iteration is a powerful tool. It allows you to refine or redirect the model’s focus based on initial responses, which can be particularly helpful when tackling complex or nuanced issues. This iterative loop can significantly enhance the accuracy and relevance of the information you receive. Follow-ups are essential for unpacking complex answers and exploring nuances not covered in the initial response. They help you navigate the conversation towards the information that matters most to you, enhancing the overall utility of the interaction.

7. You can also outline the procedure for a complex problem for the LLM Tip: When you don’t want the LLM to guide the generation, you can give it specific instructions to follow. By organizing intricate queries into a sequential, step-by-step procedure for the LLM to follow, you can dictate how the LLM steps through the generation.

Explanation:

By breaking down these challenges into clear, actionable steps, you guide the LLM through your thought process, making it easier for it to generate useful and accurate responses. This approach not only ensures that the AI addresses each part of the problem thoroughly but also helps in maintaining clarity and focus throughout the interaction. Structuring your query as a series of ordered tasks enables the LLM to apply its vast data processing abilities methodically, ensuring that no detail is overlooked and that each part of the solution builds upon the previous one. This is particularly effective in business contexts where decisions need to consider various factors and their implications methodically. Examples:

“First, create a list of the best practices for cash flow management in small businesses. Second, provide the pros and cons for each of these practices when applied to a seasonal business model. Third, using the pros and cons, rate each practice with a score of one to ten, where ten is the best practice for a seasonal business model. Finally, for the top three scores, generate a summary of each one that can be provided to my business partners in an e-mail.”

8. Employ keywords strategically

Tip: Use keywords effectively to guide the LLM’s focus.

Explanation: Keywords act like beacons that signal to the LLM what you consider most important about your query. Selecting the right keywords can dramatically improve the precision of the answers you receive, especially in fields crowded with jargon and specialized terminology. Scenario: You are an entrepreneur looking for information on funding options available for startups in the technology sector. Without Strategic Keywords: “What are some ways to get funding for a new business?” With Strategic Keywords:

“What are the best venture capital funding options for early-stage tech startups specializing in artificial intelligence?”

9. Request Summaries

Tip: Ask for summaries of long texts to distill essential information.

Explanation: Summaries can help you quickly grasp the main points of a lengthy document or discussion, saving time and cognitive effort. This is particularly useful when dealing with complex topics that require quick assimilation of key facts and conclusions.

10. Create a new context window Tip: Start a new chat session to reset the context window when the conversation becomes cluttered or irrelevant.

Explanation: As conversations with an LLM progress, they can accumulate a lot of contextual information, some of which may no longer be relevant or might even lead to confusion in responses. For many LLM’s, starting a new chat session effectively resets the context window, allowing you to redefine the focus without interference from past interactions. This is particularly beneficial when shifting between distinctly different topics, dealing with sensitive information that needs clear segregation, or correcting the course of a conversation that has gone off track. Resetting the context helps maintain clarity, improve response relevance, and safeguard the structured flow of information.

As you begin to apply these top tips in your daily use of LLMs, you’ll notice a significant improvement in the quality of interactions and outcomes. But don’t stop here! If you’re eager to explore more advanced techniques or seek deeper insights into leveraging AI technologies, check out my other articles. They cover a range of topics from advanced prompt crafting to integrating AI into your business.

About the Author

Chris Weidemann

Chris has been interested in what we all now refer to as AI for over ten years. In 2013, he published his first research journal article on the topic. He now helps companies implement these progressive systems. Chris' posts try to explain these topics in a way that any business decision maker (technical or nontechnical) can leverage.

Don't miss these stories: