Why is "prompt engineering" not as easy as it should seem to be? Writing prompts for generative AI tools seems like it should be super simple - especially because we can communicate with LLM's in our natural language. So writing prompts should come naturally to everyone, right? Wrong. To really be effective at writing prompts and get the best output, it takes knowledge and several skills. For example: 1️⃣ understanding GAI & how LLM's work - what they can and cannot do and their limitations (at least fundamentals). 2️⃣ domain expertise - deep familiarity with the subject matter relevant to the task can be important, otherwise, you may struggle in framing effective prompts and have no way to judge the quality of the output. 3️⃣ problem formulation and decomposition - the ability to clearly define complex problems and break them down into smaller, manageable parts. 4️⃣ iterative thinking to refine prompts based on AI responses. 5️⃣ critical thinking to structure prompts in a way that guides the AI's thought process, to analyze and evaluate AI outputs for accuracy and relevance, and adapt follow-up prompts based on LLM output. 6️⃣ communication - the ability to articulate ideas and instructions precisely in prompts and provide appropriate context for the AI. 7️⃣ creativity in framing questions or tasks in novel ways to elicit high-quality/valuable outputs. 8️⃣ curiosity - the drive to explore, experiment, and ask interesting questions, leading to more insightful prompts, more helpful output, and unexpected discoveries in your GAI conversations. I'm sure I'm missing something in this list - please let me know your thoughts. 🙏 Hopefully this gives you some sense of why using LLM's, especially for work, isn't as easy as it might seem on the surface. When I train people on how to effectively use GAI, I always stress that if you're not happy or impressed with the output, you need to look in the mirror because you're at least 50% of the problem (and maybe more!). #AI #GAI #PromptEngineering #Skills
Essential Skills for AI Prompting
Explore top LinkedIn content from expert professionals.
Summary
Mastering AI prompting is about developing essential skills that allow you to effectively communicate with generative AI tools, ensuring precise, meaningful, and high-value outputs. This process combines creativity, problem-solving, and strategic thinking, turning AI into a true collaborator for innovative solutions.
- Learn the basics: Understand how large language models function, their strengths, and their limitations to craft targeted and realistic prompts.
- Define problems clearly: Break down complex tasks into smaller, manageable questions and provide specific contexts for the AI to follow.
- Iterate and refine: Treat AI as a collaborative tool by continuously refining your prompts and analyzing its output to achieve the best results.
-
-
In a world where access to powerful AI is increasingly democratized, the differentiator won’t be who has AI, but who knows how to direct it. The ability to ask the right question, frame the contextual scenario, or steer the AI in a nuanced direction is a critical skill that’s strategic, creative, and ironically human. My engineering education taught me to optimize systems with known variables and predictable theorems. But working with AI requires a fundamentally different cognitive skill: optimizing for unknown possibilities. We're not just giving instructions anymore; we're co-creating with an intelligence that can unlock potential. What separates AI power users from everyone else is they've learned to think in questions they've never asked before. Most people use AI like a better search engine or a faster typist. They ask for what they already know they want. But the real leverage comes from using AI to challenge your assumptions, synthesize across domains you'd never connect, and surface insights that weren't on your original agenda. Consider the difference between these approaches: - "Write a marketing plan for our product" (optimization for known variables) - "I'm seeing unexpected churn in our enterprise segment. Act as a customer success strategist, behavioral economist, and product analyst. What are three non-obvious reasons this might be happening that our internal team would miss?" (optimization for unknown possibilities) The second approach doesn't just get you better output, it gets you output that can shift your entire strategic direction. AI needs inputs that are specific and not vague, provide context, guide output formats, and expand our thinking. This isn't just about prompt engineering, it’s about developing collaborative intelligence - the ability to use AI not as a tool, but as a thinking partner that expands your cognitive range. The companies and people who master this won't just have AI working for them. They'll have AI thinking with them in ways that make them fundamentally more capable than their competition. What are your pro-tips for effective AI prompts? #AppliedAI #CollaborativeIntelligence #FutureofWork
-
The PAIR framework for developing generative AI skills, at Harvard Business Publishing Education Here are five pivotal skills I’ve identified—based on AI research, firsthand observations of student interactions with AI, and my own hands-on experiences with AI—that students need to develop to successfully use these tools. 1. #ProblemFormulation, which is the ability to identify, analyze, and define problems. Students need to successfully translate what they hope to get from a generative AI tool into a well-defined problem that the large language models (LLMs) can understand. Problem formulation is the thinking you do before you attempt to prompt the AI; it’s outlining the focus, scope, and boundaries of a problem. Simply put, without a deep understanding of the problem to be solved, your prompts won’t be effective—no matter how well they’re phrased for AI. (To learn more about problem formulation, read my HBR article, “AI Prompt Engineering Isn’t the Future.”) 2. #Exploration. With so many new AI products emerging every week, it is increasingly important and difficult to identify the most suitable tool for the task at hand. To be able to do this, students must be familiar with major generative AI tools such as ChatGPT and Stable Diffusion, excel in using generative AI-enhanced search engines such as Microsoft Bing and Google Bard, and remain motivated and curious to keep up with whatever generative AI tools and enhancements are coming next. 3.#Experimentation. Given the ever-evolving nature of these tools, one effective way to keep up is to just continue experimenting with them. Experimentation involves a hands-on interaction with the AI, a process of trial and error, and an assessment of the outcomes. 4. #CriticalThinking. Generative AI tools sometimes produce inaccurate or biased content—arguably their greatest limitation. Critical thinking helps identify and mitigate this limitation. It’s about applying a disciplined, objective lens to evaluate the information or arguments generated, which also deepens students’ learning. 5. #Willingness to reflect. Engaging with generative AI systems can sometimes stir emotions, particularly when the tools are used for tasks closely tied to one’s identity or self-worth. For example, if a student identifies as a great writer or creative designer, they may perceive assistance from AI on related tasks as a threat to their identity or worth. Adopting a reflective practice can help students understand these emotional reactions. Although it shares certain elements with critical thinking, reflection focuses on examining one’s personal thoughts, feelings, beliefs, and actions, as opposed to the AI’s output. https://lnkd.in/e2Y3c9g5
-
The ability to effectively communicate with generative AI tools has become a critical skill. A. Here's some tips on getting the best results: 1) Be crystal clear - Replace "Tell me about oceans" with "Provide an overview of the major oceans and their unique characteristics" 2) Provide context - Include relevant background information and constraints Structure logically - Organize instructions, examples, and questions in a coherent flow. 3) Stay concise - Include only the necessary details. B. Try the "Four Pillars:" 1) Task - Use specific action words (create, analyze, summarize) 2) Format - Specify desired output structure (list, essay, table) 3) Voice - Indicate tone and style (formal, persuasive, educational) 4) Context - Supply relevant background and criteria C. Advanced Techniques: 1) Chain-of-Thought Prompting - Guide AI through step-by-step reasoning. 2) Assign a Persona - "Act as an expert historian" to tailor expertise level. 3) Few-Shot Prompting - Provide examples of desired outputs. 4) Self-Refine Prompting - Ask AI to critique and improve its own responses. D. Avoid: 1) Vague instructions leading to generic responses. 2) Overloading with too much information at once. What prompting techniques have yielded the best results in your experience? #legaltech #innovation #law #business #learning
-
𝐁𝐞𝐭𝐭𝐞𝐫 𝐀𝐈 𝐒𝐭𝐚𝐫𝐭𝐬 𝐰𝐢𝐭𝐡 𝐁𝐞𝐭𝐭𝐞𝐫 𝐏𝐫𝐨𝐦𝐩𝐭𝐬 Most people think prompt engineering is about clever wording. In reality? It’s strategic thinking in disguise. I use this simple framework “PROMPT” to write questions that actually work with AI: 𝗣𝘂𝗿𝗽𝗼𝘀𝗲𝗳𝘂𝗹 | 𝗥𝗲𝗹𝗲𝘃𝗮𝗻𝘁 | 𝗢𝗽𝗲𝗻-𝗲𝗻𝗱𝗲𝗱 | 𝗠𝗲𝗮𝗻𝗶𝗻𝗴𝗳𝘂𝗹 | 𝗣𝗿𝗲𝗰𝗶𝘀𝗲 | 𝗧𝗲𝘀𝘁 & 𝗲𝘃𝗮𝗹𝘂𝗮𝘁𝗲 Here’s the breakdown: ➤ 𝗣𝘂𝗿𝗽𝗼𝘀𝗲𝗳𝘂𝗹 Start with why. Are you trying to brainstorm, summarize, or analyze? Set a clear intention before you type a single word. ➤ 𝗥𝗲𝗹𝗲𝘃𝗮𝗻𝘁 Give the AI context. Keep it tightly focused. One misplaced detail can send it spinning into the wrong zone. ➤ 𝗢𝗽𝗲𝗻-𝗲𝗻𝗱𝗲𝗱 Ask questions that begin with how, why, or tell me about. Avoid yes/no dead ends. ➤ 𝗠𝗲𝗮𝗻𝗶𝗻𝗴𝗳𝘂𝗹 Make it count. Don’t ask generic filler questions, craft prompts that actually move your work forward. ➤ 𝗣𝗿𝗲𝗰𝗶𝘀𝗲 Refine your ask. Mention your audience, desired format (bullets, list, outline), or even tone. Think briefing, not brainstorming. ➤ 𝗧𝗲𝘀𝘁 & 𝗲𝘃𝗮𝗹𝘂𝗮𝘁𝗲 Great prompts don’t always happen on the first try. Test variations. See what works best. AI is only as good as the human behind the keyboard. If you’re building in AI, leading a team, or just trying to get sharper thinking from your tools, PROMPT is a cheat code. 📌 Save this. Share it with your team. And let me know your go-to prompt move. 📩 DM me to learn about my AI Training on Leadership. #JeffEyet #TheBerkeleyInnovationGroup #AI #PromptEngineering #Strategy #Entrepreneurship #Productivity #Growth
-
“You don’t need to be a data scientist or a machine learning engineer- everyone can write a prompt” Google recently released a comprehensive guide on prompt engineering for Large Language Models (LLMs), specifically Gemini via Vertex AI. key takeaways for the article: What is prompt engineering really about? It’s the art (and science) of designing prompts that guide LLMs to produce the most accurate, useful outputs. It involves iterating, testing, refining not just throwing in a question and hoping for the best. Things you should know: 1. Prompt design matters. Not just what you say, but how you say it: wording, structure, examples, tone, and clarity all affect results. 2. LLM settings are critical: • Temperature = randomness. Lower means more focused, higher means more creative (but riskier). • Top-K / Top-P = how much the model “thinks outside the box.” • For balanced results: Temperature 0.2 / Top-P 0.95 / Top-K 30 is a solid start. • Prompting strategies that actually work: • Zero-shot, one-shot, few-shot • System / Context / Role prompting • Chain of Thought (reasoning step by step) • Tree of Thoughts (explore multiple paths) • ReAct (reasoning + external tools = power moves) 4. Use prompts for code too! Writing, translating, debugging, just test your output. 5. Best Practices Checklist: • Use relevant examples • Prefer instructions over restrictions • Be specific • Control token length • Use variables • Test different formats (Q&A, statements, structured outputs like JSON) • Document everything (settings, model version, results) Bottom line: Prompting is a strategic skill. If you’re building anything with AI, this is a must read👇
-
🧠 The most valuable skill in the AI era? 𝗔𝗱𝘃𝗶𝗰𝗲 𝗳𝗹𝘂𝗲𝗻𝗰𝘆. Personalized advice from humans is expensive — if not in dollars, then in time — so most people ration their asks. We hesitate to bother someone unless we’re sure it’s important. But now? The economics have changed. Thanks to #LLMs, personalized advice has never been cheaper or more accessible. But there’s a catch. The limiting factor isn’t the model. It’s you. If you haven’t built the skill of taking advice well, all that power just turns into noise. And #advice — whether from humans or machines — is only as good as your ability to wield it. That’s why I say: 𝘨𝘦𝘵𝘵𝘪𝘯𝘨 𝘢𝘥𝘷𝘪𝘤𝘦 𝘪𝘴 𝘢 𝘴𝘬𝘪𝘭𝘭. Three skills, actually. And they're your top AI skills too: ① 𝗞𝗻𝗼𝘄𝗶𝗻𝗴 𝘄𝗵𝗮𝘁 𝘁𝗼 𝗮𝘀𝗸 𝗮𝗯𝗼𝘂𝘁 This is your leadership superpower: knowing what matters enough to be worth doing better. If nothing feels worth improving, that’s not wisdom — that’s stagnation. It means you’re not in touch with your organization’s priorities (or your own). AI won’t fix that for you. No wonder equal access to AI tools doesn’t yield equal ROI. ② 𝗞𝗻𝗼𝘄𝗶𝗻𝗴 𝗵𝗼𝘄 𝘁𝗼 𝗮𝘀𝗸 Clarity. Context. Precision. The best advisor in the world can’t help you if your question is vague or ambiguous… or if you haven’t properly described your situation. The same applies to LLMs. Prompting isn’t new — it’s the ancient art of articulating your needs well enough to get the right result. Worried about privacy? You don’t have to disclose specifics to get useful guidance. Learn to abstract, anonymize, and still get clarity. ③ 𝗞𝗻𝗼𝘄𝗶𝗻𝗴 𝗵𝗼𝘄 𝗻𝗼𝘁 𝘁𝗼 𝘁𝗮𝗸𝗲 𝗯𝗮𝗱 𝗮𝗱𝘃𝗶𝗰𝗲 This is where discernment lives. Even the most eloquent response can be useless or harmful to you. So: how important is this decision? Have you fact-checked? Have you thought about what it would take for the advice to serve you rather than harm you? Compared second and third opinions? If not, why are you acting on it? 🎯 𝐇𝐞𝐫𝐞’𝐬 𝐭𝐡𝐞 𝐥𝐞𝐚𝐝𝐞𝐫𝐬𝐡𝐢𝐩 𝐮𝐩𝐠𝐫𝐚𝐝𝐞: treat advice as your personal AI use case. Practice. Ask better questions. Aim for better context. Hone your filters. Because while AI can give you answers instantly, it’s still your job to make sure those answers are useful, safe, and aligned with what matters. And when you’re leading at scale—especially with agentic systems—your questions ripple outward. 𝗬𝗼𝘂𝗿 𝗷𝘂𝗱𝗴𝗺𝗲𝗻𝘁 𝗯𝗲𝗰𝗼𝗺𝗲𝘀 𝗮𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗲. These systems don’t just respond. They act on your intent. So, sharpen yourself and sharpen these 3 advice skills. Because the better you are at taking advice, the better you’ll be at leading humans, machines, and hybrid teams. Now that AI has changed the cost structure of advice, the smartest move is to adapt your skills and habits to match. If you're not already thinking of LLMs in this way, the time to start is now! This post is #sponsored by monday.com to help you lead better in the AI era.
-
Yesterday I had the pleasure of working with leaders and teachers from L’Anse Creuse School District outside of Detroit for one of our Train-the-Trainer Institutes. We had a great time digging into all things GenAI! Our 1-day institute focuses on two key PD sessions: Introduction to Generative AI for Educators and Prompting 101. We work to upskill the new trainers on foundational concepts of GenAI, before equipping them with strategies to turnkey this work in their school. In our Prompting 101 session we focus on strategies for getting the best out of popular and powerful free GenAI tools like ChatGPT, Claude, and Gemini. What's great is there are many different prompt frameworks out there for educators to use - including our 5S Framework: Set the scene (priming), be Specific, Simplify language, Structure output, and Share feedback. We also break down a good prompting in the following four steps: 1. Clarity is Key Explicitly state what you would like the model to do. The more specific your prompt, the more accurate and tailored the AI's response will be. General prompts will result in general responses. 2. Pick the Right Prompting Technique You may be able to get what you need from one well-structured prompt (one-shot prompting), but there are other techniques too. You can provide examples in your prompt to guide the AI's responses (few-shot prompting), or cut down your requests into steps (chain-of-thought prompting). 3. Provide Context The chatbot is called a "context window" for a reason! Give AI as much necessary background information as possible. This will help it prepare a response that fits your needs. 4. Format Matters A well-structured prompt guides the AI in understanding the exact nature of your request. Use clear and direct language, and structure your prompt logically. So what does that look like in practice for a one-shot prompt? An OK prompt for educators might look like this: “Create a lesson plan about multiplying fractions for 5th graders” A better prompt would look like: “Act as an expert mathematician and a teacher skilled in designing engaging learning experiences for upper elementary students. Design a lesson plan about multiplying fractions for 5th grade students.” And an even more effective prompt would be: “You are an expert mathematician and teacher skilled in Universal Design for Learning. Design an accessible lesson plan about multiplying fractions for 5th grade students interested in soccer. The lesson should include a hands-on activity and frequent opportunities for collaboration. Format your response in a table.” We take this approach every time we create on of our more than 100 customizable prompts in our Prompt Library. You can check out or complete prompt library here: https://lnkd.in/evExAZSt. AI for Education #teachingwithAI #promptengineering #GenAI #aieducation #aiforeducation
-
Everyone tells you to use AI as a PM, but nobody tells you how to use it the RIGHT way. Here are 3 rules that will help you make AI your secret weapon to save time, solve problems faster, deliver better outcomes: — 𝗥𝗨𝗟𝗘 𝗢𝗡𝗘 → 𝗣𝗿𝗼𝗺𝗽𝘁 𝗦𝗸𝗶𝗹𝗹 𝗜𝘀 𝗘𝘃𝗲𝗿𝘆𝘁𝗵𝗶𝗻𝗴 When it comes to using AI, the most underrated skill is prompting. It makes all the difference between: ✅ Getting exactly what you want. ❌ Getting surface-level answers. Most people blame AI for bad output, but they’re wrong. Your output is only as good as your input. And the quality of your input comes down to how you prompt. Here’s how to give the right prompt (shared by Dan Mac and retweeted by OpenAI’s President Greg Brockman): Goal → Define your objective clearly. Return Format → Specify how the output should look. Warnings → Add caveats or constraints. Content Dump → Provide the relevant context to guide AI. This structure empowers AI to give the right outputs. — 𝗥𝗨𝗟𝗘 𝗧𝗪𝗢 → 𝟮𝟬-𝟲𝟬-𝟮𝟬 𝗙𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸 Most PMs assume AI can “figure out” what they want. It can’t. AI doesn’t know your organization, goals, or context. That’s your job to provide. Remember, AI fills in the middle of the puzzle... But you frame the edges. For that, follow the 20-60-20 Framework: First 20% (Your Input) → Brain-dump all relevant context. Middle 60% (AI's Role) → Let AI generate drafts and solutions. Final 20% (Your Refinement) → Edit out AI artifacts and add your unique human insights. AI is indeed a powerful partner, but it’s not a replacement for your judgment. — 𝗥𝗨𝗟𝗘 𝗧𝗛𝗥𝗘𝗘 → 𝗥𝗲𝘃𝗶𝘀𝗲 𝗧𝗶𝗹𝗹 𝗬𝗼𝘂 𝗚𝗲𝘁 𝗪𝗵𝗮𝘁 𝗬𝗼𝘂 𝗪𝗮𝗻𝘁 AI rarely delivers the perfect result on the first try. That’s where you come in. You need to collaborate with AI like a teammate: → Highlight gaps in the output. → Point out areas that need improvement. → Provide clear, actionable feedback for the next round. Here’s the workflow of the best PM I know: She uses AI like this: → Create a basic prompt and get initial ideas. → Add examples and refine scope. → Consider stakeholder constraints. → Fine-tune tone, details, and polish. — In a nutshell.... Using AI right is a skill on its own. And if you want to shorten your learning curve and master how exactly to use AI.... With the detailed guide on most common use cases, long prompts, and more, go here: https://lnkd.in/er5E5Buf
-
Using AI in recruiting (e.g., for AI Sourcing) is like working with a junior recruiter. It takes things way too literally and might not get everything right the first time around… …but it's an eager partner and very good at taking feedback. Hundreds of recruiters have leveraged Gem AI at this point, and my biggest takeaway is: Prompt quality matters so much — the quality of your written qualification determines the quality of your output (the profiles you see in AI Sourcing). You need to be explicit and literal to avoid confusing it. Before leaning on AI, get crystal clear on what you’re looking for: - How would you describe your ideal match? - Are there examples of what set apart the best talent in your mind? - What specifically should the AI Sourcer be looking for on a resume? Once you’ve thought through how you’d answer those questions, here are a few additional tips to help you with prompting: 1. Be as specific as possible Avoid language that's too broad, like "software development experience." Instead, use specific language about length and type: "Experience using React and Python on the job (not just on school projects)." 2. Provide necessary information, but be concise The longer and more complex your prompt is, the more likely it is to confuse your AI tool and lead to less relevant results. Instead of "We need the prospect to have between 3 and 5 years of experience working as a product manager, as this is a mid to senior role”... …condense your prompt to "3-5 years' experience with product management in B2B SaaS." 3. Avoid subjective qualifications Your AI tool won't know what you mean if you use terms like "good" or "top." Instead of "went to a good school," say "attended a top 20 college for computer science based on US News 2024 rankings." 4. Avoid vague qualifications about soft skills AI tools struggle with terms like "entrepreneurial skills." Rewrite these as measurable experiences like "experience contributing to or leading new product initiatives." 5. Bundle similar qualifications together Don't list skills individually. Create one qualification that covers related requirements: "proficiency with Python, R, and C++." 💡 Getting the right results from AI comes down to this: The clearer your qualifications, the better your matches. Bonus: Iterate on your qualifications as you get calibrated In the same way, you’d calibrate with an actual Sourcer, you’ll continue to get better and better results as you adjust your criteria once you’re calibrated. For a more detailed view of best practices for optimizing AI Search, check out our blog: https://lnkd.in/g5APPgMr