Last week, I presented some of our thoughts on AI search to a venture portfolio. Given the interest in the topic, I’m going to be sharing some insights over the next few days as part of an AI Optimization Series, where I’ll break down how LLMs work and strategies for adapting to AI-driven search. In this first post, I’m going to talk about how big AI search really is and where it fits into a marketing strategy. Here are some insights I go over in the video about what I’ve observed across top AI search platforms: 1/ AI product growth and the shift to Answer Engines > From the data I’ve tracked, AI search is a small but rapidly expanding share of total search traffic. ChatGPT is the dominant player right now, with around 400M active users, but we’re seeing a lot of fragmentation. New entrants are growing very quickly (eg. Deepseek and Grok hitting #1 in the AppStore) > We are seeing a convergence to the Answer Engine pattern. LLMs are adding search, and Search Engines are adding content generation. > Google just launched AI Mode following the lead of Bing and other smaller engines. AI Overviews are getting rolled out aggressively and replacing traditional featured snippets, and more users rely on AI-generated summaries instead of clicking multiple links. 2/ Referral traffic and sign-ups from Answer Engines are growing rapidly: > Many companies have started to track referral traffic from AI platforms, but that only tells part of the story. The bigger shift is that users complete their entire search journey within AI search engines, asking multiple follow-up questions and making decisions, so they might never click through to your site. > We’ve built a simple dashboard that plugs into Google Analytics to measure traffic and conversions from these Answer Engines. Some of the data that Mercury and Vercel shared shows ~5% of conversions coming from AI traffic, which doesn’t sound huge but is growing quickly. 3/ We might need new ways to measure impact. > Traditional SEO tends to rely on top-of-funnel traffic volumes, but in AI search, a lot of that happens behind the scenes. We might see fewer “research” clicks and more direct or bottom-of-funnel sessions. > We’ll need to focus on conversions and user journeys instead of just raw traffic. Users might discover a brand entirely within an LLM conversation, skip the usual research phase, and show up directly when they’re ready to buy. In the next post, I’ll go into more detail on how AI is changing the way people find information and ways to optimize for it. If that’s something you’re interested in, you can follow along for updates.
Innovations in AI-Powered Search Technology
Explore top LinkedIn content from expert professionals.
Summary
Innovations in AI-powered search technology are transforming how we access and interact with information, moving from traditional methods to conversational, context-aware, and multimodal systems. These advancements integrate AI models to provide instant answers, combining text, images, and personalized insights for a seamless user experience.
- Focus on user questions: Redesign content strategies to address user needs directly, as AI search engines prioritize delivering specific answers instead of redirecting users to multiple sources.
- Adapt to AI-driven journeys: Shift your measurement and outreach strategies to consider how users engage directly with AI platforms, often completing their decision-making process entirely within these systems.
- Explore multimodal capabilities: Incorporate both textual and visual elements into your data and content to support advanced AI systems like Multimodal RAG, which provide richer and more accurate search results.
-
-
Brain Boost Drop #21 𝐌𝐮𝐥𝐭𝐢𝐦𝐨𝐝𝐚𝐥 𝐑𝐀𝐆 𝐄𝐱𝐩𝐥𝐚𝐢𝐧𝐞𝐝 𝐕𝐢𝐬𝐮𝐚𝐥𝐥𝐲 ! Retrieval-Augmented Generation (RAG) is revolutionizing AI-powered search and retrieval systems, but it's no longer limited to just text! With the integration of multimodal capabilities, we can now combine both text and images to enhance the retrieval process, making AI systems more context-aware and capable of providing richer, more accurate responses. How does Multimodal RAG work? 1️⃣ A custom knowledge base is built using both text and images. 2️⃣ Images are converted into embeddings using specialized image embedding models and stored in a vector database. 3️⃣ Similarly, text is processed using text embedding models and indexed for retrieval. 4️⃣ When a query is made, it is converted into embeddings using text embedding models. 5️⃣ A similarity search is performed in the vector database to fetch the most relevant images and text. 6️⃣ The retrieved content is combined and used as context to prompt a multimodal large language model (LLM). 7️⃣ The LLM generates a response, leveraging both textual and visual data to provide a more accurate and contextualized answer. Why does this matter? Multimodal RAG enables AI to go beyond traditional text-based retrieval and integrate visual understanding, making it ideal for applications such as: ✅ AI-powered search engines ✅ Advanced chatbots with better context awareness ✅ Medical and scientific research assistance ✅ E-commerce and recommendation systems ✅ Legal and financial document analysis The future of knowledge retrieval is multimodal! If you're building AI applications that rely on enhanced retrieval mechanisms, Multimodal RAG is something you should explore. What are your thoughts on the future of AI-powered retrieval? Let's discuss! Follow Nikhil Kassetty for more Brain Boost Drops. #AI #MachineLearning #MultimodalRAG #LLM #KnowledgeRetrieval #AIInnovation #DeepLearning
-
In 2024, AI started to change the way we search for information on the internet. In 2025, it’ll change the way we search for information at work. Here are 5 trends we’re seeing among Guru customers that will accelerate this year: 1. Search queries will look a lot more like questions Think about how you search today – if you want to know your team’s priorities for the quarter, you might type ‘Q1 planning 2025’ into your company wiki and hope to find a document that contains the answer. But in 2025, you should be able to directly ask what you want to know and expect an accurate answer. There’s a fundamental shift underway from hunting for documents to getting instant answers to your questions. 2. We will get frustrated by lists of links in our search results We're already seeing this shift in Google search results, where AI summaries are now common. At work, more employees will lose patience when they conduct a search and just get a list of files to sort through. In 2025, you should expect your company’s tools to provide immediate answers, not just point you to where you might find them. 3. Unstructured data won’t be out of reach any more With AI’s ability to search across things like meeting transcripts and chat conversation, it’s now possible to access much more information than before. In 2025, you should expect to find information that’s buried in a Slack thread as easily as you find a file in a carefully structured wiki. 4. Search will start to feel more like chat Today, each search query happens in isolation – if you need to refine your results or ask a follow-up question, you start over with a new search. In 2025, search will become more conversational. Your follow-up questions should build on previous context, for example: If you ask 'What's our PTO policy?,' you should then be able to ask questions like 'When did this change?' and ‘How do I request time off?’ 5. Search will emerge as the #1 job-to-be-done by AI Most CEOs I speak to have the same priority for 2025: to figure out exactly how best to leverage AI at their company. Ever since we launched our first AI search product in early 2023, we’ve seen ‘search’ emerge as one of most practical, valuable ways to use AI at work. In 2025, I think it will become a cornerstone job-to-be-done of AI in the enterprise. After all, it helps every employee do something that's fundamental to every role at every company: find the information they need – faster than ever before. Are you using AI search at work yet? What has your experience been like?
-
𝗚𝗼𝗼𝗴𝗹𝗲 𝗗𝗼𝗲𝘀𝗻’𝘁 𝗦𝗲𝗮𝗿𝗰𝗵 𝗔𝗻𝘆𝗺𝗼𝗿𝗲 — 𝗜𝘁 𝗔𝗻𝘀𝘄𝗲𝗿𝘀. The AI overhaul of Search has begun, and it’s shaking the internet’s foundation. Google has officially entered its next phase of AI-powered search with the launch of “AI Mode,” a conversational interface now available across the U.S. This feature, powered by the Gemini 2.5 model, transforms Google Search into an expert-like assistant capable of providing nuanced, context-rich answers. It's a bold move that redefines how we interact with information online. However, this shift isn't without its challenges. While AI Mode enhances user experience, it has led to a nearly 30% drop in clickthrough rates, raising concerns among publishers and content creators about reduced web traffic. Despite these concerns, Google's dominance remains unshaken, with 136 billion monthly visits compared to ChatGPT's 4 billion. In addition to AI Mode, Google is exploring features like automated ticket purchases, live video search, and personalized AI assistance across its suite of apps. The company also previewed Android XR smart glasses powered by AI, signaling a return to the smart glasses market in partnership with Gentle Monster and Warby Parker. These developments underscore Google's commitment to integrating AI across its platforms, even as it navigates legal, competitive, and technological shifts in the information access landscape #GoogleAI #SearchInnovation #AITransformation #TechNews https://lnkd.in/eedFm_5X (one of many areas they are introducing, including F12 console).