🤖 𝐔𝐬𝐢𝐧𝐠 𝐀𝐈 𝐭𝐨 𝐡𝐚𝐧𝐝𝐥𝐞 𝐂𝐕𝐄𝐬 𝐚𝐭 𝐬𝐜𝐚𝐥𝐞 NVIDIA: Agent Morpheus: CVE Analysis at Enterprise Scale Databricks: VulnWatch: AI-Enhanced Prioritization of Vulnerabilities 1️⃣ Applying Generative AI for CVE Analysis at an Enterprise Scale This NVIDIA post describes an AI-powered workflow called "Agent Morpheus" that automates CVE analysis and exploitability assessment. The system uses RAG (multiple vulnerability databases and threat intelligence sources, the project’s source code, SBOM, docs, Internet search) with four fine-tuned Llama3 LLMs, AI agents, and tools to autonomously investigate CVEs, determine exploitability, and generate VEX documents. Agent Morpheus integrates with container registries and security tools to automate the process from container upload to VEX document creation. By Bartley Richardson, Nicola Sessions, Michael Demoret, Rachel Kay Allen, Hsin Chen. 📎 https://lnkd.in/gq5S_pqR ---- 2️⃣ VulnWatch: AI-Enhanced Prioritization of Vulnerabilities Anirudh Kondaveeti describes Databricks' AI-driven system for detecting, classifying, and prioritizing vulnerabilities, achieving 85% accuracy in identifying business-critical issues, and “no false negatives in back-tested data.” 🤯 The system ingests CVE data from multiple sources, extracts relevant features (CVSS, EPSS, availability of exploit or patch, …), and uses an ensemble of scores (severity, component, topic) to prioritize vulnerabilities. It leverages LLMs and vector similarity to match the identified library with existing Databricks libraries, and employs automated instruction optimization to improve accuracy. This approach has reduced manual workload by 95%, allowing the security team to focus on the most critical 5% of vulnerabilities. 📎 https://lnkd.in/gUrSk8-z #cybersecurity #ai
How Databricks is Transforming AI
Explore top LinkedIn content from expert professionals.
Summary
Databricks is revolutionizing artificial intelligence by combining data analytics and machine learning on a unified platform, enhancing the ability of businesses to harness and scale AI. From simplifying data management with a data lakehouse to advancing AI capabilities with integrations like MosaicML, Databricks is empowering organizations to make data-driven decisions more efficiently and effectively.
- Build a strong data foundation: Invest in a data lakehouse to centralize and organize your data, ensuring consistency and quality for AI-driven insights.
- Streamline AI implementation: Use tools like Databricks' MLOps and custom AI model training to scale AI solutions from experimental to production-level performance.
- Enable real-time insights: Leverage Databricks’ platform for integrating analytics and AI to support faster, more informed decision-making in your organization.
-
-
Yesterday at a Databricks event, I heard a COO tell everyone that building a data Lakehouse is a commodity. You don't need special expertise for your industry. And they’re already using AI. It was a great story, partly because the COO was telling it. He was the sponsor who drove the initiative, and it was clearly critical to their business. Here were his lessons. 1. A lakehouse is a commodity. This may not mean what you think it means. What he meant was that building the lakehouse is not differentiating. You can rely on outside resources to get it right. He initially tried to build a team internally, and failed after months. Luckily he found a great outside agency who came in and helped him get a lakehouse up and running fast that was implemented the right way. 2. It's all in the model The COO’s goal was to bring together his data and be so data-driven that employees could make decisions about customers including whether to keep them. This is a supply chain and transportation company. The industry part of the model involved measuring touches and their costs - the initial request for shipping, the quote, finding the actual trucks, and all the other steps involved. They got to a point where they could give each customer a touch score as an indicator of profitability and what to charge. Their analytics had to allow employees to make decisions based on these costs on how to quote, and which customers to keep. It’s a very low margin business that makes it up in volume. Reducing costs through better decisions is critical. They want to be the one driving costs in the industry down. This model is critical for that. 3. Good AI needs good data He was really skeptical about various AI vendors coming to him with AI-enabled parts of his current systems because his data wasn’t good data. He had data quality and inconsistency issues. The data lakehouse was the first phase, to get to good data. The second phase was to use AI for self-service operational analytics. 4. Use AI to make analytics easier than Excel He had all kinds of systems with data - from logistics to human resources - about trucks, times, costs, and customers. His biggest database? Google Sheets. It’s used by many of his customers and internally for one-off analytics. He decided to fix one-off analytics first, and in the process reduce the reliance on Sheets. They built a “report metabot” using dbrx to make one-off analytics as simple as asking questions (prompts). They demoed it live. It works. The next phase, after it’s been used by employees for a while, is to use it for customer self-service analytics. They started with OpenAI. But they moved to dbrx. It improved performance and cost about 30% less. What’s your first AI project? Do you have an executive sponsor? Is the project business-critical? Databricks Estuary #generativeAI #dbrx