EY is proud to be a launch partner of Snowflake Intelligence. This innovative capability bridges the gap between technical and business users, helping organizations to more effortlessly convert their #data into meaningful insights. With Snowflake Intelligence, users can ask questions in plain language without needing SQL experience and receive answers in moments versus days, weeks or even months. By implementing Snowflake Intelligence into daily business operations, EY teams help our clients shape their data’s true potential by delivering value at speed for their investments. Learn more about the EY-Snowflake Alliance: https://ow.ly/7x6c50XmEje #EYSnowflake #AllIn
EY Ecosystems’ Post
More Relevant Posts
-
Snowflake just rolled out a major upgrade to its Snowpipe Streaming architecture and it’s a big deal for any business that depends on near real-time data to drive decisions. The new system makes it faster and more efficient to stream data into Snowflake, with less engineering overhead and more predictable costs. For organizations that rely on fresh insights like financial services, retail, or logistics, this means shorter time-to-decision and better operational agility. One standout example from a global market data provider: they’re using the new architecture to stream over 100 TB of data and nearly 200 billion rows per day while keeping query times under 30 seconds. That level of performance used to require massive infrastructure investment. Now it’s built into Snowflake. If your business runs on up-to-the-minute insights, this upgrade could translate into: – Lower infrastructure and dev costs – Faster decision-making – More scalable data operations – Better ROI on data investments We’re excited about what this means for our clients who want to move faster and smarter with near real-time data. Curious how your organization could benefit? Let’s talk. #dataanalytics #realTimeData #businessintelligence #Snowflake #DataSolutions #datainfrastructure #decisionmaking #DSC
To view or add a comment, sign in
-
You can’t fix what you can’t see. Most data teams discover Snowflake overspend after the bill hits. The warehouses have already been running, the budgets already blown. But what if you could see a breakdown in real time… While queries are still running? That’s the game-changer our latest piece dives into: how real-time Snowflake cost visibility helps data teams stay in control, not just react. 🧠 Insight by Amit Yahalom worth the read if cost predictability keeps you up at night. https://lnkd.in/dzV4-JSb #Snowflake #FinOps #DataEngineering #CloudCost #DataPlatform #CostOptimization
To view or add a comment, sign in
-
-
3 Ways to Connect Multiple Data Sources Without Slowing Down Snowflake Tech Tip from Agilityx: Insight doesn’t come from data alone. Snowflake delivers incredible scale and speed — but combining multiple data sources can quickly lead to slow queries and bottlenecks. The difference between frustration and actionable insight often comes down to how you connect your data. Here are three strategies to make Snowflake work smarter, not harder: 1. Start with questions, not queries: What insights do you truly need before writing a single query? Let curiosity drive your data, not the other way around. 2. Integrate with intention: Are you moving data efficiently, or just moving it? Use staging tables, smart clustering, and optimized joins to reduce unnecessary data movement. 3. Automate where it counts: Let Snowflake handle repetitive transformations while your team focuses on interpretation and decisions. Data is powerful, but only when it’s connected thoughtfully. Snowflake helps you get answers faster, but a strategic approach ensures those answers actually move the needle. #Snowflake #techtip #dataanalytics
To view or add a comment, sign in
-
-
How to Reduce Snowflake Costs by 40% in Just Two Weeks Most Snowflake environments waste money — not because of bad tech, but because no one’s watching the right things. At Data Prophits, we’ve helped teams cut spend by 30–40% without touching performance. Here’s what actually works: * Identify silent spenders. Unused warehouses, over-provisioned compute, and forgotten clones are the usual suspects. * Automate idle suspension. Use Snowflake Tasks + Policies to pause warehouses when query volume drops. * Spot query bloat. Dashboards showing longest-running or highest-credit queries instantly expose inefficiencies. * Right-size storage tiers. Historical data can live cheaper — automate cold-data archiving to S3 or Iceberg. * Monitor everything. Our “Snowflake Cost Control Pack” alerts you before the bill spikes. Data should power growth, not burn budget. Want the 2-week playbook? Drop a 💬 or DM me — I’ll send the checklist. #Snowflake #DataEngineering #CostOptimization #DataProphits #Domo #CloudEfficiency datprophits.com
To view or add a comment, sign in
-
-
One of our Snowflake warehouses quietly doubled its monthly credits with no change in workload, no extra users. The culprit? Hidden inefficiencies we couldn’t see until we built proper observability. This post walks through how I created a simple, reliable monitoring framework inside Snowflake using only real system views, no external tools. Track what actually matters: 1) Slowest and most expensive queries 2) Idle or overused warehouses 3) User-level compute consumption 4) Real-time cost dashboards & alerts All powered by ACCOUNT_USAGE and INFORMATION_SCHEMA views with working SQL and Streamlit examples you can use today. #Snowflake #DataEngineering #CostOptimization #DataObservability #DataOps #SQL #Analytics #SnowflakeTips
To view or add a comment, sign in
-
Snowflake cost problems don’t start with the invoice - they start with lack of observability. Without full visibility into what’s running, who’s using what, and where costs come from, optimization becomes guesswork. Every data leader knows the pain: Snowflake bills that grow faster than usage, “focus weeks” spent chasing query logs, and endless back-and-forth with finance to explain why spend doubled… again. The truth? It’s not a cost issue. It’s an observability issue. When you can’t see how your data assets connect across ingestion, transformation, and BI, you can’t control how they consume resources. Observability changes that. It lets you trace every dollar of compute to a warehouse, model, or user, expose inefficiencies in real time, and finally link data spend to business value. Modern FinOps isn’t about cutting spend - it’s about making every query accountable. That starts with stack-wide observability. 👉 Read more about it - https://lnkd.in/d6rGuVbs #DataOps #Snowflake #DataEngineering #CDO #DataOptimization #ModernDataStack #CloudData #DataEfficiency #FinOps #DataInfrastructure #observability
To view or add a comment, sign in
-
-
The Unspoken Lifecycle of a Snowflake Implementation Project. Implementing a Snowflake environment often looks straightforward on paper until it begins. -The planning phase feels smooth: pipelines defined, warehouses configured. -Then comes data reality unexpected formats, query costs, and schema drift. -Optimization becomes both art and science: tuning clusters, managing credits, and balancing performance with cost. -Finally, stability arrives dashboards align, monitoring flows, and automation takes over. Each stage reinforces one truth: "Snowflake success isn’t just about architecture, it’s about discipline, governance, and continuous learning." Here’s to every engineer ensuring that dashboards stay fast, queries stay lean, and data stays reliable. #Snowflake #DataEngineering #DataGovernance #Analytics #SQL #DataArchitecture #DataProfessional
To view or add a comment, sign in
-
A materialized view in Snowflake has one key limitation - it can reference only a single table. To address this, Snowflake introduced 𝗗𝘆𝗻𝗮𝗺𝗶𝗰 𝗧𝗮𝗯𝗹𝗲𝘀, which allow more flexibility. However, they can’t be refreshed instantly; you must define a refresh rate (for example, every 10 minutes or once an hour). Think of 𝗗𝘆𝗻𝗮𝗺𝗶𝗰 𝗧𝗮𝗯𝗹𝗲𝘀 as a combination of a 𝗠𝗮𝘁𝗲𝗿𝗶𝗮𝗹𝗶𝘇𝗲𝗱 𝗩𝗶𝗲𝘄 + 𝗗𝗮𝘁𝗮 𝗣𝗶𝗽𝗲𝗹𝗶𝗻𝗲 + 𝗦𝗰𝗵𝗲𝗱𝘂𝗹𝗲𝗿 - all in one. The real advantage? They can be used to 𝗯𝘂𝗶𝗹𝗱 𝗲𝗻𝗱-𝘁𝗼-𝗲𝗻𝗱 𝗱𝗮𝘁𝗮 𝗽𝗶𝗽𝗲𝗹𝗶𝗻𝗲𝘀 directly within Snowflake. I’m curious — what are some 𝘂𝘀𝗲 𝗰𝗮𝘀𝗲𝘀 𝗼𝗳 𝗗𝘆𝗻𝗮𝗺𝗶𝗰 𝗧𝗮𝗯𝗹𝗲𝘀 you’ve implemented in your projects? #Snowflake #DynamicTables #DataEngineering #ETL #DataPipelines #CloudData #DataTransformation #DataAnalytics #SnowflakeCortex #ModernDataStack #SQL #DataAutomation #RealTimeData #IncrementalRefresh #DataOps Image credit: Snowflake Documentation
To view or add a comment, sign in
-
-
Big news: Snowflake’s latest press release highlights Merkle’s role in their AI-ready data platform rollout. Read how this collaboration is enabling secure, governed data at scale and setting the standard for the enterprise lakehouse: https://bit.ly/4p6J7Th
To view or add a comment, sign in
-
-
Big news: Snowflake’s latest press release highlights Merkle’s role in their AI-ready data platform rollout. Read how this collaboration is enabling secure, governed data at scale and setting the standard for the enterprise lakehouse: https://bit.ly/3WTspec
To view or add a comment, sign in
-