The FUTURE of Reporting?

Mastering Snowflake
5 min readMar 21, 2024

Thank you for reading my latest article Is this the FUTURE of reporting?.

Here at Medium I regularly write about modern data platforms and technology trends. To read my future articles simply join my network here or click ‘Follow’. Also feel free to connect with me via YouTube.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

How do you set up your reporting to not just survive but thrive? We all want high quality data and consistency in usage across our data assets in our organizations, but is it just a case of wishful thinking?

Perhaps the recent emergence of tools within the semantic layer is what we’ve been waiting for? Do these tools become our new best friend and an essential part of the cloud data tech stack? And crucially, how do we navigate the trade-offs between reporting flexibility and consistency?

To help break down these challenges, I turned to David Krakov, a seasoned veteran in building data products. With two decades of experience and a successful venture under his belt (acquired by Starburst), David is now making waves with Honeydew — a Y Combinator-backed endeavor that emerged as a finalist in the Snowflake Startup Challenge. Honeydew isn’t just another tool; it’s a semantic layer tailored for Snowflake that promises to redefine how we approach metrics, easing the analytics engineering burden and cutting through the clutter of duplicate logic.

In this newsletter we’ll look at the options of where to build a semantic layer and the pros and cons of each with David*. You’ll also find a link to the video where myself and David talk about the semantic layer challenge in more depth with a demo of Honeydew and how to get your hands on a demo to take a test drive for yourself!

* Note: this was based on a more comprehensive interview David did with the Data Analysis Journal. You can find the full post here.

Where to Build It?

When it comes to building a metric store for common definitions to use across your business, you have a few options at your disposal. You can build it during ingestion, in ELT, in the consumption tool or in the semantic layer. Let’s take a look at each of these in turn with the pros and cons of each:

During Ingestion

Building metrics during ingestion means to process data on its way into your data platform. This approach suits scenarios with vast, fast-moving data, like IoT streams, where latency is a no-go. It’s cost-effective and ensures metrics are always up-to-date. However, this approach comes with its caveats — complex coordination between teams, difficulty handling late-arriving data, reliance on specialized technology such as in-memory databases, and a potential disconnect from raw data.

In ELT

Computing metrics post-ingestion in a data warehouse, via tools like Airbyte, Fivetran, or dbt, offers consistency as data will often be sourced via a view or a table. The source code is often a SQL query meaning traceability is easier when it comes to debugging. It simplifies governance as the data is centralized in database objects, but struggles with non-aggregative metrics, unpredictable downstream impacts, and requires a tight collaboration between domain experts and analytics engineers, which can lead to bottlenecks and inefficiencies.

In the Consumption Tool

Defining metrics right in the BI tool (e.g., Looker, PowerBI, Tableau) maximizes flexibility and user-friendliness. Users tend to be familiar with the interface of these tools and it makes the metrics appear as if they are native to the tool. It’s great for bespoke, non-additive metrics but falters when dealing with multiple BI tools as logic will inevitably need to be repeated leading to an increased maintenance overhead and the risk of metrics diverging between tools over time. The governance and deployment process around BI tools is lacking when compared to data engineering tools, and there is also the risk of potentially poor performance due to live querying.

In a Semantic Layer

Enter the semantic layer, a middle ground that combines the best of both worlds — centralized governance from ELT and the flexibility of BI tools. David describes this as a translator between the data in your data platform and the tool or application requesting the data.

This is where Honeydew sits, offering a streamlined, standardized way to manage metrics across tools. Its aim is to reduce complexity and distribute ownership but introduces more components into the data stack and, depending on the use case, they may be slower (or faster!) than traditional ELT or BI logic implementations.

Conclusion

Choosing where to build your metrics isn’t just a technical decision; it’s a strategic one. As David Krakov elucidates in our enlightening discussion, each method has its place, depending on your specific needs, challenges, and the complexity of your data environment. Whether you lean towards ingestion, ELT, BI tools, or a semantic layer like Honeydew, the goal remains the same: to create a robust, flexible, and efficient reporting framework that can adapt to the ever-changing data landscape.

For a deeper dive into these insights and to see Honeydew in action, don’t miss our accompanying video where David Krakov shares his wealth of knowledge in detail. It’s an invaluable resource for anyone looking to stay afloat and thrive in the tidal wave of data and analytics tooling.

To stay up to date with the latest business and tech trends in data and analytics, make sure to subscribe to my newsletter, follow me on LinkedIn, and YouTube, and, if you’re interested in taking a deeper dive into Snowflake check out my books ‘Mastering Snowflake Solutions and SnowPro Core Certification Study Guide’.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

About Adam Morton

Adam Morton is an experienced data leader and author in the field of data and analytics with a passion for delivering tangible business value. Over the past two decades Adam has accumulated a wealth of valuable, real-world experiences designing and implementing enterprise-wide data strategies, advanced data and analytics solutions as well as building high-performing data teams across the UK, Europe, and Australia.

Adam’s continued commitment to the data and analytics community has seen him formally recognised as an international leader in his field when he was awarded a Global Talent Visa by the Australian Government in 2019.

Today, Adam is dedicated to helping his clients to overcome challenges with data while extracting the most value from their data and analytics implementations. You can find out more information by visiting his website here.

He has also developed a signature training program that includes an intensive online curriculum, weekly live consulting Q&A calls with Adam, and an exclusive mastermind of supportive data and analytics professionals helping you to become an expert in Snowflake. If you’re interested in finding out more, check out the latest Mastering Snowflake details.

--

--

Mastering Snowflake

Our mission is to help people trapped in a career dead end, working with on-premise, legacy technology break into cloud computing by using Snowflake.