How to Build a Semantic Layer for Self-Service Analytics Using Looker: Step-by-Step Guide

How to Build a Semantic Layer for Self-Service Analytics Using Looker: Step-by-Step Guide
Create a Semantic Layer for Self-Service Analytics in Looker
This article has been contributed by Sreyashi Das, Senior Data Engineer, Netflix.

Have you ever witnessed failed attempts at democratizing analytics? There is a relatively universal pattern where data visualization experts and data analysts must manually transform, apply filter clauses, and connect data across business domains before it is ready for charts and sophisticated visualizations. Data stewardship is entirely manual, starting with data lineage, governance, documentation, and maintaining operations. Although we have seen investments in ensuring data sources are timely and accurate, and on the other end, we have good frameworks and visualization tools like Tableau and web applications to connect to the data layer, the method of accessing data in a clean and consistent manner still impacts the time to insights and leads to inconsistencies in metric definitions.

Self-service analytics frees up the IT team to tailor data for business users, allowing data engineers to focus more on their strengths, which is building foundational datasets. However, self-service analytics forces business users to become SQL savvy, database experts, and modeling experts, leading to less trust in data and creating duplicate work. A semantic layer can be a key technology enabler for delivering a shared understanding of domain-specific entities, improving trust and data security, and accelerating time to insights, empowering business-friendly self-service analytics but with governance and control.

One of the trends driving the analytics market is the changing enterprise landscape, where there is a profound shift in data management and analytics. Cloud infrastructure is the default for deployment. The data warehouse has become more powerful by leveraging solutions like Snowflake but also challenging to understand and manage as a data consumer. Data is becoming more decentralized, and organizations are moving towards the Data Mesh paradigm to shift from centralized control to empowered domain engineering teams. A robust semantic layer in a Data Mesh or a hub-and-spoke style architecture can maintain coherence and quality, providing a single source of truth despite departmental and geographical distribution of data, keeping data aligned with organizational standards.

Role of Data Engineering

It is not about making data available for data experts; it is about making data available for everyone. Data engineers build the foundations of the data pipelines. The semantics, in the form of metadata, can be ingrained in the raw tables for data scientists to use in the later stages of the pipelines. The semantic layer is a shared responsibility of data engineers and analytic engineers and can be achieved by having flexibility, collaboration across partner teams, and sharing best practices to enable cross-domain analytics.


How Netflix Recommendation System Works? (Case Study)
Netflix is a streaming giant that uses big data, AI and its recommendation engine to recommend users curated content. Let’s understand it in detail.

Connected Data

Looker is empowering the data community with an enterprise-ready off-the-shelf solution that you can use today, featuring SOX compliance, Google Sheet Integration, and, most importantly, version control. For a semantic layer to be truly ubiquitous, data must be accessed from the same underlying data structure. Using LookML, Looker’s modeling language, semantic layer developers can participate in a shared analytics environment that brings context and clarity to the logical data model, relationships, and metric definitions.

Step-by-Step Guide

Here are some best practices to organize and build the data model using LookML:

  1. Core dimensions like product hierarchy must adhere to organizational standards and be a single source of truth. This is also known as conforming a dimension.
  2. Connect Looker to a data source, for example, the Snowflake data warehouse.
  3. Create a LookML project.
  4. Using LookML, create views. Views are named after physical table names.
  5. Then create explores. Explore files include view files and determine how data in views can be queried and explored. Explores are built for user discovery.
  6. Model files include explore files. Model files act as entry points in LookML projects, defining which explores are available for querying. By including explores in models, all the explores are made available for reporting and analysis.
  7. Using the hub-and-spoke model ensures cross-domain connectivity and accelerates the speed of generating insights from data and promotes a shared understanding of key business metrics and definitions.
  8. Test and validate your data model. Looker generates SQL queries. Validate the results.
  9. Iterate and refine your model with evolving business definitions.
  10. Documentation using LookML comments is key for the long-term impact of your data model.

Looker’s version-controlled semantic layer centralizes your business logic, ensuring everyone uses the same definition of key metrics regardless of technical background. It separates business logic from physical data, allowing you to reliably apply consistent definitions across KPIs. Metrics defined in a Looker model can also be consumed in other places like Google Sheets, Tableau, and other business intelligence tools. A universal semantic layer not only helps to democratize data and empower teams to make better decisions but also serves as the foundation for GenAI initiatives both now and in the future.

High-quality physical data models are the foundation for more accurate and trustworthy predictive models. If there exists a question-answering system powered by a large language model, the knowledge graph representation of the SQL database, such as Snowflake, will lead to more accurate answers. If there is one thing organizations should invest in, it is semantics. In today’s era, where GenAI adoption continues to grow year over year, context, metadata, and documentation together will elevate the semantics of datasets and enable data-informed decision-making.


Top 10 SaaS Business Intelligence Tools for 2023
Discover the best SaaS BI tools that offer advanced analytics and data visualization capabilities to transform raw data into actionable insights.

Must have tools for startups - Recommended by StartupTalky

Read more