Blending Trust into Your Data Pipelines for a More Reliable Operation

dbt Labs, the pioneer in analytics engineering, has officially announced the launch of several enhancements that are all designed to help businesses access trustworthy, high-quality data. According to certain reports, these enhancements begin from the introduction of dbt Assist, which is essentially an AI-powered copilot capable of automatically generating documentation and tests so to let data developers get more done in less time. Next up, we have an Advanced CI. The stated feature makes it possible for teams to verify that changes to codebase meet quality expectations before they are eventually merged into production. The update further brings forth a Unit Testing facility where you can improve test coverage without driving up data platform spend, and you can do so through earlier validation of modeling logic. Moving on to the prospect of automatic exposures, they help you achieve dbt Cloud automatic awareness of Tableau dashboards, dashboards that are downstream of dbt models. Such awareness is meant to help users trace and automate end-to-end data lineage to unlock efficiencies, optimize compute costs, and improve data freshness and trust. Joining the same would be dbt Cloud CLI. This one has its utility rooted in allowing developers to contribute to projects in dbt Cloud through terminal or IDE of choice.

“Accurate and timely data is crucial, which is why we’ve delivered a standardized way to quickly build reliable, holistic, and high-quality data pipelines at scale,” said Luis Maldonado, VP of Product at dbt Labs. “These new features take this even further, significantly improving data workflows and AI workloads, all while empowering more users with powerful business insights.”

Another detail worth a mention regarding this development is its low-code development experience. Leveraging a simple drag-and-drop visual editor, the technology enables you to generate SQL, which in turn, lowers the barrier to entry for more contributors to collaborate on the analytics engineering workflow in dbt Cloud. Then, there are upgrades introduced in and around dbt Explorer. In case you weren’t aware, dbt Explorer is an intuitive, interactive catalog for data teams to understand, improve, and troubleshoot their dbt assets across teams and projects. Thanks to the given update, though, users can now bring enriched lineage, auto-exposures, and telemetry into model consumption for the purpose of aligning development work with business impact. Not just that, you can also bank upon the stated development work, alongside embedded data health signals in analytics tools, to achieve trusted data delivery at scale. Beyond improvements in the context of dbt Explorer, dbt Labs also introduced a significant step-up for the dbt Semantic Layer. This step-up packs together enterprise-critical features such as granular access controls and permissions, Tableau and Google Sheets integrations, declarative caching, and improvements to MetricFlow that let teams build and consume complex metrics with more velocity and accuracy. Furthermore, as a part the wider runner, users can come expecting multi-project support, where the solution will allow companies to manage complexity by supporting multiple inter-connected dbt projects complimentary of individual business domains,

“dbt Cloud allows us to take all the data we’ve collected and actually make it useful to the business,” said Evan Cover, Director BI Engineering & Governance at Klaviyo. “With dbt Cloud at the center of our transformation workflows, we can build data products that represent the reality of our business objectives and model how we go about selling, attracting, marketing, and retaining customers. By enabling more people across the business to collaborate on building trusted data products, dbt Cloud allows us to work faster, more efficiently, and take more advantage of our data.”

Hot Topics

Related Articles