Organizations with many data sources often struggle to integrate them into a single high performing, auditable, and well-understood repository. Even when they do, the complexity and cost of legacy data warehousing solutions tend to become too expensive and fail to scale over time as the business demands more and more out of the enterprise data platform.
In today's data landscape, more responsibility is placed on data warehousing teams to deliver more frequently while also having to "defend" or "prove" the information in their warehouse is accurate. Additionally, many analyitcs often require the ability to track changes over time which can be challenging even for seasoned data professionals. And, of course, all this information needs to be presented within performance thresholds across millions or even billions of rows.
So how does data vault help solve these challenges? Dan Linstedt, creator of Data Vault, defines Data Vault 2.0 as a "System of Business Intelligence containing the necessary components needed to accomplish enterprise vision in Data Warehousing and Information Delivery". It delivers on this definition through three foundational pillars: Methodology, Architecture, and Model.
Methodology essentially means adopting a consistent, repeatable, and pattern-based approach to building your data warehouse. Pulling from The Matrix movie trilogy (yes I purposely made that green), what if I told you the pattern-based approach of data vault can allow you to automatically generate 98% of all your ETL code? What if I told you that you could have full auditability of that data even when source systems do not track changes? By combining data vault with tools such as WhereScape, organizations can leverage data vault patterns to achieve the following benefits.
- Automated Models
- Automated ETL Code
- Automated Docs
- Performance Patterns
- Standard Query Pattern
- Fully Historized
Architecture refers to the multi-tiered, scalable, and technology agnostic benefits when adopting data vault. Classic data warehouse methodologies typically apply hard and soft business rules directly into the dimension and/or fact ETL pipeline. This makes it difficult to change and equally as challenging to share those rules with other information marts. Data Vault separates these two rule types.
Hard rules do not change the meaning of the data or the grain of that data from the source system. By only applying hard rules during data acquisition, your Vault can provide an auditable view of source systems at any point in time all integrated by business key.
Soft rules represent transformations or business logic that changes the meaning or grain of your source data. Instead of applying soft rules when generating dimensional models, data vault persists the output of business rules in the Business Vault. The benefit of this approach is the ability to have full auditability of business rules along with the flexibility to share those rules across multiple information marts.
So while the separation of hard rules allows you to recreate the source data at any point in time, the separation of soft business rules allows you to recreate information marts at any point in time while also leveraging the same automated ETL loading patterns. This means 50% of your information mart code can be auto-generated.
- Source Auditability
- Mart Auditability
- Enables Virtialization
- Automated ETL Code
Model refers to the flexible and scalable hub-spoke architecture of data vault. At its core, data vault models are based on three entity types: hubs which holds a distinct list of business keys, links which store distinct list of relationships, and satellites which store descriptive data about the business key or relationship over time.
Using these entities to model your data provides a cross-source system model integrated by business key within a single repository that is organized under an enterprise-wide ontology.
Another major benefit of the model is that it can be built incrementally over time. You don't have to model an entire system before building your information marts. With data vault, you can adopt an agile approach that quickly delivers value to the business while minimizing the impact of change in your warehouse to achieve the following benefits.
- Business Key Integration
- Performance Optimized
- Parallel Loading Pattern
- Historization of Data
- Master Data Synergy
- Supports Big Data
Let me help
Data Vault Mentoring
Whether you're already certified in Data Vault 2.0 or plan to become certified, your business can benefit from having an experienced data vault practitioner help apply the concepts learned in Data Vault training to your business data.
Data Vault Modeling
At its core, data vault models are based on three entity types: hubs, links, and satellites. Understanding how to use these entity types to model all the various data conditions under a single integrated and auditable ontology can be challenging. I can guide you through early modeling decisions to avoid known pitfalls or I can create the model for you.
Business Vault Design
The business vault holds the output of business rules modeled as standard hubs, links, and satellites. Many businesses struggle to understand how this critical extension of data vault enables virtualization and auditability of their information marts. Let me help you unlock the power of business vault to ensure you realize the promised benefits of data vault in your information marts.
Data Vault Automation
Data vault is a pattern-based approach to building a data warehouse. If you don't have tools that understand this pattern, you are missing out on one of the biggest benefits: model and ETL code automation. While there are several data vault automation toolsets, I specialize in using the WhereScape product suite to realize the automation benefits of data vault. Let a certified WhereScape integration partner help you get the most of our your data vault.
Already have a grasp on data vault methodology and just need an experienced practitioner to help build out your data vault? I'm happy to sit on the team in a heads-down role while also providing any guidance and mentoring along the way.
More companies are transitioning their solutions to the cloud than ever before. The cost savings and scalability benefits of cloud migration are well documented. Cloud platforms, like Microsoft Azure, offer many services over a wide variety of areas.
Knowing how to leverage those services based on industry best-practices can challenging. I have years of experience implementing highly maintainable and secure data solutions using the latest CI/CD tools within Azure DevOps.
If you are looking to build data integration solutions in Azure based on any of the technologies listed below, don't hesitate to give reach out to understand how they can be leveraged in to successfully delivery your enterprise data needs.
- Azure Data Factory
- Azure Synapse
- Azure Active Directory
- Azure Data Lake
- Azure Functions
- Azure Logic Apps
- Azure DevOps
- Azure Virtual Network
- Azure Web Apps
- Azure Resource Manager
- Azure Key Vault
- Azure Virtual Desktop
- Azure SQL Server
- Azure Functions
Cloud solutions often require many of the resource types listed above and many that are not listed here. Knowing how they all fit together can be challenging while also ensuring the highest levels of security are maintained between your on-premise and cloud environments.
I can help ensure your solutions are secured by leveraging your existing integration with Azure Active Directory along with other security resources such as Azure Managed Identity and Azure Key Vault so you don't have to worry about the wrong people seeing your data.
Let me help
Building cloud solutions can sometimes be like building a house. There are hundreds of different resources to choose from and knowing how and when to use them can be challenging. Let a certified Azure Solution Architect Expert help you build a scalable and cost-efficient blueprint for your cloud solution.
Azure has several flexible methods when migrating your data solutions from on-premise to the cloud. While some businesses want to migrate all their workloads to the cloud, others choose a hybrid approach that spans both on-premise and cloud resources. I can help you understand how migrate all or part of your workloads to Azure.
Azure DevOps (CI/CD)
Already have a solution and are struggling to efficiently promote changes to QA or production? Leverage proven experience with Azure DevOps to implement automated CI/CD pipelines to streamline your promotion and approval processes. I can even show you how to integrate with your existing demand management software if needed.
Already have an archtiecture and development team, but need additional resources to speed up delivery? I'm happy to sit on the team in a heads-down role while also providing any guidance and mentoring if needed along the way.
I'm a certified consultant with over 20 years of experience building web applicaitons, web APIs, and custom software solutions using the latest cross-platform .NET frameworks. Leverage proven object-oritented design patterns to ensure the highlest levels of maintanability and scalability in any solution.
Almost every application in today's world needs data. Not only can I help you build high quality software, I can also model and implement your data access layer using using any of the latest data access platforms. Extensive experience building and tuning solutions built on the Microsoft SQL Server platform including Azure cloud editions such as Azure SQL Server, SQL Managed Instance, and Azure Synapse.
So whether you need REST APIs, ASP.NET websites using MVC/Razor, or middleware class libraries, I can help architect and build the entire solution or simply sit on your team to accelerate delivery time. Below you can find many of the skill sets that I can bring to your team to help accelerate your software project.
- C# Development
- SQL Development
- Entity Framework Core
- Object Oriented Design
- Web API Architecture
- Website Development
- VB.NET Development
When architecting your solution, you should always consider how it will scale and react to various levels of failure. I can help you understand how to meet your disaster recovery and performance objectives using through services available in Azure cloud. Be sure your solutions fault tolerant, meet scalability requirements, and provide a geographically redundant hosting environment.
Let me help
Whether you're buliding microservices, Azure cloud apps, or ASP.NET apps, good architecture directly correlates to long-term success and maintainability of your solutions. I can you help build an architectural blueprint and show you how those architecture decisions can be be enforced within Visual Studio.
I'm a geek and love to code. Leverage 20 years of experience architecting and building data driven solutions using any .NET Framework language such as C# and VB.NET. I specialize in building all types of applications on top of Microsoft SQL Server platforms and can ensure you get all the performance that SQL has to offer whether its SQL Server, Azure SQL, SQL Managed Instance, or Azure Synapse.
Azure DevOps (CI/CD)
Already have a solution and are struggling to efficiently promote changes to production? I can help you leverage Azure DevOps to implement CI/CD pipelines to streamline your promotion process.
Already have an archtiecture and development team, but need to augment additional resources to speed up delivery? I'm happy to sit on the team in a heads-down role while also providing any guidance if needed.