Our client is a residential property developer that has built homes for thousands of people across Australia. Careful cost management of the construction process is essential to ensure profitable and timely delivery of each of their clients’ dream homes.
Gaining visibility over costs was challenging in an environment which was not traditionally data-centric, with multiple source systems that presented the same information in different ways. This made it difficult to understand the full cost of a job, the final margin on each sale and the impact on profitability due to changes made during the build process. Working with the data was very manual and significant effort was required to produce reporting, often with inconsistent results.
The client was using a number of technologies for reporting such as SQL Server Reporting Services (SSRS), QlikView and Power BI along with a SQL Server Integration Services (SSIS) and Database engine backend to try to attain the needed visibility. Over the years the data sources and reports had been developed on an ad-hoc basis without an overarching structure and strategy. This lead to a Data platform that was struggling to meet its intended objectives. We engaged with our client to review, recommend and then implement best practices for the entire process.
We began by reviewing the existing solution architecture and provided a set of clear and practical recommendations to improve the functioning of the Data & AI process, each of which delivered identifiable value in terms of process, performance, and outcome. These recommendations touched upon infrastructure, Data Management, ETL process, Reporting and Visualisation best practices.
The client decided that the key element to focus on was the heart of the solution – the data. The aim was to create a fit for purpose data source, updated in a stable and reliable manner and presented for easy consumption by non-technical users. A team was formed of the clients subject matter experts and our expert Data Warehouse and Cube developers to work together to understand the data, how to structure it for analysis and present it for consumption.
Our team leveraged our proprietary tools, patterns, and methods to build out a stable data platform that consolidated data from multiple sources into a coherent picture in a Data Warehouse. We then built a SQL Server Analysis Services (SSAS) cube on top of it to provide a consistent interpretation of the data that allowed business users to self-serve from a trusted and agreed source. To ensure that the process was well understood and knowledge accumulated during the build process was not lost, the solution was documented in detail.
As a result, our client now has a single source of truth for understanding costs for each job, with an agreed structure and centrally defined metrics. Significant time savings have been achievable through having an automatically populated source of information that can be trusted and used on demand without recourse to users with specialist skills.
The process has also exposed issues with data in the source systems, which can now be addressed as part of a coherent process, rather than patching over them in each ad-hoc analysis.
Now managers can more easily find answers to critical questions such as what are the non-value or unbudgeted activities that contribute to the additional cost of building a house, and how do these changes impact margin.
SQL Server 2016
- Database Engine
- Integration Services (SSIS)
- Analysis Services (SSAS)
FTS Data & AI Proprietary tools:
- Data & AI Review Framework
- ETL Control Framework
- Data Warehousing Methodology
- Tabular Cube Methodology