Building a data pipeline or warehouse can be overwhelming. You swim in an ocean of bits, codes, timestamps, and characters classified in various columns and data tables, which usually don’t make any sense in isolation. You have to structure your data warehouse, develop a model with relationships between the data tables, and ensure the quality and integrity of the data you collect, while maintaining your data ecosystem with a continuous improvement approach.
In order to create value from data and enable intelligent buildings along with tackling technology debt, some ground work is required to lay the foundations and build a pipeline that will enable data-led decisions in property management.
A key fundamental to this is connecting data sets between platforms. As an example, data sets from energy management systems and building analytics platforms in isolation can be interesting, but connecting these systems with customer-focused applications monitoring customer trends, occupancy and movement deliver real insight, will create a pathway to optimisation based upon property needs, not out of date occupancy theories.
The quality of your data will positively impact the value of your portfolio to create a culture that believes in following the data and not the paper.
There are several steps to consider to create fit-for-purpose data:
- Assess the business needs: define and focus the problems and solutions and work backwards to identify the data points you require to deliver these.
- Establish the baseline: focus on your business rules as guidelines, they are an important starting point to benchmark measurements and standards.
- Build insights: converge the areas of importance, build data sets and integration, develop analytics and Key Performance Indicators which are then made available via user role-specific dashboards defined with clear purposes.
- Develop your data ecosystem: it’s extremely challenging to have a 100 per cent accurate, qualitative and fully optimised data ecosystem so continuously review the challenges and improvement plans.
- Capture and create: processes should be created to verify, monitor and maintain the quality of captured data.
- Identify the necessary information: this will determine the frequency of the data that you need to generate and avoid the collection of ‘dark data’ (data that organisations collect, process and store during regular business activities but generally fail to use).
- Create uniformity data rules: this will process data effectively to avoid any confusion or create mistakes. The data format is also something important to consider to ensure the ease/possibility of manipulating the data downstream so that it can be turned into insightful KPIs and analytics.
- Evaluate the benefits: the beauty of data is that we can monitor what we measure and then quantify the added value. Define the metrics and maintain them long enough to measure the benefits overtime against the baseline. For example, map the outputs and track improvements in optimisation against a defined decarbonisation or net zero plan to demonstrate the value of harnessing real time building data effectively.
- Security is paramount: any leak of data could be catastrophic for a company, notably through brand damage, financial loss or in some instances could result in a subsequent physical breach of an asset. Accessibility needs to be managed in accordance with approved cyber security and data protection policies.
- Avoid duplication: have processes in place to combine, consolidate or delete duplicated records. This will help to optimise your data warehouse and ensure good housekeeping and avoid any potential build-up of ‘bad’ data.
- Create a culture of trust: data needs to be trusted to inform asset management, maintenance, strategy, comfort and ultimately enhance asset performance.
Further information
Contact Sylvain Thouzeau or Andrew Jackson