24 Apr 2023
In a post-pandemic world, we have seen an acceleration in digital transformation where data is being harvested from multiple sources as new PropTech and other analytical platforms have become available.
The users of data in real estate includes brokers, financial institutions, property management firms, insurance companies and many others that are all interrelated within the real estate cycle. The outcomes resulting from the information populating these firms and practices will ultimately impact a wide range of stakeholders and beneficiaries.
Effort has been made to address the non-uniformity of real estate data because the variances across different regions and countries were so very vast, leading to significant operational challenges.
In 2020, the EU commission created a task force to Real Estate Data Standards (REDS) with the intent of promoting transparency and to facilitate data sharing across the real estate sector. Prior to 2020, other attempts at standardization included the Real Estate Standards Organization (RESO) in 2011 and the European Property Market Standards in 2013. A valuation standardization was also created. These organizations were established with the idea of promoting consistency and comparability in the reporting of property and other data, including listings certification.
Today, even in the presence of standards, the increased proliferation of Proptech firms means we have more accessible data than ever, but still that data and the systems it resides in remain siloed and do not communicate with each other. It’s also the case that internal systems often lack the means to integrate information. As it stands, real estate professionals remain at a loss in many cases as to how to engage the value that these new proptech systems can bring to the industry and crucial decision-making processes.
Overall, in spite of the previous standardization attempts, we have to acknowledge that the problems have not been effectively resolved. There are opposing views on whether or not there should even be a common data standard across the industry. It’s not clear yet whether each company should instead simply take responsibility in creating a standard across their own business.
A few of the outstanding issues that data standardization could address include the differences in the way in which data is integrated. In many cases a lack of standardization can cause a lack of consistency that invariably leads to a multitude of interpretations and conclusions. Muddy waters, in other words. More consistent reporting could also help the way data is delivered, bringing more confidence into what is actually being uploaded for use.
A lack of alignment between the executives who are responsible for day-to-day operations and innovation teams can impede progress. It’s often the case that the C suite and their respective business units work in silos and are therefore not benefitting from the enhanced communication that can create the best outcomes for handling and understanding data.
Technology leadership needs to lead, and offer a way to communicate in plain-language if understanding of the real issues, and the potential benefits are to be realized. Of course, data standards would be helpful!
The use of AI still needs human intervention – is the real estate industry ready for this level of due diligence and detail? AI can be powerful, but adoption requires thoughtful inputs to ensure accuracy, outputs which have useful application, and offer bias-free results.
Altogether, it’s clear that a better understanding of the offerings in technology is needed to engage Proptech – otherwise the industry is not able to get the full value of the technologies being created for their benefit. It’s very possible that standardization would help speed the process by eliminating some of the guesswork in getting everyone on the same page.
Until standardization is widely adopted, whether that be broadly across the industry or handled in-house, real estate professionals will have to work out the best practices to deal with the proliferation of data available now. Today, in fast moving markets, it’s a continual quest to make the best decisions on a daily basis. Nothing is static. Neither are markets uniform, thus requiring new methodologies to harness the available information and provide the level of contextualization needed to make sense of it all.