As recently as two years ago, the commercial real estate industry was unanimously considered laggards in terms of technology adoption. In a very short period, that narrative has completely changed and some of the most cutting-edge technologies are being deployed in CRE portfolios.
8 Advancements Coming to
CRE Technology in 2019
As recently as two years ago, companies in the commercial real estate industry were unanimously considered laggards in terms of technology adoption. In a very short period, that narrative has completely changed and some of the most cutting-edge technologies are being deployed in CRE portfolios.
This is especially true when it comes to Internet of Things (IoT) technologies because of the physical nature of real estate. Now that owners and operators have recognized that nearly any physical asset or process can be tracked and optimized with data, the pace of adoption and advancement has accelerated rapidly.
In some cases, technology that has been around for years is finally being deployed at scale. In other cases, disparate technologies are being integrated to serve the specific needs of commercial real estate. In a few cases, pilots are being run that represent the first time that certain technologies have been commercial deployed in any industry.
The question is no longer whether to adopt technology, it is how to prioritize the plethora of options to best optimize for the goals of ownership and the specificities of the portfolio. In order to make the right investments, it’s important to know both what has been scaled as well as what the early adopters are beginning to deploy.
Here are eight important advancements coming to commercial real estate technology in 2019.
1. Physical visualizations Data about specific equipment has been available for a long time through building management systems (BMS). Unfortunately, the high cost of deploying and maintaining BMS meant that only 10% of buildings could afford these systems. On top of that, because these systems are used to control equipment, access was confined to the building itself for security reasons. Finally, even if another stakeholder, such as an asset manager or owner were on site and interested in the data, these systems were only accessible to engineers who had intimate knowledge of how equipment functions and training on the technology.
When data became available in the cloud, both technology vendors and real estate executives were excited to democratize real-time data so that it could be accessed by anyone from anywhere. But the reality on the ground was that those who could benefit from this data could not translate graphs and charts into specific actions.
In response, digital twin technology is being deployed, which embeds data from the cloud directly into a visualization of the physical assets themselves. This added visual context enables experienced operators to increase their productivity by being able to access any mechanical room, in any of the buildings they manage. Digital twins can take different forms, as outlined in a recent video by Enertiv’s Lead Software Engineer, Felix Lipov, but one thing is for certain: this technology will become much more widespread in 2019.
2. Meta data
As valuable as performance data side-by-side with a physical visualization is, there’s a complimentary aspect that is equally as important. The “meta data” about the system, such as the make and model, age, maintenance schedule and history, specification sheets, and nameplate values provide important context to operators.
There is a certain expertise required to deploy sensors to capture data about physical assets. However, in some ways, it is more challenging to capture meta data because it is not a measurement, it is a range of values from different sources. When it is collected, tying largely qualitative information to a specific piece of equipment can quickly become an enormous data management challenge.
Nevertheless, this meta data is being collected, in an efficient and scalable way, in large commercial real estate portfolios. Armed with this meta data, algorithms can provide much better recommendations on how to optimize preventative maintenance schedules, run equipment efficiently, and which adjustments to make to prevent impending faults.
When evaluating platforms to optimize building performance, keep in mind how important meta data is, and inquire about how the company captures and organizes this information.
3. Data combinations
Visualizations and meta data provide valuable context for real-time measurements. However, sometimes the best way to optimize a system or diagnose a recurring issue is to deploy multiple sensor types on the same piece of equipment.
This is especially valuable for large systems such as central boiler or chiller plants. These systems are made up of many pieces of equipment, such as condensers, water pumps, and tower fans that must operate properly for the plant to function. Small deviations or issues can affect performance downstream, so it’s important to track data for each component. But whatever the data type, electrical demand, vibration, temperature, flow, etc., there will be blind spots.
More and more, platforms are combining a variety of complimentary data types to get a more holistic picture of complex systems. If it’s important to the portfolio to get the best data about their equipment assets, ensure that the platform is capable of digesting and combining multiple data types for a single piece of equipment.
4. BMS integrations
In addition to deploying IoT sensors to digitize physical assets, there are some buildings that have a deluge of data coming from their BMS. This is enough of a problem that multiple companies have been started to simply take this data and make it more user-friendly and cloud-accessible.
There should be a natural alliance between BMS solutions that remotely control equipment, and analytics platforms that provide data-driven insights. Being able to control equipment based on very specific parameters is a powerful tool, but control does not guarantee optimization (this has been covered in a previous article, Why Building Automation is Not as Smart as You Think).
As more owners and operators recognize this, the pressure for BMS providers to open their data to third parties is increasing. In the past, BMS providers would put up barriers, even charging integration fees nearly equal to the cost of installing the sensors in the first place. In 2019, it’s likely that a combination of lower or no integration fees, more standardized data protocols, and pressure from owners will result in BMS vendors playing nicer with analytics platforms.
5. Less networking
To the surprise of many commercial real estate owners and operators when deploying sensor technology to capture granular data about their buildings, it’s usually not the sensors themselves that are the limiting factor, but the networking involved in connecting these sensors to the cloud.
There have been rumblings for a few years about long-range networking protocols that will enable hundreds of sensors throughout a building to transmit their data through just one gateway (as opposed to the 8-12 that are usually necessary).
These protocols are now ready for the prime time and will translate to more options for owners and operators in terms of the data they can collect as well as quicker, less expensive deployments. Integrating with these protocols does take some expertise, so make sure to inquire about vendors’ ability to utilize “long range / low power, wide area” networking protocols.
6. Edge computing
On the flip side of better networking is edge computing. Edge computing, simply put, is the ability for IoT devices to analyze data at the source of collection, instead of transmitting that data to the cloud. IoT sensors collect an enormous amount of data, and the process of transmitting, storing, and visualizing that data in a useful format is expensive, even if networking is improved as mentioned above.
An IoT device that takes readings of equipment performance every second will have generated 86,400 data points in a single day. Multiply that by hundreds of devices per building and then by the number of buildings in a portfolio. Only the likes of Amazon and Google have the cloud infrastructure necessary to digest and analyze that much data in real-time.
Edge computing disperses the most granular analyses to the devices themselves, which uncovers insights that would be impossible with minutely data, for example. This also means that instead of trying to create rules in a centralized manner that apply to a wide swath of equipment, edge computing enables providers to “train” devices and allow them to learn the specifics of their data set.
Owners and operators evaluating solutions in 2019 should consider the relationship between hardware and software, as vendors that deliver both proprietary hardware and software will be better positioned to take advantage of edge computing.
7. Verification and accountability
Most of the advances talked about so far have been in collecting better data, collecting data faster, or collecting data more affordably. But data itself has no value, it’s the insights that can be derived that matter to owners and operators. More accurately, it’s the implementation of those insights that deliver the value and the return on investment.
With better data comes better insights, but how are managers supposed to know if the insights they’ve invested in are being implemented? As data-driven solutions move from early adopters to the main stream, more owners and operators are demanding that an ROI be demonstrated early, even in areas that are historically difficult to track, such as maintenance labor hour productivity.
To meet this demand, solution providers are building out a variety of strategies for on-site operators to verify that they have performed tasks associated with the insights that have been generated from analytics platforms.
For example, when maintenance is performed, equipment will often operate at peak efficiency and slowly degrade over time until the next maintenance is performed. If there is a task scheduled, and this spike to peak efficiency does not occur, a manager can be made aware so that he can investigate why it was deferred.
Many data solutions focus on utility savings exclusively, but maintenance savings opportunities can be just as high. The only way to quantify maintenance savings is to track labor productivity, so it’s important that a platform has a strategy for tracking activities digitally and at scale.
8. Comparative ratings
A comprehensive building operations platform will serve every stakeholder involved in building operations. One group that has been historically overlooked is asset managers, who oversee approving the capital investments for equipment replacements.
Like many large financial decisions, equipment replacements involve a lot of assumptions. Unfortunately, instead of empirical data, most asset managers must rely on manufacturers’ estimates for the equipment useful life (EUL), and other rules of thumb.
Now, due to the sheer quantity of data that has been collected, empirical data about the relative equipment health and performance are enabling asset managers to make much more informed decisions. Perhaps one system has a higher upfront cost, but the lifetime operational costs are much lower. Perhaps a high-efficiency system tends to also have many equipment malfunctions and require expensive third-party repairs that negate the superior performance.
Just as importantly, comparative data and a more accurate picture of equipment lifetimes help asset managers smooth out their investments year-over-year, so they don’t get stuck with one year that blows out the budget and another that barely requires any investments. This is a very exciting prospect to asset managers and expect these analytics to get a lot of attention in 2019.