Smart buildings are one of the most common categories of IoT implementations. In seeking to provide guidelines for smart building stakeholders such as owners, contractors, and installers, an IoT Security Foundation working group has devised a set of guidelines published in a 2019 whitepaper titled “Can You Trust Your Smart Building?” (IoTSF, 2019).
Initially the authors discuss what smart buildings are: systems designed to fully manage and control all aspects inside a building, covering sub-systems such as HVAC, UPS, elevators, lighting, fire detection, and security. Optimal asset management and resource consumption leads to energy and water savings, reduced costs, less waste, improved safety and security, and overall better maintenance and occupant satisfaction. It is thus an expanding multi-billion-dollar global market.
Smart buildings potentially impact the wellbeing of all citizens of the modern world. If there is a trust issue, as the title suggests, and they cannot be trusted, then we should seek to identify the challenges and the ways to address them. The relevance of the whitepaper is clear.
Inside a smart building, sensors gather relevant data about the controlled environment and data analysis facilitates both automation and human decision making. Management systems are increasingly offered as a service in the cloud. Therefore, smart buildings are IoT systems since they share all characteristics of IoT by utilizing sensors, Internet connected smart objects, which generate large amounts of time series data, automatically analyzed through AI in support of decision making.
Threats and Risks
Next the authors turn to discuss the risks. Threats to a smart building system can come from multiple parties including insiders, rivals, criminals, and activists. As the authors show, it is all too easy today to browse a special search website and find Internet exposed building management systems (BMS) accessible by essentially anyone. Those buildings could belong to businesses, health organizations, education establishments, and various other sensitive sites. Security companies have shown the ease in which some Building Automation Systems (BAS) could be hacked (Forescout, 2019).
Furthermore, the authors mention the devastating effects of the Mirai botnet attack and the WannaCry and NotPetya ransomware global attacks. It is this reviewer’s opinion that the whitepaper authors could have made the distinction between the former and the latter two. While Mirai primarily targeted IoT devices (CCTV cameras) with weak passwords turning them into a botnet army later used for the actual onslaught, the other two ransomware targeted vulnerable Windows operating system computers no matter their function causing them direct damage. But since many IoT systems run old unpatched operating system versions this had made them especially susceptible.
Therefore, what we can observe by this comparison is that IoT devices are both easily compromised leading to collateral damage through botnets and DDOS attacks, but also easily individually targetable which can cause direct damage due to their special role.
Following the introduction of risk, the authors discuss how it should be managed. The approach taken is that not all systems within a building are equally important and that not all data is equally sensitive. A series of questions is thus provided for stakeholders to weigh the risks. This section is too short and unstructured to serve as guide for effective threat modeling or risk assessment. On the other hand, Aufner (2020) surveyed several threat modeling techniques which could be used such as STRIDE or CORAS for security, LINDDUN for privacy, and DREAD for risk. They should be used as a starting point, while realizing that research has shown there are gaps between the common threat models and IoT due to lack of consideration for hardware and physical interactions (Aufner, 2020). When it comes to smart buildings it is obvious that physical security is essential. The whitepaper only briefly mentions physical security without going into any detail.
A relevant term in this respect is that of Cyber-Physical Systems (CPS). A review paper focusing on smart buildings as cyber physical systems recommends to increase analytics and visualizations, making smart building systems even smarter for increased resiliency and security and lastly a consistent inclusion of security throughout system lifecycle (Osisiogu, 2019). Here again, identifying the interactions between the physical and the cyber security aspects is imperative for effective risk management.
As for best practices, the whitepaper does not attempt to define its own framework and instead prefers to reference that of NIST (NIST, 2018). I fully support the reliance on a respected standards body’s publication for security implementation.
So, can we trust our smart building? The question from the title remains unanswered. Readers gain better appreciation for real world cases, vulnerabilities, and risks. Surely, one’s trust in smart buildings is diminished by that account. On the other hand, direct stakeholders involved in designing, constructing maintaining, and owning smart buildings can gain insight into what it would take to increase trust.
Overall, the whitepaper does a good job in moving from domain introduction, through problem definition and into the solution space with sound recommendation for how to proceed with security implementation. The whitepaper does not dive into details in any single topic, but instead prefers to paint the entire landscape in a broad brush. By doing so it primarily raises awareness to a pervasive aspect of our lives, dealing with our day to day surroundings, with global implications on sustainability, and personal implications on safety, privacy, and wellbeing.
Aufner, P., 2020. The IoT security gap: a look down into the valley between threat models and their implementation. International Journal of Information Security, 19(1), pp. 3-14.
Forrester surveyed 300 IT and OT decision makers from diverse companies revealing to what extent edge analytics is utilized as part of IoT deployments and found that around half has either already implemented or plan to implement edge analytics within a year. This post reports on their findings as presented in a January 2019 whitepaper and identifies issues requiring further examination for practical adoption.
Edge analytics in IoT is a relevant topic as more Internet connected edge devices are added every day and as data quantities keep increasing. The topic is at the forefront of IoT implementations these days as evident by the fact that major cloud providers offer solutions for it. Amazon Web Services Lambda, Microsoft Azure Functions, and Google Cloud Functions are prominent examples. It is also an established academic research topic with some work advocating for a move from cloud backends to edge and fog computing (Schooler, et al., 2017).
In terms of methodology and ethical disclosure Forrester’s paper does a decent job at presenting the facts. The paper was commissioned by Dell Technologies and VMware to report a survey conducted during October-November 2018. Methodology highlights are provided over three appendices and are at a level of detail one could expect from a business whitepaper.
Edge Analytics has been a focus area for Forrester. These days for instance they look into the increase in edge analytics implementations for content delivery networks (CDNs) in light of COVID-19 induced rise in content consumption (Staten & Stutzman, 2020). Edge analytics certainly could support operational IoT initiatives, but this example suggests it can have direct user experience impact as well.
The paper starts by listing IoT-enabled use-cases reported by survey participants as: security and surveillance, tracking and tracing, energy management, automatic operations, predictive maintenance, and various other use-cases, some industry-specific. The appendix indicates use-case categorization was provided to participants upfront which could miss some fine details. Since not all methodology details are shared it is hard to fully judge. In any case these results are fairly consistent with other findings surveyed here (Horev, 2020).
The whitepaper builds the case for edge analytics by reporting 40 to 49 percent of participants citing security, high costs and accessibility as potentially limiting factors for data analytics in the cloud. Furthermore, it seems half are either expanding, implementing, or planning to implement edge analytics within 12 months. Driving factors were found to be growth of edge generated data, security, cost efficiency in data transportation, reduced latency, and regulatory reasons. Overall, the report is informative, well written, and logically structured.
The best part of the paper is kept for last. In it, the authors move from results reporting into making one primary recommendation: organizations are encouraged to use specific criteria for selecting IoT use-cases best suited for edge analytics. For example, the authors suggest that by looking for IoT use-cases characterized by large volumes of data as well as low latency requirements the benefit potential of edge analytics could be maximized. The use-case identification theme is kept throughout the remainder of the paper with some rather generic set of recommendation. The question of where it is best to put our edge analytics efforts is a great practical question, and the idea of criteria combinations is a strong one. The paper only scratches the surface here and further research should be well received.
And thus, the whitepaper describes a reality of edge analytics being increasingly adopted. If so, one could have hoped for the report to go beyond incentivizing to offering implementation recommendations and alerting on expected challenges. But this is not the case. The savvy reader might not be surprised by that. Industry whitepapers tend to stop at the point at which the prospective customer would want to seek further guidance. Avnet’s whitepaper on the exact same topic serves as a second excellent example of that (Avnet, 2018).
Turning to academia, I could not find academic surveys over multiple industry cases for identification of successful patterns of edge analytics implementation. Most academic papers are either domain-specific (for example, Ferdowsi et al. (2019) on intelligent transportation systems) or theoretically driven (for example, Harth et al. (2018) on predictive intelligence algorithmic efficiency). The power of surveys such as Forrester’s could be in the finer real-world insights but those are lacking.
The paper is in fact not concerned at all with what it takes to implement edge analytics. Consider for instance the edge device itself. Research has shown that choice of machine learning algorithm as run on a Raspberry-Pi platform affects efficiency and accuracy across multiple datasets (Mahmut, et al., 2018). Not cautioning the reader as to at least some of the main concerns could be perceived as detracting from the whitepaper’s credibility.
Edge analytics does not necessarily mean running computations directly on the sensing device but rather across multiple devices at the proximity network. The industry term for that is Fog Computing and discussions on its role in IoT can be traced back to 2012 (Bonomi, et al., 2012). Unfortunately, the whitepaper does not mention the term let alone report its role in surveyed companies’ implementation.
Moreover, academic researchers have suggested an approach for using publish/subscribe systems to organize edge data analytics (Florian & Neagu, 2018) in fog computing scenarios. Publish/subscribe technology is indeed heavily utilized in the industry. When surveying companies, a lot can be gained by going into some implementation detail. It could make the whitepaper much more practically insightful.
In summary, Forrester’s white paper rightfully identifies edge analytics as a major industry trend. It provides insight into its drivers and benefits for business. And general guidelines are provided for selecting the right IoT use-case for adoption. However, the paper falls short of illuminating any practical aspects of edge data analytics implementation. Being able to ask hundreds of industry decision makers on a hot topic, one could have hoped Forrester will opt to extract practical insights on implementation trends, but this will have to wait for another time.
Mahmut, T., Basurra, S. & Mohamed, M., 2018. Edge machine learning: Enabling smart internet of things applications. Big Data and Cognitive Computing, 2(3).
Schooler, E. et al., 2017. An Architectural Vision for a Data-Centric IoT: Rethinking Things, Trust and Clouds. IEEE 37th International Conference on Distributed Computing Systems (ICDCS), pp. 1717-1728.
Internet of Things applications purport to deliver great value and comfort in the hands of consumers through Internet-connected smart devices. But let us take a look back to the underlying vision and see what remains unrealized.
Ubiquitous Computing (ubicomp)
Ubiquitous computing refers to the phenomena of computers quietly permeating our lives in abundance and in many forms. The concept of smart homes is a potential manifestation of ubiquitous computing in that the home environment can be filled with many computing devices in all shapes and sizes performing various tasks for the benefit of the people living in it. Smart buildings and smart cities could further extend this notion.
Wearable technology is a tactile example of the ubicomp vision coming to life. Smart watches, bracelets, ties, and glasses have all been developed and applied, some with wide commercial success. Main applications are health, sports, and entertainment.
The concept was coined by Mark Weiser around 1988. Weiser and his colleagues from Xerox PARC imagined a world in which computers are unobtrusive quiet servants seamlessly aiding us with everything to improve our quality of life. They advocated for calm technology, which unfortunately stands in stark contrast to some of this day and age’s anxiety inducing mobile and social technology.
“calm technology will move easily from the periphery of our attention, to the center, and back”
Furthermore, designs which enable “locatedness” allow a person to use a technology while staying attuned to peripheral queues. Contrast that with the way mobile phone apps push notifications are designed to do the exact opposite.
Ambient Intelligence (AmI)
A term coined in the 1990s by Eli Zelkha and Simon Birrell, AmI puts more emphasis on technology’s ability to react to our presence and on the user experience and interaction in system design. The simplest example would be an automatic door. A defining characteristic of AmI is described as
“The fact that AmI systems must be sensitive, responsive, and adaptive highlights the dependence that AmI research has on context-aware computing”
(Cook et al., 2009)
Cook, Diane & Augusto Wrede, Juan & Jakkula, Vikramaditya. (2009). Review: Ambient intelligence: Technologies, applications, and opportunities. Pervasive and Mobile Computing. 5(4). 277-298
The Disappearing Computer (DS)
Computers increasingly become invisible to people as they cease to be separate physical entities with which we directly interact. Computers become unnoticeable, receding to the background, allowing us to consume information and socially interact in natural ways. Or as Weiser famously put it:
(described for example in N. Streitz and P. Nixon. Special issue on ’the disappearing computer’. In Communications of the ACM, V 48, N 3, pages 32–35. ACM Press, March 2005)
“The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.”
M. Weiser. The computer for the twenty-ﬁrst century. Scientiﬁc American, 165:94–104,1991, available here
Pervasive computing can be seen as a business incarnation of ubicomp through supportive technologies such as Smart Devices, sensor technology, Wireless, Mobile, Human Computer Interaction, and context-aware systems.
An extensive technological survey is offered in “Ubiquitous Computing: Smart Devices, Environments and Interactions” by Stefan Poslad (2009 John Wiley & Sons).
But whether Pervasive computing business initiatives do in fact fulfil the ubicomp calm technology vision is a different matter.
All four paradigms were researched until the first decade of the 21st century. In the second decade of the 21st century there is some decline in focus in favor of the Internet of Things.
Internet of Things (IoT)
The term “Internet of Things”, as commonly told, was coined by Kevin Ashton around 1999 while working at Procter & Gamble after having the idea to attach RFID tags to inventory items (such as lipstick) for stock management.
As can be seen by the fact that IoT’s first application was to innovate in supply-chain management, IoT should be considered to primarily evolve from pervasive computing. It is a technological solution to a set of business problems. And while Ubicomp, AmI and DS all share a human-centric vision at their core, IoT technology is often adopted for internal business reasons, for the sake of digital transformation, not necessarily with added customer value.
IoT and pervasive computing both share the focus on Internet connected devices, whereas ubicomp, AmI and DS do not necessitate it by their definition.
IoT is a 3-tier architecture of edge devices (1st tier), Internet connected via an optional gateway (2nd tier) and cloud-based services (3rd tier). Commercial IoT architectures are abundant (random examples: Microsoft, WSO2). And if one makes the comparison to IBM’s 2003 pervasive computing technology stack, the architecture is essentially the same, of course implemented end-to-end with IBM’s suite of products.
Everyone is excited about IoT these days and its implications for business. Businesses are forewarned not to pass on this opportunity for digital transformation. Gartner says IoT is over the hype and there are real benefits for businesses, but since there are also risks adoption should be highly focused on business value. One cannot argue with the significance of this global trend.
To what extent IoT technologies can realize the vision of ubicomp, AmI, and DS?
I think the answer is that they are only an implementation medium.
First, as described in HBR’s Analytics 3.0, historically there was a shift from data for business intelligence towards customer value in the form of information derived from big data. But the future is in insight derived from information. In this sense, IoT platforms are only a medium.
Second, IoT platforms are not necessarily innovating in user experience. In most cases, the user facing application is developed in very standard ways, as a mobile app or a website.
Proponents of ubicomp emphasize interoperability. Interoperability is what enables cooperating computers to provide seamless experience. Again, IoT systems are not necessarily developed with this vision in mind. In fact, the opposite is true as there is a bewildering proliferation of edge technologies and proximity networks hindering interoperability.
I highly recommend Bill Buxton’s lecture titled “Designing for Ubiquitous Computing” in which he discusses these issues.
Buxton asks us to consider how the smart phone existed well before Apple’s iPhone. Still, the iPhone brought flow and user interface that were never seen before. The move from function to flow is very important but it is no longer enough for a new product to be excellent. The next challenge is much more important – achieving flow at the “society of devices” level.
To illustrate, Buxton describes the use-case of conducting a mobile phone call while going in and out of the car where the phone and car exchange roles, user interfaces switch, and it all happens seamlessly without requiring too much of our attention. This level of interoperability is what we should be seeking a lot more of to realize the ubiquitous computing vision.
Interoperability, user experience and context awareness are unrealized challenges for many of the Internet of Things implementations today.