Latest from IoT
Smart Water Systems and Cybersecurity
Sponsored
Very pleased with the reaction to my last column Building Emotion. As I’ve mentioned each column becomes the resource and the groundwork for the next. It was great to get feedback on the excitement of the potential mash-up of a physical fixed asset and its emotional contents creating the new Building Emotion identity, an interactive, learning, feeling of a building’s environment.
I was very pleased with the timing of this interview of Siemens’s CEO Matthias Rebellius by CNN Money anchorwoman Amanda Kayne, How can a building evoke emotions?
In it, Rebellius does an eloquent job of explaining why we need emotion in our buildings, why Siemens was buying all these early “evokers” of Building Emotion, and what they hope to do with them.
These early evokers of Building Emotion have been part of AutomatedBuildings.com's ongoing conversations for years. Those of us in the building automation industry are extremely impressed and pleased that their efforts have been recognized and rewarded. Kudos to Siemens for a full embrace of these dramatic changes.
This interview provides history and helps our readers understand the reason for recent purchase by Siemens of Comfy a company we have long been a fan of because of their early entry into Building Emotion. This interview from 2014, Control by the People For the People, provides insight:
Comfy is a piece of cloud software that plugs into existing Building Management Systems, which we do via BACnet. We’re very focused on making these connections clean and simple, which has been a chronic problem for the BMS world up to this point. In fact, in terms of what we’ve spent our development time on, Comfy is almost the icing on the cake- most of our work has gone into the underlying architecture to tie into these software systems, making everything perform cleanly and reliably.
This Forbes press release provides more details on the acquisition: Siemens Doubles Down On Smart Building Investment, Acquiring Oakland Startup Comfy.
And this article from controltrends.org, Why Siemens bought Comfy and how they will make “Perfect Places” by Eric Stromquist discusses what the two companies hope to create together.
Another Siemens purchase, J2 innovations', have been early innovators in the area of Building Emotion (plus one of the backbones of the Project_Haystack.org community.)
In this interview I did with J2 Innovation’s B. Scott Muench, the company’s Vice President, Marketing and Business Development, we talk about J2’s unified toolset for creating the User Experience across multiple client platforms ranging from desktop browsers to mobile and handheld devices (a year later Project Haystack 2012 becomes a thing).
Sinclair: How does J2 Innovation’s FIN Community fit with the Connection Community?
Muench: We created a technology called FIN (Fluid Integration between devices and humans) that is gaining popularity. You could say our users are becoming part of our FIN community, while also leveraging membership in other communities like the Niagara Community. We are also actively involved with emerging communities too, like Project Haystack. Project Haystack is an open source initiative that exemplifies a connected community with the common goal of solving the “big data” problem through tagging and standardized data models. We are also contributing code to the project to help make Haystack a great protocol for getting data out of any server. So I see J2 Innovations as one of the many pieces coming together in the future of Connection Communities, all working together for a greater good and helping to move an industry forward.
For more history on the Project-Haystack community read Seeking stackable semantics and How to Build a Haystack, the history of Haystack as chronologically documented by AutomatedBuildings.com.
So why am I bring up all this history? To show that change can take a long time. The plain fact is that a lot of what I’m saying right now may take years to (hopefully!) make sense.
In keeping the the accumulative nature of these regular columns, I want to add to last month's column the concept of the Evolving Building Edge-Bots.
Our cloud consultant contributing editor Toby Considine clears the fog and takes us to the edge with this quote:
The term Cloud in an architectural diagram, as originally used, meant “it doesn’t matter where the computing is”, i.e., the term Cloud meant vague and undefined. As happens so often, a few big data center operators (you know their names) re-defined it to mean “in our far-away high-up location.”. This definition supports their marketing but restricts the original purpose of the term.
Fog is taking back the cloud, by pointing out that clouds can be low to the ground and widely dispersed. Edge-based analytics in the IOT, for example, are near the Things rather than far away
Fog is still as vague, still a cloud. Is intelligent processing it in each sensor? In each collection of similar sensors? In a single integrated system?
The answer is, it depends.
More and more IOT applications are choosing when to transmit data to the cloud, usually near an event or trend. In 2015, IOT systems collected nearly 8 Zettabytes of data. (A Zettabyte is a billion Terabytes). Most of this data is never reviewed or analyzed. Local storage and local event processing can reduce the ever-growing data collection—as well as the network bandwidth it requires.
Local event processing and local storage can reduce the data that needs to be stored in the [high] Cloud, as well as transmitting the data that is transmitted in more efficient batch transfers. Even some simple systems are now transmitting only the antecedent and proximate data to the event up to the cloud.
In a trivial and easy to understand example, consider the web-enabled doorbell, recording video continuously. It maybe has the capacity to keep a few hours of video locally. When the doorbell rings, it can send the 30 sends before and 30 seconds after to the cloud (transmitting the Antecedent and Proximate data). Before this edge processing, users would see the hat of a delivery person walking away. With this intelligent edge processing, the user sees that the face of the person coming onto the porch and ringing the bell.
Now extend this thought to whatever data collection you do. Perform simple analysis locally, and quickly. I say quickly because one principle for good IOT is to “analyze quickly, while it still matters”. This approach can preserve privacy while lessening the need for [mostly] unused zettabytes being transferred to the remote data center.
So, the Fog is the Cloud, just one near the action, on the edge…
So what is a bot and why is it better on the edge? And why is there an evolution to Edge-Bots?
From this resource, botpublication.com, 10 Bot Building platforms and why you need to build a bot for your business (Part 1) by Masha Kubyshina:
Bots are the new way businesses talk to their customers. First, every business needed to have a website, then a mobile-friendly design, then an app. Businesses are now building bots as a customer communication channel.
Bot adoption is growing fast! Since Facebook opened its Messenger in April this year, there have now been 33,000+ bots created on the platform. The great thing about bots is that the communication goes both ways. The “machine” attempts at understanding the questions asked and replies based on the user’s intent. The goal is to be fast, helpful and efficient. Currently, the majority of the bots are automated with an assistant in the loop to help train the bot on new questions.
While bot ecosystems are still in its infancy, bot adoption by brands and users is growing exponentially. If you have a website, the chances are that you will need a bot in the future. Starting early puts you ahead of the game.
Bot platforms can exist on small form computers, microcomputers (like the Raspberry Pi), beagle boards, and even modified cellphones bits on stand-alone bot boards.
A long standing joke in the automation industry is that we are always moving towards or away from centralization. Our recent journey to put everything in the far-away cloud is over. Security risks are too great. So now fog is the cloud, but closer to the action, out on the edge. These edge-bots capable of learning and building emotion are being developed rapidly by creators and makers with low-cost microcomputers.
I see these devices not as new hardware in the field but evolved existing hardware that performs multi-functions such as Comfort, Lighting, occupancy acknowledgment, and yes the self-learned building emotion, and all of it near the action, on the edge. Close communication would be via BACnet, Bluetooth, Zigbee etc. The evolved hardware the edge-bot would live near the close edge either in a lighting fixture or an air conditioning terminal close to a power source and would be commissioned and interacted with by various wireless devices. The interaction of the devices would start to define the building emotion for that portion of the building.
The evolving emotions would largely be created by a ”deviceless” mentality, the idea that users wouldn’t have to use devices, apps or interfaces to access smart services. Instead the method of access could be anything from a mobile phone to facial recognition. The underlying idea being that the creation of intelligence is hidden away in the engine room, always there and always on, but never visible to the user. This idea seems to arise from a widespread frustration at countless apps and interfaces we need to constantly be opening, learning, mastering and, updating.
An Example of resources under development Cognitive Services by Microsoft Azure. From their homepage:
Infuse your apps, websites, and bots with intelligent algorithms to see, hear, speak, understand and interpret your user needs through natural methods of communication. Transform your business with AI today.
More evolution underway at the Sandstar Project, a working group of the Project Haystack community. From the abstract:
Sandstar project will change how we think of DDC. Major improvements are hardware independent Sedona code, historical data based control logic, driver abstraction via haystack can be achieved now. With the improvements to haystack ops where Sedona components can be created changed deleted and linked, artificial intelligence can be utilized to generate and improve upon human generated DDC code. We call this feature meta-morphing programming. On the roadmap, having haystack client in Sedona will help us to have P2P device communication along with historical data and analytics based control.
I can’t close without mentioning that Contemporary Controls has launched a building automation blog for building automation professionals and enthusiasts sub-titled “Building on Open Control” (https://www.ccontrols.com/blog/category/general/). The blog is written by Zach Netsov, who is a Product Specialist at Contemporary Controls for their BASautomation line of products. From one of his recent posts:
We recently released a 12-point I/O board for the Raspberry Pi along with a BACnet server and Sedona virtual machine that runs on the Pi and asked people to tell us how they used them. Within weeks they had them installed on jobs for applications we would never have assumed. In addition, we got questions and suggestions on how to make it better for them. There is interest, and many are willing to share stories.
If you’re wondering how these hardware platforms will be humanized, here is an interview I did with Trisala Chandaria, Co-Founder and CEO, Temboo that may provide some insight.
Yes, it is early days, but mindful change is in the air. A quote from Toby Ruckert of Unified Inbox says it about as well as it can be said: "To succeed at digital transformation, instead of making humans more technical, we need to make technology more human."
Ken Sinclair | Editor/Owner/Founder
Ken Sinclair has been called an oracle of the digital age. He sees himself more as a storyteller and hopes the stories he tells will be a catalyst for the IoT future we are all (eventually) going to live. The more than 50 chapters in that ongoing story of digital transformation below are peppered with HTML links to articles containing an amazing and diverse amount of information.
Ken believes that systems will be smarter, self-learning, edgy, innovative, and sophisticated, and to create, manage and re-invent those systems the industry needs to grow our most important resource, our younger people, by reaching out to them with messages about how vibrant, vital and rewarding working in this industry can be.