In the joint demo of the 4APIs project, we combined a Building Information Model (BIM) to API data. Here we have a BIM:

We take use of the Vertex Showroom product which is able to show any BIM in a web browser, even in mqobile. Please take a look at the interactive version of the BIM here!.
The BIM mimics the reality rather well. The following images are from the Ypsilon Community Centre / School in Yli-Maaria located in Turku, Finland:

The photo was taken in January when we had our demo project kick-off in the Ypsilon smart building. The pilot is executed by the space management center (Tilapalvelukeskus) and industry of education of the city of Turku. In the pilot, they are collecting experiences of using BIM and sensor data.
The BIM was created already in the planning and construction phase of the building a few years ago. It was found very useful during the building phase according to Projektiuutiset in 2019 and http://www.turku.fi/Ypsilon.
Sensor data is being collected from a few rooms inside the building. Let’s take a look at the Ypsilon community center from the bird view perspective:

When looked at from above, the building has a shape of letter Y, which correlates with the name of the building, Ypsilon. In the Y-shaped floor map, you can see sensors in three rooms providing real-time condition data including, for instance, the following variables:
- Temperature
- Amount of persons in the room
- CO2 level
With the given data set, we are able to construct a live view of the smart building related to the current conditions of the building, for instance, temperature. Related to temperature, Ypsilon has both thermal energy and solar panels installed:

Therefore, the infrastructure of the building is very energy efficient. The data scientists of our joint demo project found interesting phenomena in the sensor data related to indoor temperature:

According to sensor data, the temperature seems to variate between 19 and 23,5 degrees. There are peaks in the data set for each weekday in January. In spot #1, the week has only four peaks whereas another week in January in spot #2 has five peaks. A peak in the temperature occurs when there are people in the room. During the first week of January there were only four school days due to holidays which can be seen as four peaks. The week #2 in January had five school days.
But what could explain the changes in temperature between the second floor (room A2105) and the third floor (rooms C3032 and C3060)? At the moment, we don’t know. We have two hypotheses. In the the third floor it is warmer than in the second floor because:
- warm air rises up
- the windows in the third floor point towards east whereas the window in the second floor points towards north
At the moment, we have no evidence for approving or rejecting either one of the hypotheses. Independent of the reason, the temperature could be a bit lower sometimes. A common goal value for comfortable indoor air temperature is 21 degrees according to, for instance, a guide at the city of Helsinki. Perhaps some energy could be saved with a more balanced solution?
In the demo, we combined weather API data available at the Finnish Meteorological Institute into the sensor data set:

In this sample high level architectural diagram, several API data sources are shown on the left: weather data (FMI), traffic data (Föli) and sensor data. The data is loaded into the Azure IoT Hub. The data is then aggregated into the hot path, i.e, the real-time outbound API for data to be shown in the user interface. To implement the hot path in the demo, we used hard coded Excel files in the demo setting stored inside M-Files document repository. However, the Azure components presented in the architectural diagram (e.g., the Event Hub and Blob Storage) could be used to create a real-world API solution. We took use of an Azure Developer portal at https://businessfinland-4apis.portal.azure-api.net/:

The user interface then calls the API end point and gets a response:

The response contains timestamped observations in a single room (with siteId 2105) and sensor category “Tuloilma” (row 24) and sensor attribute “Ilmankosteus” (row 25) which means “Incoming air humidity” with value 32.9 and unit “%”. In other words, in the middle of the night, the humidity was on a good level.
The user interface then shows the JSON data and the BIM with embedded visual data values inside the room:

With the aggregated weather data we made the following hypothesis: when it is cloudy, the indoor temperature is lower. We made a very quick graph of temperature and cloudiness:

A very quick interpretation based on the graph is that when it is cloudy, the temperature is actually higher by 0,5 degrees. However, should it be the other way round, i.e., when it is sunny, it is warmer? Further investigation would be needed to make statistically significant inference of the phenomenon. A quick conclusion based on this interesting initial finding is that sensor data and BIM could provide a basis for phenomenon based learning. The teachers and pupils could observe a data set together considering a very familiar and tangible environment: the classroom itself.
To illustrate the data collection possibilities of the demo, let’s go back to the kick-off meeting where we had the opportunity to take a guided tour around the building. We saw novel classrooms with the latest advances in teaching technology, for instance interactive screens and class rooms without corridors:

When we entered the classroom in the third floor, we didn’t know that there is sensor counting people going in and out located in the door sides! During our ten minute visit, these three data rows were recorded by the Smart Building:

The size of our group in the kick-off was approximately 10 people so the measured value (9) could be right. However, we didn’t see anyone else there. The other measure value (16) could have some error in it. Perhaps the counter is counting people passing by to the other classrooms through one class room, for instance. However, the data set is very useful even with some possible errors. We have to remember that the project at hand is a pilot project in very early phase of going towards a Smart Building. However, it could be beneficial to install sensors to every room in the building to make the data set more coherent.
Or perhaps, in the future, it could be possible to install sensors to the whole city of Turku:

In order to create an accurate digital twin of the whole built environment, sensor data and models would be needed in large scale. The sensor data of the whole city could be collected into a cloud data warehouse, for instance. We already have a lot of BIMs and data available regarding the surrounding infrastructure:

In their article GeoDesign: Concept for Integration of BIM and GIS in Landscape Planning, Schaller et al. demonstrate possibilities for wider usage of BIMs and geographic information systems (GIS). Moreover, the BIM research group at the Helsinki University has been recently organizing a workshop with title When Social Science meets Lean and BIM, which seems to be a great multidisciplinary direction for research.
But before we advance any further into broader use cases in the context of digitalization of the society with BIMs and data, we’ll have to evaluate the value creation possibilities of the combination of BIM and data in Ypsilon. We showed the demo with BIM and data to a few potential users as part of our qualitative user interviews. Is there potential user value here, for whom and by answering to which needs? More on that topic in the upcoming posts!
One reply on “Using BIM and API data to demonstrate opportunities in Ypsilon”
[…] to say, this is something we need to think hard about also in the 4APIs project. As such, our BIM and data can be of no value. They become valuable only through performing functions and holding meaning […]