Using GraphQL Faker with portcall and pilotage order data

Author: Janne Nissinen, a developer at Solita

Solita’s API roadmap contained a task to create a proof of concept of aggregating two data sources together. Both of these APIs contained valuable information of the ship traffic, and by providing an aggregated data endpoint could provide some value to its users.

GraphQL and its extended faker-plugin were used to create several prototypes of the data structure with minimum amount of actual data merging. GraphQL-faker allows users to host local endpoints for development purposes. Furthermore, it allows its users to extend pre-existing GraphQL–services and modify their data structure to suit the users’ needs.

Figure 1 – Graphql-faker editor in browser

For the given task, each individual APIs data structure was studied, and similar data nodes were given a specific type, which resembles a data structure in GraphQL. For example, Port is a type which contains two text-based fields code and name, which will resemble the actual Port values. It also contains a nested type Location, which provide location data for the given parent type. By using the faker, no aggregation is needed to produce actual values, since the faker allows its users to pass example data in, which are then given back randomly when the port is queried.

Figure 2 – Port–type queried, faked data returned

As a proof of concept, data aggregation and data structure-based prototyping is doable and fast using GraphQL and its faker–plugin. For further iterations, both existing APIs need to be studied and the type structure needs to be honed down further to provide actual value to its users.

Demos Uncategorized

Utilisation of 3D -BIM and API for moisture risk management during construction

Indoor air problems are unfortunately common even in new buildings. These problems may be a result of poor structural designs or misuse, but more commonly they occur due to mistakes during construction.

Wall panel and couple hollow slabs waiting for assembly on site.

Structural engineers are familiar with the problematic designs. Consequently, with active review processes and utilisation of BIM (building information model) these can be handled. But what about the mistakes during construction?

New modular methods are reducing the problems with construction, as more and more construction phases are made inside protected from environmental elements. However, all buildings cannot be built modular and even with modules, some of the work needs to be done on site, for example final assembly. Transporting the modules and panels from the factory to the site is always a risk. Delays in the assembly are another significant risk since this means that the panels are stored temporary on site to wait for their assembly. Consequently, the panels may be exposed to harsh and difficult conditions for extended periods. Modules are typically protected at the factory with a plastic film. However, there is always a risk that the protective plastic film is damaged during the transportation or it is removed too early, as happened in the Wood City case (

Unprotected wall panel waiting assembly.×720.jpg

Incompletely protected panels and panels and modules with damaged protection are at risk to get damp before assembly, which can later lead to Indoor air problems

How new technologies may help in moisture risk management

Utilising IoT-sensors and 3D BIM models makes measuring temperatures and relative humidity in real-time possible.  Could the construction related problems be avoided or at least noticed early enough to make it possible to fix them without excessive dismantling of the building if sensors were used?  Adding humidity and temperature sensors to modules and panels in the factory and monitoring values in real-time during construction would give a warning in case the structure became damp. This way a drying process could be started before actual microbe damages formed.  Adding the sensor data into the 3D -BIM model via API (application programming interface) makes monitoring really easy and possible without visiting a site on daily basis.

Humidity values in 3D -BIM model makes noticeable that one panel might have damp.

In the case of concrete structures, it is crucial that the relative humidity of a concrete slab is low enough before progressing to the finalising phase. In addition to the humidity sensors, the addition of temperature sensors would be beneficial in monitoring the conditions and optimising the temperature and ventilation for the drying of concrete. With this method it might not be possible to replace the current measurements based on test pieces. But at least the sensor measurements would give a possibility to react to unwanted changes in drying conditions and to prevent wasting several weeks of drying time due to poor drying conditions. Consequently, optimal drying process, better quality, and less delays on construction would be achieved.

Could the floor be surfaced earlier if drying condition were measured in real time.

In 4APIs project demo we have integrated 3D -BIM model and IoT sensor data via APIs. Our case building was Ypsilon Community Centre / School in Yli-Maaria located in Turku, Finland. In this demo case, the sensors were installed onto a finished building and the sensors measured temperatures, relative humidity, and in and out air flow in real-time.

Interestingly, the demo case raised questions and some discussion about whether real-time is actually real-time. Like Teemu Mikkonen wrote in his blog “Real-time data with cloud platform” there were noticeable delays in data from sensor to actual services. However, in the case of moisture risk management during construction, the delay may be several minutes and it still does not cause problems, as the time scale, for example for concrete slab drying, is weeks. Even with delays the benefits are remarkable, since unsuitable changes in drying conditions or moisture in structures can be noticed almost in real-time. This gives possibility for constructors and supervising authorities to react to humidity related problems earlier.

In new buildings sensors may be added in the early phases of construction. In such cases some of the construction time sensors would also be utilisable later in use time monitoring. As a result, a complete measurement history of the building could be collected and utilised in indoor air quality investigations.


Monetizing an API has little to do with technology

As a human centered business designer I was privileged to facilitate a business modelling workshop for the 4APIs project in May. The project team spent the morning creating hypothesis with target to scale and monetize for a case study called Ypsilon. The case study is Ypsilon Community Centre which is a multi-purpose building located in Yli-Maaria, Turku. It has facilities for education, a library, youth services, school health care and child welfare clinics. 

Our main questions to answer were typical to any business model design. 

  1. Who do we create value for
  2. What is the value proposition (the problem we offer to solve)
  3. Revenue model
  4. How do we deliver and capture value 
Who? What? Value? How?

In most small groups discussion on target segments was around building owners (B2B) and house owners (B2C). Also building manufacturers and insurance companies were seen as a potential customers.

As a human centric business designer I always emphasise the importance of being aware of human behaviour and driving motivations around the problem that concept is meant to solve. Is your API solving a business critical problem or not? In my own small group we clarified three value proposition for house owners. 

  1. Economical value: prevent the risk of serious water damage or moist. 
  2. Emotional value: carefree living for house owners. 
  3. Functional level: straight forward service model with third parties.   

Even a great value proposition isn’t enough if the pricing or purchasing model is in conflict with the customer’s way of purchasing these kind of services. In B2B  the typical failure would be not to adapt on customers’ tendering models and siloed responsibilities. The model should be somehow familiar to a customer, as every business is always a combination of 55 business model patterns. 

To come up with a potential concept is rather easy. Success is more about hard sales work and right timing than latest technology. At the heart of cashing is to find the right super moments when a customer is most likely to put the effort needed for purchase. Or even better, if we find a way to serve the customer without him/her having to put any effort at all to purchase.

To me as a house owner the winning offer would be a combination of all three value propositions. The emotional side definitely on the highest priority. If it would require more effort than opening a door once, I’d remain in the potential customer category. This leads to the issue of trust. Who do I trust enough to grant access to my water consumption data and let them know when I’m on vacation?

Demos Uncategorized

Real-time data with cloud platform

The previous blog post: “Using BIM and API data to demonstrate opportunities in Ypsilon” presented a look to what is being researched as a real-time data architecture in the 4APIs project. We can see that there are multiple sensor data sources that provide real-time data about the conditions in the rooms. In order to effectively utilize the opportunities of this continuous IoT-data, we must build a sufficient architecture to support it.

As our approach, we have chosen to research and develop a cloud platform-based solution to capture, store, and provide an output for the data that is provided by the Ypsilon Sensors. More specifically we have chosen Microsoft Azure cloud platform for the initial Proof of Concept (PoC). The components provided by public cloud providers such as Azure are well suited for various data-related tasks. Furthermore, Cloud platforms provide scalability and reliability, which are both increasingly important in the future, with growing number of data sources and applications that are dependent on said data.

We base our PoC-architecture on Azure IoT reference architecture as well as our tests with dummy data simulated by a Python function. So far, we have made experiments that are meant to build the first draft of our PoC. PoC is developed with an idea of small incremental changes and monitoring the data flow in each component that is added. The data flow in the PoC architecture begins by capturing Ypsilon sensor data with Azure IoT Hub. The data is stored to Azure Blob Storage that divides the data stream into a hot path and a cold path.

PoC-test architecture

The hot path is used to stream the data to an outbound API in order to provide access to the Ypsilon data in real time. The first draft of hot path will use an Azure function that reads the data from blob storage. The cold path is for storing the IoT data to a database so that it may be used for aggregation and analysis. We have chosen Snowflake as our PoC database. Storing the data in the cold path enables us to apply the data to reports, dashboards and even machine learning purposes.

We still have many questions that need to be answered. One of the most important ones is “What do we mean by real-time?” Our initial tests showed that simulated sensor data took roughly 90 seconds to reach blob storage, from which there is some additional latency to be expected in terms of outbound API. The entire latency can be brought down by utilizing different Azure components such as Stream Analytics, but as the service gets faster, the costs are also increased. We continue our tests with Azure components in order to find a reliable and fast solution that enables real-time data as well as analytics purposes with history data.


BIM and data as such are not valuable

The world is constantly filled with new innovations that embody little to no understanding of the very people that the product or service is built for. They fail fast. It is common knowledge that most new products don’t make it through their first year on the market. “Users” of the product or service have likely been given some thought, but too often only in the form of hypotheses from thin air or by way of reflecting mainly the needs and assumptions of the team building the product. Then projecting them on the market. Or user insight is built only by looking at people from afar, through spreadsheets and quantitative abstractions lacking tangible understanding of what really drives us and our perception of value.

The first question of any innovation work should be: how does this thing of ours relate to and produce anything of value for people and societies? What kind of needs does it answer to, what kind of human and cultural practices, functions and meaning should it be part of, which actual problems might it solve for actual people and how? How does our product make life better?

Suffice to say, this is something we need to think hard about also in the 4APIs project. As such, our BIM and data can be of no value. They become valuable only through performing functions and holding meaning perceived valuable by people and institutions. Can they make life better? For whom? How?

With these questions in mind we began meeting potential target groups last month. We have now engaged with different stakeholders and potential users of our BIM and data, covering roles from energy operations to school management and real estate services. The work has just started, but here’s a few thing we have learned so far.

Could it save what’s scarce? Could it make us more safe?

The general perception of possibilities from our interviewees was mainly very positive. We found this perception of potential value to be based especially on the promises of efficacy, safety and wellbeing.

Illustration by Maria Niemi (Solita)

By ‘efficacy’ I refer here to precise allocation of scarce resources ranging from money, energy and environment to space, time and attention. Data combined with BIM hold lots of promise as a tool for better understanding of very context-specific conditions, their variation over time and also anomalies we otherwise might have difficulties perceiving. Possibilities especially with real-time data, predictive modelling and machine learning can enhance excitement and the feeling of novel possibilities. One key aspect of efficacy would be also the integration of datafrom several currently fragmented information systems into one real-time API.

The themes of ‘safety’and ‘wellbeing’ were raised especially in the context of a very special worry, even a public trauma of sorts regarding some of our public spaces. Throughout recent decades there has been growing concern over the quality of indoor air especially in schools. If we could collect more precise and rich data of the environments our kids spend their days in, this could have a reassuring function. And when data indicates problems in conditions, especially problems difficult for human senses to directly perceive (e.g. related to correct humidity levels), proper actions could be taken in time. Another given concrete example relates to acute situations of crisis, e.g. fires, when real-time data could be used to monitor people flow getting everyone safely out of the building.

Two takes on your world: data and BIM as aids of perception

Other potentially valuable functions included the use of BIM and data as pedagogical tools and as examples of smart tech relating to the very meaningful and tangible environment of everyday life for pupils and teachers alike. What holds promise here is the possibility of combining two types of information: First, the subjective and sensory information we humans directly generate by observing and sensing our environment. Then combined with the objective and unobservable conditions provided by data and BIM. The subjective experience gives meaning to the environment and its changing conditions. It is then given a complementary lighting by data of objective conditions like temperature or humidity level. Together they show how the subjective and the objective relate and might differ. And by helping perceive the effects of even minor adjustments and optimizations in conditions and behavior alike, data and BIM could be also used to provoke for example environmental awareness and positive behavior change.

This combination of the physical, tangible and sensory information provided directly by our bodies and the objective information provided by data is nicely mediated by the BIM. The 3D model makes data-measured conditions easy to perceive, understand and interpret in relation to the actual environment.

The building information model and reality. A virtual reality illustration of the Ypsilon demo.

This was brought up also in the context of maintenance of the building where this would provide actual help when not physically present to perceive the environment (to remotely and holistically grasp the conditions of the whole building quickly) or when physically present but now also being able to perceive and locate more objective data-informed conditions and their temporal variation (history, future predictions).

From data to information, from information to action

Data as such is not information, it doesn’t inherently lead into understanding let alone required actions. Actually, it rarely does. This is something we painfully know from encountering organizations equating more data with better decisions. Interpretation can be tricky. Easy-to-grasp communication, intuitive visualization and contextualization of data is usually an important factor. “No one is interested in my excels! … But when I have nice visualizations to show…”, as one interviewee remarked.

This need for easy interpretation holds especially true for multipurpose premises with modifiable spaces like the Ypsilon building. Ypsilon’s (and similar sites’) data has a broad spectrum of potential user groups ranging from building maintenance and its partners to energy authorities and more site-specific user groups like service coordinators, teachers, pupils and citizens in the area. Therefore, we need to help different users read, contextualize and understand the data as “with new data, there’s always the possibility of misinterpretation.”

But even this is not enough. After data turns into understanding, understanding still needs to turn into action. For this we need to consider organizational drivers of action and work to incentivize and structurally motivate people’s behavior. Otherwise, the possibilities for efficacy, safety and wellbeing will remain only latent.


Antti is a sociologist and ethnographer at Solita Design & Strategy. For the past 10+ years he has worked with applied social science in service design, organizations, brands and marketing.

Antti’s Twitter and LinkedIn

4APIs’ Twitter


Using BIM and API data to demonstrate opportunities in Ypsilon

In the joint demo of the 4APIs project, we combined a Building Information Model (BIM) to API data. Here we have a BIM:

A building Information Model (BIM) in action click to browse it yourself!

We take use of the Vertex Showroom product which is able to show any BIM in a web browser, even in mqobile. Please take a look at the interactive version of the BIM here!.

The BIM mimics the reality rather well. The following images are from the Ypsilon Community Centre / School in Yli-Maaria located in Turku, Finland:

A classroom for domestic science at the Yli-Maaria school.

The photo was taken in January when we had our demo project kick-off in the Ypsilon smart building. The pilot is executed by the space management center (Tilapalvelukeskus) and industry of education of the city of Turku. In the pilot, they are collecting experiences of using BIM and sensor data.

The BIM was created already in the planning and construction phase of the building a few years ago. It was found very useful during the building phase according to Projektiuutiset in 2019 and

Sensor data is being collected from a few rooms inside the building. Let’s take a look at the Ypsilon community center from the bird view perspective:

A view to the Yli-Maaria community center building from altitude of 500 meters observed from Goole Earth. Floor map on the right.

When looked at from above, the building has a shape of letter Y, which correlates with the name of the building, Ypsilon. In the Y-shaped floor map, you can see sensors in three rooms providing real-time condition data including, for instance, the following variables:

  • Temperature
  • Amount of persons in the room
  • CO2 level

With the given data set, we are able to construct a live view of the smart building related to the current conditions of the building, for instance, temperature. Related to temperature, Ypsilon has both thermal energy and solar panels installed:

The thermal energy and solar panels illustrated on top of the BIM.

Therefore, the infrastructure of the building is very energy efficient. The data scientists of our joint demo project found interesting phenomena in the sensor data related to indoor temperature:

Infograph of the temperature in the three rooms with sensors.

According to sensor data, the temperature seems to variate between 19 and 23,5 degrees. There are peaks in the data set for each weekday in January. In spot #1, the week has only four peaks whereas another week in January in spot #2 has five peaks. A peak in the temperature occurs when there are people in the room. During the first week of January there were only four school days due to holidays which can be seen as four peaks. The week #2 in January had five school days.

But what could explain the changes in temperature between the second floor (room A2105) and the third floor (rooms C3032 and C3060)? At the moment, we don’t know. We have two hypotheses. In the the third floor it is warmer than in the second floor because:

  • warm air rises up
  • the windows in the third floor point towards east whereas the window in the second floor points towards north

At the moment, we have no evidence for approving or rejecting either one of the hypotheses. Independent of the reason, the temperature could be a bit lower sometimes. A common goal value for comfortable indoor air temperature is 21 degrees according to, for instance, a guide at the city of Helsinki. Perhaps some energy could be saved with a more balanced solution?

In the demo, we combined weather API data available at the Finnish Meteorological Institute into the sensor data set:

Overview of the cloud architecture (Azure) with three data sources on the left and data flowing to the hot path (live conditions) and cold path (long term data storage for, e.g., machine learning purposes)

In this sample high level architectural diagram, several API data sources are shown on the left: weather data (FMI), traffic data (Föli) and sensor data. The data is loaded into the Azure IoT Hub. The data is then aggregated into the hot path, i.e, the real-time outbound API for data to be shown in the user interface. To implement the hot path in the demo, we used hard coded Excel files in the demo setting stored inside M-Files document repository. However, the Azure components presented in the architectural diagram (e.g., the Event Hub and Blob Storage) could be used to create a real-world API solution. We took use of an Azure Developer portal at

Azure developer portal enables subscription to APIs and trying them out.

The user interface then calls the API end point and gets a response:

The JSON response of /api/measurements/2105

The response contains timestamped observations in a single room (with siteId 2105) and sensor category “Tuloilma” (row 24) and sensor attribute “Ilmankosteus” (row 25) which means “Incoming air humidity” with value 32.9 and unit “%”. In other words, in the middle of the night, the humidity was on a good level.

The user interface then shows the JSON data and the BIM with embedded visual data values inside the room:

The demo UI sketch with sensor data values illustrated inside the BIM. The raw data in table.

With the aggregated weather data we made the following hypothesis: when it is cloudy, the indoor temperature is lower. We made a very quick graph of temperature and cloudiness:

Infograph of indoor temperature and cloudiness (on scale 0-8)

A very quick interpretation based on the graph is that when it is cloudy, the temperature is actually higher by 0,5 degrees. However, should it be the other way round, i.e., when it is sunny, it is warmer? Further investigation would be needed to make statistically significant inference of the phenomenon. A quick conclusion based on this interesting initial finding is that sensor data and BIM could provide a basis for phenomenon based learning. The teachers and pupils could observe a data set together considering a very familiar and tangible environment: the classroom itself.

To illustrate the data collection possibilities of the demo, let’s go back to the kick-off meeting where we had the opportunity to take a guided tour around the building. We saw novel classrooms with the latest advances in teaching technology, for instance interactive screens and class rooms without corridors:

A classroom in the third floor of Ypsilon. In a school without corridors, another classroom at the back is entered through the classroom in the front of the image.

When we entered the classroom in the third floor, we didn’t know that there is sensor counting people going in and out located in the door sides! During our ten minute visit, these three data rows were recorded by the Smart Building:

Sample sensor data from the classroom in the third floor.

The size of our group in the kick-off was approximately 10 people so the measured value (9) could be right. However, we didn’t see anyone else there. The other measure value (16) could have some error in it. Perhaps the counter is counting people passing by to the other classrooms through one class room, for instance. However, the data set is very useful even with some possible errors. We have to remember that the project at hand is a pilot project in very early phase of going towards a Smart Building. However, it could be beneficial to install sensors to every room in the building to make the data set more coherent.

Or perhaps, in the future, it could be possible to install sensors to the whole city of Turku:

Turku city in 3D. Source: Google Earth.

In order to create an accurate digital twin of the whole built environment, sensor data and models would be needed in large scale. The sensor data of the whole city could be collected into a cloud data warehouse, for instance. We already have a lot of BIMs and data available regarding the surrounding infrastructure:

A combonation of BIMs and data is able to create a digital twin. Source of the image: Schaller, Joerg, et al. “GeoDesign: Concept for Integration of BIM and GIS in Landscape Planning.” J. Digit. Landsc. Archit 2 (2017): 102-112.

In their article GeoDesign: Concept for Integration of BIM and GIS in Landscape Planning, Schaller et al. demonstrate possibilities for wider usage of BIMs and geographic information systems (GIS). Moreover, the BIM research group at the Helsinki University has been recently organizing a workshop with title When So­cial Science meets Lean and BIM, which seems to be a great multidisciplinary direction for research.

But before we advance any further into broader use cases in the context of digitalization of the society with BIMs and data, we’ll have to evaluate the value creation possibilities of the combination of BIM and data in Ypsilon. We showed the demo with BIM and data to a few potential users as part of our qualitative user interviews. Is there potential user value here, for whom and by answering to which needs? More on that topic in the upcoming posts!