APIs have become a crucial asset for many businesses

Written by Saeid Heshmatisafa Tampere University

For decades, many companies have developed APIs as a means for their partners to share information and facilitate integration. In the past, integrating software products with business partners might have taken 12 to 18 months, while APIs not only accelerate the process but also allow many business partners to collaborate and interact within an ecosystem. Web APIs form modern middleware that provides access to any type of content, data, and other digital assets to build creative desktop, mobile, and web applications.

In today’s digital economy, the value of your assets will remain limited if it is isolated within the individual business ecosystem. The outside-in practice of open innovation has led many companies to look outside of their organizational boundaries for the next novel ideas in order to expend their products and services and maximizing the value of their technologies. Open APIs are one of many ways that enable companies to gain positive network effect along with spotting emerging trends, developing new products/services, and creating a digital economy. It can be argued that APIs are the components that enable various apps, systems, and platforms to connect and share data with partners and third parties to develop new API-consuming solutions. Moreover, such a strategy can be seen in the case of Visma’s e-sign, the company could gain 180 customers with their traditional UI product. Nevertheless, as they began to embrace risking and move from the utilization of APIs primary for supporting integrations towards accepting third party users, the number of customers increased to 180,000 (by 100 times). Currently, the digital signature service has approximately 36,000 users., one of the most popular API directory, has more than 23,539 public APIs in its database. Currently, the average single application is integrated by 18 APIs, and 50% of the B2B collaborations are powered through APIs. Most significantly, according to Akamai’s Tony Lauro, APIs are responsible for 83% of all web traffic. Google Maps, one of the earliest open APIs, can be found in many applications that include a geospatial aspect. For instance, Uber is a mobile application connecting taxi drivers (providers) to customers/consumers. However, behind the scene, Uber is an ecosystem of providers; Google Maps is used as a front-facing interface, Stripe handles the payments, Twilio for communications, and many more aiding the process of developing value chains of APIs.

It is evident that API technology has surpassed its original purpose as a “technical asset” and became the major force of the economy. A significant revenue stream of many incumbent companies is generated by their APIs, such as eBay’s (60 percent), Salesforce (50 percent), Expedia (90 percent), and Amazon Web Services (full 100 percent). APIs not only open a new monetization stream from “shelved intellectual properties,” but it also incentivizes the emergence of API-first companies and building a lucrative platform empire. For instance, Salesforces merged with MuleSoft – an API management system – for a value of $6.5 billion. Another example is the acquisition of Plaid – a startup company specialized in the development of APIs to connect payment apps such as Venmo to share banking and other financial information – by Visa for $5.3 billion.

APIs are not just another high-tech product; they are the next generation of the internet. Enterprises that take technological risks and embrace digital strategy are more likely to become the next digital transformation leaders. However, API strategies vary depending on the type of role a firm aims. In this regard, companies can experiment by exposing non-core assets through open APIs. Nonetheless, firms need to take a couple of critical capabilities into account when designing an open API. First, the value exposed by the API must be unique and useful. Second, API must be planned to address a well-defined need; exposing existing capabilities may require redesign and implementation from the perspective of potential users. Third, the simplicity and flexibility to meet user needs and styles of consumption. Fourth, ensuring consistent accessibility and operation. Fifth, it must be supported and extended throughout its lifetime. API providers require to invest a great deal of effort into the creation of the ecosystem and support existing users, advocate new users, and evangelizes. Thus, APIs have become a crucial asset for many businesses. 


Notes on blockchains

Written by Elina Kettunen (UH)

The best-known applications of blockchain technology are cryptocurrencies, but there is considerable interest in applying blockchains as a data storage method in various different fields. Blockchains can be used to record transactions in a reliable, secure and immutable manner. Transactions are saved to linked blocks that form a digital, encrypted ledger. Each party or node in a blockchain maintains a copy of this immutable ledger. Consensus among the parties is achieved by using, for example, proof-of-work. 

Blockchains can be permissionless or permissioned. Permissionless, public blockchain systems allow anyone to join the blockchain, whereas permissioned, private blockchain systems use membership control to allow only identified parties to join. In public blockchains, anyone can read or write data, but while reading is free, writing to a blockchain requires paying a fee in cryptocurrency. The fee will be paid to a miner who first completes the proof-of-work to secure a new block containing the transaction data. In private blockchains, the owner of the blockchain can decide on the transaction fees. 

Blockchains can be used to eliminate the need for trust among the parties sending transactions to each other. All transactions are visible in the distributed ledger and tampering the transaction history would require the malicious party taking control of the majority (51%) of the blockchain network’s mining hash rate.

Ethereum [1] is the most popular permissionless blockchain that allows writing of smart contracts on the blockchain. Smart contracts consist of contracts or business logic that are installed on the blockchain system. Parties in the blockchain can execute smart contracts to create different transactions that are then validated by other parties and saved to the blockchain.

There are several different platforms for building permissioned, private blockchains. Some of the most widely used are Hyperledger [2] and R3’s Corda [3]. Private blockchains are meant to allow saving sensitive data to a blockchain so that only selected parties are able to view it. However, it is possible to save encrypted, private data also to a public blockchain.  

Since public blockchains use computationally more expensive consensus protocols and have more nodes, private blockchains can potentially offer better scalability and faster transactions. However, private blockchains are not truly decentralized and, for example, Hyperledger Fabric was found at least in 2019 to have issues with network delays causing desynchronization in the blockchain [4]. 

There has been a lot of interest in the possibilities of blockchain technology, and hopes of revolutions in many different areas such as finance and Internet of Things (IoT). Blockchains can provide secure ways to manage confidential data and identity information, and thus provide potential use cases also in health care. 

However, so far there are only a few fully operational blockchains besides systems related to cryptocurrencies and Bitcoin [5] remains the most successful real-world application of blockchain technology. 

Ethereum was the first blockchain to support the implementation of smart contracts, which enable building decentralized applications (dapps) on Ethereum blockchain. There are various potential use cases for dapps and plenty of tutorials on dapp development available online. Despite this, most dapps have practically no users or transactions. On 9th of June 2020 on DappRadar [6], the list of Ethereum dapps showed over 1880 dapps, but only 330 had had at least one user during the previous seven days. All Top Ten dapps (based on user count during the previous seven days), save one, were related to money exchange, high risk investments and decentralized finance. There are some gaming applications on Ethereum (e.g. CryptoKitties, My Crypto Heroes), but most dapps appear to be related to finances and gambling. 

In Deloitte’s Global Blockchain survey 2019, 53% of organizations saw blockchains as critical and being in top five of their priorities [7]. In the same survey one of the top five “organizational barriers to greater investment in blockchain technology” was the lack of in-house capabilities. As the need for blockchain professionals is likely to grow in the future, in Finland a project has been launched to provide education on blockchain technology in universities [8]. 

Supply chains are one area where the use of blockchain technology can potentially streamline the process and reduce paperwork besides creating a transparent, immutable record of the product history. However, Gartner’s report (2019) estimates that “contrary to initial market hype and for the time being, blockchain is not enabling a major digital business revolution, and may not enable one until at least 2028” [9]. This is due to several factors that currently make it challenging for organisations to adopt blockchain technology. 

In a blockchain system, there is overhead from replicating the data. For example, in Ethereum, those users that host the full node need approximately 180 GB disk space [10]. However, not all users need to download the full node, as there are also light nodes that store only the transaction headers and are able to request other information from full nodes. Light nodes can be used, for example, in mobile phones or embedded devices.

For instance, Peker et al. (2020) studied the cost of saving IoT sensor data to Ethereum blockchain. In their experiment the cost of storing 6000 data points (256-bit integers) was approximately 335 – 467 US dollars depending on the method used [11]. According to other informal estimates, the cost of storing 1kB of data to Ethereum blockchain would have been approximately 1.6 US dollars in 2018, and the storage of 1GB was over 1.6 million US dollars. According to Kumar et al. (2020) “the cost of storage on a public blockchain platform can be staggering, a few thousand times higher than on a distributed database system or in the cloud. On a permissioned blockchain system, the cost is likely to be less but still one or two orders of magnitude higher.” [12]

Due to mining being computationally expensive, public blockchain systems consume more energy than regular distributed databases. Bitcoin mining is notoriously energy consuming, and sustainability issues are one area where more research is needed. The popularity of Bitcoin and other cryptocurrencies has also led to different scams related to cryptocurrencies, and for example, infected websites harnessing visitors’ computers to mine Bitcoin.

In their article Kumar et al. (2020) suggest that “blockchain technology should be deployed selectively, mainly for interorganizational transactions among untrusted parties, and in applications that need high levels of provenance and visibility.” For example, tracking the origin and shipping of precious gemstones or other expensive or critical commodities is one area where blockchain systems have been tried. Regarding supply chains, major challenges for using blockchains include creating legislation and standards all parties can agree on, and getting everyone to use blockchain technology despite additional costs. Joining a consortium is usually necessary to properly utilize blockchains.

To conclude, as blockchains are today a high-cost and high-overhead storage method, careful consideration is needed to determine the proper use cases. Also, a decision should be made on what data to store to the blockchain, as it might be feasible to store only the most critical parts of the whole data to save resources. Blockchain technology is being piloted in various different fields, and in the future, blockchains are likely to be utilized in a much wider scale.

References and recommended reading




[4] Nguyen, T.S.L., Jourjon, G., Potop-Butucaru, M. & Thai, K.L. 2019, “Impact of network delays on Hyperledger Fabric”, INFOCOM 2019 – IEEE Conference on Computer Communications Workshops, INFOCOM WORKSHOPS 2019, pp. 222-227.



[7] Deloitte’s Global Blockchain survey 2019


[9] Gartner 2019: Blockchain Unraveled: Determining Its Suitability for Your Organization


[11] Peker, Y.K., Rodriguez, X., Ericsson, J., Lee, S.J. & Perez, A.J. 2020, “A cost analysis of internet of things sensor data storage on blockchain via smart contracts”, Electronics (Switzerland),vol. 9, no. 2. 

[12] Kumar, A., Liu, R., Shan, Z. 2020, “Is Blockchain a Silver Bullet for Supply Chain Management? Technical Challenges and Research Opportunities”, Decision Sciences 51 (1), pp. 8-37.


Asiakasarvo ohjelmistojen ratkaisusuunnittelussa – tutkittua tietoa

Väitöskirjan nimi on “Solution Planning from the Perspective of Customer Value”

Marko Komssin tuoreessa väitöskirjassa analysoitiin ohjelmistosuunnittelun ongelmia asiakasarvon näkökulmasta. Keskeinen havainto oli, että ohjelmistoyritysten ratkaisusuunnittelu kärsii tuoteominaisuuksiin keskittyvästä kapeakatseisesta ajattelusta ja ”tulipalojen sammutustyö -syndroomasta”. Nämä asiakasarvoon liittyvät ongelmat vaikuttavat olevan kulttuurisia ja sellaisina vaikeasti korjattavia.

Komssin tutkimus osoittaa, että ohjelmistojen ratkaisusuunnittelussa asiakkaan tehtävien holistinen ja varhainen analysointi lisää asiakasarvoa. Tutkimuksessa esitellään konkreettisia käytäntöjä, joiden avulla asiakkaiden tehtäviä voidaan kuvata ja priorisoida tiekarttapohjaisessa ratkaisusuunnittelussa. Koska asiakkaiden tehtävät eivät tyypillisesti muutu kovin usein, ne mahdollistavat tuoteominaisuuksia pidemmän aikavälin näkymän ratkaisusuunnitteluun. Analysoinnilla olisi voimakkaampi myönteinen vaikutus ratkaisusuunnitteluun, jos yritysten strategiset prosessit ja kulttuuri korostaisivat asiakasarvoa.

Vastaväittäjä: professori Pekka Abrahamsson, Jyväskylän yliopisto

Kustos: professori Marjo Kauppinen, Aalto-yliopiston perustieteiden korkeakoulu, tietotekniikan laitos

Väittelijän yhteystiedot: Marko Komssi, F-Secure, p. 040 5323753,

Elektroninen väitöskirja:äitöstilaisuus järjestetään etäyhteydellä Teamsissa:
Linkki Teams kokoukseen

Demos Uncategorized

Utilisation of 3D -BIM and API for moisture risk management during construction

Indoor air problems are unfortunately common even in new buildings. These problems may be a result of poor structural designs or misuse, but more commonly they occur due to mistakes during construction.

Wall panel and couple hollow slabs waiting for assembly on site.

Structural engineers are familiar with the problematic designs. Consequently, with active review processes and utilisation of BIM (building information model) these can be handled. But what about the mistakes during construction?

New modular methods are reducing the problems with construction, as more and more construction phases are made inside protected from environmental elements. However, all buildings cannot be built modular and even with modules, some of the work needs to be done on site, for example final assembly. Transporting the modules and panels from the factory to the site is always a risk. Delays in the assembly are another significant risk since this means that the panels are stored temporary on site to wait for their assembly. Consequently, the panels may be exposed to harsh and difficult conditions for extended periods. Modules are typically protected at the factory with a plastic film. However, there is always a risk that the protective plastic film is damaged during the transportation or it is removed too early, as happened in the Wood City case (

Unprotected wall panel waiting assembly.×720.jpg

Incompletely protected panels and panels and modules with damaged protection are at risk to get damp before assembly, which can later lead to Indoor air problems

How new technologies may help in moisture risk management

Utilising IoT-sensors and 3D BIM models makes measuring temperatures and relative humidity in real-time possible.  Could the construction related problems be avoided or at least noticed early enough to make it possible to fix them without excessive dismantling of the building if sensors were used?  Adding humidity and temperature sensors to modules and panels in the factory and monitoring values in real-time during construction would give a warning in case the structure became damp. This way a drying process could be started before actual microbe damages formed.  Adding the sensor data into the 3D -BIM model via API (application programming interface) makes monitoring really easy and possible without visiting a site on daily basis.

Humidity values in 3D -BIM model makes noticeable that one panel might have damp.

In the case of concrete structures, it is crucial that the relative humidity of a concrete slab is low enough before progressing to the finalising phase. In addition to the humidity sensors, the addition of temperature sensors would be beneficial in monitoring the conditions and optimising the temperature and ventilation for the drying of concrete. With this method it might not be possible to replace the current measurements based on test pieces. But at least the sensor measurements would give a possibility to react to unwanted changes in drying conditions and to prevent wasting several weeks of drying time due to poor drying conditions. Consequently, optimal drying process, better quality, and less delays on construction would be achieved.

Could the floor be surfaced earlier if drying condition were measured in real time.

In 4APIs project demo we have integrated 3D -BIM model and IoT sensor data via APIs. Our case building was Ypsilon Community Centre / School in Yli-Maaria located in Turku, Finland. In this demo case, the sensors were installed onto a finished building and the sensors measured temperatures, relative humidity, and in and out air flow in real-time.

Interestingly, the demo case raised questions and some discussion about whether real-time is actually real-time. Like Teemu Mikkonen wrote in his blog “Real-time data with cloud platform” there were noticeable delays in data from sensor to actual services. However, in the case of moisture risk management during construction, the delay may be several minutes and it still does not cause problems, as the time scale, for example for concrete slab drying, is weeks. Even with delays the benefits are remarkable, since unsuitable changes in drying conditions or moisture in structures can be noticed almost in real-time. This gives possibility for constructors and supervising authorities to react to humidity related problems earlier.

In new buildings sensors may be added in the early phases of construction. In such cases some of the construction time sensors would also be utilisable later in use time monitoring. As a result, a complete measurement history of the building could be collected and utilised in indoor air quality investigations.


Monetizing an API has little to do with technology

As a human centered business designer I was privileged to facilitate a business modelling workshop for the 4APIs project in May. The project team spent the morning creating hypothesis with target to scale and monetize for a case study called Ypsilon. The case study is Ypsilon Community Centre which is a multi-purpose building located in Yli-Maaria, Turku. It has facilities for education, a library, youth services, school health care and child welfare clinics. 

Our main questions to answer were typical to any business model design. 

  1. Who do we create value for
  2. What is the value proposition (the problem we offer to solve)
  3. Revenue model
  4. How do we deliver and capture value 
Who? What? Value? How?

In most small groups discussion on target segments was around building owners (B2B) and house owners (B2C). Also building manufacturers and insurance companies were seen as a potential customers.

As a human centric business designer I always emphasise the importance of being aware of human behaviour and driving motivations around the problem that concept is meant to solve. Is your API solving a business critical problem or not? In my own small group we clarified three value proposition for house owners. 

  1. Economical value: prevent the risk of serious water damage or moist. 
  2. Emotional value: carefree living for house owners. 
  3. Functional level: straight forward service model with third parties.   

Even a great value proposition isn’t enough if the pricing or purchasing model is in conflict with the customer’s way of purchasing these kind of services. In B2B  the typical failure would be not to adapt on customers’ tendering models and siloed responsibilities. The model should be somehow familiar to a customer, as every business is always a combination of 55 business model patterns. 

To come up with a potential concept is rather easy. Success is more about hard sales work and right timing than latest technology. At the heart of cashing is to find the right super moments when a customer is most likely to put the effort needed for purchase. Or even better, if we find a way to serve the customer without him/her having to put any effort at all to purchase.

To me as a house owner the winning offer would be a combination of all three value propositions. The emotional side definitely on the highest priority. If it would require more effort than opening a door once, I’d remain in the potential customer category. This leads to the issue of trust. Who do I trust enough to grant access to my water consumption data and let them know when I’m on vacation?


Digital Ecosystem Development: The future of integration environment

In general, to drive digital transformation, organizations must tap into an ever-growing set of applications, processes and information sources across multiple clouds and on-premise – all which significantly expand the enterprise’s need for modern integration capabilities. The Integration and API environment is evolving rapidly and will create new business opportunities. On-premises integration platforms are moving to public cloud, and the same time microservices oriented development demands new ways to create communications between microservices and data reserves. These new ways are utilizing APIs and event driven approaches. These require new technologies like API management, async APIs, event stream technology and service mesh services (like Istio). All this needs to work in synchronized manner which means that governance demand will and have to be resolved by new efficient solutions. At the same time, old school integration legacy cost has to decrease to maintain competitiveness. Keys for success is to replace old integration legacy by iPaaS based cloud integration services, and to use APIs in cloud platform and event driven data communication by event stream technology like Kafka.

Gartner and Forrester have analyzed those demands and tells in their investigations next observations:
Gartner Hype Cycle for Application and Integration Infrastructure, August 2019

“The dynamics of digitization require innovative application and integration infrastructure that rapidly evolves into productive maturity. Organizations are under considerable pressure to modernize their application infrastructure and middleware as part of a broad, cohesive strategy to deal with the disruptive changes associated with digital business transformation. Tool and skills adaptations to develop, deploy, integrate and manage applications have to meet businesses’ demand for fast time to value and greater intimacy with customers, employees, partners and citizens.
The newest and most hyped application and integration infrastructure technologies and practices are increasingly expected to fulfil more advanced and demanding requirements. They include event broker platform as a service (PaaS), self-integrating applications, digital integrator technologies, container management, and practices associated with hybrid integration platform and service mesh.”

The Hype Cycle
“This Hype Cycle includes a diverse range of emerging and maturing innovations intended to equip application leaders to deliver agile and scalable applications that will integrate with the rest of their digital business technology platform. Application leaders’ infrastructure, integration and platform strategies must continue to evolve, along with a sharper focus on API-enablement and event-driven capabilities, and a readiness to integrate at scale across their organizations and ecosystems. Work to meet these requirements will lay the foundations for user organizations’ application portfolios to evolve into API- and event-enabled granular business services, which will be composed and recomposed via integration technologies to support the changing nature of business.
This progress will require modern techniques for application and integration infrastructure. These will include implementing application functionality as a service mesh tied to emerging elements of application architecture such as microservices and containers, along with a hybrid integration platform strategy and defined competencies for an integration strategy enablement team.
Greater adoption of event-driven architectures that support “business moments,” including use of event-driven APIs and information from Internet of Things (IoT) devices, is renewing interest in event broker PaaS. Also emerging are self-integrating applications, and there is growing use of artificial intelligence (AI) in integration platforms in the form of digital integrator technologies, which are intended to rebalance the work of humans and AI. Additionally, the requirement to provide largescale, high-throughput and low-latency API platforms is driving the emergence of digital integration hubs.”

The Forrester Wave™: Strategic iPaaS And Hybrid Integration Platforms, Q1 2019
“The integration technologies strategic iPaaS and HIP are a cornerstone of the evolution required to support digital transformation. Data is the new oil, and not only due to its value as a resource. The fluidity of data is becoming a key factor in data management and managing data in motion well is a differentiator for firms. Making data usable for analysis in 12 to 24 hours was just fine in 2010 — but for 2018’s real-time businesses, data needs to be available in less than 60 seconds. A retailer that wants to compete on time-to deliver must consolidate stock in all warehouses and in all regions to manage real-time delivery to its customers, and the data must be available for processing by all relevant applications — and must be visible to customers.
In the first wave of investments in EAI and SOA middleware, it was difficult to build the business case because these foundational investments benefited the vague goal of reuse or automated little understood IT processing. But the new pricing models with OSS, freemium, and cloud solutions avoid upfront license and hardware investments and are helping even midsize enterprises adopt integration technologies. As a result, the market is growing, with solutions like iPaaS in the cloud not requiring architect or developer training. Today’s iPaaS solutions are also faster to implement than previous technologies, supporting the pace of Agile development cycles and governance.
At the same time, large enterprises that first heavily invested in message-oriented middleware (MOM), integration architectures, and proprietary solutions like EAI and ESB are showing interest in integration technology to lower their licensing costs. They are also increasingly challenged by lines of business (LOBs) that desire more autonomy for their systems of engagement and are finding integration solutions that fit their requirements without involving central IT.”
In our approach, the following technologies are the forerunners in integration technologies, such as APIs, API management and event streaming solutions. We have created a new era model Digital Enabler Services which give the cornerstone to move integration and API needs to business environment which digital transformation demands. Solution suppliers have estimated that the cost will reduce 30 – 50 % if replacing old on-prem integration with legacy base services. At the same time, microservice architecture, cloud base data handling and AI can be utilized only by using rapid iPaaS and API based technologies and event driven architecture. The integrations transfer to cloud, API and API management services near integration environment and event streaming in the same packet are mandatory to handle demands of the next era digital business.

Source: Gartner

The next phase of digital transformation is edge computing which will be the mainstream when 5G will be in use. Also, the ecosystem business model needs a new technology motor for the future. Monitoring and logging capabilities are needed through the whole digital technology. The business and technology situational awareness have to be as online snapshot every time because of speeding and automating business processes.
The business process automation is also coming back with RPA (Robotic Process Automation) and AI based decision making. Visual planning tools and low-code base platform create more efficiency and cost reduction for system development and data handling. Now it is the moment to enable these capabilities with modern integration solution, as digital transformation demands integration modernization.
Automation is organizations using technology to automate tasks that once required human judgment or action. Gartner has named this phenomenon as Hyperautomation which is a state in which organizations use a combination of AI and ML to rapidly identify and automate all possible business processes. Hyperautomation extends across a range of tools that can be automated, but also refers to the sophistication of the automation (i.e., discover, analyze, design, automate, measure, monitor, reassess).

According Garner by 2024, organizations will lower operational costs by 30% by combining hyper automation technologies with redesigned operational processes.

Related to for instance 4APIs Ypsilon case, hyperautomation could offer more rapid utilization of censor data and proactively optimizing operational costs and well-being of pupils and Ypsilon personnel.


When a Business specialist meets an Academic researcher

Business people and academic researchers are often seen belonging to different worlds. When exaggerating one could say that a business specialist solves subjective problems of clients, whereas an academic researcher or scientist is hunting for objective truth.

In our Solita Research -community, we have seen that more crucial is the mindset than a name of the “hat”, which a person is officially wearing. Business units alone can not find all answers for extremely complex problems. Neither can scientific research lock itself in a cell separated from real life needs. An academic researcher is often thinking of concrete value creation and a business specialist wants to deep-dive into theoretical models. Although an old Finnish saying states that “the cobbler should stick to his last”, we all as individuals and communities are forced to expose ourselves to new fields of learning and co-operation.

Here are some findings along my own learning path when following up the 4APIs-project and various other co-operation projects where business and academic worlds meet. Maybe part of these are familiar at your work, too?

Hunger for results, humility for learning
Although we are supposed to present ourselves as proud professionals, our knowledge turns outdated very fast nowadays. Metamorphose of data into operational or strategic wisdom is not an automated technological process only. It happens after combining knowledge from different sources of information and insights from specialists. Learning from each others as individuals, as teams and as co-operation partners is a must. If you see only a jungle of competitors around instead of co-learning companions, there’s no corner for you on any platform or on any ecosystem map.

Talent of finding the right questions before answers
When starting a research project, it’s easy to formulate a hypothesis, but not so simple to find the right questions. In many cases the imperative of being fast has slipped from ICT-development projects into other circles, too. Instead of “failing fast” a dull sounding process thinking and a bit of patience can bring more effectively results with less hassle.

Share your expertise, be open to new ideas and listen to others
Quite often it’s surprisingly hard to form an insight, share and communicate it within the internal and external organization (a project team, a business unit, a faculty). Creating the common understanding among co-operation partners is usually not written in a project plan as a separate action point. It does not mean that everyone should think identically, but it serves as a basis and as a tool for sharing knowledge, creating trust and verifying various ideas.

Networks are seldom mentioned nowadays, but ecosystems and platforms pop up almost daily in media. In the future data experts may become backstage players, when a new generation of wisdom suppliers march into spotlight. Those who are used to co-operate today on no man’s land combining in open-minded way ideas and people, are the ones who are shaping tomorrow. Together.

Demos Uncategorized

Real-time data with cloud platform

The previous blog post: “Using BIM and API data to demonstrate opportunities in Ypsilon” presented a look to what is being researched as a real-time data architecture in the 4APIs project. We can see that there are multiple sensor data sources that provide real-time data about the conditions in the rooms. In order to effectively utilize the opportunities of this continuous IoT-data, we must build a sufficient architecture to support it.

As our approach, we have chosen to research and develop a cloud platform-based solution to capture, store, and provide an output for the data that is provided by the Ypsilon Sensors. More specifically we have chosen Microsoft Azure cloud platform for the initial Proof of Concept (PoC). The components provided by public cloud providers such as Azure are well suited for various data-related tasks. Furthermore, Cloud platforms provide scalability and reliability, which are both increasingly important in the future, with growing number of data sources and applications that are dependent on said data.

We base our PoC-architecture on Azure IoT reference architecture as well as our tests with dummy data simulated by a Python function. So far, we have made experiments that are meant to build the first draft of our PoC. PoC is developed with an idea of small incremental changes and monitoring the data flow in each component that is added. The data flow in the PoC architecture begins by capturing Ypsilon sensor data with Azure IoT Hub. The data is stored to Azure Blob Storage that divides the data stream into a hot path and a cold path.

PoC-test architecture

The hot path is used to stream the data to an outbound API in order to provide access to the Ypsilon data in real time. The first draft of hot path will use an Azure function that reads the data from blob storage. The cold path is for storing the IoT data to a database so that it may be used for aggregation and analysis. We have chosen Snowflake as our PoC database. Storing the data in the cold path enables us to apply the data to reports, dashboards and even machine learning purposes.

We still have many questions that need to be answered. One of the most important ones is “What do we mean by real-time?” Our initial tests showed that simulated sensor data took roughly 90 seconds to reach blob storage, from which there is some additional latency to be expected in terms of outbound API. The entire latency can be brought down by utilizing different Azure components such as Stream Analytics, but as the service gets faster, the costs are also increased. We continue our tests with Azure components in order to find a reliable and fast solution that enables real-time data as well as analytics purposes with history data.


BIM and data as such are not valuable

The world is constantly filled with new innovations that embody little to no understanding of the very people that the product or service is built for. They fail fast. It is common knowledge that most new products don’t make it through their first year on the market. “Users” of the product or service have likely been given some thought, but too often only in the form of hypotheses from thin air or by way of reflecting mainly the needs and assumptions of the team building the product. Then projecting them on the market. Or user insight is built only by looking at people from afar, through spreadsheets and quantitative abstractions lacking tangible understanding of what really drives us and our perception of value.

The first question of any innovation work should be: how does this thing of ours relate to and produce anything of value for people and societies? What kind of needs does it answer to, what kind of human and cultural practices, functions and meaning should it be part of, which actual problems might it solve for actual people and how? How does our product make life better?

Suffice to say, this is something we need to think hard about also in the 4APIs project. As such, our BIM and data can be of no value. They become valuable only through performing functions and holding meaning perceived valuable by people and institutions. Can they make life better? For whom? How?

With these questions in mind we began meeting potential target groups last month. We have now engaged with different stakeholders and potential users of our BIM and data, covering roles from energy operations to school management and real estate services. The work has just started, but here’s a few thing we have learned so far.

Could it save what’s scarce? Could it make us more safe?

The general perception of possibilities from our interviewees was mainly very positive. We found this perception of potential value to be based especially on the promises of efficacy, safety and wellbeing.

Illustration by Maria Niemi (Solita)

By ‘efficacy’ I refer here to precise allocation of scarce resources ranging from money, energy and environment to space, time and attention. Data combined with BIM hold lots of promise as a tool for better understanding of very context-specific conditions, their variation over time and also anomalies we otherwise might have difficulties perceiving. Possibilities especially with real-time data, predictive modelling and machine learning can enhance excitement and the feeling of novel possibilities. One key aspect of efficacy would be also the integration of datafrom several currently fragmented information systems into one real-time API.

The themes of ‘safety’and ‘wellbeing’ were raised especially in the context of a very special worry, even a public trauma of sorts regarding some of our public spaces. Throughout recent decades there has been growing concern over the quality of indoor air especially in schools. If we could collect more precise and rich data of the environments our kids spend their days in, this could have a reassuring function. And when data indicates problems in conditions, especially problems difficult for human senses to directly perceive (e.g. related to correct humidity levels), proper actions could be taken in time. Another given concrete example relates to acute situations of crisis, e.g. fires, when real-time data could be used to monitor people flow getting everyone safely out of the building.

Two takes on your world: data and BIM as aids of perception

Other potentially valuable functions included the use of BIM and data as pedagogical tools and as examples of smart tech relating to the very meaningful and tangible environment of everyday life for pupils and teachers alike. What holds promise here is the possibility of combining two types of information: First, the subjective and sensory information we humans directly generate by observing and sensing our environment. Then combined with the objective and unobservable conditions provided by data and BIM. The subjective experience gives meaning to the environment and its changing conditions. It is then given a complementary lighting by data of objective conditions like temperature or humidity level. Together they show how the subjective and the objective relate and might differ. And by helping perceive the effects of even minor adjustments and optimizations in conditions and behavior alike, data and BIM could be also used to provoke for example environmental awareness and positive behavior change.

This combination of the physical, tangible and sensory information provided directly by our bodies and the objective information provided by data is nicely mediated by the BIM. The 3D model makes data-measured conditions easy to perceive, understand and interpret in relation to the actual environment.

The building information model and reality. A virtual reality illustration of the Ypsilon demo.

This was brought up also in the context of maintenance of the building where this would provide actual help when not physically present to perceive the environment (to remotely and holistically grasp the conditions of the whole building quickly) or when physically present but now also being able to perceive and locate more objective data-informed conditions and their temporal variation (history, future predictions).

From data to information, from information to action

Data as such is not information, it doesn’t inherently lead into understanding let alone required actions. Actually, it rarely does. This is something we painfully know from encountering organizations equating more data with better decisions. Interpretation can be tricky. Easy-to-grasp communication, intuitive visualization and contextualization of data is usually an important factor. “No one is interested in my excels! … But when I have nice visualizations to show…”, as one interviewee remarked.

This need for easy interpretation holds especially true for multipurpose premises with modifiable spaces like the Ypsilon building. Ypsilon’s (and similar sites’) data has a broad spectrum of potential user groups ranging from building maintenance and its partners to energy authorities and more site-specific user groups like service coordinators, teachers, pupils and citizens in the area. Therefore, we need to help different users read, contextualize and understand the data as “with new data, there’s always the possibility of misinterpretation.”

But even this is not enough. After data turns into understanding, understanding still needs to turn into action. For this we need to consider organizational drivers of action and work to incentivize and structurally motivate people’s behavior. Otherwise, the possibilities for efficacy, safety and wellbeing will remain only latent.


Antti is a sociologist and ethnographer at Solita Design & Strategy. For the past 10+ years he has worked with applied social science in service design, organizations, brands and marketing.

Antti’s Twitter and LinkedIn

4APIs’ Twitter


Using BIM and API data to demonstrate opportunities in Ypsilon

In the joint demo of the 4APIs project, we combined a Building Information Model (BIM) to API data. Here we have a BIM:

A building Information Model (BIM) in action click to browse it yourself!

We take use of the Vertex Showroom product which is able to show any BIM in a web browser, even in mqobile. Please take a look at the interactive version of the BIM here!.

The BIM mimics the reality rather well. The following images are from the Ypsilon Community Centre / School in Yli-Maaria located in Turku, Finland:

A classroom for domestic science at the Yli-Maaria school.

The photo was taken in January when we had our demo project kick-off in the Ypsilon smart building. The pilot is executed by the space management center (Tilapalvelukeskus) and industry of education of the city of Turku. In the pilot, they are collecting experiences of using BIM and sensor data.

The BIM was created already in the planning and construction phase of the building a few years ago. It was found very useful during the building phase according to Projektiuutiset in 2019 and

Sensor data is being collected from a few rooms inside the building. Let’s take a look at the Ypsilon community center from the bird view perspective:

A view to the Yli-Maaria community center building from altitude of 500 meters observed from Goole Earth. Floor map on the right.

When looked at from above, the building has a shape of letter Y, which correlates with the name of the building, Ypsilon. In the Y-shaped floor map, you can see sensors in three rooms providing real-time condition data including, for instance, the following variables:

  • Temperature
  • Amount of persons in the room
  • CO2 level

With the given data set, we are able to construct a live view of the smart building related to the current conditions of the building, for instance, temperature. Related to temperature, Ypsilon has both thermal energy and solar panels installed:

The thermal energy and solar panels illustrated on top of the BIM.

Therefore, the infrastructure of the building is very energy efficient. The data scientists of our joint demo project found interesting phenomena in the sensor data related to indoor temperature:

Infograph of the temperature in the three rooms with sensors.

According to sensor data, the temperature seems to variate between 19 and 23,5 degrees. There are peaks in the data set for each weekday in January. In spot #1, the week has only four peaks whereas another week in January in spot #2 has five peaks. A peak in the temperature occurs when there are people in the room. During the first week of January there were only four school days due to holidays which can be seen as four peaks. The week #2 in January had five school days.

But what could explain the changes in temperature between the second floor (room A2105) and the third floor (rooms C3032 and C3060)? At the moment, we don’t know. We have two hypotheses. In the the third floor it is warmer than in the second floor because:

  • warm air rises up
  • the windows in the third floor point towards east whereas the window in the second floor points towards north

At the moment, we have no evidence for approving or rejecting either one of the hypotheses. Independent of the reason, the temperature could be a bit lower sometimes. A common goal value for comfortable indoor air temperature is 21 degrees according to, for instance, a guide at the city of Helsinki. Perhaps some energy could be saved with a more balanced solution?

In the demo, we combined weather API data available at the Finnish Meteorological Institute into the sensor data set:

Overview of the cloud architecture (Azure) with three data sources on the left and data flowing to the hot path (live conditions) and cold path (long term data storage for, e.g., machine learning purposes)

In this sample high level architectural diagram, several API data sources are shown on the left: weather data (FMI), traffic data (Föli) and sensor data. The data is loaded into the Azure IoT Hub. The data is then aggregated into the hot path, i.e, the real-time outbound API for data to be shown in the user interface. To implement the hot path in the demo, we used hard coded Excel files in the demo setting stored inside M-Files document repository. However, the Azure components presented in the architectural diagram (e.g., the Event Hub and Blob Storage) could be used to create a real-world API solution. We took use of an Azure Developer portal at

Azure developer portal enables subscription to APIs and trying them out.

The user interface then calls the API end point and gets a response:

The JSON response of /api/measurements/2105

The response contains timestamped observations in a single room (with siteId 2105) and sensor category “Tuloilma” (row 24) and sensor attribute “Ilmankosteus” (row 25) which means “Incoming air humidity” with value 32.9 and unit “%”. In other words, in the middle of the night, the humidity was on a good level.

The user interface then shows the JSON data and the BIM with embedded visual data values inside the room:

The demo UI sketch with sensor data values illustrated inside the BIM. The raw data in table.

With the aggregated weather data we made the following hypothesis: when it is cloudy, the indoor temperature is lower. We made a very quick graph of temperature and cloudiness:

Infograph of indoor temperature and cloudiness (on scale 0-8)

A very quick interpretation based on the graph is that when it is cloudy, the temperature is actually higher by 0,5 degrees. However, should it be the other way round, i.e., when it is sunny, it is warmer? Further investigation would be needed to make statistically significant inference of the phenomenon. A quick conclusion based on this interesting initial finding is that sensor data and BIM could provide a basis for phenomenon based learning. The teachers and pupils could observe a data set together considering a very familiar and tangible environment: the classroom itself.

To illustrate the data collection possibilities of the demo, let’s go back to the kick-off meeting where we had the opportunity to take a guided tour around the building. We saw novel classrooms with the latest advances in teaching technology, for instance interactive screens and class rooms without corridors:

A classroom in the third floor of Ypsilon. In a school without corridors, another classroom at the back is entered through the classroom in the front of the image.

When we entered the classroom in the third floor, we didn’t know that there is sensor counting people going in and out located in the door sides! During our ten minute visit, these three data rows were recorded by the Smart Building:

Sample sensor data from the classroom in the third floor.

The size of our group in the kick-off was approximately 10 people so the measured value (9) could be right. However, we didn’t see anyone else there. The other measure value (16) could have some error in it. Perhaps the counter is counting people passing by to the other classrooms through one class room, for instance. However, the data set is very useful even with some possible errors. We have to remember that the project at hand is a pilot project in very early phase of going towards a Smart Building. However, it could be beneficial to install sensors to every room in the building to make the data set more coherent.

Or perhaps, in the future, it could be possible to install sensors to the whole city of Turku:

Turku city in 3D. Source: Google Earth.

In order to create an accurate digital twin of the whole built environment, sensor data and models would be needed in large scale. The sensor data of the whole city could be collected into a cloud data warehouse, for instance. We already have a lot of BIMs and data available regarding the surrounding infrastructure:

A combonation of BIMs and data is able to create a digital twin. Source of the image: Schaller, Joerg, et al. “GeoDesign: Concept for Integration of BIM and GIS in Landscape Planning.” J. Digit. Landsc. Archit 2 (2017): 102-112.

In their article GeoDesign: Concept for Integration of BIM and GIS in Landscape Planning, Schaller et al. demonstrate possibilities for wider usage of BIMs and geographic information systems (GIS). Moreover, the BIM research group at the Helsinki University has been recently organizing a workshop with title When So­cial Science meets Lean and BIM, which seems to be a great multidisciplinary direction for research.

But before we advance any further into broader use cases in the context of digitalization of the society with BIMs and data, we’ll have to evaluate the value creation possibilities of the combination of BIM and data in Ypsilon. We showed the demo with BIM and data to a few potential users as part of our qualitative user interviews. Is there potential user value here, for whom and by answering to which needs? More on that topic in the upcoming posts!