What is the cloud and why does it matter? We’re bombarded by advertisements telling us the cloud is the current ‘big thing’. Somehow this vision of white fluffiness will solve all of humanities problems. Maybe it will, but the adoption cycle is just beginning.
It’s no secret that the ‘cloud’ is just a bunch of servers sitting in a dark data center. If you’ve ever visited one of these facilities you’ll quickly realize the cloud is very hot and noisy – perhaps not the nirvana of problem solving that we envisioned.
But there is something very powerful about the concept of cloud computing that can be more accurately defined as distributed computing. For many years we’ve heard the term ‘big data’ this catch-all phrase suggests we can now capture and make some sense of it all. But the reality of understanding big data requires both technology and domain expertise, it’s not just having access to the data that is compelling – it’s creating meaning from it – from vast sums of unstructured information from many different sources.
So how can this be done in a meaningful way that creates not only data interoperability but also interoperability to the right constituents within an ecosystem? In other words, creating data for different people who need different data to do their jobs.
Let’s first tackle the very practical yet complex task of creating data interoperability. Until recently software business models have been focused on client-based perpetual licensing. Today virtually every major software publisher is moving to cloud based models. Largely what that means is software is delivered as a service (SaaS) and users pay as they go. The benefit is that customers have the option of using the most current software at any given time. Now all of that sounds wonderful but still creates the problem of interoperability. For architecture, engineering and construction (AEC), interoperability has largely been addressed by finding a common dumbed-down file format, workable but problematic for users who want to pass along valuable metadata.
As adoption evolves, cloud interoperability and application programming interfaces (API’s) allow users to think about data outside of the file structure, allowing small bites of data to be analyzed and those datasets becoming further tailored for the user. API’s also assist in unifying data interoperability so an application in Oakland can speak to applications in Mumbai without intervention – that’s a big problem solved that creates its own significant ROI. Simple right?
All of that makes sense – let’s pass data around. But there’s a second part that’s even more compelling and valuable – it’s the ability to compute complex datasets in the cloud. Distributed computing allows virtually unlimited computational cycles. At startups like SKUR, that power makes it possible to compare billions of data points to design models to create advanced analytics, something that wasn’t possible just five years ago. But what the cloud also offers is the opportunity to develop data that is tailored to the user – let’s examine that and why it matters.
Data to a user is analogous to an individual’s DNA. We all share it but it’s just slightly different for each of us. The commonality may be, for instance, geometry. An architect will use that differently from an engineer who will use it differently from field workers and again differently from executives in the back-office – the same data can have many faces. When data is expressed differently it also becomes more dynamic, as updates are made by one user everyone will benefit from seeing those updates in real time. Imagine the impact that has when contractors are coordinated with their subs and suppliers. Suddenly the cloud begins to create efficiencies we didn’t even think were possible, with a new ability to coordinate and collaborate we can begin to see real change. Still simple, right?
So that’s what can be, but let’s be real – this vision takes a willingness to evolve. Startups must be funded. Contractors must be willing to examine their work practices and make significant organizational changes to accommodate technology. Owners & developers must understand the value of data – not only during the construction process, but also as a valuable resource that can be utilized into facilities management though the asset’s lifecycle. We must also understand that construction will continue to be challenging and complex and initially expect marginal ROI’s. But over time, given the full implementation of a technology stack, AEC will have the opportunity to drive innovation and efficiencies that will parallel or exceed other complex industries. Simply.
Adam Cohen is the Founder and CEO of SKUR, a SaaS analytics platform for construction and facilities management based in Oakland, California.