The Ubiquity Test: Accessibility, Cost, Performance, Compatibility
I had a meeting with some of the Republic of Things team this week. They are bright people and as ever, they got me thinking. This time about the nature of Ubiquity.
Ubiquity is one of the five Vectors of Change that we use to forecast the future. These vectors form one part of our Intersections methodology.
What we mean by Ubiquity is that wherever technology can be deployed to deliver real advantage, someone will deploy it.
But in order for that to happen — in order for a technology to be truly ubiquitous and find all the niches in which it can deliver advantage — a number of things have to happen.
Firstly, the technology has to be cheap enough. Niche applications are found through experimentation. Experimentation by its very nature means failure: not all of your investment will deliver the hoped-for returns, even if it delivers valuable learning. If experimenting is hugely capital intensive, there will be fewer experiments and hence fewer niche applications found.
Even more simply, if a technology is cheap as well as useful, it is likely to returns in many more applications where it just wouldn’t make sense if it was expensive.
Rule of Thumb: If you can get started with pocket money (or in corporate language, less than the discretionary limit for expenses), then a technology is on the way to being ubiquitous.
Examples: SaaS CRM is probably a good corporate example, which in my experience made its way into many workplaces as a credit card purchase for the sales director. The per-seat licensing meant that they could get their teams up and running on SalesForce or similar without going through a big procurement exercise.
The IoT explosion is another one. At Republic of Things we reckon you can connect just about anything to the internet for £3 using off-the-shelf parts. Much less if you’re producing at scale.
If a technology is wildly complex, or even dangerous, to deploy, then its applications are going to be limited. For similar reasons to the cost. If fewer people can play with the technology, fewer experiments are going to happen and fewer niches are going to be found.
Rule of Thumb: If you need a PhD (or at the least a CompSci/Engineering degree) to get to grips with a piece of tech, it isn’t ubiquitous. If a passionate amateur, or even a school kid, can learn the fundamentals in a few hours, or copy and paste together something that works, it’s on its way.
Examples: The web is a great example of this: anyone can build and deploy a functioning web page in a matter of minutes.
From a hardware perspective, the Arduino team has done an incredible job of making microcontrollers easily accessible to the masses.
A technology is never going to be ubiquitous, however cheap and accessible it may be, if it just doesn’t deliver reliably. In other words, price and accessibility aren’t absolute, they form part of a value ratio against performance.
Rule of Thumb: A technology has to be surprisingly capable for the cost and deliver with sufficient reliability to be trusted in a real application.
Examples: Consumer-grade 3D printing fails the Performance test right now. Though the cost and accessibility has improved exponentially in recent years, the results just aren’t reliable enough for the mass market.
We live in an age of networked technologies. Nothing works in isolation. For a technology to be truly ubiquitous it has to easily interact with our existing systems and services.
Rule of Thumb: Standard and open? Good. Proprietary and closed? Bad. Unless that proprietary ecosystem is so large that it can create an impression of wider openness inside its boundaries (hello Apple).
Examples: I’m currently watching closed, proprietary systems inside client organisations being swept away by open, standards and web-based alternatives.
Right now it is closed-ness that is preventing home automation technologies becoming truly ubiquitous.
So, pick a technology and run it through the test. Is it ubiquitous or does it fall at one of these four hurdles?