For a lot of my futurist career, blogging has been a major outlet. My posts are less frequent these days but occasionally I still use a blog post to organise my thoughts.

The archive of posts on this site has been somewhat condensed and edited, not always deliberately. This blog started all the way back in 2006 when working full time as a futurist was still a distant dream, and at one point numbered nearly 700 posts. There have been attempts to reduce replication, trim out some weaker posts, and tell more complete stories, but also some losses through multiple site moves - It has been hosted on Blogger, Wordpress, Medium, and now SquareSpace. The result is that dates and metadata on all the posts may not be accurate and many may be missing their original images.

You can search all of my posts through the search box, or click through some of the relevant categories. Purists can search my more complete archive here.

Future of Humanity Future of Humanity

All the world's a stadium and all the men and women merely gamers

What does it mean to be a winner in the game of life today? And how much control can we exert over the rules?

The rules of the game of life are changing constantly. It takes strength to define your own rules. But as cultures fracture, we might need to find that strength."All the world is a stadium and all the men and women merely gamers."Sorry Will.With access to the real world restricted, I have spent a lot of time in lockdown playing in virtual worlds. I'm not alone. Spending on gaming leapt £1.6bn in the UK between 2019 and 2020, to £7bn. Game-related internet traffic in the US jumped 75% in just one week in March 2020. Oculus's Quest 2 - a device I have spent a lot of time with - has been a breakout VR success.When we talk about people spending even more time gaming and on screens, it doesn't conjure up the healthiest picture. And yet there is growing evidence that games - yes, even ones on screens - do us good. Games are the new playground for the young, the place they don't just test themselves but interact with others. Games themselves appear to have helped some people deal with lockdown better than their non-gaming counterparts.These though, are the games that we play consciously. One of the many draws on our time of which we now have a surfeit of choice. Though these are a big focus for me right now - a client has me thinking about the future of gaming - I'm also very interested in the games we don't choose to play.

The game of life

"Be a winner at the game of life,get married, have a baby!"https://www.youtube.com/watch?v=hAvwGtRY7RAYou have to be of a particular vintage to remember that little ditty. In MB's defence, at least they showed the young girl getting the job and making the big bucks, and the boy holding the baby. That was seriously progressive for children's toys in the 1980s. Some would argue we've gone backwards since then.The reason this particular advert came to mind was because of some work I'm planning for an FMCG company. They're interested in different life stages and I wheeled out my idea about extended adolescence. If being a winner at the game of life means getting married and having a baby, then it is taking us all longer and longer to be winners, as average ages at marriage and first child continue to rise. What about all the people who don't do those things? Are they losers? And what are people doing throughout their twenties? Just playing?I'm being somewhat facetious: the rules of a 40-year old game aren't really the rules of life. But, it does illustrate how much the metrics by which we judge ourselves and others have shifted in that period. And how much they haven't. While some might be open minded about the time shift that has taken place in the key milestones of adult life, those milestones are still ranked above much else in the eyes of popular culture. What are you unless you have a good job, a home and a partner? And of course, kids. It's not easy to challenge expectations around any one of those, let alone two, three or four of them.Rather than removing these expectations, it seems we have just deferred them. And added in new ones to fill in the gaps. Social media has become the venue where we demonstrate our worth before the big milestones of life. Our holidays, our cocktails (guilty), our thigh gaps (not guilty). What we show and how many people see us do it are now the points we seek to acquire in the game of life.

Measuring impact

I have become a little too obsessed with online perception of me in recent years. There is a professional lens on this, as I have been chasing profile to win work. But it's hard for me to separate my professional and personal lives online, since I am to a large extent, the product that I sell. My search ranking, social media profile, and newsletter subscribers have been monthly metrics that I've tracked for the last three years. Only recently have I begun to question these metrics.The first note of caution came in January 2021 following my first truly viral tweet:https://twitter.com/bookofthefuture/status/1346916905500639232It wasn't really relevant to my work, but it took off more than anything I had tweeted about the future. It was thrilling watching the likes and retweets rack up. I thought about trying more similar tweets to attract such levels attention. But then I looked at the impact: had it driven more followers? A big uptick in web traffic? No. I looked at the profiles of other peoples with similarly viral tweets. Had they gained thousands of followers? Mostly they were still small accounts.Was the effort of trying to write more such tweets worth it? Probably not for the business.My scepticism increased over the course of the year. This, I'm pleased to say, has been an incredible year from a new business perspective. Both the scale of the projects and the nature of the brands I'm working with has been incredible. As has the total number of enquiries. What's driving it? Surely those metrics I've been tracking must correlate?Nope. All those metrics were pretty steady. I don't have all the data yet but my sense is that this additional work is not coming from social media but from much more old fashioned routes: primarily, word of mouth.

Stepping back

My kids have been complaining for some time that I spend too much time on Twitter. My excuse is always that it's important for work. Suddenly, I couldn't make that argument with any honesty. So last week, I took it off my phone. And what a release that has been.I've regained probably twenty to thirty minutes each day that I had been spending doomscrolling. I've dodged so many comments and opinions that would make me furious.Downsides? I'm less well informed about day to day events: Twitter has become my primary news source. But I am less distracted from deep research on the things that matter.

Social credit

I'm not the first to do this by any stretch of the imagination. If anything, I am very late to the party of people stepping back from being 'very online'. And I'm not saying I won't go back to Twitter on my phone (I still log in periodically on my laptop, but there, for some reason, the doomscroll temptation is less). I'm not immune to concerns about my social profile declining. I have an ego and it needs feeding.I am struck though by our willingness to submit ourselves to the evaluation of others through social media. Especially when in other parts of the world we already have examples of such profiles feeding into a more formal system of social credit.China's social credit system is not, yet, the terrifying digital panopticon of some headlines. But it is headed in that direction. An aggregation of data across social, payment, and governmental systems to benchmark the behaviour of citizens. It's easy to see it expanding out into the physical world, with algorithms processing camera data for small infractions: bad driving, inconsiderate parking, or jaywalking. Where here we enforce behaviours and to an extent, participation, on each other through social pressure and the hunger of our own egos, there both the pressures and the consequences look set to be much worse.

Making up the rules

Coming back to where this post started: games. There are the games we play consciously, and the games of life. The latter are becoming more game-like all the time. As we, as a group, get richer and focus less on survival and more on self-actualisation, so the rewards we seek are less physical (food, shelter) and more ephemeral (success and social approbation). The achievement of these things is increasingly 'gamified' through the application of psychological understanding to 'nudge' us into different behaviours - often using very game-like rewards. This is done by governments and companies alike. But we also do it to ourselves, subscribing to an ever-changing set of rules, created and managed by the shared consciousness of culture and society.Often these rules will change more slowly than our reality. It takes time for change to permeate society. So there will be a lot of tension between the rules and our desire to follow them, and the reality of what is best for us. Making your own rules, and defending them to the world takes a lot of strength. As does challenging the rules imposed by others, whether it's society or the state.I think this is a strength that many of us are going to need, particularly as our cultures fracture and become more diverse

Read More

When the flame goes out

Kids are bemused by old technologies that were critical to us but which have now disappeared, leaving only echoes. What technologies are likely to disappear in the near future? Here's one suggestion, explored in fiction.

A bit of a different approach this week. Thought I'd try my hand at a little future fiction. A short story. It's a bit 'on the nose' but I think you'll get the idea.This story was inspired by those questions we all get from our kids about old technology: "What's a record?", "Why is the save button that funny square?" etc. It got me thinking: what abandoned technology might kids be curious about in the near future?##The pop of the cork, then the wail of the smoke alarm. The two sounds followed one another so closely, I felt it must be cause and effect. It took me a few seconds to realise that my wine hadn't triggered the alarm. Nor was dinner burning. I hadn't got past chopping vegetables.Instead, the noise was coming from upstairs. That meant the kids.I dropped the bottle on the surface, and started to turn. Then instinctively turned back when I saw it wobbling in the corner of my eye. Having saved the wine, I raced up the stairs seeking the source of the smoke.What I found was just wisps. Barely visible. And my eldest daughter, standing on the landing, hands clamped over her ears, looking shocked and sheepish. Her younger sister hadn't even opened her door.Satisfied we weren't in imminent danger, I turned my attention to the alarm. "Move out of the way" I shouted over the noise. I grabbed a child-sized chair from the eldest's bedroom, giving me just enough extra height to reach the reset button.The silence brought relief. The adrenaline quickly started to subside. As quiet returned, the 8-year-old removed her hands from her ears."Do you know where the smoke came from?" I asked. She instantly dissolved into tears. "I'm sorry!" she pleaded.I pulled her into a hug and settled her. "It's OK. I just need to know what happened. You're not in trouble." She continued to sob, albeit less energetically."Did one of your farts set your room on fire?""DAAAD!!" she shouted. But she couldn't help herself. She laughed. And the tears stopped.She wiped her face on her sleeve. "Come and see." she said.I followed her into her room.There, on the floor, was a log. More of a small branch really. Two feet long and about the same diameter as her arm, but twisted and gnarled. It was old, bark-less and so dry it was almost white. She had brought it back from a walk a few weeks earlier and insisted on keeping it, lugging it up the stairs and nearly taking a number of pictures down along the way.One of the hollows in the log was dark and sooty. Next to it lay a straight stick, around which was wrapped the string of a plastic kids bow. The sort that fires darts with large suckers on the end. The straight stick was also blackened at one end.I was stunned. I felt anger rising. But I was also curious. And impressed. I knew what she had been doing. I had learned about this trick on a survivalist YouTube channel as a kid. The bow allows you to spin the stick to create friction on the larger piece, starting a fire. I had tried this trick many times and barely got the stick warm before I gave up, arm aching. Without the interruption of the alarm, she would have succeeded where I failed."Were you trying to start a fire?""I just wanted to see what it looked like!"This stunned me again. I was speechless as I tried to turn back through eight years of memories. Surely she had seen fire?Slowly I ticked off the many ways she might have experienced it.No gas hob. That had gone in the early thirties when we refurbished the kitchen. All domestic gas had been phased out a few years earlier and it didn't make much sense to cling to the dead technology.We had a wood burner in the living room. But those had been all but outlawed in the twenties. Actually, the laws banning them didn't come in until the thirties but you risked social censure from the local clean air campaigners if they saw smoke rising from your rooftop. And the stats weren't good on what they did to people inside the house either. So the only light in ours now came from a big string of fairy lights stuffed inside the cast iron casing.But she must have seen fire somewhere, surely?Bonfires? Maybe not. Bonfire night had increasingly been 'fireworks night' until those were banned after a series of accidents, including the high profile injuries to a popular influencer. With drone displays, laser shows and holograms taking over from the fireworks as the main attraction, people just stopped building bonfires either at home or for big displays. The risk - and the insurance - just wasn't worth it. I'm sure we went and saw a real bonfire when she was young. But maybe she was only two? She wouldn't remember now.Candles then. Sure she must have seen birthday candles? But no, I realised. We'd had the same set of LED ones for probably ten years now with their archaic little USB charger.No-one smoked actual cigarettes any more. At least, no-one we knew. And the clean air rules meant no-one burned garden waste. Not here in the city.So no. At eight years old, our daughter had probably never seen a flame. Not that she could recall."Dad?"I snapped out of my reverie."Do you still want to see fire?""Won't the alarm go off again?""Not in here!! Please tell me you will never, ever try to start a fire in here again. There's a reason we have alarms for that. It's incredibly dangerous.""Okay, okay!"She looks like she will cry again."But I think you should see fire. Come outside?""But it's bedtime!""That didn't stop you trying to burn the house down."At this point the younger child pokes her head around the door. A smoke alarm couldn't pull her out of her cosy bed, but the promise of being up after bed time is clearly too good to miss.The three of us head downstairs and put on shoes and coats. I head down into the cellar and find my father's old blowtorch - still miraculously with a little propane in the tank. I'm not messing around with rubbing sticks together.We head out to the back yard and I place the gnarled log on an old paving slab. The girls squeal and recoil as the blowtorch ignites. As they look on, I point the torch at the log. Under this assault it bursts rapidly into flames.The three of us stand there, transfixed and silent, as it is consumed.

Read More
Future of Humanity Future of Humanity

A digital life after death

Microsoft has filed a patent to turn someone's digital personality into a chatbot. Is this our first attempt at resurrection?

It is perhaps not surprising that during a pandemic, we find ourselves thinking more than usual about death and what lies beyond. So the news that Microsoft has filed to patent the process of building chat bots from dead people's social media histories, seems somewhat timely.Microsoft's patent covers chatbots built from anyone's digital history, not just the dead. But it is there that your mind immediately goes. Especially if you have been reading Neal Stephenson's latest novel Fall, as I have. Fall is about the creation of a virtual world into which human bodies can be scanned at the point of death. The newly created 'souls' retain some aspects of their personality, albeit not their full memories.

We can rebuild him

Microsoft's proposal is to delve through the digital archives of an individual and recreate their personality in digital form. The system would apparently draw on “images, voice data, social media posts and electronic messages” to build a profile. It might even use a ‘voice font’ assembled from recordings to make it sound like them, or recreate their image in 2 or 3D. Of course, with current levels of technology, we can't actually replicate human thought processes or capability. But call centre systems can already assemble original conversations from stores of data. A conversation with a chatbot such as Microsoft may seem fairly true to the original. It may even be able to say completely original things, if it can process news media through the lens of what it understands about a person's views.

Do not resuscitate

For me the main problem with this is about consent. Do you want a digital puppet based on your personality existing in the world after you are gone? Could you stop someone creating one if you wanted to? After all, many of us have sufficient digital footprints to support the creation of basic deepfakes today. It's not a massive leap to think someone could create a virtual clone of us today without our consent. The only reason I have a blue tick on Twitter is that someone - maliciously - created a digital identity pretending to be me, so some people are clearly motivated to do this. Imagine if that digital clone had been autonomous rather than human-controlled. Imagine if they could have spawned hundreds of virtual Toms, each time one was shut down.These are extremes. But we are already having to face issues of consent around digital resurrection. From holographic performances by Tupac or Elvis, to Kanye's holographic gift to Kim Kardashian: a speaking representation of her father with a message from beyond the grave. Who has the right to resurrect us?

The big conversations

These are questions to which we don't have answers today. Like so many technologies, this possibility creates questions for society about ethics, etiquette and law. And as is so often the case, we feel ill-equipped to address the range of questions at the speed required. Facebook may have abandoned its mantra of 'move fast and break things', but our approach to dealing with new technologies remains to break things first, then work out how to fix them.The prospect of a digital afterlife in one form or another is already moving from science fiction to reality. If you want to live forever, or want to ensure that your end is a true end, it may be time to update your will.pixellated day of the dead skull 

Read More
Future of Humanity Future of Humanity

The brain in the jar

The brain in the jar is a staple of science fiction: minds that can be dropped into new bodies or digital environments. But it's wrong.

I took a course this year which included large elements of philosophy. It is not something I have ever studied formally. Unless you count reading Sophie's World as a teenager, or dozing off to In Our Time. As a result, I found it pretty challenging at times.Many of the ideas we discussed were sort of familiar, like songs you've only heard playing in a shop, or the back of cab, but never really listened to. Once you sit down and listen, you discover the depth to the orchestration, or that the lyrics weren't at all what you thought they were. It challenged some of my deeply held (though little-considered) ideas about reason. And particularly about transhumanism.

A science fiction education

The idea that we can supersede human biology, our minds escaping the limitations of the flesh, is an old and important one in science fiction. And I have devoured science fiction as long as I can remember, from Terrahawks and Star Wars, to Iain M Banks and Charles Stross. So many times in these stories, we see the separation of mind and body. Humans continuing their lives with their brains transplanted into alternate bodies, or their minds ported into the digital realm. It's an appealing idea in many ways. Digital superpowers and immortality, rolled into one.The brain in a jar is an idea that has very old roots. A modern expression of Cartesian dualism, where the consciousness and its container are two separate and entirely divisible things. It was a foundational myth of religions and ghost stories, long before the first science fiction.It is also a very important idea to many in Silicon Valley, where transhumanist ideals have very much taken root.It's an idea I have always bought into. Until recently.

Software/Hardware

It turns out that the human mind is not like a piece of software that could be run on a different piece of hardware. Instead, the software and the hardware are deeply entwined. Over the last few decades, a variety of studies((This blog post is worth a read to start with: https://blogs.scientificamerican.com/guest-blog/a-brief-guide-to-embodied-cognition-why-you-are-not-your-brain/)) have shown that our whole mental model of the world is shaped by our physicality. The exact relationships are still to be fully understood, but the conception of the mind as driver and body as vehicle seem to be fundamentally incorrect. We think with our bodies not just our brains, from the sensory signals in our skin, to the chemicals washing through us from our glands, to the spatial model created by our senses. The picture is complex but my reading of the evidence so far is that you cannot have a mind distinct from the body.

Emulating the body

Could we recreate all of the biological elements that make up ourselves, as we do when we build emulators to run old software? Maybe. But the point remains that it is much more complicated than we have imagined: you can't transplant or digitise the brain. You have to digitise the whole embodied experience, and then put it into a virtual environment that can present information in a way that the embodied human can interpret. And you have to get it right: if you make fundamental changes to the body, you are changing the person.This would rather mess with transhumanist ideas of immortality: something might live on but will it be you?

A political dimension

There is a clear political aspect to belief in this division of mind and body. It devalues the physical, turning our bodies and indeed, everyone else's, into resources. This is why feminists frequently((Frequently, but not by any means always. Feminism is an incredibly broad field with a huge range of different perspectives, as you might expect from a philosophical discipline and political campaign representing half the population. Feminism also has a strong transhumanist tradition - for example, Donna Haraway's Cyborg Manifesto: https://warwick.ac.uk/fac/arts/english/currentstudents/undergraduate/modules/fictionnownarrativemediaandtheoryinthe21stcentury/manifestly_haraway_----_a_cyborg_manifesto_science_technology_and_socialist-feminism_in_the_....pdf)) have an issue with this mode of thought, since ending the treatment of women as reproductive resources is at the very core of the feminist movement.By divorcing us from our bodies, the transhumanist ideology also questions the link between us and our planet. If our consciousness can be immortal, we can be more cavalier with our environment. ((https://www.greeneuropeanjournal.eu/an-eco-social-perspective-on-transhumanism/))I don't mean to say that all transhumanists are patriarchal hyper-capitalists (see footnote on feminism and transhumanism, for a start). But there is undoubtedly that streak. Indeed, one of the things I like about so much about some science fiction is that it doesn't ignore this. Whether it is Battle Angel Alita or Altered Carbon, these stories show that even if we could transplant our brains or minds into new forms, the society that this technology creates wouldn't necessarily be a kind or egalitarian one.

We're still human

Throughout the pandemic I have been briefing clients and audiences on its likely ramifications. And I keep getting the same piece of feedback. People really like hearing about the human traits that persist and even dominate in the face of change and disruption. In many ways, the pandemic has reminded us just how human and fragile we still are. How bound to our bodies, and to our environment. Despite all the digital luxuries of our age, we have been forcibly reminded that there is no substitute for pubs and hugs, colleagues and kisses. And for the most part, we have shown real humanity. One Public Health England study showed that nearly 2/3 of us checked on our neighbours. Over a third of us shopped for neighbours in need.We are still human. Ours is still an embodied experience. We still need our planet and the people around us. However powerful the potential of technologies to take us beyond our limitations, we need to remember that.

Read More
Future of Humanity Future of Humanity

We're still human

Lockdown is excluding us from tactile experiences. Any answer to this & future pandemics cannot divorce our digital consciousness from physical interaction.

There is a trope in science fiction that sees the human consciousness divorced from the physical form. Our essence is extracted and allowed to roam free across the net or re-embodied in a robot form. It's a fascinating but flawed idea.So much about who we are is connected to our physical form. Our identities are not just software that can run on any hardware. Our brains, isolated from the chemical and electrical processing and information streams from the rest of our bodies, would not still hold the same person.The separation of body and mind is such a cliche of science fiction - and fantasy, and spirituality before that - that I think we often forget this. We think we are the voice in our own heads and that voice can be freed from the shackles of humanity and exist without the experiences and infrastructure of life around it.

The bandwidth of being there

I think this is part of the reason behind the current over-confidence in the possibilities for remote working. And the recognition of it is behind some of the backlash.There is something different about being there, in person, with all of your senses engaged. It's what I called a few years ago, 'the unbeatable bandwidth of being there'. What gets transmitted and received through the screen and headset, mediated by a million miles of fibre optic cable, is not the full experience of meeting.Nor does it allow for all the things that happen around those meetings. I've talked at length about the need for peer support, the subtler parts of staff training, and the mutual inspiration that happens when you're sharing a physical space. But what about all the other stuff?

Human behaviour

In light of (ongoing) harassment at work, office romance is a complex topic and one that many HR departments would probably like to ignore. Nonetheless, nearly a fifth of couples in the UK met their partners through work. Romance (almost) inevitably leads to sex, and this is something that also isn't going away.Sex is just one of the many human experiences for which there is not, and will not be, replaceable by a digital alternative in the foreseeable future. Yes there is all sorts of sextech. But that's no more a replacement for human contact than a postcard is for a walk on the beach.When we are thinking about the future, particularly in light of the ongoing lockdown, we need to remember this.

Christmas cheer

For me, the starkest loss on the horizon is that of Christmas. I realise this puts me in the deeply privileged camp, when compared to those who have lost friends and relatives, or who are facing the next few months without work. When compared to those who have been isolated alone for much of the last six months.Christmas is a big thing for me. I don't see my parents and sibling that frequently. There's a couple of hours travel between us and we all lead busy lives. But at Christmas we all get together. Like most families, it's a time for feasting and drinking, lots of chat and plenty of hugs.The prospect that we might not be able to do that this year has really brought home just how vital that period is to me. And how pale a digital chat is in comparison.

Live culture

Christmas, eating out, live music, dating, competitive sport, going to the pub, school sportsday. These things are important. They are core to the human experience. Whatever we do to tackle this virus, and the next virus, we need to remember that. We can give them up for a while but eventually, we need them in our lives. Without them, we are like the brain removed from the body. Conscious, perhaps. But not human.

Read More
Future of Humanity Future of Humanity

Hypertribalism

Low friction global communication has enabled our fracture into tiny tribes, each with strong views, tight borders, and fierce opposition

After another week of debate about 'culture wars' and 'cancel culture', I decided to write something about it this morning. Then I realised I already had.I wrote most of the post below back in January, but never quite finished it. It's very much a provocation without too much evidence behind it. But reading through it today, it certainly felt like an accurate representation of what I have continued to witness this year in politics and culture.By the way, I note that the term 'hypertribalism' has been used by a few different people from conspiracy-theorist forum posts to Catholic ministers. But I use it here in a slightly different sense.##High frequency change has disrupted the foundations of our identity. These foundations include familial political affiliations, sporting allegiances, religious affiliations, and shared economic and cultural experiences. These things have been persistent between generations and across age cohorts in the same communities for decades. But now they are being disrupted by the explosion of choice, the more global communities created by always-on digital communication, and the globalised supply chain for media, products and politics.These shared foundations of identity were what connected us to a sense of place, and what bonded us into coherent movements. Into tribes. In the digital age, our sense of place is undermined, and old tribes are fractured. We share less with our geographic and historical peers than we did in the past, whether those peers are family members, school friends, or colleagues.What has replaced these old tribes is a new hypertribalism.I would characterise hypertribalism as having three traits:

  1. As tribes get smaller, the adherence to core tenets gets stronger. Any diversion from those tenets sees rapid expulsion
  2. 'Opposing' tribes and their members are demonised for even slight divergence from the tribes' core tenets
  3. Transient leaders, structures, and sometimes principles

You could argue that these are all traits of tribes throughout history. But I use the term hypertribalism to highlight the fact that each one of these core traits is amped up in this current age. The extreme reach and accessibility of global communications platforms being the primary catalyst for this change.

Adherence to core tenets

"One Trot faction, sitting in a hall,One Trot faction, sitting in a hall,And if one Trot faction, should have a nasty squall,There'll be two Trot factions, sitting in a hall."This rhyme is related in Christopher Brookmyre's (excellent) Country of the Blind, but I remember it first from my time in student politics in the late 90s. Trot, for the uninitiated, is short for Trotskyist/Trotskyite, an adherent of Leon Trotsky's branch of Marxism. The story the rhyme tells is that of the endless rifts in the political left, particularly in the emotion-drenched realms of student politics, over issues of principle.Today those rifts are more evident than ever, on both ends of the political spectrum. On the left, it is not just Corbynites vs Blairites, but fractional groups aligning around different priorities, whether it is the achievement of power, the rolling back of austerity, or the rejection (or pursuit) of one of the many possible forms of Brexit. On the right, the rise and subsequent implosion of UKIP and then the Brexit Party has torn apart the broad church of conservatism, leaving a loose and deeply unhappy coalition of europhile moderates, disenfranchised working classes, hedge fund managers, and frankly, racists.Though Boris Johnson successfully attracted enough of this coalition to his cause (namely, his own power) in the recent general election, this base feels highly unstable and very open to disruption, either by a resurgent and more appealing Labour or just as likely by a new force on the right. As Moises Naim said a few years ago, power is now harder to win, harder to use, and easier to lose.While each faction defines itself by hard adherence to some key tenets, and rejects anyone who does not share that adherence, the fracturing of each group into smaller groups will continue. This is a phenomenon particularly amplified by social media, where it is hard to express nuanced views and anything except full-blooded commitment to the cause is often met with opprobrium. Given the driver for the existence of tribes - as much about our need to belong as any real connection to a cause - people are incentivised to keep their views blunt in order to secure social approval.

Demonisation of others

Tribes have always defined themselves in opposition to others. They are as much about what they are not, as about what they are. As tribes fracture and become smaller, and to the outsider, their differences appear perhaps smaller, so each tribe has to express its differences more strongly. Particularly by highlighting the apparent failings of the other. Five minutes on Twitter and you will see incredible levels of hate directed between groups who might be expected to be natural allies, were there not a single issue of principle separating them.This phenomenon can be incredibly damaging to those who become the target of a particular group. Particularly when they were part of that group but have been ousted from it for some apparent breach of its rules. But it reaches its most disturbing peak in the use of language like 'traitors' and 'enemies of the people' by national media.

Unstable leadership and structures

What the various interests coalesced around the Brexit agenda have shown over the last few years is that a complete lack of structure and stability is no longer an obstacle to achieving your ends. Whether it is the 'strong and stable' party of power, or the endlessly reforming and leader-shifting UKIP/Brexit Party, if your narrative connects with the public it can continue in spite of the machinery failing.This power of a story to exist beyond its teller feels like it too has been augmented in this age of low friction communication. Stories - and conspiracy theories for that matter - can rapidly take on a life of their own.Of course, this instability is apparent in organisations with a less successful story as well. Anyone remember TIG?###This is as far as I got with the post (in fact the last sentence is a new addition just to round out that last point). But I think it gets the key points across. And it raises some important questions. Is this a new or growing phenomenon? Or is this just another case of a digital age observer seeing age-old patterns of behaviour through an new medium? I am more convinced now than I was in January that this is not necessarily something new, but something that has been incredibly amplified by the communications capabilities unique to this digital age.I have written before - here in 2017 and here in 2018 - about my concerns for the way that technology can fragment society. This post is really just documenting that effect and its emergent effects in a little more detail. But these facets, or symptoms of the phenomenon seem important if we believe it is something that needs tackling.I certainly think there is an argument for tackling the bullying elements of this phenomenon. And the way that it allows, or even encourages, the spread and use of  - even deep belief in - false information. These things do serious harm both at the individual and the societal scale.But I also see this commons as a positive and powerful thing. A place where ideas can be shared, debated, demolished or enhanced. This is a society working out its differences primarily through language not violence.How do we keep these positives while addressing the negatives has been a big debate for the last few years. Where does responsibility lie? With the social networks? Police? Or society? Personally, I don't think we can or should ask corporations to police our speech. But they can give society the tools to do so, whether that's fact checking, abuse blocking, or providing information to law enforcement - subject to appropriate judicial oversight.The rest, I think, remains up to us. The improvement of this commons is only likely to come from the commons itself.

Read More
Future of Humanity Future of Humanity

A race between the four horsemen

Four horsemen of disaster are vying to define our next three decades. Which one lands its blows first will determine our future.

In a recent post for for Locus Magazine, Cory Doctorow laid out his scepticism about general AI in a piece entitled 'Full Employment'. He argued that there is no sign that a general AI - one that can replicate human adaptability in tasks - is on the horizon. And that the work required to address climate change is so great that we are much more likely to see full employment than the AI-driven unemployment that many have predicted.I disagree with Doctorow's analysis of AI. Right now, I don't believe that we are close to a general AI. I am more open minded than Doctorow about the idea that current AI systems have the capability to 'evolve' into something more generally capable, but the gap remains large.My criticism is that I just don't think AI has to be very sophisticated in order to replace humans in the workplace. It's an argument that I have made many times on this blog, so I won't repeat it in too much detail here. Suffice to say that if you break any job down into its component tasks, today's machines are eminently capable of handling many of them. If you accept that machines take work - tasks, rather than jobs - then you can see that the remaining work can be redistributed among a smaller number of humans.Where I don't disagree with Doctorow is on the scale of the challenge presented by climate change. I have little doubt that large portions of humanity will be involved with the mitigation response. But the idea that this will offset any job losses due to automation brings me back to one of the most difficult parts of futurism: seeing not what, but when.

Four horsemen

Even before the pandemic, I was concerned about our prospects for the next 20-30 years. While it's not quite the apocalypse, there are four modern horsemen of disaster racing to cause us problems.

  • Climate: In this period, directly or indirectly, climate change will start to affect the more moderate climates. Changes in weather patterns, disruption to agriculture, sea level rises. Until this point climate change has been something most people could ignore, should they so choose. This choice is going away in the next few decades.
  • Technology: The prospect of technological disruption to employment and the economy is another major issue. Whether you want to generously call it AI, or prefer the perhaps more accurate 'machine learning and robotics', there is the potential for swathes of workers to be displaced by machines in the next three decades, from administrative, customer service, logistics and manual roles.
  • Politics: We are in a rancorous period of global relations. Violence so far has been primarily inside borders rather than between them. But our international trading relationships are collapsing and our diplomatic ties being strained.  And domestic leaders in many countries seem to be incompetent, mad, corrupt, vicious, or some combination of all of these.
  • Disease: The latest addition to the line-up is the global pandemic, spreading effortlessly through our international connections, strained as they are. It's unlikely to end quickly and we are likely to see more of its type.

The horsemen analogy falls down when it comes to timing. This isn't about which of these potential challenges will win a race to reach us. All four are here already. The question is the speed and scale at which their effects will be felt.

A race to the finish

Doctorow might be right. Our climate mitigation efforts might start well before we adopt robotics and ML technologies to a level that severely disrupts the labour market. Or he might not. The scale of job losses in the retail sector right now are pretty dramatic. We could attribute these to the pandemic, but really this is just the acceleration a trend towards automation and self-service that has been rolling for years. The pandemic may accelerate the adoption of automation technologies in the retail supply chain and logistics. It might also accelerate their adoption in other fields - administration, customer service, finance, law... Once people are out of the office, perhaps we will be less squeamish about replacing them with machines?Even if you ignore the technological effects, the pandemic has clearly had a terrible effect on our economy. Many are bracing themselves for  job losses in the coming months. During lockdown almost 150,000 people have been made redundant and over 9m have been furloughed. This doesn't even include the many self-employed who sit outside the support schemes or many not be counted as having lost their jobs, despite their income having collapsed. Full employment feels like a long way from here.This is especially true in the current turbulent political environment where it is hard to see coordinated efforts to restore global prosperity. Or for that matter, a coherent effort to address climate change. If we were to start that process now, I can see the creation of an enormous number of jobs that might redress the losses currently being experienced. But it feels more likely to me that these efforts won't start until the effects really start to bite. That is the nature of our politics right now: always focused on today not tomorrow.In the meantime, it is going to be a difficult few years, whichever of the horsemen is leading the race.

Read More
Future of Humanity Future of Humanity

The population implosion

A new report warns of a population implosion by the middle of this century. What does this mean for humanity and how should we respond?

The human race is facing a population implosion, faster and sooner than we previously understood. What does this mean for us?Every year, the UN updates its forecasts for the global population. In 2019, the median prediction saw us hitting a peak of around 11 billion humans at the end of this century before our numbers start to decline.11 billion is a lot of humans. The planet could easily support that many, if we all adopted certain lifestyles and policies. But not if everyone wants to live like people do in Britain or America. And there's a good argument that says "Why shouldn't they?" After all, we have spent decades squandering the planet's resources to feed ourselves and our economies. So 11 billion people on the planet was going to be tough. Not only would it accelerate climate change, feeding that many people would be made harder by the effects of climate change. But at least we could see the population peaking. And we could begin to plan for its decline.

Shrink to save the planet

The problem with a declining population is that global disasters aside, it generally means a fall in the birth rate. That means that the population is ageing as it shrinks. Which in turn means fewer young and working age people are available to support the older members of the population, either through taxes or through direct support.Nonetheless, given the pressures of climate change, and the reasons *why* the population was peaking - rising global wealth and the emancipation and education of women - it was clearly a good thing. And while some countries where the birth rate is already low were already starting to struggle with an ageing population, at a global scale it was something we had decades to learn how to deal with.Then came a new report.

Population implosion

The report from the University of Washington's Institute for Health Metrics and Evaluation shows the global population peaking lower and sooner. Instead of 2100 it will peak in 2064. And instead of 11 billion it will peak at 9.7 billion. The global fertility rate will be down to 1.7 by the end of the century.What does this mean? Let's start with the good news. It means women around the world have more power and control over reproduction. And while this is probably too little, too late to have any real impact on catastrophic climate change, it will reduce the scale of the mitigation challenge - a rather euphemistic way of talking about the feeding and rehousing of millions of people.Bad news? This population implosion is happening much faster than we thought at a global scale. Here in the UK, the effects aren't predicted to be that dramatic in terms of total population but some countries like Italy, Spain and Portugal, are predicted to see their populations halve by 2100. This will see a collapse in the tax base and workforce while the cost of caring for an ageing population rises and rises. And it starts now.

Policy response

So what do we do? The first response to this population implosion must be to shore up programmes that support women's education, work, and reproductive rights. As the economic consequences of this population decline become clear, there are bound to be those whose solution is to drive women back to a role as mother and home maker. Setting the policy tone now by addressing the remaining imbalances will make the coming battles much easier.Then we need to look at relatively short term measures to our ageing population. Immigration is the most obvious solution, politically unpopular as it is in many places right now. Populations in India and Nigeria are going to continue to grow through the end of the century. Rather than closing our doors we should be opening them and inviting people in. Based on this report it seems many countries are likely to incentivise immigration within a few decades.Technology will play a huge role. There is a lot of squeamishness about robots and automation in care and health contexts, as I have written about before. But technology can alleviate the burden of some routine and unskilled tasks from care workers, giving them more time to offer personal contact and companionship.Public health campaigns will also be critical. If we can help people to look after their own health, and extend people's healthy, productive years by one, two, or even five on average, then we can drastically reduce costs to the state.This is likely to be married to an extension to the working age. Don't expecting your pension before 70 or even 75, so staying healthy as long as possible will be critical.

The turnaround

The longer term question becomes one of species survival. With a birth rate at 1.7, the population will continue to shrink. Will we see it return to more sustainable levels, around 2.1? I think we will.There is an element of techno-optimism in this view, but I do believe that perhaps in the next century we will reach a level of global health and wealth where most of us are living much longer, healthier lives, with average lifespans rising over 100. If you think that is overly optimistic, just look at the changes in the last century.Investments in women's medicine should see the trauma and risk of childbirth reduced over this period. Greater political equalisation at home and in the workplace, should make it easier for women to have children without damaging their careers. And with better medicine and extended lifespans, having children later in life will be more common.The population is likely to decline a long way before any of this happens. We may find we settle in the 5-6 billion range before it does. And post-climate change, the world will look very different then.

Read More
Future of Humanity Future of Humanity

Will Strictly go on forever? #AskAFuturist

How long will our love affair last with the glitzy record breaking show, Strictly Come Dancing? When will it be replaced in our affections - and by what?

We all need a little light in the current times of pandemic and political betrayal. So I thought I'd tackle this tongue-in-cheek question from Fiona on my #AskAFuturist thread on Twitter: "Will Strictly go on forever?"Fiona is referring to Strictly Come Dancing, the somewhat oddly named BBC TV series, now at 17 seasons and named the most successful reality TV format by Guinness World Records. In a world of on-demand entertainment it remains appointment viewing, attracting on average over ten million viewers per episode in the last few seasons. Can it continue this run of success?The answer to that question brings in the subject that is most obsessing me at the moment: choice. And it raises the most challenging issue for futurists in making predictions: the unpredictability of human taste and behaviour.

The choice explosion

Today we have more choice about what we watch and when we watch it than ever before. Not only that, we have more choice about how we spend our leisure time than ever, albeit arguably with more leisure time to spend as well. The natural assumption from this is that each form of entertainment might take a smaller share of our total time. And this largely appears to be true. As the internet rose, so linear (broadcast) TV consumption declined around the world. But what's fascinating as someone who watches and listens to very little real-time/broadcast content* is just how dominant broadcast media remains in the UK. According to the 2019 Ofcom report, 89.4% of us listen to the radio and spend an average of almost three hours a day listening. 88.5% of us watch television, and spend an average of over three hours in front of the gogglebox. 71% of that viewing is still accounted for by the primary five broadcast channels and their subsidiaries.Compare these figures to those for Netflix consumption and you see just how dominant the old forms of media remain. The average Netflix subscriber (about 40% of UK households) consumes about 7 hours of content per week.What can we take from this? Strictly has achieved an incredible feat by growing its audience over the seasons to the level it is at today. But it has done so in the context of a choice explosion that is only just beginning. We are at, as the saying goes, the b of the bang. While there are many more options out there, most people have yet to migrate their tastes away from the dominant broadcasters, if they ever will. Though the trend is most pronounced amongst the youngest viewers, as ever. If they maintain their behaviour as they age, consuming more short-form and on-demand content, the strictly could suffer.

Tomorrow's celebrities

This said, it's hard to discount the idea of continued success for Strictly. After all, part of its appeal comes from the stars who they attract to appear on it. The careful curation of these celebrities ensures that the show attracts a broad demographic. If that successful curation were to continue, and new generations of stars continue to value an appearance there, then it's possible that the show could sustain its success. But, I think this is where the show may struggle.The media through which the new generation of celebrities are emerging are very detached from the traditional world of carefully curated linear programming, or even reality TV. The style is very different and so is the audience. Creators are often auteurs with complete control of their output and image. Their audience comes from all across the world. Appearing on a show like Strictly might be a big leap for them, and one that doesn't necessarily hold that much appeal. If you have ever watched YouTube stars appear on 'normal' television then you will know what I mean. Even the most famous and polished frequently look awkward and out of place. It's just a different discipline. Learning it may not hold sufficient reward unless the financial prize is very large and the crossover with your own audience is significant.

Format shift

Strictly and other popular linear programming is likely to face another challenge in the next decade as we go through another format shift and mixed reality becomes more accessible. Exactly how and when this happens is unclear, but as I have written about in the past, the physical and digital worlds have been coming closer together for decades. Blending them in a form of augmented reality interface seems like a very obvious next step.This creates enormous possibilities for programming that is somewhere between television and computer gaming. People have demonstrated systems where you can be the director of your own show, for example, using the huge amounts of raw footage captured by the falling cost and rising pixel count of cameras. That's a very different experience from the passive consumption of television, but it might appeal to some.Whatever happens, there will be a new dimension of choice and competition for established formats in the future.

Re-use and recycle

If I were to be forced to bet on what will happen with Strictly, I would guess it follows a fairly traditional TV arc. Ratings start to decline, and after a couple of years of them falling the BBC decides to stick the idea back in the vault to resurrect at a later date when the time seems right - much as it already did with Come Dancing when it added the 'Strictly'. When will that happen? Well, the numbers are still strong. I would guess we have at least another three to five seasons of Strictly yet before they decide to pull the plug.##*Apart from all the shows I appear on. Clearly I listen to/watch you all, every week, without fail...

Read More
Future of Humanity Future of Humanity

#AskAFuturist: The future of privacy

In response to my #AskAFuturist request, Franck Nijhof asked about the future of privacy. What will be left of our privacy in 50 years?

“But for real, the future of privacy, what will be left of it in 50 years...”This question came from Franck Nijhof, also known as @frenck, one of the most prolific contributors to my favourite open source project, Home Assistant.To answer it, we must start with some questions. What do we mean by privacy? What are we keeping private? And from whom?

All or nothing

There are two opposing schools of thought here:

  • “None of your business” - One school of thought says that everything should be private unless you explicitly choose to share it. That no-one should have the right to compel you to share it, without at the very least, compelling evidence of wrongdoing.
  • “What have you got to hide?” - The other school of thought asks, “what have you got to hide?” It suggests everything should be out in the open and only terrorists and deviants keep things concealed.

Most of us inhabit the grey area in the middle. We’re happy to share some things, we share others grudgingly. Sometimes we make informed choices about trading some privacy for services, such as social networks. Most of the time we make such choices based on very poor information and understanding.The short answer to “what is the future of privacy”, is more of the same: we will continue to muddle through making mostly OK choices based on limited information about what we should share. Our privacy will be alternately infringed and reinforced by governments, regulators and private corporations alike.The bigger answer requires addressing some of the specific technologies, business models, and pressure points that will affect the future of privacy. The answers it reveals might show up the cracks between different jurisdictions.

Future of privacy: the long answer

Let’s divide the challenge between public and private, asking two questions:

  1. Will corporations continue to gather data about us in a bid to target us more accurately with advertising and change our buying (or voting) behaviour?
  2. Will governments make increasing use of technologies like facial recognition, infringing our privacy under the cover of making us safer?

Corporate data gathering

A few years ago, I had conversation with the chief data scientist of what was then Demandware, now Salesforce Commerce Cloud, Rama Ramakrishnan. Rama is now Professor of the Practice, Data Science and Applied Machine Learning at MIT Sloan.  What he told me rather exploded my understanding of the drivers for social networks and search engines gathering big data about us.“When it comes to understanding shoppers, the key lesson is that you are what you buy. That I am of Indian origin and live in a particular suburb of Boston is not particularly valuable. The fact that I like a certain brand of boots, that is interesting.”What he means, and what he went on to explain in the paper that I wrote for Demandware on data-driven retail (sadly no longer available), is that much of the information we are worried about sharing, and that the social networks appear to have prized collecting, is actually of very little value when it comes to selling advertising or targeting us. Put another way, we give so many explicit signals about what we want that there is little return on investment on spending billions trying to infer what we might want from other data about us.

The value of data

I see lots of companies starting to get to grips with this fact. They are starting to understand that, even before any regulation, or consideration of the security risks it presents, the cost of gathering and processing lots of personal data about us is often not outweighed by its value. Better just to have the 10% of – often at least semi-anonymous – data that gives 90% of the value.Now, there is still lots of deeply personal and perhaps compromising information in our clickstreams. We are right to be cautious about what happens to that information. Here, regulators have a role to play, looking both at what is collected, how it is stored, and how it is used. But the diminution of the business case for large-scale data hoarding gives me hope that this sector can be regulated. If the value of our most deeply personal data proved to be much higher, I would worry that business would lobby and wrangle to minimise oversight.This is the same reason why I’m not *that* concerned about the data gathered by voice assistants like Alexa. Yes, they are picking up more than our explicit commands. But storing and processing that morass of data probably has very little value. I’ve still chosen to keep voice assistants out of shared areas in the home until my kids are old enough to make a conscious decision about sharing their own voice data. And they can always be hacked, but that’s a different story…

Data ownership

Of course, our data does still have value. “So why don’t we see that value?”, many ask. Many people have proposed putting personal data into the hands of individuals rather than corporate behemoths. There are even proposals to ascribe it some sort of property rights, making it a tradeable commodity. It’s an idea that I have discussed on this blog before. I still believe that in the future we will store most of our personal data inside a firewalled cloud account and release it only on a case by case basis when we see value in return. Sometimes that value might be financial – such as when it is a signal of a willingness to buy. In this case there are already mechanisms to monetise that signal, through cashback services like Quidco. But sometimes that sharing might be more community minded – such as with the sharing of health data for government information or large-scale studies.

Is it worth it?

It’s unlikely that we will make the decision to share ourselves for reasons of practicality, and of value. Practicality because the number of requests per day will likely be overwhelming. We will give our personal AI assistant a set of broad rules and it will take decisions based on those rules, feeding us the exceptions and learning from each one. Value because, unless it is a very explicit buying intention, our data just isn’t worth very much!As Chloe Grutchfield of adtech specialist RedBud Partners pointed out on a panel I was part of in late 2019, most people would stand to make a maximum of 50p per day from their data. That would be based on them signing up for multiple data-driven reward schemes and sharing their data without much thought for their privacy. Now, that might be enough to subsidise the cost of the personal AI making the data-sharing choices for you, but it won’t do much more than that. You can’t make a living from your personal data.So, will corporations continue to gather data about us in a bid to target us more accurately with advertising and change our buying (or voting) behaviour? Yes. But I think they will be increasingly selective about what they gather, and more conscious about how they store and use it. In the long run, we will likely have more granular control over what is collected and shared. And we will reap some of the (small) reward from it. Here, the future of privacy is surprisingly positive.

Government grabbing data

I am more concerned about the overreach of government when it comes to the future of privacy, particularly in places like the UK and US. Here, there is limited experience of authoritarian states and the speed at which the tide can turn against different groups. The “what have you got to hide” lobby is very strong, particularly in the UK where there is less of a counterculture of those completely opposed to federal government. That doesn’t mean that I respect those groups – they rather terrify me. But they are nonetheless something of a balance to governmental overreach.The ruling in the UK on the use of Live Facial Recognition by the South Wales Police sets a precedent, albeit one based on a limited version of the technology. These are not the networked cameras scanning the streets and monitoring everyone’s movements that you might see in a dystopian scifi. Rather these are mobile units running against a bespoke set of target individuals at each location they are deployed. That said, the fining of a man who covered his face to avoid the system on trial in Romford sets a worrying precedent of its own.It only takes an authoritarian home secretary (ahem), and a terror incident to see the scope of such technology expanded. And with us outside the European Union and challenges to judiciary oversight, I am concerned how fast this might happen.

Beyond cameras

That concern extends beyond cameras. Our legislature in the UK has been using technology as a scapegoat for all sorts of societal ills for a few years now. It is very happy to discuss draconian measures to lock down access and monitor people’s use. The “what have you got to hide?” lobby might do well to read a history book or two and see just how many groups fall into the government’s sights when things start to slide.Will governments make increasing use of technologies like facial recognition, infringing our privacy under the cover of making us safer? Yes, in the UK, US and probably places like Australia, they almost certainly will. In the EU, with different attitudes and stronger regulators, the risk feels much more distant.

Future of privacy: What will be left in 50 years?

The answer to this first question then probably comes down to where you live. We won’t have global harmonisation on laws for many more than 50 years. In the meantime, I think you will see the boundaries of your privacy expand in both the public and private domains if you live in Europe. Good news for you Franck. Elsewhere, the future of privacy is perhaps not so rosy. 

Read More
Future of Humanity Future of Humanity

Extended adolescence

Are kids growing up too fast? Or do we now get to live an extended adolescence, aware of the adult world early but not hitting its markers until later?

I first published this post in February 2020 but I have updated it to accompany the first episode of Season 5 of my podcast, Talk About Tomorrow. In this season we are focusing on some of the big ideas that keep recurring in my work, starting with this one.##"Children grow up too fast these days.”You hear this said a lot, but is it true? I don’t think it is. I think some aspects of childhood have been compressed and others extended. Extended so far in fact that a lot of the traditional markers of adulthood are now things we don’t consider until well into our thirties. In the future, they might even be into our forties. In the meantime, we experience an extended adolescence.

Childhood compressed

What aspects of childhood have been compressed? Well perhaps inevitably with the advent of digital mass media, our children are exposed to more of the world, earlier. It is hard to completely shield them from some aspects of life without isolating yourself from the world in some form of religious retro commune. It is true of ideas about sex, politics, religion, celebrity, beauty, violence, crime, and more.The good news is that it is mostly just ideas they are exposed to, rather than the real thing. Rates of violence and sexual crime are down a long way from the peak in the 1990s. Crime against children aged 10-15 has fallen 30% over the last decade. Teen pregnancy rates are down around 60% since the late 90s . Kids are drinking less, and anecdotally, are much less likely to go to nightclubs when they are underage than my peers and I were in the 90s.

Markers of adulthood

On the other side of childhood, lots of things happen much later. Learning to drive (average now 26 (2016) - up from 22.8 in 2004). House buying (average age 33, rising to 37 in London ). Getting married (skewed by second marriages, but nonetheless, 37.9 for men, and 35.5 for women in heterosexual couples, rising to 40.8 for men and 37.4 for women in same sex couples). Having kids (30.6 for women and 33.6 for men).The net result is a kind of extended adolescence, where you are aware of adult things early, but don’t reach many of the traditional markers of adulthood until later. Do we need to re-write the social rules for what marks out adulthood? Or do we accept that our lengthening lives mean that we need to think differently about different periods in our lives now?

Extended life, extended adolescence

I lean towards the latter. ‘Adulthood’ now covers an average period of 62.96 years, from being allowed to vote, to being buried. There is a lot of stuff that happens in between. While we still need the general term of ‘adult’ for people legally permitted to do certain things, there is no harm in starting to think differently about different periods of our lives. Not least because it might alleviate some of the pressure I hear about from younger adults.Society’s pressures change faster than society’s expectations. I speak to people in their teens who are concerned about their lack of life plan. People in their twenties who are worried about the incoherence of their career path so far, or having not yet found a partner. I speak to people who feel like they’re doing something wrong because they don’t have a house as they approach thirty.What they need to know is that that is just normal now. We should allow people an extended period of experimentation and adaptation throughout their twenties. And arguably, into their thirties. There is simply more time now than there has been in the past. Time to do, in all but the most tragic cases, many of the things we traditionally saw as markers of early adulthood.There’s no rush to grow up.

Read More

The future of sport: making predictions

When Grosvenor Casinos asked me to look at the future of sport, I started with four basic principles that will define tomorrow

When a new client asked me to make some predictions about the future of sport, I started from five basic principles.Sometimes my work is very serious. I work with big companies who have had major shocks. They come to me because they want to see the next shock coming. And because they want to be ready for it and able to adapt.Sometimes my work is just great fun. Such as when Grosvenor Casinos approached me to think about the future of five different sports: football, tennis, running, e-sports, and Formula 1. The results can be found at this brilliant page they put together: https://www.grosvenorcasinos.com/future-of-sports.I started this project on the future of sport from four basic principles about sport, and sportspeople.

Continuous Improvement

We just keep getting better. And when I say 'we', I mean human beings. As the health of the general population improves, we keep pushing the boundaries for what a human body can do. And as we understand the human body and mind better, our ability to extract the best possible performance from each athlete. There's also the simple fact that there are more of us now. Though the route to elite sport is still easier for some than others, we are now selecting from a greater pool than at any point in history.In short, it's fair to assume that the limits of human performance will continue to grow. The future human will perform even better than today's elite athletes.

Technological Progress

Another given is that the technology of sport will continue to improve. That might mean the introduction of new materials, such as graphene into running shoes or racing bikes. It might mean that we have access to better training technologies. And it might mean that there are new ways to extend our performance beyond natural human limits. Sometimes this might mean cheating. But sometimes in the future, we may decide to make this part of the rules.We have already seen blurred lines on some performance enhancing technologies, such as the introduction of drag-reducing swimsuits. In the future of sport, regulators will have a hard time decided what is, and what isn't, unfair enhancement.

Regulatory Tension

Regulation is critical to keep sport both fair and exciting. As technology and human performance continue to improve, the role of the regulator is only going to get harder. This is particularly true in technology-reliant sports like F1, where the next few years are likely to see incredibly increases in capability. The regulators are going to have a challenge to balance what is possible with what is safe, and also what produces a good spectacle...

The Spectacle Imperative

Sport is big business. And it will only remain such if it continues to present a spectacle worth watching. The future of sport will see growing challenges to sport's role in popular culture, with more and more entertainment choices available. The imperative to keep presenting a spectacle worth watching will drive the decisions of teams and regulators, and force them to push the boundaries of technology, performance and sometimes even safety.

The future of sport? Bigger, better, faster, more

You can find out more about what I think about the future of sport on the dedicated Grosvenor Casinos page. But the summary is this: bigger, better, faster, more. Sport remains one of the biggest cultural connectors in the world, drawing together global audiences around teams and competitions. Interest is not waning, even if it is perhaps being distributed across a greater diversity of forms of sporting entertainment. This will drive the continued improvement in the performance of athletes and vehicles, which only regulatory tension focused on safety and fair competition will restrain.

Read More
Future of Humanity Future of Humanity

Will we ever put microchips in our brains?

Will we ever put microchips in our brains? Sure. But not yet. Here are three reasons why a neural interface is still a few years away.

"Will we ever put microchips in our brains?"People ask me this question frequently, often inspired by the latest science fiction showing transhuman characters with implanted digital technology. So sure are they that this will happen, the question is often not ‘if’ but 'when'.I have no doubt that at some point our technology will allow us to supersede our biology. At that point, inserting technology into our bodies will be entirely normal. But I think we have a way to go before this is everyday reality. Here are a few reasons why.

The rate of change in technology is too high

How often do you change your phone? Every two years? Every three? How often do you want to have major surgery?For the last fifty years at least, the performance we get from our digital devices has doubled every two years. We are reaching the limits of what our current technology can do, leading many to see the end of Moore's Law, the description of this incredible expansion in the bang for buck we get from machines. But with Neven's Law, quantum computing is showing early promise of even faster advances in computing power.The result is that even if you just look at raw computing capability, the technology we might choose to install in our brains is advancing incredibly fast. And who wants to be stuck with an old model, especially when it is in their brains?The situation gets even more absurd when you consider the rate of change in other fields of science and technology that will influence how a human/machine interface might be constructed. Everything from the materials from which we might make a neural interface, to our understanding of the brain to which it might connect, is expanding incredibly fast.This won't stop some people from doing it. There will obviously be medical cases where an intervention makes sense. And there will be those who want to be at the cutting edge. But for most people, the idea of carrying around technology that is instantly out of date inside their skull will probably put them off.

The security risks are too great to put microchips in our brains

Everything is hackable. There is no such thing as complete security.To give you an example, a few years ago I brought the hacker Samy Kamkar over to the UK to speak at a conference I was hosting. He spoke about a situation where hackers found a way to get data off an air-gapped computer - i.e. one not connected to any network - inside a locked room. As long as they could get code onto that machine, they could turn its memory chips into rudimentary antennae to broadcast information over radio waves.Since then people have found lots of ways to extract information from inaccessible machines. This article describes the same approach Samy described but also a way to use status LEDs and surveillance cameras as a communications channel.If you are going to have a neural interface, then it is going to be connected to a network. Otherwise, what's the point? That makes it much more susceptible to hacking and malware than the carefully protected computers in the examples above. And someone will try to hack it. That is guaranteed.Are you ready to have your brain hacked? I thought not.

Direct to brain is the wrong interface

Perhaps the strongest reason not to have a neural interface though, is that you don't need one.The last sixty years of computing history may show an incredible increase in speed, storage and bandwidth. But what is much more interesting is the use to which all that power has been put. We have used a huge proportion of the available bits and bytes to make machines easier to use. No longer do we have to punch instructions out of bits of car, or script complex instructions on the command line. We can shout across the room and the machine does what we want. At least, some of the time.The need for a neural interface presumes that this trend does not continue, when it is very clear that it will. Because the next natural step, and one that we are working towards quite clearly, is to have no interface at all and let the machines take decisions on our behalf.Feed an AI data from your life. From your social graph, your conversations, from the video camera you will soon be wearing on your face 10 hours a day to drive you mixed reality glasses, and from the physiological sensors it has on your body. Give it all this information and it will know you well enough to choose - and even buy - things for you. You don't need an interface to your lights, heating, or door locks because they all respond to you automatically. Most of your shopping is increasingly automated. You will take pleasure in doing some things manually, like picking some outfits, or putting on a record, or browsing a book shop. But 80% of things, the ones you want a neural interface for, will be automated anyway.The same will be true at work. Why have a neural interface when the machine can craft most of your messages for you, and interact with you in the most human domains: movement and language?

Will we ever put microchips in our brains? Yes. Just not soon.

I'm not ruling out the mass adoption of some form of neural interface. But it won't happen soon because the technology isn't ready and it moves too. fast Because even if it were stable and complete, we couldn't secure it. And most of all, because we just don't need to put microchips in our brains. 

Read More
Future of Humanity Future of Humanity

When recycled material is better than new

Adding single layer materials like graphene to recycled plastics can create a range of new materials with properties perfectly suited to their application

Can we really make recycled material that is better than its virgin equivalent?When you describe someone – perhaps a government minister right now – as a ‘chocolate teapot’, everyone knows what you mean. We know that chocolate melts at roughly the temperature of the human body. It would be a ridiculous material from which to make a teapot carrying boiling liquids. Fortunately, we have many other choices of material and we can select one that is appropriate to the task at hand.The material we choose for a particular task depends on its physical properties, as well as its cost, both financial and environmental. We can never select for just one property: it’s always about a compromise between a variety of characteristics, and our budget.

Recycled material for specific tasks

Imagine if we could design a material for each task. One that had the minimum amount of compromise because it was engineered for the task at hand. More, imagine we could reform recycled materials with new properties that make them greater than their virgin equivalents.I’ve shared my excitement about the new wave of materials science a few times on this blog. What thrills me is that new single layer and composite materials will change the way our world looks, just as the digital revolution changed the way it works. In fact, based on a conversation I had at the national Graphene institute recently, it might just do both.We all know plastics are bad, right? Bags, straws, packaging, all have to go because they consume fossil fuels in their production and take decades or more to biodegrade, choking the seas as they do. Some plastics are recyclable but the resulting product is typically inferior. This doesn’t have to be the case.

Pipeline plastics

Speaking to Dr Oana Istrate, a Graphene Applications Specialist at the new Graphene Engineering Innovation Centre, I learned that they are working on adding single layer materials to recycled plastics to create a variety of properties suited to different applications. For example, stopping the leakage of Hydrogen Sulphide gas from oil pipelines.H2S is a colourless gas with a distinctive smell of rotten eggs. It is highly poisonous, flammable, and corrosive, presenting a real challenge for the oil and gas industry. H2S corrodes pipelines shortens their lifespan and increases the risk from leaks, creating a serious safety issue, as well as one of maintenance costs.Combining graphene with plastic can create a material that is impenetrable to H2S. Line a pipeline with this and you can extend its lifespan and improve safety.Other applications abound: imagine a contact lens that better retains moisture, or a wetsuit that better retains heat. This is before we get into improved mechanical properties. Researchers have already used Graphene to increase the wear properties of trainer soles, and make racing bikes and cars stiffer, improving the transfer of power to the track.If we combine single layer materials with recycled plastics, we see the promise of a new range of materials. Materials with which we can construct tomorrow’s world. Materials that are greener but also more particularly suited to the applications at hand.

Read More

Sensory Overload: Why we struggle with too many communications channels

We have far too many channels of communication today to be productive. Something has to change to give us back control of our conversations.

It’s fashionable to knock email at the moment. Plenty of articles have been written about how it wastes more time than it saves, and many companies are now enforcing strict email management rules in a bid to reclaim productivity. But I don’t believe email is the problem.We now have a wealth of communication tools and information resources at our fingertips. Every one of them is competing for a bit of our attention, distracting us with sounds, images, flashing lights and vibrations. Every one of the channels and tools available to us is generally well designed as a product in its own right. Few people struggle to use Outlook, or Skype, or a mobile phone, or Firefox. But the problem is that it is never an either/or choice in the modern life — we are constantly multi-tasking in a bid to keep on top of all the information coming to us.Just looking at my desktops both real and virtual now, I have: a landline; a mobile; Skype and headset (for two SkypeIn numbers and my Skypename); Thunderbird (handling four email accounts); Outlook (handling a fifth email account, plus calendar and task list with pop-up reminders); VNC (for controlling my server and jukebox); and a Timesheet application (again with pop-up reminders).Any one of these I can handle quite ably, even two or three at a time are fine. But there are days when everything seems to go off at once, or even worse, in a constant stream that prevents any work except talking for an entire day.In the short term this means developing strategies to handle all the different media: ignoring some calls, putting Skype on DND, turning off pop-up alerts, and ignoring email for large parts of the day. But in the long term I think the technology has to change. While I am sure our brains will eventually evolve to deal with all the various inputs, why should we wait a few thousand years for that to happen?Instead there needs to be a standard for communications tools to collaborate and share information about our availability — and willingness — to accept inbound information and communications requests. This extends right across the different media: if my Skype is set to DND, I also don’t want calls on my mobile or landline (unless I have specified otherwise — perhaps calls from a certain number, friend or family group). If I am in the middle of writing a long blog entry, I don’t want my anti-virus to pop-up while I am typing, or for Windows to ask me to restart because it has completed an update. In fact, I want an interface that actively helps me to concentrate by blocking out other distractions while I am working, perhaps only offering me contextual information, or messages that are relevant to what I am doing.This ties in very much with the media filtering technology that is the ultimate goal of most search companies: they want to understand you well enough to suggest TV, books and articles that you might like and save you trawling the enormous oceans of data on the internet. That’s great for home, but if we’re going to stop the white collar classes becoming a nation of digital fidgets, some of that effort really needs to be directed at the workplace.

Read More

Is tomorrow's society diverse or fractured?

We can rejoice in the diversity of modern society while also being concerned about the loss of a shared set of civic values and institutions

Who are you? How is that identity defined? What groups do you associate with? And which ones do you define yourself against?

These are the issues increasingly at the heart of modern politics, according to a recent article in Foreign Affairs magazine by Francis Fukuyama. No longer is the debate defined by who has what, but by who we are. Traditional class lines have been disrupted by signifiers that have taken on greater importance. Low-friction global communications have allowed us to build tribes that are no longer defined by geography, as I have written about before.This last point is, in many ways, a good thing. As one Twitter friend put it yesterday: “Why would I want to associate with my neighbour? I’d much rather join a global group of people I actually like.” The freeing of communication has allowed us to find perhaps a truer sense of our own identities, by meeting like-minded people around the world who share our hobbies, interests, or deeper definitions of who we are. Other people who challenge norms and the status quo and want to explore what it might mean to be human beyond historical limitations.But increasingly digital as our lives may be now, there are still issues to grapple with that are defined by space and place. From the simplest issue of bin collections, to more thorny issues of rights, benefits, and education. How do we address these issues that are shared and contested among increasingly fractured communities sharing the same spaces?

Internet principles

Fukuyama suggests that common creeds form part of the answer. Shared sets of ideals around which countries are built.For me there are parallels here in how shared systems like the internet are created: millions of components of both hardware and software, created by thousands of different companies, operating to a huge variety of different ends. And yet through a set of shared standards, somehow co-operating to achieve a sufficient level of coherence that it all works — most of the time.The problem with Fukuyama’s solution for me is that it operates at a state level, and I am no longer convinced that we can maintain a shared state identity even in a country as small as the United Kingdom. Or rather, there may not be sufficient shared identity across the country to maintain coherence in that national community. Rather, we have to acknowledge that there is an increasingly devolved identity, just as we are — slowly — acknowledging the need for more devolved power.

Shared spaces

I think we can create a sense of shared purpose across diverse communities in a shared space. But that sense of purpose can only be defined in part at a state level. What will be much more important is a sense of local identity that binds us to our neighbours around the things that matter that are inevitably defined by space. These people may not be our friends, they may form part of groups against which we choose to define ourselves. But we will have to accept a measure of compromise over the issues in which we have a shared interest.That compromise is unlikely to be forced upon us. Communities of shared interest are rarely built from the top down. They have to be constructed from the bottom up. Doing this will require renewed efforts to overcome identity-based boundaries.I’ve never liked the term ‘tolerance’ in this context. Surely we should be striving for more than that? Acceptance, understanding, or resolution. But these things take time, and in that time we will have communities with a proportion of shared interests that need to take action. They will need to get past their potential areas of conflict to work for their common good.This sounds a little light weight: “all we need is peace, love and harmony”? Hardly a radical conclusion. But I come back to my position on the future: short term pessimist, long term optimist. The direction of travel for the human race is a positive one when it comes to resolving differences. More and more is handled by communication, less and less by violence. I think we can and will reach a situation where we can celebrate the rich diversity of our race while reliably building ad-hoc coalitions to achieve shared goals, even between groups with wildly different, and sometimes conflicting, ideals.But it’s going to take time. The next few years will continue to be challenging.

Read More
Future of Humanity Future of Humanity

The manipulation of nature

Technology is not a narrow term. It is not phones and laptops. Technology is the tools with which we change our world, for better or worse.

One of the primary objectives of the proto-science of alchemy was to turn lead into gold. It seems a rather base goal (forgive the pun), and more in the realm of magic than technology. Nonetheless, alchemists around the world laid down some of the foundations of modern science.The alchemists never succeeded, but as it turns out, you can turn lead into gold. Since every element is merely a collection of protons, neutrons and electrons, if you can manipulate the content of a nucleus you can change lead into gold. People have done so. Unfortunately, the process isn’t exactly practical, requiring huge amounts of energy from a particle accelerator, or depositing the lead in nuclear reactor.Selling that might be even harder than selling Ratner’s jewellery.

Coding DNA

Early in 2017 a team of scientists took the next step in creating truly programmable organisms. We may look back on this as synthetic biology’s 'Turing moment’, the point at which an expensive specialist machine starts to become an affordable generalist platform.Imagine being able to program a bacterium to produce materials, biofuel, cotton or spider silk. Imagine being able to program it to make medicines. Program one, feed it and watch it divide, exponentially increasing your production capacity.The potential is endless, as are the pitfalls. Such power needs careful constraint. And yet, it is following the same path of all technologies: it is becoming cheaper and more accessible all the time.Basic genetic engineering is already at the point of being a toy, in terms of its cost and ease. How long before I can buy a genetic programming platform as readily as a 3D printer?

Technology is the tools by which we manipulate nature

I have rather pigeonholed myself as a ‘tech expert’ over the years. Occasionally I struggle against this self-applied categorisation, worried that it limits my scope and people’s faith in my advice.But then I follow little rabbit holes of research into alchemy (inspired by a throwaway comment on a recent episode of The Infinite Monkey Cage) and synthetic biology, and realise that technology — properly defined — is barely a pigeonhole. It represents the grand scope of our ability to affect our environment, an endeavour that I believe defines us as a species.This is why I start with technology — in the broadest sense — when looking to the future. Technology is the means by which we make change, whether intended, or unintended.

Read More
Future of Humanity Future of Humanity

In the future (carrying) less is more

There are four things I hate carrying around: wallet, house keys, cash, laptop. I'm working to do away with all of them.

There are four things I hate carrying around: wallet, house keys, cash, laptop. They just add friction to your day, discomfort to your pockets, and weight to your backpack. I'm working to do away with all of them.

Wap your wad

The wallet and the cash are increasingly easy, as long as you don’t care about loyalty card points. I'm willing to ditch those in favour of a nice empty pocket, even without the privacy concerns. For most of this week I have been relying on my phone for payments and have found few occasions when it has not been accepted.

I have tried tucking a credit card and a single cash note into my jeans for those occasions but found the credit card gets easily bent. A little engineering along the lines of the Ridge wallet may be required.

Unlock your pockets

The keys are more complicated. Yes, there are digital locks, like the Yale lock I tested recently, which I will be fitting to the door of my workshop (when I get around to fitting a door to my workshop). But this would annoy the crap out of the rest of my family as a front door control.

Instead I need a system where I can use my phone (or RFID) while they continue to use a key. I haven’t yet found one fits my front door, and I don’t really fancy replacing the door just yet.

Leave the laptop

The laptop is perhaps the biggest challenge. I'm lucky to have a pretty dinky laptop but it’s still the biggest and heaviest item I have to carry each day.

Until now I've always believed that mobile devices lack the horsepower for a lot of my work, but I now think it is only the interface that stops me getting everything done with a pocket sized device. And I mean that in every sense: even if I can type fast enough on screen — as I'm doing now — I don’t have the screen size or mouse-driven precision for video or audio editing, or presentation prep.

Lots of attempts have been made to overcome these challenges with hybrid devices and accessories. But, of course, the more additional hardware it involves, the more you may as well carry a laptop. This will require experimentation…

In the future…

So far this has just been a post about my pet peeves. But there is a point to it: this stuff all goes away, and soon.

The first step will be further consolidation into the smartphone as it increasingly integrates all of the major wallet functions — not just payments but smart cards, loyalty schemes, and ID.

Then it will start to absorb the key chain. Right now, digital locks are a pain: power is a problem, people are concerned about security, and there’s no straight electronic replacement for barrel locks and security doors — without changing the door. But all of these problems will be solved in time.

Where it gets really interesting is when these functions start to explode out of the phone and either become device-less, or integrated elsewhere.

The first place people think of for this integration is the body, but given the fast pace of technology change, I remain sceptical about anything embedded under the skin. Rather, I think we’ll see schemes that replace the device altogether: biometric sensors for identification could go a long way to replacing keys and cards. Maybe a single, small, ID device could provide a second authentication factor.

Likewise with computing power: why take it with us when the power can be hosted in the cloud and projected to us when we need it? Future devices might only need a minuscule physical presence if they can capture voice input, or project three dimensional interfaces through augmented reality. Even the AR device may only be the size of a contact lens.

A minimalist future

We may be on the path towards a minimalist future already: most media items are disappearing — discs, newspapers, magazines. We may be buying more books for now but how many people do you see out listening to, watching or reading from their digital devices rather than a dedicated physical medium?

The convergence of devices into the smartphone is already cliché, but it hasn't finished until it has swallowed the wallet and the key chain. I still believe we will see an explosion of functions out of the smartphone as more ergonomic options become economically viable. But long term, those devices may be so small, that we don’t even remember we’re carrying them. Or, there may be no device at all.

Read More
Future of Humanity Future of Humanity

Giant ping pong balls at the speed of light

In the last twenty years, technology has changed the way the world works but in the next twenty, it will start to change how the world looks.

I’m fond of telling people that in the last twenty years, technology has changed the way the world works but in the next twenty, it will start to change how the world looks. Materials science is perhaps one of the most exciting areas of research right now, with money flooding into research into two dimensional and meta materials with incredible properties.

Think about the difference between the world before plastics and the world after. Think about the shapes, weights and textures of so many objects that would have been previously unfamiliar. Now imagine a transformation of the same magnitude in the materials from which we make cars, buildings, and clothes. Think about a world where the previously impossible, becomes possible, because we have materials that are stronger, lighter, more insulating or more conductive.

Of course, not all of this is going to happen in the next twenty years. There’s still a lot of fundamental science and manufacturing development to be done on these new materials. But we’ll see early applications that will shift our expectations for what certain objects look like.

Rockets & shuttles

Take the Breakthrough Starshot programme, an ambitious plan announced in April 2016 to send a spacecraft to a planet orbiting our nearest star.

We all have ideas in our heads about what spacecraft look like. We’ve spent years — decades — absorbing news of rockets and shuttles, and having our imaginations stretched by depictions of craft in science fiction. But the ‘nanocraft’ planned for this project look nothing like that.

The latest research suggests that they might be giant ping pong balls, a few metres across but weighing just a couple of grams, including all of the electronics. To put that into context, the cereal bar I just ate was 30 grams: I just ate the equivalent of fifteen space ships.

As you can probably guess, there will be no passengers on this craft, which will be accelerated up to a fraction of the speed of light in just a few minutes by being pounded with photons from a giant laser array here on earth.

Making these giant ping pong balls will test the limits of our understanding of materials. You may never see one. But the money that goes into their development will probably drive changes in objects you see every day.

Read More

Future language: precision matters

Life is (sadly) not like an Aaron Sorkin script. Whatever we may like to think about our own linguistic abilities, not many can spar with the wit and speed of his characters. Maybe Stephen Fry. But not most of us. We always think of the perfect retort three hours later.

Perhaps in the future we will be more Sorkin-esque. We certainly might wish we were. Because two things are happening that will raise the value of efficient, effective verbal communication.

Speaking machines

Firstly, our interface with machines is increasingly going to be based on natural language. We will talk and the machines will listen. And vice versa. The greater the speed, accuracy and range of our verbal communication, the higher the bandwidth of our interface to the machine.

This could take us in a number of directions. Witness the rise of txtspeak, a rich and highly efficient form of communication, even if it offends the eyes of the preceding generations. Or look at the syntax of really powerful web search terms, a mixture of human language and computer code. Constructing them well requires great skill.

I like to think that the depth of our long-evolved languages will prove superior to these hybrids, but future language will doubtless evolve in response to the new needs, as it always has.

The end of low-value interactions

The second thing that’s happening is that our low-value interactions are disappearing. For people like me who hate, and I mean HATE, administration, this is a huge bonus. Less and less will we need to fill out forms, interact with call centres, deal with post, or scan receipts. Because we will either allow institutions sufficient access to our personal data to let them find the answer. or we will have an AI assistant who handles these things for us.

There are serious issues with both these steps, around privacy, security and employment. How much do we want institutions to know about us — particularly states? How much are we willing to trade or risk in order to eliminate many of life’s major irritations? How many jobs will be lost as a result of the falling friction in our interactions — friction previously smoothed by human intervention?

Personally? I am concerned about just how much I might let go in order to never have to fill out a form again. For all my principles, I would give a lot for that.

Future language

As low value interactions diminish so the the importance of being skilled in high value interactions will grow, whether they are with machines or people. The better we can express ourselves, the higher the bandwidth of those interactions. I’m not saying every conversation is going to be like a Sorkin-script. But we might all start to place more emphasis on the quality of our repartee.

Read More