IBM’s Spin Off: A Window on Key Architectural Factors for the Coming Decade

It’s being called a ‘spin off.’  But this is not like a glacier calving.  IBM is splitting in two. One part will continue to be ‘IBM’, focused on hybrid cloud and AI. The other part, called ‘NewCo’ for now, will focus on managing client-owned infrastructure.

This is an incisive and courageous move by IBM. Understanding why will require us to survey the future of enterprise computing for the coming decade. We will be looking at key architectural factors – the terrain where we architects do battle, and as vital for us to understand as it was for Burnside to factor in – or infamously fail to – the Sunken Road at Fredericksburg.

What will the back-office world of 2030 look like?

Obviously, the application of AI will be commonplace.

When we talk about the advent of AI over the last decade, we are largely speaking about advances in the particular subset of machine learning called ‘deep learning’. That’s the magic behind driverless cars and Alexa.

The recent boom in deep learning was propelled by two things:  the discovery in around 2009 that the powerful graphics chips that had been developed so Grand Theft Auto and Call of Duty would run faster : ) were also perfect with their matrix math mojo for accelerating deep learning; and, because unlike most other machine learning techniques, which ‘top out’ in performance at some point, the more labelled data you throw at a deep learning widget the better it gets, and the faster it gets better. And the vast resources of the cloud let us marshal a lot of data.

Until now we have all been cherry-picking where AI gets applied. By 2030 we are all going to be beating the bushes to systematically apply it to every suitable business decision – which will mean most of them.

The emergence of comprehensive semantic vocabularies – of ‘knowledge graphs’ – is not only vital to AI – there has to be a consistent, shared understanding of the concepts we are reasoning about – it is vital to the increasingly distributed and interconnected shape of our business processes.   It seems obvious that we must not only agree on what particular words mean, but on what they mean in relation to each other.

But to do that when we ‘talk’ to software, and when we talk to each other using software, we have to formalize and capture those concept definitions and the rules which define their relationships.   So, we are seeing not only the rise in individual company use of knowledge graphs in a given domain, like Google’s for search, but those graphs are aggregating in a given industry – and across industries in efforts such as schema.org.  We will see the emergence of a ubiquitous health industry knowledge graph in the 2030’s if not sooner.  The adoption of US Core Data for Interoperability is a baby step in that direction.

The massive impact of the Cloud has tended to overshadow some other significant evolutions – some of which were instrumental in the success of the Cloud.

Virtualization is like The Matrix.   A software procedure believes it is running in an application when it is _really_ running in a ‘Function-as-a-Service’ framework that virtualizes the application.  A software application believes it is running on an operating system, when it is _really_ running in a software ‘container’ that virtualizes the operating system.   The operating system believes it is running against the BIOS on a server when it is _really_ running on a ‘virtual’ machine hosted on a physical server.

Virtualization is growing.  More and more physical – and other virtual – devices, from disk to connectivity, are being abstracted by virtualization frameworks.   We are calling this phenomenon ‘hyperconvergence’.

The end result, circa 2030, is going to increasingly be that we will be working abstracted from all of the hardware.  We will no longer need to provision specialized hardware to manage specialized computing loads.  Instead we will simply dial up what we need, and the super-duper-uber-hyper-visor managing the Matrix will serve it up.

The first generation of The Cloud conflated the ‘as-a-Service’ layers – Infrastructure-as-a-Service, Platform-as-a-Service, Data-as-a-Service.  We were all seduced by the big boxes of tinker toys in the public clouds, and the ease with which solutions could be assembled from them, and started to get locked into particular clouds.

Then we all started to realize that all public clouds not created equal, so that we wanted to use more than one.  And it became clearer that near-term economics demanded we keep our data center doors open.   It also became clear that we wanted to use consistent technologies in and across the whole shebang.

So we have all started moving toward so-called ‘hybrid cloud’ solutions.

As all businesses become technology businesses, and as consumers have increasingly engaged with technology, there is an increasing focus on locking down, on securing, on hardening, those technologies.  

Finance led the way on this, now, as we know, Healthcare is moving strongly to improve security and privacy management, adopting strong identity proofing and multi-factor authentication.  National and state privacy regulations are driving rapid change.

That’s our quick sketch of the world of computing in 2030.  So how does IBM’s strategy align?

Computing hardware is becoming increasingly commoditized.  High performance individual boxes are being replaced by oceans of small, massively interconnected machines.  (Supercomputers are made up of zillions of massively interconnected small processers.  The faster the connections among discrete machines get, the more they become a virtual supercomputer.)

So, getting out of the building and servicing computers business makes sense.

Quantum computers are an exception.   They will not be commodifiable for many decades.  Their initial uses will be specialized – there are only a handful of quantum algorithms at this point – software algorithms that require a quantum computer to run on – the main two being Shor’s for large integer factorization and Grover’s for searching unordered lists – so there is only a certain set of problems where they can both be used, and where we can afford to apply them.

Quantum computing is a long game, but it is one it makes sense for IBM to stay in.

IBM is an acknowledged leader in AI (although Deep Blue, which beat Kasparov at chess, and is the basis of much of that mindshare, is last-generation AI technology at this point.)  They are doing furious research – over 1800 AI-related patents in 2019.  They have effectively ‘personalized’ AI with their Watson branding.

More interesting to me right now is the work they are doing in what we might call ‘explanatory AI’.    In a rules engine, you can look at an audit trail of the individual decisions that led to the final recommendation or result.   With deep learning – with much of statistically based AI – there is no such audit trail.  This is obviously problematic with respect to trusting AI, and with respect to using AI in situations where we might have to, legally or otherwise, defend our use of it.

IBM is way out ahead on developing technology that will let us know why a given intelligent agent made the recommendation it did.

IBM is also very active in the semantic technology space, a necessary complement to both AI and broad interop.   One of their interesting research directions is so-called ‘neuro-symbolic’ AI.   This is AI which imbue intelligent agents with ‘common sense’ about the world – a big gap right now, and one that Doug Lenat famously tried and failed to fill with his Cyc project.

IBM was an early pioneer of virtualization technology, but have failed to capitalize on it. But with the acquisition of RedHat, they have moved quickly to the fore of the modern world of hybrid cloud and broad virtualization.

They are also investing a lot in the hybrid cloud security space.

A few more thoughts to round this out…

There is no long-term competitive advantage from adopting the cloud – it is a rising tide that floats all boats.   The cloud provider wars will continue, but will settle down into the Coke Cloud and the Pepsi Cloud with room left for the Dr. Pepper Cloud, Mountain Dew Cloud, the Jones Pure Cane Soda Cloud…

There will be increasing cloud provider specialization.   IBM is concentrating on building a finance cloud with locked-down security and auditing – which makes sense with their portfolio of financial institutions – over 80% of the world’s credit card transactions run on IBM today. The key is going to be – the same as it ever was – in how businesses use all of the compute and storage.   And most of that answer lives in the space between.

Google sees the same future in their crystal ball that IBM sees – Google is also leaning very hard into multi-cloud, not only with Anthos, but also with Big Query Omni and other tools they have in the pipeline.   Google understands what IBM understands – we are moving toward hybrid multi-cloud on top of a hyperconverged infrastructure – the uber-virtualization approach I discussed above, where not only processing but storage is virtualized.

At the same time connectivity continues to grow apace.  There are already almost three times as many devices connected to the Internet as there are people on the planet – and that number is growing.   The speed with which those devices can communicate is soaring – 5G is bringing half a gig download speeds to mobile.  80% of Americans can access a 1 gigabit connection.   In a decade it will be 10.

What does that mean, besides that your online game will no longer hiccup, and annoying ‘buffering’ messages will be a thing of the past?  To answer that, we have to factor in the growth in AI.  We are all increasingly using shared, specialized standards on top of TCP/IP, which make our time-to-market faster, and our cost lower, for establishing new connections with our business partners. But what is it we are connecting?

Interoperability is not about data.  Interoperability is about extending our business processes across organizational boundaries.   What do those processes consist of?   If we subtract out all of the ‘human computing’ – which increasing automation is doing – what are we left with?   Business processes are decisions strung together by communication that lead to actions.  (See my thread – hmm, maybe only one post so far : ) – on workflow in the age of knowledge work.)

AI is increasing the number of those decisions that don’t have to be made by humans.  AI can now automate not only rote decisions, but complex ones with lots of factors – and do many of those better than humans.  By 2030 we are going to be left with our human experts making the most important decisions, frequently in novel circumstances, with the least complete information.

With greater process distribution, the redundancies in our businesses will get factored out.   Each specialized type of decision will be made by its experts – and we only need one pool of those experts in a distributed process.  In the same kind of way, we will only need one pool of shared AI experts.    Distributed AI is a bandwidth hog – but as bandwidth grows it will become increasingly viable.  It has been estimated that it will take broad 15 gig mobile connectivity to fully realize autonomous cars.  We will be there, or nearly, by 2030.

IBM has to tell two strategy stories.   They have to have a quarter-over-quarter story for the shareholders.   That is the near-term Cloud story.    That will play out for a few more years – but it will be heavy weather for IBM until the cloud provider wars settle down.   IBM got a spike in their share price from the news report, but that seems a ‘glad they are doing _something_’ bump.    As their long-term strategy is slow to fully realize value – that will take years – they are going to get beat up.

But the strength of that long-term strategy eventually will out.   If they are out ahead on hybrid multi-cloud on hyperconverged infrastructure and AI-driven hyper-connected business efficiencies, the market will come to them.

That is why the ‘spin off’ is insightful and courageous. And that is why it serves as a lens to bring key architectural factors in the coming decade into focus.

Leave a comment