State of the Nation

State of the Nation – a review of the state of play of initiatives and architectures in the interoperability domain.

Comments and suggestions welcome.
Update without notice.

The Interoperability Domain – is ubiquitous. The latest buzz around BigData, (I)IoT and BlockChain mean pretty much every use of anything by any individual or device or system in any domestic, consumer or industrial business is part of this super-domain. We’re talking life, the universe and everything.

There is a little girl sitting on the ground outside her mud-brick home in Balochistan live streaming images from Cassini’s last moments on her Android phone and tweeting questions back to NASA as I type. Actually, I made that up. Last time I was at that small village not far from Quetta almost four decades ago, the old guys were squatting in the dust tuning a TV to an episode of Dallas via a makeshift satellite receiver and capturing it on their VCR.

But you know it’s true. The sky’s the limit, but some things never change. Plus ca change, plus c’est la meme chose.

Whichever industrial scope we see as our domain of interest, information interoperability is part of the problem. There are some people who’ve been around the loop of introducing standardisation into the technology cycle at least 5 or 6 times since the 1980’s in the attempt to maximise efficiency and opportunity of interoperability.

Even back then the problem was described in the same terms the Buddhist monks described their elephant. A very large thing with many different prospects depending how you approached it. We always knew we couldn’t eat a whole one. Since then of course, despite lots of individual successes, that elephant of a problem (opportunity) has grown ever larger with ever more possible angles.

As I said at the top, it’s become a ubiquitous topic. Every aspect of every business supply chain of every industrial or consumer domain is tangled-up in the solutions to any one part of the problem. The opportunities are similarly endless.

[STEP > ISO15926 > AIM > BIM > SemanticWeb > IoT > Cloud > IIoT > BlockChain > Edge > Industry4.0]
[Model > RDL > IHOG > Templates > Interfaces > CFIHOS]
[Digital Built Environment – Digital Twin.]

Examples discussed below, after pre-amble:

Architecture and Philosophy

Humility never stopped anyone claiming they had the best solution, or solution architecture.

No analyst was ever short of a philosophical basis to justify a given architecture.

Anyone could present their solution at the centre (or on the top, or in the foundation) of their chosen solution architecture.

Faced with philosophy and pragmatism we have of course always chosen practice over theory whenever there have been funded and collaborative opportunities to create and deliver some part of the solution. How dumb would we have to be? That is after all the point of the elephant metaphor, to recognise there is no single best view of the problem and therefore no best overall solution – no silver bullet.

The fact is even information Science theories have always recognised many dimensions to the problem. Faced with business-processes, user-interfaces, content, technology, functionality, performance & availability, security & trust – you name it – it is always possible to drawn multiple system architectures, even for the same ultimate physical implementation.

True but useless is as old as philosophy too.

How hard can it be?

Focus and priorities have changed

An Abstraction Too Far – some wag once reduced everything to this:

You can’t argue with that. Go on, try.

True but useless?

It’s always possible to reduce a complex situation to a simpler abstract view, but as Einstein once said (the apocryphal Einstein that is, not the real one) – make everything as simple as possible, but not more so. There are no silver bullets that represent the whole situation in a way that solves the whole problem. There are no end of problems …. and opportunities. The trick is always a matter of what’s appropriate. What are the immediate needs and priorities, here and now, and what’s the appropriate level of flexibility to allow for or pre-invest in the (predictably unknown) future?

And there is always a Pareto effect in the answer, an 80/20 question of diminishing returns. That prioritisation is recursive, having made the initial assessment you can still ask what to do with the 80 and how to handle the remaining 20 “for now”. You can re-apply that 80 and/or 20 to the 20 and/or 80 ad infinitum, from 68/32 “mostly harmless” to 96/4, 99/1 to 4-SD, Six-Sigma and beyond. If it’s “only money” or entertainment value, acceptable losses are one thing. If it’s reputation, existential threat or HS&E consequences of some hazardous process, it’s quite a different matter.

Depending on the nature of your priorities, it is always possible to draw up an ideal architecture to address your situation. All the components are always there in principle, just that some are given more prominence than others, according to these priorities. Security and availability. Latency and performance. Content and function. Componentisation and integration. You name it.

ANY taxonomy or ontology, including that of your architecture, is “deemed to be”. It’s modelled that way for a reason, for the priorities of your identifiable set of reasons and needs.

Pattern discovery & recognition

Formal schemas vs multiple examples / pro-forma examples

Programmatic automation vs True-AI

Big data – (enough sample data that syntactical patterns are discoverable, beyond any formal schema, and their significance assumed without any semantic understanding).

Big data vs model – Part of Just-in-Time / Just-where-needed philosophy – if explicit form content and inferred semantics from pattern recognition are mostly “good enough” – only semantically map to explicit RD structures where critical / necessary – DON’T DO EVERYTHING.

Just-in-Time mapping.

Mapping – a map definition – a meta-schema in a mapping language & also the process & result of applying a mapping.

68/32 – 80/20 – 64/4 – 96/4 – 99/1 Variations on Pareto. Six Sigma.

Discovered syntax+ – refine discovery and add human semantic.

In purely business “arithmetic” cases, discovery may be “good enough”, where exceptions cannot invisibly cause critical (HS&E integrity and continuity) cases. Exceptions are simply matters of efficiency and quality.

In certain business cases, the exceptional (potentially abnormal situations) cases are the ones that govern key operating systems. In order to add-value and semi-automate these, human semantics must be validated and captured in systems and schemas.

In a previous paper “What is Reference Data-based Interoperability (RDBI)” (Nov 2011) I described the architecture implicit using reference data, the value in exploiting reference data being a given.

The value remains a given. The RDBI architecture remains generic, but the technology architecture evolves daily.

In the early STEP AP221 / ISO15926 days – the assumption was a given “industry” would manage its own library and the software tools used in the industry would demonstrate content compliance in technology mappings at tool and database exchange interfaces – integration being achieved by common use of warehousing databases.

Internet / Web – semantic web. Older, Triples, older.

Semantic precision. Life doesn’t run on first order logic. Russell / Gödel

RDL > XML > Dublin Core & Extensions > OWL

Process and Oil&Gas Assets > BIM > Capital Facilities

Consumer IoS / Android Web apps

Cloud / IoT / Fog / BlockChain

GartnerTechHypeCycle2017 – Derek / LinkedIn

Where is the data, the ref data, the functional behaviour – everything “as-a-service” in the cloud.

Everyone of these has different emphasis on content and functionality, there are many different axes or dimensions in which their architectures can be described.

BUT Discovery + Semantic Mapping using Ref Data runs through them all.


Kondratiev Waves > “Industry4.0″– Knowledge / Power / Communication Distance / Trust

Trust is the focus of block-chain – the true currency of commerce (even money is a promise, after all)

A continuous chain of encrypted provenance (trust) that can detect if broken in order to revalidate or abort. But it comes at a processing cost. Bitcoin servers (already) consume more energy than 20 EU states!

AND for a laugh!

Unless of course, your “currency” is TRUST!

Trust was of course the top level of the W3C Sem-Web Arch WAY BACK IN 2001!!!!

BlockChain is just about architecture with this trust – the security and “key-chained” integrity of verified trust in the network of comms and content, and visible in blocks where broken.

Obvious why trust is bigger gap with greater needs than ever. Fake news, trolls, spoofs, scams …


Mapping App Mapping Ref Data
Visualisation & Silos (Figay)