THE INFORMATION STANDARDS BOARD, SLIGHT RETURN

A few years years ago we blogged about the Department for Education’s Information Standards Board. We thought we’d revisit it. Having done so one extraordinary fact stands out:

NOTHING HAS CHANGED.

I suppose we’d better qualify that. What we did notice is that our previous observation that “there are 269 “Recommended” standards, and zero “Adopted” standards” has moved on: there are now 384 recommended standards but still zero adopted standards. This is in spite of the ISB’s mission statement that:

“The core business of the ISB, supported by the Technical Support Service (TSS), is to successfully embed standards within the Education, Skills and Children’s Services (ESCS) system in England.”

Apart from that unfortunate detail, all of the salient points that we made back in February 2015 remain true today. We gathered together the elements of our critique under the following headings, drawn from the goals of the Internet Engineering Task Force (IETF) Standards Process:

  • technical excellence;
  • prior implementation and testing;
  • clear, concise, and easily understood documentation;
  • openness and fairness; and
  • timeliness

It is in this context that we have seen no movement. None. Nada.

We foraged around the (clearly moribund) ISB website and found the following document entitled Championing Open Data Standards, published in September 2016. It’s so… well let’s be direct – it’s so pitifully ex post facto that we decided to reprint it in full; you should note the Information Standards Board has not met since so it’s our guess that nothing here has changed subsequently. Here it is, verbatim:


Championing open data standards

Why have common and open data standards?

In order to achieve efficient data movement and matching, there is a need for common data standards. These are necessary to allow people and systems to exchange and re-use information.

This was recognised within the Education, Skills and Children’s Services (ESCS) sector, and the Information Standards Board (ISB) was created in 2007 to deliver the data standards and data model to realise savings in the costs incurred with data moving and processing.

Why use the ESCS Information Standards?

DfE and BIS have jointly sponsored a Board and Technical Support Service since 2007 to develop the ESCS standards and support their implementation. Their aim was to provide a definitive set of standards through stakeholder consensus and support their implementation by ensuring developments were shared effectively and lessons learnt.

The standards and data model developed aims to provide a sector-wide solution which is future-proofed, so that it meets both the ability to cover the wide-ranging, but often similar, data items in more general terms, as well as allowing the model the flexibility to develop as a result of new policies which may impact on the data items in the model.

Where are we now?

Whilst much work has been done to develop the model and standards, the potential benefits of that investment are yet to be fully realised. There has been a strong focus on supporting two key areas of model usage so far: the Joint Council for Qualifications’ (JCQ’s) A2C project; and the Data Exchange Project being led by DESAG within DfE. However, the ESCS Information Standards have not yet been widely adopted.

Standards and Data Models by their nature require constant maintenance even once developed. If we are to have common data standards, there will be a need for ongoing support to further develop the model.

Questions for discussion

  • How can we build wider interest and engagement with ESCS Information Standards?
  • Who are the potential users? Government, LAs, MIS, Schools, Colleges, etc
  • What could be done to engage more with users?
  • How can DfE facilitate this?

 

We’ll say it again: this was the penultimate published document from the ISB, succeeded only by a technical note on 19 December 2016 regarding the latest (very small and as far as we can establish, unconsulted) changes to the ISB document set.

Given the lack of process by which a standard is judged; given the lack of engagement, input and challenge that a standard requires to warrant the title; and given its complete lack of adoption – it has to be asked:

Can the Information Standards Board’s Business Data Architecture ever be considered an open standard?

To conclude: take a look when this juggernaut was founded.

Yes, that’s right – 2007. It’s now 2019.

A dozen years have gone. It’s now in its third (or is that fourth) government for goodness’ sake. Yet despite this repetitive failure in process, despite the singular lack of progress in engagement and adoption and because of what appears to be a failure in the proper oversight of this monumental white elephant… money continues to be spent.

It’s probably time to ask why, and how much.

Advertisements

Delivering the National Resilience Capability Survey for Cabinet Office

An inauspicious beginning

For more than twenty five years I was a civil servant, usually in charge of delivering something substantial – so I thought I knew how this would go: Not so.

At the time Cabinet Office awarded the contract to Software for Data Analysis Limited (SDA) the directing policy team were undergoing a seismic change of personnel, the legacy system we were to replace had already been decommissioned, and it was two years on from the previous data collection so many in the field had also moved on. It was our job to introduce our SCROLL survey software into this extreme state of flux.

The job

Cabinet Office’s Civil Contingencies Secretariat (CCS) are responsible for the National Resilience Programme, which aims to increase the UK’s capability to respond to and recover from civil emergencies and provides advice on preparing for a crisis. They needed a solution to enable them to collect, process and analyse resilience and readiness data from national and local responder organisations as well as utilities and Local Authorities. The data is at least “official sensitive” so the solution had to be locked down tight. SDA’s comprehensive ISO27001:2013 certification was only the starting point!

Engaging the client

After we’d negotiated the security capsules in the lobby of 70 Whitehall – and been relieved of all our electronic accoutrements – we were taken to COBRA 2 for our initial briefing and workshop. Here we began to flush out the details of the assignment. The shopping list was extensive. These were a few of the must haves:

  • Guided development of hundreds of separate survey questions;
  • 25 different questionnaires delivered simultaneously to 800 diverse respondent groups during a six-week window;
  • Multiple users in an organisation involved in completing each instrument;
  • Different access levels for different users to both questionnaire development, data entry and outputs;
  • Local administrative functions for each participating organisation;
  • Multiple views of survey data from pre-set reports to complete flexibility in describing variables and granularity.
  • Facilities to compare individual elements of demographic groups with the aggregate performance of the respective demographic group;
  • Benchmark reports across different levels of aggregation;
  • Full range of export facilities e.g. CSV, Excel, Quantum, SPSS and Triple-S;

The list went on. And on.

Making progress

It became clear that though the responder community had huge subject matter expertise, much of it on the frontline, it needed to be complemented by an equal proportion of survey savvy: in other words they knew what they needed to find out, but they didn’t altogether know how to ask. And while the ‘must haves’ were given, the ‘shoulds’, ‘coulds’ and ‘woulds’ were still in the eyes of the beholders. SDA gently suggested that we might be best used to arbitrate between conflicting priorities based on our deep knowledge of our discipline and our product.

At the end of the workshop we proposed two important variations:

  • That CCS delegate the survey design and implementation to us
  • That it would be more efficient if we were to manage the entire technical service.

Our public sector colleagues agreed, very quickly, and it seemed – to me at least – that our offer had evoked a metaphorical sigh of relief. We began work in earnest.

A question of security

Cabinet Office is the home of the Government Digital Service (GDS) and so it was unsurprising that the project was run according to agile principles. This was fine by us: SDA have always subscribed to requirements/prototype/iterate in all their many guises. One big deal, though, was the GDS Service Assessment. Ours majored on security. We took them through our Information Security Management System, we showed them our audits, our testing regimes, our certificates and our third party certificates. Still they appeared uncertain.

Finally we suggested that we could install SCROLL on a virtual machine in their own datacentre. Absolutely, they said. Another deep sigh of relief.

Going live

Our guiding imperative was to launch on time: this was an absolute. And honestly, we were always on the front foot. Where our client had questions, we provided answers. Where they had problems, we provided solutions. Where they had deadlines, we met them. Come the day, though, a final flurry of amendments were presented to us and they were non-negotiable. Of course, we went the extra mile. That’s what we do.

Who cared about the weekend anyway?

Does that all sound difficult? It was. Was it unpleasant? No. It was exhilarating. When you throw yourself into something, the barriers just come down. We all did a great job.

Lessons learned

  • Technology is easy. Organisations aren’t.
  • Never underestimate complexity.
  • Be prepared to lead, sometimes from behind.
  • Focus on delivery.

SCROLL – OFF ON A ROLL!

SCROLL. Did I ever mention it? Maybe, maybe not. Assuming not, well that’s probably because we’ve been so busy building it. And now it’s here. But first things first: what is SCROLL?

It’s this:scroll-blog

SCROLL is SDA’s latest data collection, processing and analysis tool. It embodies 20 years’ focus on turning data into decision grade information. Used by some of the largest social and market research organisations – and most recently by Cabinet Office – our system has already delivered thousands of data collections. Indeed, it’s taking off all over the place, including France, Germany and… Iran. Using SCROLL’s multi-lingual capabilities our new associate research consultant has adapted the system to accommodate Farsi and, right now, it’s being put to work on the ground in Tehran and other major Iranian cities. A unique feature of the project is that we’ve modified the system for offline operations to allow for specific elements of the local business environment.

Indeed – STOP PRESS (like, this morning) – we have just been confirmed as the principal supplier of computer assisted mobile interviewing (CAPMI) by RahbarBazaar (www.RahbarBazaar.com), the leading market research agency in the region, and exclusive regional affiliate of Kantar Insight.

The weird thing is that we sort of slid into operational mode. SCROLL has been in development – a sometimes vexed process – for a while. There were several components that worked perfectly as independent modules but their integration was… problematic. So we stepped up the effort, polishing them up, streamlining their throughput and then, almost unexpectedly, they clicked. A good job they did because – how shall I put this – we’d taken a few liberties with our marketing endeavours. I’ll put my hands up; we had to. You pour resource into development but there comes a point when you’ve got to get your product out of the door.

So, lessons learned? We all know the impulse to make something perfect; to improve, to refine, to embellish. To fiddle.

The simple fact is that this is all displacement activity. Version 1.0 is staring you in the face. Get it out. Put it to work.

The world will direct where it goes next.

Twitter: @SCROLLcontrol

What goes around, comes around: Interoperability, Data Exchange and A4L

Open letter to the A4L Systems Interoperability Framework community campaigning for a seat on their UK Management Board:

Dear Colleague,

My name is Mark Phillips from Software for Data Analysis Limited. Among other things we have published the DfE’s School Performance tables since 1993.

 
After a long period of absence, I’m back in the SIF saddle. Unexpectedly I’ve been urged to apply for a position on the UK Management Board. So I’m going to.
 
 
I do have some form in this respect. Rather than rehash what I’ve already said I’ll simply quote it here:
 
I was the first National Chair of the SIFA UK Management Board, presiding from 2006 to 2009. During that time, along with our committed and vibrant Board, I took SIF on the road – introducing the concept and the detail to Central and Local Government, schools and vendors around the country.

It was the first inflationary period as it were, where we moved from zero to max in a short but exciting burst of incredible energy, re-tooling the predominantly US standard to fit the requirements of the UK education system. The rationale, both economic and technical, was forged then and has remained compelling since: improved information available more quickly, more efficiently and more effectively – for less cost.

With that manifesto I got the movement onto the map.

Since those times I have moved from Government to the private sector, developing SDA as a force in Open Data, building on the tenets of interoperability across a wider stage and working alongside the Open Data Institute, TechUK and Government Digital Service to develop systems and processes fit for the 21st Century.

Now I’m delighted to see that my original intentions are on the cusp of realisation with the new movement towards Data Exchange at the DfE.

If I was to be elected to the A4L UK Management Board I would bring the longevity and consistency of my vision, now enhanced and improved by new perspectives and wider experience, to bear on the ultimate realisation of this long journey towards open standards and interoperability in UK education.

The final push, if you like.

 I would, of course, fulfil my obligations to the Board in terms of time, commitment and energy – as was ever the case!”

 If you were to distil my ambition into a single point it would be this; to ensure that we deliver a SIF model that leverages the global Specification wedded to a data model that embodies the principles of an open standard, specifically Due Process, Consensus, Transparency, Balance and, of course, Openness.
 
If you want to know more about me, don’t hesitate to get in touch.
 
All the best,
 
Mark

Diversification!

Onwards and upwards…

Software for Data Analysis Limited have been working closely with the National Casino Forum to design and engineer SENSE, the backbone system that crystalises a new approach to Corporate Social Responsibility in the world of UK Land Based Casino gambling. Liaising closely with representatives of all the major players in the industry, SDA have integrated a huge range of administrative and technical systems to develop and deliver a single, seamless product. Having added some of their technical fairy-dust on the way, SENSE can now be found on the shop floor of every UK Casino, enabling people to self-exclude from gambling if things go wrong. The Gambling Commission have already recognised it as a “significant achievement”.

Here’s the press release:

“UK gambling trade association the National Casino Forum (NCF) aims to raise public awareness for responsible gambling by announcing the launch of SENSE – the Self-Enrolment National Self-Exclusion tool which it will roll out through its land based casino partners

This national programme is being introduced by casinos in advance of the Gambling Commission’s licensing condition which is due to come into force on 6 April 2016.  This condition will require operators to participate in multi-operator self-exclusion schemes so that customers are able to self-exclude from gambling facilities.

The SENSE scheme enables customers to voluntarily self-exclude from all participating land-based casino premises and is mandatory for all NCF member operators. Enrolling in SENSE means that customers will, for the first time, be sharing their request to self-exclude from all participating land-based casinos for a minimum period of six months.

The system is designed to be simple and straightforward to use; a casino operator will read out the terms and conditions of SENSE and then ask the customer to electronically sign the enrolment form. The casino operator will also take a photograph of the customer and upload this along with their enrolment form onto the secure SENSE system.  Customers can also download the SENSE information and self-exclusion form at www.playingsafe.org.uk.

Once a customer has enrolled in SENSE, operators at participating casinos will be alerted and any marketing material for that customer and memberships will be switched off. If a customer tries to access a casino when they are self-excluded, an operator will note this on the system and it will alert surrounding casinos that they are trying to gain access.  Customers will only be eligible to be removed from SENSE after the six month period and only upon request.

The Gambling Commission commented: “We see the development of sector specific self-exclusion schemes as an important step in providing greater protection to players who require help managing their gambling.

“The casino sector was well placed to lead the way in this but implementing SENSE now, well ahead of the deadline we set, is a significant achievement. We recognise that it required the considerable efforts and full commitment of all NCF’s members.”

Tracy Damestani, CEO of the National Casino Forum said: The NCF and its members took the decision to pioneer the first national self-exclusion programme.  Self-exclusion is an important step for people who have recognised that they have a problem with gambling and have made a commitment to deal with it.

We applaud the Gambling Commission’s decision to introduce a new provision which mandates that all gambling operators will need to implement more effective self-exclusion systems. NCF members recognised the need to have an easy to use system in place that can be used nationally across UK casino venues and we have been working on this initiative for the last two years. SENSE was created with the intention of encouraging responsible gambling throughout the industry and to help those people who may be at risk.”

“NCF represents over 98% of the UK’s land-based casinos.  The members’ commitment to implementing the SENSE system has been admirable; they have taken all the steps necessary to ensure their casino staff support customers who wish to self-exclude.  While

SENSE was developed as a casino initiative to be flexible and progressive, we hope to see other gambling sectors also take similar steps for their customer base.”

The SENSE web-based application system was developed with SDA (Software for Data Analysis).

Michael Hart, Managing Director of Software for Data Analysis Limited (SDA) added: “Developing and delivering SENSE has been an exhilarating time for SDA.  Investing our substantial expertise with data processing and systems development into such a cutting-edge and socially responsible project has been both demanding and rewarding.  I’m delighted with the result, and that we’ve been able to contribute to this groundbreaking product.””

Burn Before Reading

I’m sorry I just can’t help myself.

Every time I return to the fray I seem forced to grapple with a fresh example of inefficient, inconsistent or just plain inscrutable public sector blarney. My only catharsis is to write it down, here. Not to do so would create all manner of new, unwanted and probably damaging internal tensions that I would prefer to do without.

Here’s the deal:

“The Redacted is implementing a new multi-buyer e-procurement platform… that will allow us to manage our supplier base more effectively. To continue to receive notification of potential tender opportunities with the Redacted and its executive agencies… click on the link below to register…”

So. Registration. Should be straightforward, right? I’ve got the link and I’ve been “pre-invited” to register, so what can go wrong? I fling myself into the task with optimism and enthusiasm. Things start well: the first page has recorded some basic details and provided a system ID. Only ten more pages to go.

The second page requires some further information, which I don’t have to hand. Off I go to collect it. I’m already thinking “wouldn’t it have been good if I’d been provided with a list of things I’ll need”. Having collected the information I sit back down to key it in.

I’ve been logged out. Never mind, I’ll pick it up from where I left off.

Wrong. The form has forgotten me (already). So I re-key my previous entries and add the new details. On to the next page.

Rinse and repeat.

Too late I notice (by complete accident) that a date I had entered in free text had defaulted to the American format. I only discovered this by subsequently looking at the drop down calendar and finding that my input – 6/11/2015 – had been recorded by the system as 11th June 2015 – which, if it had gone unnoticed would probably have invalidated the registration. No guidance, no warning.

Rinse and repeat.

Suffice to say a really, really simple job took me nigh on two hours to complete. That’s two hours of quite expensive, sort-of-executive time. No doubt multiplied across the supplier roster. And we wonder at our productivity failings.

Finally, I finished. I figured that, as the invitation to apply had been extended by a specific Department, to SDA as one of their specific suppliers, and as the invitation had majored on providing us with information about “…a number of forthcoming tenders from across the Redacted and its executive agencies that we don’t want you to miss out on” I could now look forward to examining what was on offer. Of the 63 opportunities noted JUST ONE was with said organisation; the rest were from all manner of public bodies for whom we have never worked, and for which we were mostly unsuited.

I know the Government mantra: don’t build, buy; COTS is best! But really? At least put in some effort to MAKE IT WORK.

Disappointed by this episode I wrote to the “executive agency” that appeared to have been charged with the development of the system, explaining some of my reservations. (SDA are, after all, expert in form creation.) For the next couple of days I got the following message:

“This is an automatically generated Delivery Status Notification
THIS IS A WARNING MESSAGE ONLY. YOU DO NOT NEED TO RESEND YOUR MESSAGE.
Delivery to the following recipient has been delayed: commercialadmin@redacted.org.uk
Message will be retried for x more day(s)”

Culminating with:

“Delivery to the following recipient failed permanently:
commercialadmin@redacted.org.uk
Technical details of permanent failure:
The recipient server did not accept our requests to connect.”

Doh!

Burn before reading.

THE INFORMATION STANDARDS BOARD

The Department for Education in collaboration with other authorities in the Education, Skills and Children’s Services sector has, since at least 2009, been developing its own set of data-related standards under the management of the Information Standards Board (ISB).

According to its own website (http://data.gov.uk/education-standards/standards-adoption), “The core business of the ISB, supported by the Technical Support Service (TSS), is to successfully embed standards within the Education, Skills and Children’s Services (ESCS) system in England.”

The ISB has two “approved” statuses for its published standards – “Approved: Recommended” and “Approved: “Adopted”.  On the ISB website today, there are 269 “Recommended” standards, and zero “Adopted” standards.

So why, in five years, has the ISB failed in its core aim of issuing standards which are actually being used?

We believe the answer lies in the approach taken by the ISB in producing and publishing these standards.  Let’s take as our benchmark the work of the Internet Engineering Task Force (IETF), which is responsible for producing a large number of widely-deployed technical standards.  To quote from “The IETF Standards Process” (RFC 2026):

The goals of the Internet Standards Process are:

  • technical excellence;
  • prior implementation and testing;
  • clear, concise, and easily understood documentation;
  • openness and fairness; and
  • timeliness

Assuming we agree that these are all good goals to have when producing a standard, let’s assess the performance of the ISB against those goals.

1) Technical excellence

It could be argued that technical excellence should be relatively low on the list of priorities for ISB (its aim is simply to produce usable standards), but there is a minimum standard that the work is “fit for purpose”.  Unfortunately there have been numerous examples of documents published as “Approved” by the ISB which are simply not fit for purpose.

One of the foundations of the ISB data standards is the Business Data Architecture Data Types document.  This was first published on 20th September 2013 as version 4.0 and was so riddled with errors that that version is not even archived on the ISB website!

It contains statements such as the recursive definition of a date being “A string providing date information and hence containing following values – year, month and date”.  Those three components of a date are each defined to consist of a single digit (0 to 9).  Or defining a type called “Simple_Integer” as being “A simple unsigned string of numeric values.”, and another type called “Integer” with the simpler definition of “Signed numeric value”.

It was simply not possible to implement any of the definitions provided in this document without ignoring large parts of it and trying to guess what the document author intended.

When defining controlled lists of values for common concepts such as language, country and currency, the ISB have quite rightly decided to adopt existing ISO standards in those areas.

However, the ISB have fundamentally misunderstood the purposes of the ISO standards, which are titled “Language Codes – ISO 639”, “Country Codes – ISO 3166” and “Currency Codes – ISO 4217”.  i.e. these are standard lists of *codes* for languages, countries and currencies.

In trying to fit in with other controlled lists, the ISB have broken perfectly good existing standards by using the textual descriptions (e.g. Spanish, Spain and Euro) instead of the code defined by the standard.  The problem is when ISO codes remain the same but the textual description changes.

This was raised as an issue with ISB in May 2014 but no visible action has been taken.

Regarding the many other controlled lists and data documents, there is no way to reliably assess their fitness for purpose without attempting to implement them in the real world.  Which brings us to…

2) Prior implementation and testing

The ISB have stated that it is not their role to perform testing – the ISB propose standards and it is reliant on stakeholders to implement them.  However, given that no ISB standards have been through an implementation process and moved to “approved” status, this approach is obviously not working.

The ISB standards are also heavily dependent on each other, adding layer upon layer upon layer.  When mistakes are found in documents that many other documents depend on (such as the BDA Data Types document) or improvements are suggested to those documents, then these suggested improvements are rejected as being impossible to implement due to the many other standards that would be affected by such a change.

If no stakeholders have implemented any of these standards then in addition to not proving the standards are fit for purpose, it also raises the obvious questions around the need for those standards in the first place.

The motivation for a lot of the standards developed as part of the work of the ISB has been the DfE Data Transformation Programme, which is described as consisting of the “Data Exchange” and “School Performance Data Programme” projects, neither of which are  currently being actively pursued.

However, there is still value in the work undertaken by the ISB, especially in the production of controlled lists which can be used regardless of the over-arching data model.  Instead of investing in the production of further standards, the ISB should stop and invest in its own testing and implementation of at least the core standards and their XML representation.

To quote Albert Einstein, “In theory, theory and practice are the same. In practice, they are not.”

3) Clear, concise, and easily understood documentation

The documents published by the ISB are far from being clear, concise and easily understood.

The ISB staff developing the ISB data model need to use the ERwin software to view, develop and maintain the data model behind all the published ISB documents.

However, the “other side” of the process (organisations wishing to develop systems conforming to ISB) do not have any of that functionality, and are just provided with a set of static PDF documents, often containing 90% boilerplate text and 10% actual content.

There is also an XML Schema (xsd) file which partially specifies the XML representation of ISB data.  In order to obtain the full specification of this XML representation, an implementer is required to refer to the PDF documents.  No examples of XML files have been published.

To quote from the W3C, “XML Schemas express shared vocabularies and allow machines to carry out rules made by people. They provide a means for defining the structure, content and semantics of XML documents.”

The XML Schema provided by ISB defines the structure of an ISB XML file, but as a matter of policy declines to use the language of XML Schema to define the content and semantics of such files.

The main example of this is the lack of incorporation of controlled lists into the Schema, but also includes the lack of any documentation on the semantics of the structures being defined.  Instead, a user of the schema is referred to the many PDF documents as being the definitive source of documentation.

Given that XML will be the main method of transferring ISB data between systems, it is vital to ISB’s success that the documentation on the XML format is both accessible and unambiguous.  The XML Schema language should be fully used to aid in this task, not ignored.

It seems clear that the information published by the ISB needs to be stored in a structured way (e.g. in a database), and the content should be accessible via an interactive website.  In addition, the same database of information could be used to automatically generate PDF documents similar to those now created manually, or to create an XML Schema which as the W3C intended, would “allow machines to carry out rules made by people”.

4) Openness and fairness

In 2014, an independent review of ISB was undertaken, which included contributions from stakeholders outside the ISB and its constituent organisations.  However, the findings of this review have not been made public, not even to those who contributed time to the review. This is particularly shocking given this Administration’s emphasis on “transparency”.

5) Timeliness

It is difficult to judge the timeliness of the work of the ISB given that there is no apparent desire to implement any of the published standards.