I’ve spent two months calling Local Authorities. I offered their education Leaders an initial proposition:

  • Increase Free School Meals take up in their catchment area
  • Help local schools increase their Pupil Premium
  • Empower parents to apply directly
  • Save money in the process.

We have a product, called Online Free School Meals (OFSM), which does all of this out of the box. So this was my pitch:


The National Food Strategy appears set to double the reach of Free School Meals (FSM). At the same time, the fallout from the coronavirus pandemic threatens a huge increase in their uptake.

Yet according to a range of statistics some 10% of children eligible to claim FSM fail to do so. This in turn means that schools in your area are missing out on £millions in pupil premium.

This image has an empty alt attribute; its file name is fotolia_52157843_s.jpg

Online Free School Meals (OFSM) is an instant eligibility checking system linked directly to the DfE’s ECS service. It is currently used by thousands of schools in over a hundred local authorities. It is easy to use and is known to improve FSM uptake. OFSM’s principal features are as follows:

  • Online application form provides parents, carers and schools with instant results while reducing the burden of re-keying paper forms.
  • Failed applications are regularly checked to detect changes in status
  • Web-based system: ready to go with no technical overheads or associated costs
  • Comprehensive back-office dashboard provides all current data to administrators
  • It’s safe and secure: ISO27001 certified and GDPR compliant.”

I duly called said Leaders. Some wouldn’t take my call, some were simply not interested, some (a few) referred me to the relevant manager. I called them. Some wouldn’t take my call, some were not interested, some said they’d look into it and get back to me. They didn’t, so I got back to them. They hadn’t looked into it. When challenged (ever so politely), they brought down the shutters.

My point is that I was making a proposition that flat-out benefits everyone: more kids get food; their schools get more money; we make some money; LAs save some money (re-keying thousands of forms for bulk upload then reporting back is a full time job, and really – in this day and age?)

I even did the calculations for them based on real world figures (numbers eligible minus numbers eligible but not claiming) to show, for example, that in the North East over 9000 kids aren’t getting their due, to the tune of £11,000,000 in lost pupil premium. A lose/lose situation by any measure.

But inertia and lack of curiosity prevail.

Shame, really.

Data Dashboards


This paper is intended to consider the purpose and use of data dashboards in the context of (mainly) public sector business insight and intelligence. It will do this by working through the following four sections:

  1. Firstly, it will try to consolidate what is commonly meant by a dashboard;
  2. Secondly, it will examine what the UK public sector wants from a dashboard;
  3. Thirdly it will look at a number of examples which draw together and highlight these requirements, each with a small commentary.
  4. Finally, it will try to consider how SDA/GIDE might be able to benefit from further work on embedding dashboards into our work and products.


Part 1: Background

What do we mean by a dashboard?

The word dashboard originally referred to a piece of wood or leather, used to protect the driver of a horse-drawn carriage from the mud and pebbles flung up by their horses’ hooves. When the original motor vehicles were built, the same principle – and name – was applied to the panel keeping the fumes and heat of the engine at bay. As cars evolved, so did the dashboard, to enable the driver to monitor their vehicle’s major functions at a glance via a centralised instrument cluster.

This description is precisely suited to its use today: the ‘data dashboard’. Essentially it is a graphical user interface providing snapshot views of key performance indicators relevant to particular objectives or processes. It may, for example, show the number and types of crimes being solved in a policing jurisdiction according to their priority. Or it may show the progress towards a particular strategic outcome by reference to a number of component outputs. The key point is that it directly draws on the most recent data available within an organisation and that it can be compared with earlier snapshots to provide a sense of trajectory.

Who uses dashboards?

Not data people! They can run whatever sophisticated queries they need.

I would suggest that data dashboards are a reporting console rather than a data tool. I would further suggest that they are – and should be – designed for and aimed at managers to understand the business status of their organisation, or the part of it they control, by monitoring the specified performance of those units that report to them.

Why would they use them?

To gauge exactly how well an organization is performing overall, the corporate data dashboards should enable each manager to easily view a periodic snapshot  of strategically or operationally relevant data, from each of the units they control, to provide a snapshot of how they are performing against their stated contribution to the ‘corporate mission’.

To achieve this the data needs to be:

  1. suitable tiered to enable a common dashboard to be deployed at all relevant levels, and
  2. properly categorised according to the strategic or operational component it is intended to describe. This will allow the corporate dashboard to contain modules that can be swapped in or out depending on their relevance to the manager using it.

What are the benefits of using data dashboards?

  • Visual presentation of performance measures
  • Ability to identify and correct negative trends
  • Measure efficiencies/inefficiencies
  • Ability to generate detailed reports showing new trends
  • Ability to make more informed decisions based on collected business intelligence
  • Align strategies and organisational goals by focusing on responsibilities
  • Provides total and instant visibility of all relevant components
  • Saves time compared to running bespoke reports
  • Quick identification of data outliers and correlations.


Part 2: How does the UK Public Sector use dashboards?

The Government Digital Service has, for better or worse, rightly focused on the use of timely and accurate data to understand gaps in policy, underpin new or corrective policy and to monitor its implementation. It is, as they say, ‘on a journey’. Having said that, GDS have come a long way in articulating the problems and along with the Office for National Statistics and the Government Statistical Service, are beginning to define the issues and produce solutions.

Here are some extracts from an ONS blog discussing the strides being made in using data dashboards to corral and present complex data from across the public sector (it’s worth reading in its entirety):

“Each year the ONS and government departments publish data which provide insight into the efficiency of public services. When these datasets are viewed separately, however, it can be hard to understand the bigger picture. Now the interactive dashboards which aim to bring together a range of data in one place.” – Sophie Davis, headline para from the blog.

“With more data available than ever before, a dashboard can be a great way to analyse and present data quickly, in an interactive and accessible format.” – from the blog.

 “There is real value in this work; bringing data together to cut through the noise and make sense of the world around us.” – John Pullinger, National Statistician.

These, and other, observations from around the public sector demonstrate that there is a growing appetite for data dashboards to focus on and illuminate the key data that drives progress.

Demos: Governance by Dashboard

Of course, there are other commentators who present a more cautionary view of their use. In their Demos paper “Governance by Dashboard”, Jamie Bartlett and Nathaniel Tkacz summarise their emergence during the 2012 Coalition Government’s early days:

In late 2012, the Government Digital Service created a new way for the Prime Minister to keep on top of events. Around the Cabinet Office, it was known as the ‘Number 10 Dashboard’. This bespoke iPad app provided performance indicators on a number of government services, real-time information on aspects of the economy, trends from social media and expert commentary, all integrated into a single screen and with the capacity to ‘drill down’ as needed.”

They go on to describe five contingent features of data dashboards that have emerged as they have developed. These features are contingent because, they argue, “…how they present this data, and how it is acted upon, in turn creates new modes of behaviour, attitudes and norms within the organisations that use them.” All or any of which may be beneficial, or not. These features are:

  • ‘Increased focus on data and measurement’, which they broadly support
  • Introduction of ‘New biases’, which can be summarised as the story being shaped by the choice and presentation of the data
  • ‘Design’, which, they remark “has profound consequences in shaping the priorities of its user, and dashboard designers should be acutely sensitive to the agency of a dashboard’s architecture in shaping the meaning that the dashboard’s user gleans from its interface.
  • ‘New types of knowledge’ in which they comment that “Dashboards perpetuate (the) tendency to show things as they are (and are related) but without any attempt to explain why.
  • ‘New forms of expertise’ in which they comment that “dashboards are the tool that communicates the work of data scientists, designers, developers, information architects and other technical experts to people whose expertise lies elsewhere.” They caution that this potentially skews the relative values of the producers and the users (who are usually the ‘doers’) to the disadvantage of the latter – and therefore the organisation!

The paper ends with four recommendations:

  • Clarity of purpose: City leaders said they needed a ‘city dashboard’. “There may have been good reasons for requesting a dashboard, but these had not been effectively communicated to the team assembled to build it. The team had access to any number of data flows, from real time car parking, transport and social media ‘sentiment’ data, to public utilities data, but without a clear sense of what the dashboard was for it was very difficult to make concrete decisions about what to include.” QED!
  • Users: In this, the authors converge with the GDS’ view that the dashboard MUST be guided by what the operational and/or policymaking users require of it, whilst acknowledging that that is sometimes easier said than done.
  • Danger of ‘off the shelf’: In which the authors look at the pros and cons of bespoke vs off the shelf. On the whole they conclude that the choice is down to ‘horses for courses’, however they do end on a note which may be interesting for SDA: “It is worth mentioning, however, that increasingly even bespoke dashboards incorporate pre-existing software. There are a number of open source software and free visualisation tools that are commonly used for lowscale dashboard design.” Perhaps we should investigate.
  • Understanding limitations: Here the authors offer a clear ‘caveat emptor’: “Since many dashboards are explicitly made to manage organisational performance, some commercial dashboards can reproduce a ‘performance bias’. That is, the dashboard can pull your data as well as the actions of a team in a specific direction. Finally, once integrated into an organisation, dashboard metrics and KPIs can be gamed like any other rule or metric. For all these reasons, users should foster a critical disposition toward their dashboards, and constantly remind themselves of the dashboard’s limits.

You can find the full paper here.


Part 3: Examples of data dashboards in use in the UK public sector

Here we will look at three examples of UK data dashboards, including a group of beta presentations from Government Statistical Service, and (rather counter-intuitively) one educational dashboard from California, to look at what can be done and how they are used. All of them are public facing and therefore easy to explore. By and large the narrative will be delivered through images of the dashboards themselves. Though I will explain transitions and what can be gleaned from the displays.

1) Ofsted staff survey.

This is a very basic dashboard, summarising sentiment in a selection of ten categories, presumably regarded as key by Ofsted leaders. These categories abstract the results from their component questions, which number around 100 in all, some with several sub-sections. They also show the change in sentiment from the previous survey and, to provide a broad-brush comparison with the wider context, a comparison with the same categories in the civil service as a whole as well as a comparison with so-called ‘Civil Service high performers’.

Leaders would quickly glean from this a) that the trajectory is broadly positive b) that they outpace the civil service as a whole, sometimes by a considerable margin, and c) they also often outpace the civil service at the high end of their comparison groups.

The presentation of this information therefore obeys the basic tenet of a dashboard: to meaningfully present key information to decision-makers whilst retaining the necessary data for analysts and policy makers to examine as necessary.

Clearly this form of dashboard is essentially static and, though very useful in its own terms, does not connect to the data it presents and is therefore inflexible.

Ofsted dashboard

Ofsted people survey dashboard

2) Mayor’s Office for Policing and Crime (MOPAC) Weapon-enabled crime dashboard

This is one of a range of similar dashboards which together compare and chart 12 month rolling crime data across London. It has been specifically designed and compiled to reflect local priorities, and therefore seems to conform to the Demos recommendations 1 (Clarity of Purpose) and 2 (User oriented).  

The weapons-enabled crime dashboard is configurable to provide the user with the ability to ‘chase down’ their specific enquiries by providing a) filters to interrogate specific forms of weapons-enabled crime and b ) the ability to drill down to borough and ward level for specific data, which can then be selected to view or download using point and click. It also provides a longitudinal view, by quarter, showing public sentiment about the type of crime being explored.

The following images illustrate these and other facilities available with this dashboard.

A) Dashboard description


B) Dashboard overview

Here, the user can gather at-a-glance information, across the Metropolitan Police jurisdiction, about any given weapons enabled offence (filtered in this case for knife crime) including colour coded gradations demonstrating hotspots against the mean, and longitudinal measures over different periods.


C) Data and trends by borough and ward

Here you can drill down to specific boroughs and their wards. The longitudinal measures are adjusted accordingly.


D) Public perceptions by borough, over time (Hackney selected)

Finally, in this brief glance, a look at borough comparisons for public perceptions across a range of reference questions:



 3) Government Statistical Service Public Service Data Dashboards (Beta)

This set of four comprehensive data dashboards, covering Criminal Justice, Education, Health Care and Work and Pensions, perhaps go furthest in demonstrating the public sector’s intention to integrate their respective data sets. They draw on a broad range of existing National and Official Statistics as well as published management information, bringing them together into an interactive tool under the efficiency headings. By doing so they intend to illuminate, and thereby build a wider understanding of, public sector efficiency.

We will take a quick look at what they describe in Education.

N.B: These are efficiency measures and deal exclusively with overall input/output data. School level performance and financial data remain available through the usual channels.

A) All education sectors. We will further examine the ‘Secondary’ views.


B) Secondary Overview: as implied, drilling down to this panel provides a broad overview of the following four categories, each of which can be interrogated to the extent of their annual summary over time. After briefly considering this overview, we will take a further look at the overall ‘input’ (spend) and the outcomes derived from it.


C) Secondary Spend: this shows headline expenditure/annum (gross input) and an in-year breakdown and comparison of secondary spend/pupil in LA maintained schools against Academies.


D) Secondary outcomes: finally we can examine headline outcomes (according to HMG priorities), noting that, while the spend panel distinguishes between LA maintained and Academies, the outcomes panel does not, focussing instead on the outcomes for the English Baccalaureate. This could, presumably, be replaced by a more meaningful comparison assuming the data is available.



4) A view from the United States: The California School Dashboard

This is an eye-pleasing, customer focused operational dashboard clearly showing their key performance indicators which can be broken down according to a number of demographic and performance categories by state, school district and school/institution.

CSB1(Note also that a PDF report can be downloaded)

So for example, let’s explore the ‘Mathematics’ performance indicator, which shows that statewide performance in maths is in the ‘orange zone’ generically labelled ‘low’ performing, but more precisely defined (in the detail) as being “33.5 points below standard based on student performance either on the Smarter Balanced Summative Assessment or the California Alternate Assessment, which is taken annually by students in grades 3–8 and grade 11.”

This is a good example of providing a broad brush visual assessment without losing the underlying meaning for more refined analyses.

The mathematics performance indicator can be further examined to identify how ethnic and socio-economic groups perform:


Finally, more precise information can be found about the respective performance of ethnic and socioeconomic groups, in this example the ‘orange’ category, it having the largest representation:



Part 4: Takeaways for SDA/GIDE.

Dashboards: there are a lot of them about. It’s hard to tell whether they’re a fad or whether they’re here to stay. To some extent that may come down to their utility: there are clearly occasions when they, or their representations in reports and presentations, do absolutely nothing to illuminate the data they try to convey. Just look up “bad data dashboards” on google for a truly scary array of how to do it wrong.

On the other hand, a meaningful set of data – a data dashboard in other words – that has been developed to represent the important metrics valued and used by an organisation, running off good data that has been well described, can cut through the noise to deliver clarity and insight.

The examples we have examined conform more or less (according to taste) to this requirement:

  • A tool for presenting progress towards strategic objectives (Ofsted)
  • Tracking operational performance against tactical goals (MOPAC)
  • Gathering together and making sense of large and disparate data-sets (GSS Beta)
  • Providing simple and intuitive information for a range of consumers to help guide or focus their choices (California).

Having considered this research – especially the very useful Demos report – I think that Robert Baldy (SDA CEO) is right: each dashboard that we (or anyone) might create needs to be a bespoke project, tailored to the requirements of the given client. I might go further: we should test what it is that the client wants to do with the dashboard to ensure that their objectives are deliverable, and that the product will provide them with the results they (think they) want. On their behalf, we should be clear that the data they are collecting, or have collected, is sufficient – and sufficiently good – to achieve their ends. To some extent this would enable us to leverage what we claim to be our core talent: organising and processing data.

We should therefore take to heart the Demos recommendations, that the dashboard project or development should:

  • Identify purpose and use:  In and of themselves, Dashboards are not necessarily the best way to understand all problems, and must be carefully considered to ensure they meet organisational needs. Once the purpose has been identified, it must be communicated clearly to its designers as well as its intended users.
  • Understand limitations: Dashboards will often prioritise operational issues, rather than longer-term strategic issues, and may marginalise more reflexive approaches to a problem. Like all metrics they can be “gamed”, and so users must be encouraged to have a critical eye for the dashboard’s limits.
  • Select the right staff and skills: Just because they are designed to be user-friendly, it is dangerous to assume users will intuitively understand how to use them. To maximise the take-up rate among staff, they will need to be provided with training to understand the dashboard’s purpose, where the data is drawn from and the way that it is framed.





A few years years ago we blogged about the Department for Education’s Information Standards Board. We thought we’d revisit it. Having done so one extraordinary fact stands out:


I suppose we’d better qualify that. What we did notice is that our previous observation that “there are 269 “Recommended” standards, and zero “Adopted” standards” has moved on: there are now 384 recommended standards but still zero adopted standards. This is in spite of the ISB’s mission statement that:

“The core business of the ISB, supported by the Technical Support Service (TSS), is to successfully embed standards within the Education, Skills and Children’s Services (ESCS) system in England.”

Apart from that unfortunate detail, all of the salient points that we made back in February 2015 remain true today. We gathered together the elements of our critique under the following headings, drawn from the goals of the Internet Engineering Task Force (IETF) Standards Process:

  • technical excellence;
  • prior implementation and testing;
  • clear, concise, and easily understood documentation;
  • openness and fairness; and
  • timeliness

It is in this context that we have seen no movement. None. Nada.

We foraged around the (clearly moribund) ISB website and found the following document entitled Championing Open Data Standards, published in September 2016. It’s so… well let’s be direct – it’s so pitifully ex post facto that we decided to reprint it in full; you should note the Information Standards Board has not met since so it’s our guess that nothing here has changed subsequently. Here it is, verbatim:

Championing open data standards

Why have common and open data standards?

In order to achieve efficient data movement and matching, there is a need for common data standards. These are necessary to allow people and systems to exchange and re-use information.

This was recognised within the Education, Skills and Children’s Services (ESCS) sector, and the Information Standards Board (ISB) was created in 2007 to deliver the data standards and data model to realise savings in the costs incurred with data moving and processing.

Why use the ESCS Information Standards?

DfE and BIS have jointly sponsored a Board and Technical Support Service since 2007 to develop the ESCS standards and support their implementation. Their aim was to provide a definitive set of standards through stakeholder consensus and support their implementation by ensuring developments were shared effectively and lessons learnt.

The standards and data model developed aims to provide a sector-wide solution which is future-proofed, so that it meets both the ability to cover the wide-ranging, but often similar, data items in more general terms, as well as allowing the model the flexibility to develop as a result of new policies which may impact on the data items in the model.

Where are we now?

Whilst much work has been done to develop the model and standards, the potential benefits of that investment are yet to be fully realised. There has been a strong focus on supporting two key areas of model usage so far: the Joint Council for Qualifications’ (JCQ’s) A2C project; and the Data Exchange Project being led by DESAG within DfE. However, the ESCS Information Standards have not yet been widely adopted.

Standards and Data Models by their nature require constant maintenance even once developed. If we are to have common data standards, there will be a need for ongoing support to further develop the model.

Questions for discussion

  • How can we build wider interest and engagement with ESCS Information Standards?
  • Who are the potential users? Government, LAs, MIS, Schools, Colleges, etc
  • What could be done to engage more with users?
  • How can DfE facilitate this?


We’ll say it again: this was the penultimate published document from the ISB, succeeded only by a technical note on 19 December 2016 regarding the latest (very small and as far as we can establish, unconsulted) changes to the ISB document set.

Given the lack of process by which a standard is judged; given the lack of engagement, input and challenge that a standard requires to warrant the title; and given its complete lack of adoption – it has to be asked:

Can the Information Standards Board’s Business Data Architecture ever be considered an open standard?

To conclude: take a look when this juggernaut was founded.

Yes, that’s right – 2007. It’s now 2019.

A dozen years have gone. It’s now in its third (or is that fourth) government for goodness’ sake. Yet despite this repetitive failure in process, despite the singular lack of progress in engagement and adoption and because of what appears to be a failure in the proper oversight of this monumental white elephant… money continues to be spent.

It’s probably time to ask why, and how much.

Delivering the National Resilience Capability Survey for Cabinet Office

An inauspicious beginning

For more than twenty five years I was a civil servant, usually in charge of delivering something substantial – so I thought I knew how this would go: Not so.

At the time Cabinet Office awarded the contract to Software for Data Analysis Limited (SDA) the directing policy team were undergoing a seismic change of personnel, the legacy system we were to replace had already been decommissioned, and it was two years on from the previous data collection so many in the field had also moved on. It was our job to introduce our SCROLL survey software into this extreme state of flux.

The job

Cabinet Office’s Civil Contingencies Secretariat (CCS) are responsible for the National Resilience Programme, which aims to increase the UK’s capability to respond to and recover from civil emergencies and provides advice on preparing for a crisis. They needed a solution to enable them to collect, process and analyse resilience and readiness data from national and local responder organisations as well as utilities and Local Authorities. The data is at least “official sensitive” so the solution had to be locked down tight. SDA’s comprehensive ISO27001:2013 certification was only the starting point!

Engaging the client

After we’d negotiated the security capsules in the lobby of 70 Whitehall – and been relieved of all our electronic accoutrements – we were taken to COBRA 2 for our initial briefing and workshop. Here we began to flush out the details of the assignment. The shopping list was extensive. These were a few of the must haves:

  • Guided development of hundreds of separate survey questions;
  • 25 different questionnaires delivered simultaneously to 800 diverse respondent groups during a six-week window;
  • Multiple users in an organisation involved in completing each instrument;
  • Different access levels for different users to both questionnaire development, data entry and outputs;
  • Local administrative functions for each participating organisation;
  • Multiple views of survey data from pre-set reports to complete flexibility in describing variables and granularity.
  • Facilities to compare individual elements of demographic groups with the aggregate performance of the respective demographic group;
  • Benchmark reports across different levels of aggregation;
  • Full range of export facilities e.g. CSV, Excel, Quantum, SPSS and Triple-S;

The list went on. And on.

Making progress

It became clear that though the responder community had huge subject matter expertise, much of it on the frontline, it needed to be complemented by an equal proportion of survey savvy: in other words they knew what they needed to find out, but they didn’t altogether know how to ask. And while the ‘must haves’ were given, the ‘shoulds’, ‘coulds’ and ‘woulds’ were still in the eyes of the beholders. SDA gently suggested that we might be best used to arbitrate between conflicting priorities based on our deep knowledge of our discipline and our product.

At the end of the workshop we proposed two important variations:

  • That CCS delegate the survey design and implementation to us
  • That it would be more efficient if we were to manage the entire technical service.

Our public sector colleagues agreed, very quickly, and it seemed – to me at least – that our offer had evoked a metaphorical sigh of relief. We began work in earnest.

A question of security

Cabinet Office is the home of the Government Digital Service (GDS) and so it was unsurprising that the project was run according to agile principles. This was fine by us: SDA have always subscribed to requirements/prototype/iterate in all their many guises. One big deal, though, was the GDS Service Assessment. Ours majored on security. We took them through our Information Security Management System, we showed them our audits, our testing regimes, our certificates and our third party certificates. Still they appeared uncertain.

Finally we suggested that we could install SCROLL on a virtual machine in their own datacentre. Absolutely, they said. Another deep sigh of relief.

Going live

Our guiding imperative was to launch on time: this was an absolute. And honestly, we were always on the front foot. Where our client had questions, we provided answers. Where they had problems, we provided solutions. Where they had deadlines, we met them. Come the day, though, a final flurry of amendments were presented to us and they were non-negotiable. Of course, we went the extra mile. That’s what we do.

Who cared about the weekend anyway?

Does that all sound difficult? It was. Was it unpleasant? No. It was exhilarating. When you throw yourself into something, the barriers just come down. We all did a great job.

Lessons learned

  • Technology is easy. Organisations aren’t.
  • Never underestimate complexity.
  • Be prepared to lead, sometimes from behind.
  • Focus on delivery.


SCROLL. Did I ever mention it? Maybe, maybe not. Assuming not, well that’s probably because we’ve been so busy building it. And now it’s here. But first things first: what is SCROLL?

It’s this:scroll-blog

SCROLL is SDA’s latest data collection, processing and analysis tool. It embodies 20 years’ focus on turning data into decision grade information. Used by some of the largest social and market research organisations – and most recently by Cabinet Office – our system has already delivered thousands of data collections. Indeed, it’s taking off all over the place, including France, Germany and… Iran. Using SCROLL’s multi-lingual capabilities our new associate research consultant has adapted the system to accommodate Farsi and, right now, it’s being put to work on the ground in Tehran and other major Iranian cities. A unique feature of the project is that we’ve modified the system for offline operations to allow for specific elements of the local business environment.

Indeed – STOP PRESS (like, this morning) – we have just been confirmed as the principal supplier of computer assisted mobile interviewing (CAPMI) by RahbarBazaar (, the leading market research agency in the region, and exclusive regional affiliate of Kantar Insight.

The weird thing is that we sort of slid into operational mode. SCROLL has been in development – a sometimes vexed process – for a while. There were several components that worked perfectly as independent modules but their integration was… problematic. So we stepped up the effort, polishing them up, streamlining their throughput and then, almost unexpectedly, they clicked. A good job they did because – how shall I put this – we’d taken a few liberties with our marketing endeavours. I’ll put my hands up; we had to. You pour resource into development but there comes a point when you’ve got to get your product out of the door.

So, lessons learned? We all know the impulse to make something perfect; to improve, to refine, to embellish. To fiddle.

The simple fact is that this is all displacement activity. Version 1.0 is staring you in the face. Get it out. Put it to work.

The world will direct where it goes next.

Twitter: @SCROLLcontrol

What goes around, comes around: Interoperability, Data Exchange and A4L

Open letter to the A4L Systems Interoperability Framework community campaigning for a seat on their UK Management Board:

Dear Colleague,

My name is Mark Phillips from Software for Data Analysis Limited. Among other things we have published the DfE’s School Performance tables since 1993.

After a long period of absence, I’m back in the SIF saddle. Unexpectedly I’ve been urged to apply for a position on the UK Management Board. So I’m going to.
I do have some form in this respect. Rather than rehash what I’ve already said I’ll simply quote it here:
I was the first National Chair of the SIFA UK Management Board, presiding from 2006 to 2009. During that time, along with our committed and vibrant Board, I took SIF on the road – introducing the concept and the detail to Central and Local Government, schools and vendors around the country.

It was the first inflationary period as it were, where we moved from zero to max in a short but exciting burst of incredible energy, re-tooling the predominantly US standard to fit the requirements of the UK education system. The rationale, both economic and technical, was forged then and has remained compelling since: improved information available more quickly, more efficiently and more effectively – for less cost.

With that manifesto I got the movement onto the map.

Since those times I have moved from Government to the private sector, developing SDA as a force in Open Data, building on the tenets of interoperability across a wider stage and working alongside the Open Data Institute, TechUK and Government Digital Service to develop systems and processes fit for the 21st Century.

Now I’m delighted to see that my original intentions are on the cusp of realisation with the new movement towards Data Exchange at the DfE.

If I was to be elected to the A4L UK Management Board I would bring the longevity and consistency of my vision, now enhanced and improved by new perspectives and wider experience, to bear on the ultimate realisation of this long journey towards open standards and interoperability in UK education.

The final push, if you like.

 I would, of course, fulfil my obligations to the Board in terms of time, commitment and energy – as was ever the case!”

 If you were to distil my ambition into a single point it would be this; to ensure that we deliver a SIF model that leverages the global Specification wedded to a data model that embodies the principles of an open standard, specifically Due Process, Consensus, Transparency, Balance and, of course, Openness.
If you want to know more about me, don’t hesitate to get in touch.
All the best,


Onwards and upwards…

Software for Data Analysis Limited have been working closely with the National Casino Forum to design and engineer SENSE, the backbone system that crystalises a new approach to Corporate Social Responsibility in the world of UK Land Based Casino gambling. Liaising closely with representatives of all the major players in the industry, SDA have integrated a huge range of administrative and technical systems to develop and deliver a single, seamless product. Having added some of their technical fairy-dust on the way, SENSE can now be found on the shop floor of every UK Casino, enabling people to self-exclude from gambling if things go wrong. The Gambling Commission have already recognised it as a “significant achievement”.

Here’s the press release:

“UK gambling trade association the National Casino Forum (NCF) aims to raise public awareness for responsible gambling by announcing the launch of SENSE – the Self-Enrolment National Self-Exclusion tool which it will roll out through its land based casino partners

This national programme is being introduced by casinos in advance of the Gambling Commission’s licensing condition which is due to come into force on 6 April 2016.  This condition will require operators to participate in multi-operator self-exclusion schemes so that customers are able to self-exclude from gambling facilities.

The SENSE scheme enables customers to voluntarily self-exclude from all participating land-based casino premises and is mandatory for all NCF member operators. Enrolling in SENSE means that customers will, for the first time, be sharing their request to self-exclude from all participating land-based casinos for a minimum period of six months.

The system is designed to be simple and straightforward to use; a casino operator will read out the terms and conditions of SENSE and then ask the customer to electronically sign the enrolment form. The casino operator will also take a photograph of the customer and upload this along with their enrolment form onto the secure SENSE system.  Customers can also download the SENSE information and self-exclusion form at

Once a customer has enrolled in SENSE, operators at participating casinos will be alerted and any marketing material for that customer and memberships will be switched off. If a customer tries to access a casino when they are self-excluded, an operator will note this on the system and it will alert surrounding casinos that they are trying to gain access.  Customers will only be eligible to be removed from SENSE after the six month period and only upon request.

The Gambling Commission commented: “We see the development of sector specific self-exclusion schemes as an important step in providing greater protection to players who require help managing their gambling.

“The casino sector was well placed to lead the way in this but implementing SENSE now, well ahead of the deadline we set, is a significant achievement. We recognise that it required the considerable efforts and full commitment of all NCF’s members.”

Tracy Damestani, CEO of the National Casino Forum said: The NCF and its members took the decision to pioneer the first national self-exclusion programme.  Self-exclusion is an important step for people who have recognised that they have a problem with gambling and have made a commitment to deal with it.

We applaud the Gambling Commission’s decision to introduce a new provision which mandates that all gambling operators will need to implement more effective self-exclusion systems. NCF members recognised the need to have an easy to use system in place that can be used nationally across UK casino venues and we have been working on this initiative for the last two years. SENSE was created with the intention of encouraging responsible gambling throughout the industry and to help those people who may be at risk.”

“NCF represents over 98% of the UK’s land-based casinos.  The members’ commitment to implementing the SENSE system has been admirable; they have taken all the steps necessary to ensure their casino staff support customers who wish to self-exclude.  While

SENSE was developed as a casino initiative to be flexible and progressive, we hope to see other gambling sectors also take similar steps for their customer base.”

The SENSE web-based application system was developed with SDA (Software for Data Analysis).

Michael Hart, Managing Director of Software for Data Analysis Limited (SDA) added: “Developing and delivering SENSE has been an exhilarating time for SDA.  Investing our substantial expertise with data processing and systems development into such a cutting-edge and socially responsible project has been both demanding and rewarding.  I’m delighted with the result, and that we’ve been able to contribute to this groundbreaking product.””

Burn Before Reading

I’m sorry I just can’t help myself.

Every time I return to the fray I seem forced to grapple with a fresh example of inefficient, inconsistent or just plain inscrutable public sector blarney. My only catharsis is to write it down, here. Not to do so would create all manner of new, unwanted and probably damaging internal tensions that I would prefer to do without.

Here’s the deal:

“The Redacted is implementing a new multi-buyer e-procurement platform… that will allow us to manage our supplier base more effectively. To continue to receive notification of potential tender opportunities with the Redacted and its executive agencies… click on the link below to register…”

So. Registration. Should be straightforward, right? I’ve got the link and I’ve been “pre-invited” to register, so what can go wrong? I fling myself into the task with optimism and enthusiasm. Things start well: the first page has recorded some basic details and provided a system ID. Only ten more pages to go.

The second page requires some further information, which I don’t have to hand. Off I go to collect it. I’m already thinking “wouldn’t it have been good if I’d been provided with a list of things I’ll need”. Having collected the information I sit back down to key it in.

I’ve been logged out. Never mind, I’ll pick it up from where I left off.

Wrong. The form has forgotten me (already). So I re-key my previous entries and add the new details. On to the next page.

Rinse and repeat.

Too late I notice (by complete accident) that a date I had entered in free text had defaulted to the American format. I only discovered this by subsequently looking at the drop down calendar and finding that my input – 6/11/2015 – had been recorded by the system as 11th June 2015 – which, if it had gone unnoticed would probably have invalidated the registration. No guidance, no warning.

Rinse and repeat.

Suffice to say a really, really simple job took me nigh on two hours to complete. That’s two hours of quite expensive, sort-of-executive time. No doubt multiplied across the supplier roster. And we wonder at our productivity failings.

Finally, I finished. I figured that, as the invitation to apply had been extended by a specific Department, to SDA as one of their specific suppliers, and as the invitation had majored on providing us with information about “…a number of forthcoming tenders from across the Redacted and its executive agencies that we don’t want you to miss out on” I could now look forward to examining what was on offer. Of the 63 opportunities noted JUST ONE was with said organisation; the rest were from all manner of public bodies for whom we have never worked, and for which we were mostly unsuited.

I know the Government mantra: don’t build, buy; COTS is best! But really? At least put in some effort to MAKE IT WORK.

Disappointed by this episode I wrote to the “executive agency” that appeared to have been charged with the development of the system, explaining some of my reservations. (SDA are, after all, expert in form creation.) For the next couple of days I got the following message:

“This is an automatically generated Delivery Status Notification
Delivery to the following recipient has been delayed:
Message will be retried for x more day(s)”

Culminating with:

“Delivery to the following recipient failed permanently:
Technical details of permanent failure:
The recipient server did not accept our requests to connect.”


Burn before reading.


The Department for Education in collaboration with other authorities in the Education, Skills and Children’s Services sector has, since at least 2009, been developing its own set of data-related standards under the management of the Information Standards Board (ISB).

According to its own website (, “The core business of the ISB, supported by the Technical Support Service (TSS), is to successfully embed standards within the Education, Skills and Children’s Services (ESCS) system in England.”

The ISB has two “approved” statuses for its published standards – “Approved: Recommended” and “Approved: “Adopted”.  On the ISB website today, there are 269 “Recommended” standards, and zero “Adopted” standards.

So why, in five years, has the ISB failed in its core aim of issuing standards which are actually being used?

We believe the answer lies in the approach taken by the ISB in producing and publishing these standards.  Let’s take as our benchmark the work of the Internet Engineering Task Force (IETF), which is responsible for producing a large number of widely-deployed technical standards.  To quote from “The IETF Standards Process” (RFC 2026):

The goals of the Internet Standards Process are:

  • technical excellence;
  • prior implementation and testing;
  • clear, concise, and easily understood documentation;
  • openness and fairness; and
  • timeliness

Assuming we agree that these are all good goals to have when producing a standard, let’s assess the performance of the ISB against those goals.

1) Technical excellence

It could be argued that technical excellence should be relatively low on the list of priorities for ISB (its aim is simply to produce usable standards), but there is a minimum standard that the work is “fit for purpose”.  Unfortunately there have been numerous examples of documents published as “Approved” by the ISB which are simply not fit for purpose.

One of the foundations of the ISB data standards is the Business Data Architecture Data Types document.  This was first published on 20th September 2013 as version 4.0 and was so riddled with errors that that version is not even archived on the ISB website!

It contains statements such as the recursive definition of a date being “A string providing date information and hence containing following values – year, month and date”.  Those three components of a date are each defined to consist of a single digit (0 to 9).  Or defining a type called “Simple_Integer” as being “A simple unsigned string of numeric values.”, and another type called “Integer” with the simpler definition of “Signed numeric value”.

It was simply not possible to implement any of the definitions provided in this document without ignoring large parts of it and trying to guess what the document author intended.

When defining controlled lists of values for common concepts such as language, country and currency, the ISB have quite rightly decided to adopt existing ISO standards in those areas.

However, the ISB have fundamentally misunderstood the purposes of the ISO standards, which are titled “Language Codes – ISO 639”, “Country Codes – ISO 3166” and “Currency Codes – ISO 4217”.  i.e. these are standard lists of *codes* for languages, countries and currencies.

In trying to fit in with other controlled lists, the ISB have broken perfectly good existing standards by using the textual descriptions (e.g. Spanish, Spain and Euro) instead of the code defined by the standard.  The problem is when ISO codes remain the same but the textual description changes.

This was raised as an issue with ISB in May 2014 but no visible action has been taken.

Regarding the many other controlled lists and data documents, there is no way to reliably assess their fitness for purpose without attempting to implement them in the real world.  Which brings us to…

2) Prior implementation and testing

The ISB have stated that it is not their role to perform testing – the ISB propose standards and it is reliant on stakeholders to implement them.  However, given that no ISB standards have been through an implementation process and moved to “approved” status, this approach is obviously not working.

The ISB standards are also heavily dependent on each other, adding layer upon layer upon layer.  When mistakes are found in documents that many other documents depend on (such as the BDA Data Types document) or improvements are suggested to those documents, then these suggested improvements are rejected as being impossible to implement due to the many other standards that would be affected by such a change.

If no stakeholders have implemented any of these standards then in addition to not proving the standards are fit for purpose, it also raises the obvious questions around the need for those standards in the first place.

The motivation for a lot of the standards developed as part of the work of the ISB has been the DfE Data Transformation Programme, which is described as consisting of the “Data Exchange” and “School Performance Data Programme” projects, neither of which are  currently being actively pursued.

However, there is still value in the work undertaken by the ISB, especially in the production of controlled lists which can be used regardless of the over-arching data model.  Instead of investing in the production of further standards, the ISB should stop and invest in its own testing and implementation of at least the core standards and their XML representation.

To quote Albert Einstein, “In theory, theory and practice are the same. In practice, they are not.”

3) Clear, concise, and easily understood documentation

The documents published by the ISB are far from being clear, concise and easily understood.

The ISB staff developing the ISB data model need to use the ERwin software to view, develop and maintain the data model behind all the published ISB documents.

However, the “other side” of the process (organisations wishing to develop systems conforming to ISB) do not have any of that functionality, and are just provided with a set of static PDF documents, often containing 90% boilerplate text and 10% actual content.

There is also an XML Schema (xsd) file which partially specifies the XML representation of ISB data.  In order to obtain the full specification of this XML representation, an implementer is required to refer to the PDF documents.  No examples of XML files have been published.

To quote from the W3C, “XML Schemas express shared vocabularies and allow machines to carry out rules made by people. They provide a means for defining the structure, content and semantics of XML documents.”

The XML Schema provided by ISB defines the structure of an ISB XML file, but as a matter of policy declines to use the language of XML Schema to define the content and semantics of such files.

The main example of this is the lack of incorporation of controlled lists into the Schema, but also includes the lack of any documentation on the semantics of the structures being defined.  Instead, a user of the schema is referred to the many PDF documents as being the definitive source of documentation.

Given that XML will be the main method of transferring ISB data between systems, it is vital to ISB’s success that the documentation on the XML format is both accessible and unambiguous.  The XML Schema language should be fully used to aid in this task, not ignored.

It seems clear that the information published by the ISB needs to be stored in a structured way (e.g. in a database), and the content should be accessible via an interactive website.  In addition, the same database of information could be used to automatically generate PDF documents similar to those now created manually, or to create an XML Schema which as the W3C intended, would “allow machines to carry out rules made by people”.

4) Openness and fairness

In 2014, an independent review of ISB was undertaken, which included contributions from stakeholders outside the ISB and its constituent organisations.  However, the findings of this review have not been made public, not even to those who contributed time to the review. This is particularly shocking given this Administration’s emphasis on “transparency”.

5) Timeliness

It is difficult to judge the timeliness of the work of the ISB given that there is no apparent desire to implement any of the published standards.

And Now For Something Completely Different… Scholarly Articles from SDA!

SDA are delighted to announce two new peer-reviewed articles discussing their work to improve access to 1) School Performance data and 2) Free School Meals . Both papers have been spearheaded by Dr Alan Strickley, the former with contributions from SDA colleagues, and are available from

First up is:

A National Single Indicator for Schools in England: Helping Parents Make Informed Decisions

Alan Strickley, John Bertram, Dave Chapman, Michael Hart, Roy Hicks, Derek Kennedy, Mark Phillips.


With an ever-increasing measurement of pupil and school performance and presence of resultant statistical tables and indicators, parents are faced with a sometimes overwhelming plethora of data and information when monitoring the performance of their children’s present or prospective school. The authors are part of a company that has, using open data, developed a parent/carer-accessible site to attempt to address issues and needs for parents/carers. Anecdotal evidence indicates that a single portal where parent/carers can find all the relevant data about schools in England would be an invaluable tool for monitoring and choosing a school. It was decided that such a site would be built around a National Single Indicator (NSI). The indicator is formed from an amalgam of expected progress measures: the main threshold level; pupils’ average points score; and the value added measure. By changing the weight attributed to each of these measures, the website allows parents to modify their relative importance according to the value they place on them. This dynamically alters the overall result to give users their own “personal indicator”, which means they can compare schools in a list tailored to their own specification.

The website is available at

The second article is:

Online Free School Meals as a Cloud-Based Solution: Three Case Studies of Its Use in England

Alan Strickley


Online Free School Meals (OFSM) was a transformational programme supported by the Department for Education (DfE) in England. The full process is documented by Strickley[1]. Whilst the use of the system can be judged an overwhelming success, most Local Authorities (LAs) have stopped short of the full web-based system in which parents can apply directly via an online form as a result of the perception of negligible cost benefits created by a lack of technical expertise, scarce resources and server and development costs. The paper describes how these issues were overcome by developing a generic cloud-based solution. The paper looks at the general structure of the solution and examines the experiences of three types of user: an academy consortium, a single school and a large LA to illustrate adoption, implementation, usage and benefits. It concludes that a cloud-based system is cost effective by removing much administration and as a result of lowering the stigma of applying can result in an increase in applications. This has resulted in financial advantages for schools and LAs.

More information is available at