In a conference room in Brussels a group of data fanatics have spent two days discussing how governments could be encouraged to make more data available to their citizens, what technologies might be appropriate for the purpose and what uses the data might serve to those citizens when they get it.
These are important questions to answer because it is widely accepted that making good quality public data available can reduce corruption, and contribute to a greater level of accountability and transparency in policy making.
Amongst the sixty or so attendees at the event organised by Phil Archer (from the W3C) were Franco Accordino, (Head of the European Commission’s Task Force Digital Futures), Andrew Stott, (Public Sector Transparency Board, former UK Government Director of Transparency & Digital Engagement) and numerous academics and experts in the field.
The vision, which largely united rather than divided those at the event is one of a world where people have better, quicker, more efficient access to government information, which allows them to hold their representatives to account and generally improve the way they connect with and consume public services.
This might mean better ways to track a policy decision that effects your child’s school or data that enables you to find out which of allotments in your area have the shortest waiting lists. The availability of data is also seen to be a major potential contributor to new ways of designing policy, which is something that very much fits with the EU’s Digital Futures project
It’s worth mentioning that there was an almost unanimous recognition that the best way to make data really valuable was to used the linked data model. There was a particularly good session from Tim Davies on open data and engagement which described the five stars of working with linked data.
While there was widespread agreement on the technologies and the general idea that great things could come from making data available, a number of people agreed with Vagner Diniz, who expressed concern that governments with scarce resources were incurring expense to release data without knowing whether it would ever be of use to their citizens.
The predominant business case from a financial point of view, both for the public bodies investing in the infrastructures needed to make the data available and for non-governmental operators looking for commercial opportunities, is that considerable efficiencies can be found when people, both within organisations and outside them, no longer struggle to get the information they need.
That might not have the brave new world ring of transparent policy-making for nations of “smart” cities and towns whose citizens’ lives will be made easier by good quality, timely data at their fingertips, but it might be enough to get the ball rolling.
Anti-corruption and transparency campaigners are putting governments’ willingness to release data high on their check-list when judging a country’s openness. Many government institutions see enough benefits in terms of efficiency to invest in this area and there are a growing number of web developers having fun with the data available producing apps at Hackathons. For take up of the services for the population at large to become widespread however, there is a need for more products that are more directly focussed on solving real world problems and meeting user needs.
Overall the two day event was an excellent opportunity to discuss the current state of play regarding open government data. This is an area that is getting a lot of attention and I’d recommend that anyone who’s interested looks through the agenda and reads some of the papers (all of which are around 2 pages long).
And to round off, here’s a picture of the wonderful Atomium, which is an appropriate shape building to appear on a post about linked data. I vote that the next meet up on the subject takes place in one of the balls!