Once upon a time, I managed a team who were struggling to crack a tough problem. They needed to improve the dreadful user experience of a major online transaction, but the ‘owners’ of the transaction – a ‘partner’ organisation, in Government-speak – refused to let them have access. The ‘owners’ talked about the extensive user testing they had already done. My guys asked to see it. The Comms people said the IT people had it. The IT people were too busy building the transaction and frankly, digging out that email would put the project in jeopardy.
My guys would ask nicely if they could test the application. The offered to bring in the team nominally responsible at the heart of government. In a rare moment of weakness, the transaction owners offered a 60 minute demo in their far-flung office, but no hands-on access. The deadlines crept nearer and nearer. The emails became more colourful. Worried submissions started to make their way to ministers (who frankly had shorter-term horizons).
At this point in the story, I’d like to point to a dramatic denouement where the resistance crumbled and the heroes triumphed, but the truth is duller and more common. There was a lot of bad feeling, something a bit shit went live, and an unknown number of customers were annoyed or confused.
Where I think Martha’s wrong is in centralising content management and the user experience full stop. We’ve lived with web convergence and a single-domain (well, supersite domains) for a while now, and that works to a point. At best, it’s a seamless integration and quality check; at worst, it’s a pleasant fiction that does no harm.
But she’s gone further than Varney did (retyped from the published, scanned PDF):
Recommendation 3: The model of government online publishing should change radically, with a new central team in Cabinet Office in absolute control of the overall user experience across all digital channels, commissioning all government online information from other departments
The proposed model would see ‘Departmental experts’, presumably policy owners, producing content on commission from the central team, who would manage a ‘shared, agile, cost-effective suite of web services’ to publish it via a single domain, perhaps using departmental subdirectories for navigation.
Now, I’ve been proliferating government websites for some time, it’s currently part of the way I make my living, and the recommendation above is likely to render many former colleagues and friends redundant in due course. None of which are valid reasons to reject a good idea, so I’m trying hard to manage the inevitable conflicts in my reactions.
It’s true that there’s plenty of bad practice across the government web estate, and plenty of opportunity to join up and adopt common infrastructures. Though government speaks to many audiences, it doesn’t do so consistently well. And there aren’t the meaningful incentives or threats to those who do it well or badly to lead them to improve.
It’s quite possible I’ve misinterpreted elements of the admirably concise report, but I’m struggling to see how this model will work in practice:
- Government online content, done right, is simply too big for a single site. I’m not sure what commercial sector examples might be relevant here, but perhaps the BBC comes closest, and it seems to have come to the view that closest you can come to harmonisation is in standards for content, common search and a basic unifying navigation bar. Directgov and BusinessLink pare down government content, which is essential for citizens and businesses, but useless for intermediaries, researchers and stakeholders. I hope that cleverer minds than mine will be put to the task, but I don’t see how a useful volume of government content for these audiences can be made navigable in one place, except through search.
- Centralising digital channels poses problems for integrating digital into other aspects of government’s work. Whether and how to devolve web publishing is a challenge every large organisation faces: a centralised model is generally more consistent and probably lower measurable cost, but less responsive, creative and integrated with the organisation’s work – and those are bigger challenges in large departments than somewhere more news-oriented like Number 10. Arguably, a central Cabinet Office team co-ordinating digital should really co-ordinate other communications channels too.
- By separating content commissioning, transactions and publishing from digital engagement, an opportunity is lost. Broadening engagement with policymaking really needs people to be involved in context, not in isolation. And I’m not clear what a digital engagement team in a department would do without a say over platforms from which they can form partnerships, except create sneaky blogs and microsites around the margins (and no, you can’t do everything on third party sites or within a government brand).
- It’s somewhat old-fashioned to view websites as a professionally-managed library: the truth is, stakeholders’ interactions with government often happen at an individual or team level, and professional audiences for policy content want human contact, latest news and full data rather than the high quality ‘guides’ developed by supersites like Business Link. Improving the responsiveness of government and its transparency needs to involve putting a window on the wormery, not laying neater turf over it.
There are definitely some opportunities, and it’s encouraging to see ideas from outside as bold as this, with an early indication of support from Ministers. What might well make sense would include:
- Pooling capability: more sharing of the expertise and resources around government for multimedia production, user research, search optimisation, email marketing and so on
- Shared infrastructures: a small but plural set of platforms for different organisations and different purposes, including low-cost open source social CMSes, heavyweight publishing-oriented platforms, community-oriented platforms and data/document repositories along with a menu of sensible hosting options
- A common look and feel: as mandated in Canada, underpinned with clear, practical and concise guidance
- Pan-government search and shared services for commoditised applications such as vacancies, news, speeches and formal consultation, learning from the best in-house work on API development within government, such as for civil service jobs
- Co-ordinated news planning: integrating government platforms better to promote the big launches, combining the people, tools and partnerships to give big things a proper push online
- A unifying vision: a digital strategy which brings together the various functions of digital and articulates its role, and the key direction for government online in supporting transparency, increasing participation, improving customer service and strengthening public sector collaboration
It’s easy to be critical from the sidelines, and I don’t want to fall into that trap. There’s much to like in Martha’s report, and a real opportunity to make things better. If it’s done right.
For ease of reference, I’ve uploaded OCRed versions of the two documents here to improve accessibility and quotability:
If I remember from the review, activities like running digital campaigns, digital engagement type stuff will still happen through teams within individual departments. I guess the worry is that centralisation could – as you say – lead to redudancies and good people leaving (as their responsibilities shrink), so you have less good digital people in departments with possibly more on their plate, and also less ‘channels’ or flexibility to be creative, i.e.: their own WP install. Although that could just to lead to more creativity all round as people do interesting stuff in the margins.
Also – less digital people championing what is possible from within departments could lead to a fairly by-the-number approach to digital marketing (my area of interest) led by the centre.
In my experience (working as a UX consultant for a government “partner” on a DirectGov / govt dept project) there are a few relatively simple to observe, but difficult to fix, problems at the heart of DirectGov.
Firstly, the individual projects are run by civil servants, often with no knowledge of the web. They are led by their outsourced IT partners to solutions that generate the most profit for those partners with the least work.
Then, as you touched on yourself, there’s the fact that these projects are often driven by Ministerial promises to parliament or the press. In these cases the driver is to get “something” done, by an arbitrary deadline. Quality is very much secondary.
Then there’s the fact that there’s no effective monitoring of the success of these projects after delivery. Press releases heap praise, civil servants get promoted, suppliers get paid, and everyone moves on. Even when the delivered project is of scandalously poor quality.
I know this might sound harsh, but it’s based entirely on my experience of trying to turn the tide of such projects in favour of something that’s going to actually benefit the public.
I agree with the issues you’ve raised too, but I do wonder if the problems I’ve mentioned are going to undermine any approach to govt websites, whatever Martha Lane-Fox says. If the management of these sites effectively moves higher up the food chain to the Cabinet Office, will this solve the problems or just further remove the decision-making from people who understand the issues and problems involved?
I remember getting excited about direct 8 years ago when I worked at Ofsted. The e-GIF and e-GMS projects promised a bright future where local gov, central gov and public services could share date seamlessly and provide front-end interfaces for the public.
Shame to hear that the same blockages are getting in the way now. We could have got this sorted years ago and been in a much stronger position to cut costs, increase efficiency. I think it needs a non-government agency (bit like the Bank of England) to be incharge of data services before we see any significant change. These decisions (and their funding) are decided by short-termist politicians with little or no understanding of the technical challenges or benefits.
Agree with everything above, especially Lee’s comment. Its a strong vision but very very difficult to do in practice.
The reality is that any front-end transactional service needs to be integrated with a matching back end, and some of the big govt departments have back end systems dating back to the early 90’s. Improving the functionality of Directgov is only half the battle – and probably the easier half.
Also – from a citizen’s point of view, they never just interact with a service via the web – it will be a combination of web, phone, letter, face to face – just like with a bank. So unless all those join up within a Department, the citizen will be left with a disjointed experience. It is hard to amalgamate several different channels with associated back-office processes in a large department in any situation – but doing so when you don’t control the development of your own web transactions is doubly difficult.
Finally – do citizens want a one-stop shop?
I for one could easily handle having to go to the Jobcentre Plus website for my jobsearch and JSA queries, to the DVLA to sort out my drivers licence, and to the council website to find out my bin collection days. Directgov doesn’t make it any easier – just different, so is the cost justified?
And the fact that jobcentre plus has a massive highstreet presence but no website is odd in this day and age…
Attempts to engage Student Finance England behind the @directgov website have failed for 2 years plus, with the same problems unfixed despite letting the CEO & Finance Officer walk away with their pensions. Attempts to highlight this through MP’s, BIS, PM’sOffice and Info ombudsman gets nowhere. Very Fast.
Same is true with HMRC, DVLC, Passport Office.
Individuals protecting their own department, budget, job, without any comprehension of the bad light under which Govt IT is shown to be incompetent to very online citizen.
pass it on to Martha.
Two comments really stick out for me here: “something a bit shit went live” (from the post), and ” the driver is to get ‘something’ done” from Lee’s comment.
I can’t help but think this is the depressingly modern British way of thinking – it’s more important to have an X than it is to have a good X. In a way, the British govt’s approach to technology is akin to supermarket food – having a “website” is like having meat; get it as cheaply and conveniently as you can, and at the end of the day, no-one can say at least you *didn’t* have a website/some meat.
I suspect that things will be fairly dire so long as nobody challenges *why* these things should be built. Having a website because “everyone else has one” leads to crap projects. Having a website because it fills a functional gap leads to good projects. It’s not difficult, but changing people’s agendas is.
Great breakdown of the issues and opportunities, thanks for sharing. Centralisation is also likely to hamper continual improvement. I note that the Canadian sites you reference are still 780px wide.
Another great summary, Steph, from a man who’s worked at the coal face!
Incidentally, on the subject of government content being too big for a single site, clever minds have been looking into this. Fresh Networks proposed a workable solution for a ‘Resource Library’ of all cross-government content earlier this year. Based on the principle that many people come to government websites just looking for a single document, but may not necessarily know which department produced it (with the added confusion that it often moves around following Machinery of Government changes), it involved a faceted search model which could easily be extended to incorporate all public policy documents.
Although the project was shelved following the general election, the work had already been done and perhaps it could still see the light of day.
At the risk of being scythed down by a Directgov SWAT team I thought I’d throw in my 2p worth.
I’ve posted here before about SOME centralisation being a good thing. Put all the key transactions in one place, make them easy to use and market the hell out of it – job done for a large percentage of user journeys.
Plus a bit more centralisation would stop departments reinventing the wheel when there’s no need, with money being spent over and over on the same things (e.g. how many times do you need to rebuild a news page?)
But I worry that this solution goes too far. Maybe I’ve misunderstood what’s being proposed, but how can one team deal with the hundreds of requests they’re going to get? How can one platform deal with all eventualities? If you want to do anything you need to file a CR. If you want to publish something, see this flowchart. If you don’t agree with what we say, well, tough…
I and others have posted here about a looser structure comprising shared infrastructure, common templates that anyone can build on and release, developers sitting in depts with digital comms people so they can break new ground every day.
Surely we need a bit of both here – one big destination for the key stuff and the ability to innovate at the margins.
As for there being a lot of learn from the commercial sector (under point 1 in the MLF review), well yes there is but you need to look in the right places.
The ideal solution is not big, corporate and centralising like what’s being proposed (the publishing industry is littered with such failures) but rather flexible, able to iterate, open to ideas from all comers. Probably a lot like lastminute.com when you started it Martha.
This is a very interesting topic, and too much to take on all at once. I feel a need to make one thing explicit, so we can have a debate about it – and that topic is, what kinds of things should government standardise, and what should we allow to proliferate ?
I used to work at the Planning Portal, and there was a recurring debate. Local planning authorities wanted planning applications to be on their websites. The Whitehall department wanted a single, national portal for information about planning in England and Wales. The architects and surveyors wanted a consistent standard so that they could work the same with all the local authorities. Some people felt it should all be on Direct.gov, because it was for the general public. Others felt that since 90% of planning applications were done through a professional agent, there should be a site aimed at agents.
The lesson I learned from this, which might be helpful today, is that we need to be very careful when we argue about standardising and centralising. IMHO, it is good to support several different ways to find the same information and access the same services, in order to meet the needs of people who have different expectations and different levels of expertise. But it’s also good to have standard structures and reusable components than you can build once and reuse all over the place. The difficult decisions are about where to drawthe lines.
I very much agree with you, Steph. To use a road metaphor, I don’t see any problem with a central agency having responsibility for the motorways and encouraging people to use the motorways for big trips, but that’s a long way from saying there shall be no road travel outside the motorway network 🙂
Excellent post, Steph. Really constructive. Thank you.
Full disclosure: I helped Martha with the review. First, bit of a mea culpa: From the reaction today, I can see a need to clarify recommendation 3.
The *last* thing that needs to happen is for all online publishing to be centralised into one humungous, inflexible, inefficient central team doing everything from nots to bolts from a bunker somewhere deep in Cabinet Office.
The review doesn’t recommend that. Trust me! It does, as you spotted, point towards a model which is closer to the BBC – a federated commissioning approach, where ‘commissioning’ is more akint o the hands-off commissioning of a TV series, rather than micro-commissioning as per a newspaper editor. Equally, it recommends consistent, high-quality shared UI / design / functionality/serving. Crucially, it recommends universal user metrics driving improvement (or removal) when content can be seen to be underperforming.
BBC.co.uk has now centralised its UI/design/core-func/backend a lot more than you suggest – there used to be 20+ media players for example. The visual design, and core nav/functional components are now shared, and very consistent, albeit necessarily flexible within bounds to cater for a wide range of audiences. There’s now no mistaking that you’re on a BBC site. And if you’ve learned to use one BBC site, you’ll probably know how stuff kinda works on all other BBC sites. BBC is also moving swiftly to a single platform, and the number of CMSs has reduced by an order of magnitude in the past 5 years. This is a pretty major achievement (all done after I left, natch).
So BBC.co.uk is a single website, with content created and managed day to day by departmental teams on a federated basis, with consistent high-quality UI/design. Cue, major increase in user appreciation, as well as traffic. The same core four user metrics are measured quarterly across all BBC.co.uk sites, and underperforming sites are focused on pretty ruthlessly (see example of BBC Weather in this excellent report on web product management)
Guardian online works in a very similar way. Common, shared UI/design/functional components/CMS/serving, with local editors having day to day editorial control, but their sections being assessed regularly on overall performance.
Central government website portfolio by comparison? From a user need perspective? With a poor user coming in from Google, or a deep link, as do vast majority of users? Please. Don’t. From a *quality of experience* perspective??? Please. Ouch.
For a start, how often do departmental *names*, and hence urls change? I really can’t see many good user-centric arguments for an architecture defined by dozens of silo’d, hermetically-sealed, vertically-integrated departmental sites. And that’s well before you get to the inefficiencies.
Wrt your argument about different stakeholder requiring different flavours/depth of content. Good point. I’m not sure it follows that this content needs to exist on completely different, silo’d sites, though. Yes, nav will be a challenge, but that’s what dynamic aggregation was invented for! (see Guardian topic pages). And, again, 80% traffic is from search or deep links already…
What I really do like in your analysis is the need for a unifying digital vision. A vision which has as it core message a passion for meeting user need, which recognises that, done right, ‘digital by default’ *should* change and challenge core policy, and which brings the design and management of government online services in from over-outsourced ‘IT’ wildernesses.
One final point: The apparent absence of a strong cadre of mutually supportive, mutually respectful, Internet practitioners *across* departments is a major weakness. That’s probably a function of wholesale outsourcing, and Whitehall culture, but it’s a major barrier to nonetheless. Great products come from great culture. Hard problem.
PS Apologies for the scanned pdf. No excuses. Being fixed now the speeches are over.
So which huge system integrator will get the £100m contract to implement the £100k CMS then? Bets on for Serco, Steria, Capita or BT.
Either way, someone will get very rich and deliver very little. OR the Government may decide that civil servants, rather than protecting their jobs and choosing a commercial SI to carry the risk – will be brave and take a commercial decision to invest in a British Company to do a good job of it. There are plenty such companies in the UK market.
[…] reaction from Steph Gray, Simon […]
Things have moved on the ‘cadre’ front (cf Tom’s penultimate para) over the last year or two in my opinion. It feels as though there’s been a genuine shift amongst gov internet practitioners to share and improve things, working more collaboratively despite very different circumstances in parent Departments. But it ain’t a *strong* cadre at the moment. It remains to be seen whether there the review broom will sweep through, starting afresh, or provide recognition and support for the good stuff that is happening, and proper, sharp and scary teeth to bite the legs of the activities we all know don’t work as well…
Tom, thanks for the clarification.
But please, please, please – read John Culkin’s comment above.
In fact, seek him out and get him to help you with the next review.
(Disclaimer, I have no idea who he is – but I do know he’s a wise old, or young, sage).
The BBC site, crucially, does not offer much if anything in the way of transactions.
Most govt sites (the worthwhile ones) do just that.
Always linked to your records. And fairly complex ones at that.
Technologically speaking it’s like comparing apples and oranges.
This is not, repeat not, a ‘new CMS’ exercise as 99% of the universe appears to think.
Matt: Thank you. Fear not, my comment was only intended to clear up confusion wrt the publishing side of the review. As Martha has made clear, the transaction shift is far more important, and as you say, a whole different kettle of fish.
Departments need to be accountable for the delivery of their transactions, but the same logic holds wrt shared designs, UI & components – the citizen should not have to re-learn a different visual, design and functional language of every single gov transaction. And that transaction design capability needs to be developed and nurtured.
My worry with the transactional bit is that moving it all centrally would be a rather complex undertaking. SPINE showed very well that implementations on such a massive scale are troublesome to say the least.
Although I’m all for a bit of variety, a consistent look and feel would if nothing else save a small fortune on graphic design spend. Not sure any other savings would be realized.
Something I’ve never understood about govt. websites is… there is supposed to be ‘one’. There are actually 3000. Some departments have 10s or 100s, depending how you count (often with different designs). If we’re aiming for a degree of consistency, wouldn’t it be better to start at a departmental level ?
Firstly, thank you Steph for your ever insightful thoughts and for hosting this discussion. I am really happy to see similarities between what is being suggested by Martha Lane-Fox and being discussed here and the stuff we have been talking about on the smaller scale of a city web presence. However, I think that, to badly misquote Phil Anderson, smaller is different and I’d really like to talk about some of those differences, even if it is slightly off topic.
The idea of centralising standards, guidance, expertise and controls rather than production is one that was core to the web strategy we recently worked on with Sheffield City Council. However, when talking about an entire city’s web presence (including local businesses and citizens) the control often becomes irrelevant and has to be replaced with persuasion.
The methods then become about providing the builders of web systems with facilities that add value such as a common search mechanism, joined up branding, taxonomies, geo-coding tools, single-sign-on systems. These facilities can then be used by Council departments, partners, businesses and citizens to build web systems from the most effective technologies available, including the open source and low cost platforms that Steph referred to.
To my mind the reasons to build a centralised CMS to manage all content is simply an extension of this. If the features provided by the content management tools are useful and/or there are savings to be made in training/expertise, then it should be used, else, use a tool that fits better. This is predicated on the fact that most of the features that are commonly integrated into a CMS are offered as separate services (i.e. search, commenting, login, profile, semantic/geo-analysis, etc).
I feel that the major challenge of adopting a federated service approach to local government is persistence of existing and enshrined off-the-shelf applications. It is quite easy to justify the cost of creating an transaction with a user experience that meets the required standards when there are tens of millions of potential users and massive savings to be made. But in the local government market, the aggregation of requirements is done (if at all) by software vendors, who seem to have a shallow view of user experience. Each council gets the choice of taking a standard package off the shelf and disappointing and alienating their users or spending more to provide a user experience that will achieve the channel shifts they need. How many will choose to believe the sales material of the software vendors and dismiss the importance of good user experience to save money.
[…] of language throughout the report was at times vague and confusing. Given the fact that it took a comment from Tom Loosemore on Steph Grays blogpost to radically change most peoples reading of a pretty important section of […]
[…] and detail of the recommendations, admirably and openly led in government circles on the blogs of Steph, Simon and Neil. The detail will determine the success, and we don’t have that yet. But it […]
[…] prepared by Martha Lane Fox for Francis Maude, which needs to be read in conjunction with the essential gloss provided by Tom Loosemore (scroll down to the comments, pausing to read Steph’s post as you go, alas no way of linking […]
Tom – thanks for the extra detail and clarifications. I’d like to ask a question about the review – at the start, were you considering all options, or was the direction you took influenced by the pre-existance of Directgov?
I recently heard a story about India’s telephone network, which I will recound (badly) here. In today’s India, their mobile phone network is far superior to their landline network. For years, the landline telephone services were terrible and the infrastructure was really under-developed, and most Indians had poor/non-existanta access to phones . Then mobiles came along, so they just concentrated on that, because it rendered the landline network obsolete. In a way, their comparative failure to implement landlines has left them with a better system – it’s all mobile now anyway, no roads to dig up, no reliance on copper wires, and people only have to pay one bill.
There can be a downside to early adoption.
Maybe the Directgov idea was of its time, and web convergence certainly made a lot of sense in early 2000s, but I often wonder where we’d be now if it never existed. Would today’s brilliant web-strategists, armed with an understanding of how government transations work offline, still recommend a supersite approach? Stupid hypothetical question I know, but there it is…
[…] to be narrowed. (The scale of the challenge involved in Lane Fox’s proposals can be found in this Helpful Technology blog post and its interesting […]
A view from the fringes. Our exec NDPB provides UK wide high-level biodiversity advice to the governments of the UK and detailed supporting/reporting information to the conservation community. Very little is transactional. Very little is targetted at the lay citizen.
As a tax-payer funded organisation we are included in TGWR and will be also be impacted by this review. But the proposed model worries me from a number of angles:
It looks like an England scale initiative so I wonder how we continue to deliver for Scotland, Wales, Northern Ireland, the UK Overseas Territories and Crown Dependencies?
The audience for biodiversity content is very broad (even as a microcosm of the whole of government content). Bringing together the higher level public-facing information from across the environmental and conservation organisations into a central more standard location makes great sense, but its more difficult to see how this would work for the technical, detailed information aimed at a more informed audience.
Could the content commissioning model ever deliver technical/scientific content unless it was devolved to a level that understood the demand for it? Which probably isn’t too far from where we are at the moment.
Could the common infrastructure accommodate third party applications built to deliver dynamic content from other datasources that would sit outside the CMS? Obviously it would be technically possible -but would the integration of such applications be devolved?
Metrics are essential but need to be understood in context. If your content is specialist, your total audience is restricted and even a massively successful piece of content may return raw metrics that don’t compare well to piece of generalist content that actually ‘works’ less well.
The focus on consistent, high quality, secure transactions between the citizen and their government is excellent. Sweeping up the whole of *.gov.uk web content into the one solution for transactions is a much more serious challenge.
[…] reaction from Steph Gray, Simon Dickson, Neil Williams, Michele Ide-Smith, Matt Jukes, Public Strategist, Andrew Lewin, Tom […]
I’m one of the owners of the transaction which this blog post is based on. I read this post when it was first published and asked my manager if I could respond, but they weren’t comfortable about this at the time – as you can appreciate, relationships between different government bodies can be strained. Recently they relented and said I can respond to posts like this anonymously, so I’m putting in this comment for posterity.
Our version of these events is very different from Steph’s. The new portal was developed throughout 2009; we weren’t able to offer access to the development environments outside our offices but there was an open invitation to Steph’s former organisation (and Steph himself) to visit our far-flung office (four hours on the train, incidentally) and test the portal at any time; they chose not to do so until December 2009 but made several more visits after this, up until the launch in April 2010. They reviewed all the content; signed off the design (which was agreed in advance and built to meet their toolkit requirements); and reviewed the results of the £60K we spent on customer engagement and useability testing. They came back with only minor modifications to page instructions before the launch.
As for it being ‘a bit shit’, a million people have successfully used the portal since the launch; we launched on time and under budget despite our organisation going through a swathe of redundancies in April 2010. Nonetheless, I accept we could do far more to increase the useability of our online services; we’ve worked with Steph’s former organisation since December last year to run a new useability project.
The suggestion of Steph’s post is of an organisation struggling to comprehend the notion of web useability and stubbornly refusing any offers of help. The reality is, we’ve spent a large amount of time and money building new services and testing them with customers and we’ve been open to input from partner organisations. I would welcome adding a ‘window on the wormery’ for the public to see the advancements we’ve made, despite being under severe financial and bureacratic constraints.
I rather like the image of people working to deliver public web services as ‘worms’ though – we spend a lot of time fumbling blindly in the dark; but as any gardner will tell you, at least worms help create something at the end of the day. Plus, the idea of sending in Martha’s ‘SWAT team’ to help us worms produces rather a wonderful mixed metaphor.
[…] Blog which argues for leaving things as they are https://postbureaucrat.com/2010/11/a-window-on-the-wormery/ […]
[…] be a flexible platform for digital engagement and make government more efficient and transparent: a window on the wormery, not a neat layer of turf on top. Let Alphagov make digital public services work for citizens, but don’t break it by importing the […]
[…] – policy is another. Despite the awards, I think the jury is still out on whether tidying up the lawn to make policy accessible to new audiences was worth alienating the existing ones, currently […]