Every day I love you less and less

Communication used to be fun for me. Digital communication especially so. In the mid 90s it was a blank sheet of paper, or one only scribbled all over in pencil. Common sense was all it really took to say what you wanted to say, online, to the audience you wanted to reach.

Since the digital revolution of the last decade (at least) – and as ‘organisations’ make their online presence a strategic priority – it has become increasingly hard to keep that clear line of sight.

Take website rationalisation in the UK government. It is a perfectly simple and absolutely right policy. The information was often badly managed, not maintained and completely impossible to find, notwithstanding the cash that was being poured into a plethora of websites.

Put in its simplest form, website rationalisation means that all public sector information for citizens can be found on Directgov, and for business on businesslink.gov.uk (corporate information stays with the departmental websites) by 2011. This requires convergence of the content on the two main sites and throws up the inevitable cry of: what about the old stuff? Clearly, content that was written yonks ago needs to be re-written and there are new style guides to consider &c &c. But we can’t just switch off the old sites, it is wrong to have broken links in recorded answers to PQs/PMQs, that information must remain in perpetuity; and once you go down that path you end up in all sorts of mind-boggling complications. The National Archives provides the obvious solution (but that is so not as simple is it sounds – because I am nice I will not drive you down through that particular ‘detail devil’). Nor can you switch off urls, as to do so risks cyber squatting (on non-.gov domains) by questionable folk.

*sigh* you see… by the time you have wound yourself up in knots about this, the simple pleasure of getting the right information to the right audience is swept up in such a maelstrom, you wish you never started! but you can’t do that…

Then along comes a new lovely clean simple way of communicating online: one that is not simply a push of content…

WEB TWO (twenty if you’re cool)

Oh how attractive this is to the frankly ragged people like me; and to be fair the bemused policy units, communication and marketing teams, press officers and the rest: aching to be relieved from the too complicated discussions around getting the ‘old, flat’ content to the spangly new macro-sites (and keeping the… yes you get where I am going).

And so we have seen the remarkable rise in supremely fantastic new work across the public sector digital arena, using social media tools: monitoring, influencing and engaging in the *hopefully* appropriate digital communities… so much so that I cannot keep up (unless I give up the day job and simply watch).

In the last 18 months the most desired digital skill set has not been the ability to craft and manage online content, rather the canny knowledge of the community manager: someone who understands how everything works NOW, and can steer a department/organisation into utilising crowdsourcing, cloud computing and Open Source software.

This is all well and good; it honestly is the Good Life of the internet: community based communication.

But it’s not that simple.

Now we have embraced social technologies we come to the problem of data. In order to continue with this trend of ‘going to the people where they are communing’ we must listen to what they need – and increasingly those who enable us to utilise these social tools demand that the raw data be free. I don’t mean personal data about you and I, I mean the data feeds. Give it to us, they say, and we will make our own stuff in a way that we understand.

The answer to the eternal cry of ‘How can we engage the young people’? Give them the data and let those who know what they are doing, create something that their peers will understand.

And so we find ourselves in a quandary. Not because anyone is precious about the data, rather it is not ready; often it has not been held in any format that is easily shared; sometimes data sets have been held in different formats and updated by a variety of people; borders and boundaries differ &c &c.

In order to free this data, a cross-government (central and local) audit needs to take place; and as with the rationalisation of content onto Directgov and businesslink.gov.uk, a redrafting and ordering of the raw data needs to occur, APIs created, ratification of the accuracy, maintenance contracts drawn up, SLAs…

*sigh*

It’s just never as simple as it seems, but we need to do this work. All of it.

I just wanted you to understand how complicated this all is 🙂

Oh and by the way, go and sign up to this: http://www.mashthestate.org.uk/index

#babysteps

PS Apologies to the Kaiser Chiefs… er not sure what I am legally up for when using a song as a blog title.

This is the last word on customer retention

As a champion of social media I am struggling with the moral dilemma of writing a new post based on the one-to-one conversations I have been having in light of my recent musings. How can I credit you all when you do not want to post your opinions in the comments? Only solution I see is to wrap up my own work and include highlights of what you have all taught me. Everything that you have sent to me has been really useful, thank you. Special thanks to Adam Burr from Logica who has been hugely educational and whom I shall quote extensively.

At the end of my post yesterday, I said that I would tie it all up for you: what I have done so far, benefits measurement and Press Office.

To start with the latter: you need to let them know what is happening, give them lines to take on what you have done with the website.

Benefits measurement: Ben ‘just wikipedia me’ Hammersley (yes I name-dropped, and?…) says that we should be fine with 404 stats. I agree to a point, however, if the technology is OK and we work it well that does not necessarily mean that we have reassured our stakeholders and readers whilst we play online pick-up-sticks. (More on this later).

My suggestion is that you put an error 404 capture on, just to see how you are doing and remedy all 404s as they come in. In addition, you run a customer review on the site, every three to six months. I would use a specialist company to do this. This would enable the e-media team to feel fully confident that the people they are delivering a service for – and the people they are delivering the service to… are happy. Not only feeling confident, they will be able to back it up with real user insight.

How did we get here…

Original post explained the following:

Exec sum – or similar

In researching this subject I spent many hours on the internet looking for the words: keeping your customers when changing url/retention of online customers. Surprisingly, I found nothing that gave any practical advice. Through contacts in Google, in the dark arts of Search Engine Optimisation (SEO) and in the National Archives, I have found that the answer is:

1. not simple or singular and

2. relies heavily on user input.

The two things that strike fear into the heart of all professional communicators.

In this blog, I have tried to simplify those principles that are imperative to the retention of customers. I have also attempted to provide an action point(ed) list of things to do to complete a diamond, bronze and tin version of customer retention – but would welcome all thoughts… please!

The rest is down to you, your comms team and marketeers.

This blog should also help you avoid time-consuming pitfalls.

Since then I have learned much and have changed my view. The 301 and 410 redirects will do it all for you, but as a business you should reassure your customers through old-fashioned comms. For this you need

  • some kind of reverse linking tool to see who is driving traffic to you
  • analyse that and ensure that the top 150 referrers are well looked after
  • put a page on the site that you can point people to
  • ensure that you are retaining your customers through measures

Now, Ben started an interesting (although offline) argument about the value of being this anal. Adam Burr put it most succinctly:

Why the URLs are changing

The existing URLs are, for the most part, not good at all. They are being changed because they are being improved! Anyone who has been on the fco website and clicked around for even 1 minute will have seen the ugly URLs I am referring to.

Why there should be comms

Because authoritative 3rd party sites such as blahblah.gov.uk are trusted by search engines. If these sites continue to have the old links then they are effectively asserting (with implied authority) that we do indeed have content at that URL. Ben is right that things will work, but an outdated link is still an outdated link. The nuances he is missing are…

  • Some of the outdated links will lead to the “page gone” holding page. This will refer people to site search and TNA, but this is not ideal. It would be better for 3rd party webmasters to provide a link to the next-most-helpful-page on either our website or someone else’s.
  • The 301 redirects shouldn’t be necessary in perpetuity. For one thing, they lead to a slower user experience, particularly if the user in question is in a part of the world that is distant from our servers and not well equipped from a bandwidth point of view. After all this is the FCO site! They also add to the load on the server as the mapping list will be large. It should be possible to monitor the number of 301s being issued and remove the redirects when the frequency of their use reduces to below a set level. The 410s can take up the slack.
  • And finally, it is inelegant and not “joined-up gov” or “partnership” to not inform the third party sites of the redesign and consequently leave the Internet littered with harder to type, harder to remember, non-up-to-date, soon to be obsolete and slower-performing links.

Aside

I accept that partly it depends on one’s philosophy to web URLs. When an organisation publishes a piece of content at a URL, are they really accepting responsibility to respond to requests for that URL in perpetuity? (I accept that PQs are a separate case as Law seems to mandate the answer – but what is the principle that dictates this for every other URL?) I think change is OK as long as there is a good reason and as long as it is properly managed, which is what we are planning to do!

He was not happy with that and continued:

“Gone off on one” a bit trying to understand whether URLs are some kind of perpetual obligation under accepted webmaster etiquette and bet practice

I found this essay which is interesting. It contains an interesting quote:

“Any URL that has ever been exposed to the Internet should live forever: Never let any URL die since doing so means that other sites that link to you will experience linkrot.”

But also quotes the World Wide Web Consortiums standard that a server may return a 410 in which case the requester should remove their link or a 301 in which case they should amend it. So clearly, there is some room for debate as to where the burden of responsibility actually lies or perhaps how it should be shared.

I think that the practical reality is that no-one wants to maintain forever all the 301 redirects for any page that they have ever had on their domain. I feel that there needs to be a statute of limitations whereby once honour has been satisfied they can be withdrawn.

I am afraid that I cannot be any more explicit than this – it is an interesting discussion (for anal people like me). Would be interested in hearing your thoughts.

Customer retention: update 3

For this to make sense you do really need to read my last two posts on customer retention… or you can just not and read on here but it might be a bit muddling… quick update:

We know about the 301 and 410 redirects and switching these on permanently is sensible. Hansard and The National Archives (TNA) are being superstars in a) accepting our proposed solution for parliamentary questions that contain answers which link to the current FCO site and b) harvesting the whole of our site before we cut-over – to sit forever behind TNA wall and remain available in perpetuity should any queries arise.

Next question was about the blogs. Obviously we need to keep the stuff that we have and we need to ensure that RSS feed readers will obey 301 and 410 redirects. I am assured that this is the case – relief all round!

So far so simple.

Now it is my turn to tackle the comms. I got hold of a Nedstats report of our top 1000 referrers – but anyone can get this info from Google reverse linking – and:

1. Deleted everything after the first 150 referrers (this was because when I looked, there were so many dupes after the first 150 it was silly)

2. Deleted all search engine referrals (we are handling them with redirects and xml sitemap)

3. De-duped the remaining sites

4. Categorised them into: (a) those that we owned, (b) those that we are associated with (in this case .gov.uk domains) and finally (c) private sector sites such as Expedia etc

5. Stopped to admire beautiful and small list of key referrers

6. Created key messages for each group

Bingo

Now all I have to do is:

  • create a standard by which the success of customer retention can be measured
  • talk to the Press Office about all offline comms/printed matter and assess the size of the task there – do we need to reprint anything?
  • Wrap up all that I have done these last two weeks and precis it, with detail of whom I have spoken to and what we need to do next in each case

I feel strangely inadequate. What started as a dramatic and scary task has broken down into something beautifully simple. I shall update you all at the end of my fortnight.

Update on customer retention

After writing about how to retain customers, I was duly summoned by the FCO (my employers at the time) to put my theory into practice and go ahead to make it happen. I am half way through my fortnight of doing so, but thought it might be useful to update you all with how this translates in reality – with some incredibly brilliant help from Adam Burr from Logica.

Disclaimer: my boss, the wonderful Tracy Green (head of the e-media team in FCO) knows that I share this information, it is within the bounds of public sector knowledge share, and full accreditation is given where it is due.

The FCO web project is Prince 2 certified (I am assured there is a Prince 1 by the fabulous Dave Briggs, but cannot remember what it supposedly did) . Anyone who is either Prince 2 certified, or has worked on a Prince 2 project will realise that there is a requirement for minutising your work – which in this case is rather handy for blogging! My project manager, Darren Roberts from PA Consulting, insisted that I turned my musings into a set of deliverables over two weeks. This helped focus the mind…

In between writing the original blog post and the FCO asking me to make this real, I spoke to a contact in Logica, who could answer the technology questions for me – as it had become obvious that there was a relationship between technology and comms, and the money needed to be spent in one or other area. I learned so much and clarified the problems we were facing as follows:

1. Not annoying those who regularly use the site

2. Retaining the support and authority of key ‘linkers’

3. Checking that all Parliamentary Questions held at The House – containing answers referring to FCO web pages, would continue to point to the relevant information (pretty key)

4. Doing the decent thing with the thousands of people/companies reliant on regular updates from the FCO and associated sites

Adam read through my suggestions and proceeded to talk in a succession of numbers. After two meetings, several emails and a document, I think that I can explain what he suggests we do (from a technology/automated point of view). It is beautifully simple – so simple that I am sure it is the perfect answer.

Bear with me whilst I explain.

Glossary first, before I go on you need to know these two things:

Error 301: this code means that an old url which contains content that has moved, will be discreetly redirected to the new url – however, it sends some silent message to search engines that will ‘accelerate the correction on the search egine indices’ (quote Adam Burr). I suggest that you explore this further off your own bat if you need a full explanation.

Error 410: this a code that works better than a usual 404, because it explains that a page has moved permamently rather than being temporarily out of action. It also enables you to tailor redirects. Once again, this explanation could do with more research, do go off and look it up of your own accord.

Right, now you are ready for the beautifully simple automated solution.

We have identified two problems: Finding:

1. Content that has migrated to a new home on the beautiful new FCO website

2. Content that has not been migrated to a new home on the beautiful new FCO website

There are differing sets of reasons for why we need to ensure that all content is re-findable, but who cares? If we can solve the two redirects – we are winning.

So, the decision that we are musing over most seriously is for:

Problem 1: Content that has migrated – we put an error 301 ‘page moved’ notice on. This will help our readers, and the search engines.

Problem 2: Content that has been archived – we put an error 410 on, giving the reader a splash page with the opportunity to go to the new page and find updated information, go to the new website search page to hunt down what you need, or go to the old page that has been stored in The National Archives (this is a whole other story I am not so sure you readers will want to hear about, but if you do… yell)

How beautifully simple is that?

Now all that is required is a dissemination of the stakeholders, linkers and subscribers, and three tailored messages for each.

I will update you next week on how we handle the comms around this, and the other brilliant stuff that Adam Burr has un-earthed that might be even MORE useful!