The annual survey of local government websites, Better Connected, has a section about website usage (aka 'take-up').
This is based on information from three sources Hitwise, GovMetric and the Website take-up service from Socitm Insight.
The former and the latter measure website usage but the results on unique visitor numbers to websites is based on Hitwise's 'local government market percentage share' numbers. Subscribers get access to the raw numbers.
Better Connected does contextualise their comments, saying that:
Each local authority will seek to understand the patterns of traffic to its website; this is no easy task, because it is fraught with technical difficulties about the definition of usage. This is not a task that we can analyse in great detail, because local website statistics are not available in a consistent format that would enable us to make a comparison across all the councils in the way that we can for useful content and usability.
However this is exactly what the section does, drawing comparisons based on site usage compared with local authority population to come up with 'take-up' numbers for regions, and singling out particular councils as well as a top twenty as having high 'take-up'.
This is not a good use of - effectively - one set of stats from Hitwise. And I can prove it.
For some time now my council has been sharing access to Google Analytics stats with ten other councils. Here's the comparison of those numbers with those from Hitwise.
* September 2008 - to .gov.uk website - internal use excluded (where possible)
** rounded to give some attempt at anonymity
There are a number of caveats. The n/a is actually Hitwise showing 0.00%. Some of these sites have services sitting on other websites with different URLs, not sub-domains. One of them has extremely high internal usage. Obviously, what stats they choose to share may not include some website areas, hence the district with the high population but low website use.
But there are some logical patterns. I cross-referenced with the deprivation index and unemployment rates and then the per capita usage does show simple patterns. There's also some common sense here. A very rural area might have low broadband and be poorer. Our city, for example, is fairly affluent and has very high broadband penetration.
What looking at Google Analytics shows which is most striking as a discrepancy is a much lower per capita website use than Hitwise. Does this mean something or am I seeing things? If it's true then it's a serious finding in terms of how we are perceiving our sucess.
Email paulcanning1ATgmail.com if you want the spreadsheet.
Needless to say, what sparked my interest was the great difference between what Hitwise said and what Google Analytics said about our city. It didn't tally with what GA was telling or what logically made sense.
Both stats packages their issues and both should not be used in isolation or to draw very specific comparisons as Better Connected has done.
Webstats people, such as those working for big commercial transactional sites, say that there are two things to remember:
- use more than one methodology
- remember that what you really want is trends, not hard numbers to present out-of-context
- Better Connected should get some very specific expert guidance on extrapolating from the data they have available on website usage. This is not a easy skill, to draw out real analysis from webstats data although it is relatively easy to match trends to goals. Better Connected simply doesn't provide the contextualisation that I believe such a professional would provide.
- Better Connected should seek access to those councils using Google Analytics - I'm sure the number is far more than eleven. My experience is that such access is freely given but you could easily draw up a usage agreement, using one of the free licenses regarding how Better Connected could use it. Obviously, the more data the merrier.
Better Connected says that:
In April 2008, the Public Accounts Committee report about central government websites highlighted weaknesses in the knowledge of website costs and usage, leading to the point that:Ab-sol-utely.
“The Government does not know how much it is saving through internet services, nor whether any savings are being re-deployed to improve services for those who do not or cannot use the internet”.
In response the Central Office of Information (COI) has embarked on a programme of guidance about website statistics to be published by the end of March 2009 in order to mandate central government websites to collect new information in the financial year 2009/10. The guidance comprises a set of three documents:
The criticisms have not been levelled at local government, but many local authorities might be equally vulnerable. The guidance will not, initially at least, mandate councils to collect this information.
- Measuring website costs
- Measuring website usage
- Measuring website quality [stats are very useful when attached to user testing, especially around new designs]
Nevertheless, the guidance is likely to be helpful for local authorities and others to follow as representing good practice and reminding decision-makers at all levels that investments in websites should be supported.