After I posted in March about Issues with Better Connected's webstats use, Better Connected (BC) author, Socitm, kindly invited me to present at an event in London in May (a presentation partly repeated at #googlelocalgov). Following a conversation with BC's Martin Greenwood I've been invited to contribute more to their development of BC's web stats use section and how council's are judged on the subject in the annual report.
(Better Connected is the annual review and report on UK local government websites).
The Central Office of Information (COI) published guidance on web stats earlier this year.
The following are some initial notes and so I would welcome comment and input on them. There are issues with how the BC review team is practically able to judge councils on their stats use but first I think we need some idea of what best practice is on how they should be using them.
The use of web stats for web development is best tied to goals, and through defining those key goals and associated metrics at all levels of a council: business, web team, other teams (such as comms), and then individual services.
Unfortunately internal politics often means that reporting (upwards) usually takes priority. So how a council better allocates stats resources (mainly time) is entirely consistent with BC's prime task of improving council websites.
- Customer–driven, which usually means tied to online service delivery through processes but could be information-delivery as well as process.
- Linked to Key Performance Indicators (KPIs) — unclear (to me) as to whether there are standardised (apples vs apples) measurement?
- CTR (Click-throughs) — actions, exits, abandonment rates; sources being email, direct/search, shortcut, ads, referrer, drill-down, onsite cross-promotion
- Keywords — search referral, SEO, content relevance
- Content strength — time on page, pageviews
- Benchmarking — against call volume, against other councils (NB, apples vs. apples issues, note city profiles), goal valuation (ROI)
- Is view access shared?
- Trends — pageview, keywords
- Finding problem pages — drop out rates, low/high time on page, click backs to indexes, broken funnels (CTR, with follow ups eg surveys, customer services data, focus groups, usability testing application)
- Other council comparisons — eg Google Analytics view share comparative page drop out rates, low/high time on page, click backs to indexes, broken funnels
- Establishing goal use practice/policy to measure funnels
- Testing different designs
- Goal measurement tied to marketing — landing pages, funnels
- Keywords — search referral, SEO, content relevance (primarily relationship to Google search but audience segment)
- Marketing — external link tagging for CTR (click through) measurement
- Linked to business needs
- Reporting needs
- Work/resource allocation — cost/time benefit measurement
- Is the stats package correctly set up? — are secure sites (forms, payments), other sites, email links, downloads tagged
- Is there integration with CRM?
- Is internal/ other use segmented?
- Planning timelines for reporting — also automating
- Training — is this resourced, is there buy-in/ comprehension
- Benchmarking – how to set, what to compare to (should be apples vs. apples)
- Segmenting — sourcing demographic audience data
- Use of more than one stats provider
- When to hire expert guidance and for what purposes