I've left this blog

Hello, I'm not updating this blog anymore but you can still find me over at Medium or on my website. Cheers for now.

Search This Blog

Thursday, 11 July 2013

57. Four Ways to do Results Based Scrutiny




My council has been making use of Results Based Accountability.  We use it to support corporate performance work and community planning.  While it’s the Friedman version we use, other outcomes based approaches are of course available.  I'm not saying this is necessarily the best way of doing things but I do think that there are a number of nice ideas in Results Based Accountability that can be adapted to local government scrutiny.
 
Scrutiny struggles more than a little to demonstrate its impact (perhaps similar to academia in this way).  The role of scrutiny committees is to make recommendations to others, mainly local government cabinet members, on service and policy issues.  It can be tricky, therefore, to demonstrate the actual difference being made to communities as a result of individual pieces of scrutiny work.  Demonstrating the impact of the scrutiny process as a whole is even more challenging.

So here are four suggestions, inspired by my experience of Results Based Accountability, as to how scrutiny might take more of an outcomes approach.

1.  Focus on What's Measurable


Identifying population outcomes that describe the conditions we want to see for our communities is a key aspect of Result Based Accountability.  So is identifying population indicators that capture something important about those outcomes.  Scrutiny work should, wherever possible, identify the population outcomes and indicator (yes, one if possible) that the work is seeking to affect.  For the in-depth pieces of work in particular, having a high level outcome measure will provide not only focus but a potential means of evaluating the impact of the work in future.

As an example I recently supported an inquiry into affordable housing which identified the Welsh Government indicator of ‘all additional affordable housing provision by local authority area’ as a focus.

As well as population issues this applies as well to individual services.  Any in depth work should start by identifying the measure that scrutiny, by working with the service, wants to see improve.

2.  Ask Results Based Questions


Results Based Accountability sets out a small number of simple questions in order to help people achieve results both at population (e.g. partnership) level and at performance (e.g. organisation) level.

These questions can be adapted for scrutiny as the basis for a consistent results based approach to questioning.  Specifically these questions would be used as part of the ‘holding to account’ of those who are responsible for the policy or service that scrutiny is interested in.  Here is my take on what those questions might be:

Population Level Questions

  • What quality of life conditions do you want for the relevant population group(s)?
  • How do you describe what achieving these conditions looks like?
  • How do you measure these conditions?
  • How are you doing on the most important of these measures?
  • Are you working with the right partners?
  • What are you doing to identify what works?
  • What do you propose to do in future?

Performance 

  • How much did you do?
  • How well did you do it?
  • Is anyone better off as a result?

3.  Evaluate the Impact


Most in depth pieces of scrutiny work make recommendations to cabinet and then have a process for checking that the agreed recommendations have been implemented.  For us this means signing off the action plans that cabinet members produce in response to in-depth inquiries.  In others words; did they do what they said they would do?

A potentially better alternative that we have been talking about is to evaluate the impact instead of the action plan.  This could mean holding an event for all the people who gave evidence to the original inquiry as well as the cabinet member(s) and anyone else who might usefully be involved.  The purpose would be to see what has changed (easier if you have identified an outcome and indicator) and what influence the scrutiny proposals had.

It may well be that this is already being done in others areas and I would love to hear about it if it is.

4.  Scorecard Your Scrutiny


As well as looking at other services scrutineers should, of course, be thinking about their own practice.  Most local councils produce annual reports for their scrutiny function which seek to show how a difference has been made.  In my Council we have taken a results based score card approach to reporting what has been achieved, with indicators reported and analysed against the following questions:

  • How much scrutiny did we do? (e.g. number of meetings, number of reports published)
  • How well did we do it? (e.g. awareness of scrutiny, involvement of councillors, councillor rating of scrutiny support service)
  • How much did scrutiny affect the business of the Council? (e.g. number of reports to Cabinet, action plans agreed, follow ups)
  • What were the outcomes of scrutiny? (e.g. recommendations accepted, actions completed, perceptions of councillors and officers that scrutiny had an impact)

The main thing missing, I think, is a genuinely good measure of what the public think.  Let’s just say it’s a work in progress.

No comments:

Post a Comment