Barcelona Principles 2.0 – my first thoughts

http://www.cipr.co.uk/courses/measuring-and-evaluating-pr

The Barcelona Principles 2.0 launched this morning. Unfortunately, I couldn’t get to the launch event, but thanks to Richard Bagnall I was able to watch most of it via his live Periscope stream.

I haven’t had an opportunity to study them in depth or speak to everyone involved in the new Barcelona Principles 2.0, but I do have some first thoughts:

1) Goal setting and measurement are fundamental to communication and public relations

Old: Importance of goal setting and measurement

I like this as it makes it crystal clear that setting objectives isn’t just important, it’s essential. However, I’m not entirely sure I either understand or agree with the explanation of saying “communication and public relations”. AMEC’s explanation is that:

“While the Barcelona Principles were intended to provide a foundation for PR programs, the updated Principles recognize that they can also be applied to the larger communication function of any organization, government, company or brand globally. In fact, measurement, evaluation and goal-setting should be holistic across media and paid, earned, owned and shared channels.”

This doesn’t really make any sense as it doesn’t make clear what they mean by ‘larger communications function’. The clarification that it should be holistic across media and paid, earned, owned and shared channels just confuses it more as these are all part of public relations. What does this explanation of communications include that public relations doesn’t cover?  I think it’s fantastic that AMEC is saying that the Barcelona Principles should cover all aspects of communications, but they are potentially misleading people by saying ‘larger communication function’.

2) Measuring communication outcomes is recommended versus only measuring outputs

Old: Measuring the effect on outcomes is preferred to measuring outputs

This is a lot better. The old one implied that it was wrong to measure outputs. In fact there is lots of value in measuring and understanding outputs, but never on their own. The updated principle also specifically calls out advocacy as an outcome that can (and should) be measured.

3) The effect on organizational performance can and should be measured where possible

Old: The effect on business results can and should be measured where possible

This fixes one of the fundamental flaws in the original, namely that it applies to every organisation and not just ‘businesses’. However, that’s rather semantic and the more important change is it recognises communications impacts every aspect of the performance of every organisation and even more importantly this in turn changes the overall performance of the organisation – for good or bad. I’m less clear about the second part of AMEC’s explanation which appears to narrow it again by bringing it back to ‘integrated marketing and communication’.

It also doesn’t tackle the ‘elephant in the room’ on measuring the effect of communications on organisational performance. How do you isolate the impact of different communications channels and the impact of external environmental factors such as the economy, society etc? For example what role did the communications campaign play in improving staff retention at a call centre, compared to the fact that unemployment in the town increased by 7% during the same period?

4) Measurement and evaluation require both qualitative and quantitative methods

Old: Media measurement requires quantity and quality

To be fair to ‘old school’ PR measurement this is the bit it most often did better than it did on measuring outcomes rather than outputs. AMEC’s explanation of this principle is great, but I don’t think the update communicates it clearly enough without the explanation:

The updated principle recognizes that qualitative measures are often needed in order to explain “the why” behind the quantitative outcomes. In addition, the updated principle reminds practitioners that to be truly objective, we need focus on measuring performance (be it positive, negative or neutral), and avoid making assumptions that results will always be positive or “successful.”

5) AVEs are not the value of communications

Old: AVEs are not the value of public relations

It’s a tragic indictment of public relations that we even need this one. When I came into PR more than 25 years ago we knew that AVEs were an idiotic measurement. Quite why we should be having to reiterate this in 2015 beggars belief. I’ve written extensively about this before, but the fact that this principle is needed means I should probably do an updated blog post on it.

6) Social media can and should be measured consistently with other media channels

Old: Social media can and should be measured

This is a welcome improvement as the old one was a bit ‘doh, that’s obvious’. The explanation of the new one highlights the importance of consistency and the need to measure engagement and quality, rather than just ‘vanity’ quantity measurements. On this principle I always emphasise that just because you can measure it doesn’t mean you should. There are almost too many ways of measuring social media that you can get side-tracked on the hundreds of potential metrics, rather than identifying and focusing on the ones that matter to you.

7) Measurement and evaluation should be transparent, consistent and valid

Old: Transparency and replicability are paramount to sound measurement

When the original Barcelona Principles came out this was the one that concerned me as I considered that the way I measured and evaluated was better than many others. That meant it was my USP. A unique selling point that demonstrated why clients should pick me. The idea of opening it up to the world was frightening.

I was wrong. I was so wrong. What opening it up and talking about it means is that it improves considerably as every conversation I have about PR measurement and evaluation gives me new ways of thinking about it. So over time by being transparent I’ve got better myself.

The other important point is that if you don’t know how the secret black box works then you can’t trust anything that comes out of it. When I’m running modernised PR training courses or helping clients to review their PR and digital agencies I always say that if the agency won’t or can’t explain in detail how it measures and evaluates then you need to stop them doing it and get someone you can trust to do it. Frequently this means not only getting someone else to do the measurement and evaluation, but getting a new PR agency as the inability to do something so important competently calls into question its ability to do other things.

The new principle is a lot clearer than the old one and thankfully drops the awful word ‘replicability’ which is so hard to get your tongue around!

How to improve your PR measurement and evaluation

Barcelona Principles 2.0 just measure communications not PR

Excellent as the Barcelona Principles 2.0 are it’s important to remember they are just to measure communications, not PR. Public relations is about reputation, which is a result of what you do, not just how you communicate it.

If you’re inspired by the new Barcelona Principles 2.0 to improve how you measure and evaluate public relations and communications then please get in touch. I run custom in-house training programmes for PR agencies and in-house PR teams as well as providing consultancy and advise on modernising your PR strategy. In the UK I also run two-day training courses for the Chartered Institute of Public Relations. Dates for the ‘Measuring and Evaluating PR’ course for the rest of 2015 are:

  • 16-17 September | London (taught by Andrew Smith)
  • 24-25 November | Scotland (taught by Stuart Bruce)
  • 2-3 December | London (taught by Stuart Bruce)
  • 9-10 December | Newcastle (taught by Stuart Bruce)

I also run similar courses in a variety of countries in Europe, the Middle East, India and Asia both directly and through a variety of third party training companies.

Contact me about any of the scheduled UK courses, running one in your country or organising an in-house session.