Showing posts with label PR associations. Show all posts
Showing posts with label PR associations. Show all posts

Wednesday, 5 March 2014

PR Evaluation Survey 2 - 45% say there's a "lack of information"

This is the second of two posts on small-scale research into current PR measurement and evaluation practices. See # below for the methodology.

The first posting identified that while 80% claimed to formally or informally evaluate PR activity, 43% continued to use AVE as a prominent measurement metric.

This post reports on who is doing evaluation, the percentage of PR activity evaluated, some attitudes, knowledge of the Barcelona Principles and students’ views on the whether evaluation was being undertaken ethically.

Who did the evaluation of PR activities?
Account team/PR staff – 45.6%
In-house research and outside media analysis company – 26.5%
In-house research section – 16.3%
·         It looks as if self-certification of PR activity is common

What percentage of PR activity was evaluated?
100% - 35.7%
90% - 19.0%
80% - 11.9%
·         Good news – Two-thirds of the 80% who evaluate their PR coverage (53.3%) measure and monitor 80% to 100% of their PR activity

Questions about attitudes to PR measurement and evaluation
1)    PR budget is difficult to obtain: 37.7% agree or strongly agree; 28.2% disagree or strong disagree
2)    There is a lack of information on PR evaluation: 45.3% agree or strongly agree; 34.0% disagree or strong disagree
3)    There is a lack of time for PR measurement: 55.6% agree or strong agree; 24.1% disagree or strongly disagree
4)    PR is difficult to measure: 35.8% agree or strongly agree; 35.8% disagree or strongly disagree
5)    Practitioners fear evaluation: 25.0% agree or strongly agree; 42.3% disagree or strongly disagree
6)    Without measurement, PR’s future is threatened: 64.1% agree or strong agree; 20.8% disagree or strongly disagree

·         Mixed messages – 7% more say budget is difficult to obtain; there is a gap of 11% between those who agree that there is a lack of information on PR measurement (a large 45.3%) and those who don’t; a clear majority don’t have time to evaluate ('too busy doing PR'); there is a balance between those who find PR activity difficult to measure and those who don’t, which is an improvement; Many disagree that practitioners fear evaluation, but nearly-two thirds (64.1%) agree that PR’s future is threatened without the consistent use of measurement and evaluation.

·         The most concerning attitudinal outcome is that 45.3% of organisations say there is a lack of information on PR measurement and evaluation methods. This is a negative comment on the professionalism of many practitioners who can’t be bothered to look at abundant resources in terms of online materials (often free), books and training courses. Measurement and evaluation has been a major education and training topic since the mid-1990s and appears to have been ignored by them.

Students were asked about the percentage of PR budgets that were applied to PR measurement and evaluation. Most, not surprisingly because of their junior positions, ‘Didn’t know’ (53.8%) but the next largest valid percentage was for 1-3% of total budget (17.3%), which aligns with other research in the UK and Australia.

Barcelona Principles
Students were questioned whether the Barcelona Declaration of Measurement Principles (AMEC 2010) was referred to or mentioned at their main placement. Their answers were wholly negative with 55/55 ticking “NO”. Bearing in mind the support that AMEC, CIPR, PRCA, PRSA, IPR, Global Alliance, etc have given to the Barcelona Principles in the past three years, this is a very disappointing result but is similar to US research (Ragan and others) that found low awareness.

Was PR evaluation undertaken ethically
YES – 74.0%; NO – 26.0%
No comment!


# Methodology: PR students at Bournemouth University were surveyed recently about their experiences of evaluation practices during their 2012/13 sandwich year placement. 55 students (85%) took part, voluntarily, in the self-completion survey. As all but one (98.2%) had been on placement for nine months or more in a single organisation, they can be considered valid observers of practices taking place around them or in which they participated. The data were analysed using SPSS which provided descriptive statistics, mainly frequencies. The data used in these posts is based on ‘Valid Percent’ which omits missing answers unless they are a large part of the sample.

Tuesday, 4 March 2014

PR Evaluation Survey 1 – 80% do measurement; 43% use AVE

PR students at Bournemouth University were surveyed recently about their experiences of evaluation practices during their 2012/13 sandwich year placement. 55 students (85%) took part in the self-completion survey. As all but one of them (98.2%) had been on placement for nine months or more in a single organisation, they can be considered valid observers of practices taking place around them or in which they participated.

Headline news, using valid percentages, was that they reported

  • 80% of placement organisations undertook formal or information evaluation of PR activity
  • 79.2% of activity was measured in those organisations that undertook evaluation
  • The main measurement was of media coverage, by 10 times (at least) ahead of measurement of KPIs, social media or organisational objectives (in that ranking)
  • AVE was used in 43.2% of placement organisations; On the other hand, it was not used in 56.8% of placement organisations
Calculation of AVE

Students who reported that AVE was used were asked “How was AVE calculated at your main placement organisation?”

Their verbatim replies indicate that (1) AVE continues to be widely used and (2) some AMEC members are actively offering products and services to calculate the metric.

Got it from Precise; PR Value = AVE x 3 divided by 100
Based on rates from Gorkana
By media evaluation agency as percentage of editorial value
[Media coverage multiplied) X 3
Via Metrica and Cision's algorithms
We rang companies for the figures or used Precise or Gorkana
We used Mymarket Monitor to indicate AVE which they used page space to decide value
A piece of coverage that took up 1/4 of the page, for example, was divided by 4 and x (multiplied) by 3 or 5 depending on whether it was online or print coverage.
PR value x 3
PR Value = Advertising by 3
For press coverage a value was calculated at the end of the month
Depending on the size of coverage times x 3
Through Precise Media and AVE reports for each campaign
We would send coverage to an outside agency (Kantor) they would reply with AVE
By measuring coverage, working out the costs of this space by advertising rates and times value by 3
Calculated the published article against (an) advertising rate card. Also against viewers
Not sure: PR Value x 3 (divided by) 100%. Same strategy applied for all.
From Precise media cuttings package

Despite the Barcelona Principles, which were announced with very visible support from CIPR and PRCA in 2010, AVE is as widely used as it ever was. And AMEC members, who wrote and adopted the Barcelona Principles which barred use of AVE, are leading the way in its continued usage.

More research outcomes follow soon in PR Evaluation Survey 2.

Thursday, 2 May 2013

PR associations - an uncertain future

On April 21, I posted a blog titled Are PR associations past their “sell-by” date? It was a rhetorical question and brought a lot of traffic to this blog along with some comments. It also inspired CIPR presidential candidate Jon White to start a LinkedIn discussion about the questions posed.

(BTW, I am not campaigning for either CIPR presidential candidate. I know both Jon White and Stephen Waddington and wish them well. It’s a benefit that in 2013 there is a civilised debate taking place).
Despite groans from one contributor that a “Professor of Public Relations” might actually be involved in discussion and debate, I have analysed the posts from more than 20 practitioners on this site, the LinkedIn debate and some other blogs (e.g. Stephen Waddington’s “Wadds” and Heather Yaxley’s “Green Bananas”).
These are the headlines:
1)      There’s no concept of what ‘professional PR’ or professionalism is in UK public relations practice. It’s a vague sort of aspiration that has no dimensions;

2)     About half the respondents consider CIPR should enforce a CPD policy as a requirement of continuing membership.  It should make entry more (rather than less) demanding.

3)     Some consider that CPD is too loosely applied at present; others think enforcing it would be a step too far and “would pull up the ladder” on good members who are less committed or able to spend time on training and continuing development.

4)     About half believe that there should be a PR body of some sort, preferably only one. It should be less costly, less London-centric, offer cheaper training and more benefits. It should be more engaged with stakeholders, but less with internal issues. Others were much less supportive and considered CIPR to be past its expiry date like many club-type organisations. “I think the CIPR should hear the clock ticking”, wrote one contributor.

5)     The majority consider that CIPR does not campaign for PR practitioners and their businesses. (PRCA, however, should be congratulated on its battle with NLA which has been successful in the Supreme Court).

6)     CIPR's stance on ethics is soft and relativist. Johanna Fawkes’ comment that “weak engagement with ethics undermines a lot of claims (that PR has) a social benefit, and that most Codes, including CIPR’s, are general statements of intent rather than moral guidelines” captured this.

Overall, there was an undefined feeling that a CIPR-type body should exist but there were no convincing arguments about its purpose or objectives.
Finally, a personal observation of mine on a comment that CIPR be “a provider of hard evidence of PR’s value”. Surely, that is the practitioner’s role to develop campaigns that create value that is recognised by clients and employers. Even if CIPR bestrode the whole communication landscape, it could not deliver what practitioners should be doing through application of research, planning, best practice and applied theory.
For at least two decades UK practitioners have had readily accessible information on research, planning and evaluation but have mainly chosen to ignore it for quick fixes like AVEs and other junk data.