Select Your Language

Introduction

At PSOW, we are keen to learn how we can improve the experience of our service users. We have five Service Standards committing us to excellent customer service.  Below, we discuss the telephone survey of our complainants that we undertook during 2019/20 to find out how people rate our service and how we performed against our Service Standards. We thank all the respondents for their time and insights.

Research, background and design

In the past, we gathered feedback from our service users mainly through a survey located on our website. Service users would be invited to complete the survey by following a link in all case-related correspondence. However, we found that the response rate to this survey was generally low – for example, during in 2019/20 it was completed 75 times. The pool of respondents was also entirely self-selecting. This meant that we had no way of ensuring that the survey results were representative of the experience of the broader pool of our complainants.

For these reasons, we decided to undertake additional customer satisfaction research through a targeted telephone survey of a representative sample of our complainants. The external supplier we commissioned for this work (Beaufort Research) conducted 204 interviews with our complainants whose cases were closed between April and December 2019.

The sample was constructed to enable us to identify trends in our performance based on:

  • case type (public body complaints / Code of Conduct complaints)
  • case stage (closed at assessment stage / closed at investigation stage)
  • case outcome (intervention / no intervention)
  • case subject (health / other subject)

The sample of respondents was constructed to be broadly proportionate. However, we weighted the sample to include a little higher than proportionate representation of complainants whose cases were closed at the investigation stage. We did this to reflect the fact that the majority of the resources of our office is directed towards investigations.

Themes

The survey results pointed to several themes in the experience of our customers.

First, we noted a very strong correlation between the level of satisfaction with our customer service and the level of satisfaction with the case outcome. People satisfied with the outcome of their case assessed our performance much more favourably than those who were not satisfied. The differences in scores between these two categories were so significant, that, in the interest of fairness, throughout this post we report both scores side by side.

Second, we found that our service users were generally more satisfied with our performance in handling cases about public bodies than about alleged breaches of the Code of Conduct. Some aspects of our service which scored noticeably lower for the latter group of cases concerned how well we explained our role and process; how well we understood a complaint; whether we considered the evidence thoroughly and whether we were impartial.

Third, in general, people assessed our service more positively in relation to the cases closed by us at the investigation stage rather than the assessment stage. There was, however, one exception: our score on timeliness was higher for the assessment cases.

Lastly, our performance on cases related to health scored higher than other cases in relation to staff politeness and quality of communication – but lower in relation to how well we explained our role and process; how well we understood a complaint and whether we considered evidence thoroughly and impartially.

Our findings by Service Standard

Overall, 57% of all respondents agreed that they were very or fairly satisfied with the level of customer service that they have received from PSOW. This score was much higher for those who were satisfied with the outcome (98%).

In designing the survey, we aimed to ensure that the questions allow us to rate our service against our Service Standards. The questions covered 4 of our 5 Standards.

We did not ask questions in respect of our fifth Standard – “We will operate in a transparent way”. This was because that Standard is a little different – it focuses less on the journey of our complainants and more on their general perceptions of our transparency as an organisation. We also wanted to avoid discouraging the potential participants by making the survey too long. However, next year, we will consider expanding the questionnaire to also capture perceptions of our transparency among our service users.

We will ensure that our service is accessible to all

  • when asked about how they found out about PSOW, the largest group of respondents (30%) stated that they found out about us from an Internet search. 19% ‘always knew’ about the Ombudsman. 17% found out about us through a public body (Council or a health body)
  • in a very positive finding, 91% of respondents found it easy to contact PSOW. This score was higher for those who were satisfied with the outcome (98%)
  • only 37% of respondents agreed that they were asked whether they had additional requirements for communicating with us (i.e., required reasonable adjustments). This score was only slightly higher for those satisfied with the outcome than those who weren’t (42%)
  • 68% of respondents agreed that they were given clear information about the Ombudsman’s process for handling complaints – 91% among those satisfied with the outcome.

Overall, the aggregate score on Service Standard 1 was 65% for all respondents, and 77% for those satisfied with the outcome.

We will communicate effectively with you

  • 58% of all respondents agreed that they were asked to indicate how they would like us to communicate with them (although 22% didn’t remember whether they were asked or not). This score was higher for those who were satisfied with the outcome (74%)
  • 70% agreed that we communicated with them using the format requested. This score was slightly higher for those who were satisfied with the outcome (79%)
  • 68% of respondents agreed that they were given clear information about the Ombudsman’s role and what we can and cannot do – but 91% among those were also satisfied with the outcome
  • 73% agreed that the staff they dealt with have treated them with courtesy and respect; 62% said that our staff were easy to get hold off and 53% said that our staff kept them updated throughout the process, as agreed. These scores were on average 20% higher among those also satisfied with the outcome.

Overall, the aggregate score on Service Standard 2 was 64% for all respondents, and 82% for those satisfied with the outcome.

We will ensure that you receive a professional service from us

  • 50% agreed that we had a good understanding of their complaint – 88% of those satisfied with the outcome
  • 62% agreed that we handled their complaint according to the process as explained – 94% of those satisfied with the outcome
  • 58% agreed that their complaint was considered in a timely manner – 86% of those satisfied with the outcome.

Overall, the aggregate score on Service Standard 3 was 57% for all respondents, and 89% for those satisfied with the outcome.

We will be fair in our dealings with you

  • 39% agreed that their complaint was considered thoroughly, taking account of all relevant evidence – 87% among those satisfied with the outcome
  • 49% agreed that their complaint was considered impartially – 86% among those satisfied with the outcome
  • 60% agreed that PSOW explained the decision clearly or fairly clearly – 85% among those satisfied with the outcome.

Suggestions for improvement

When asked about ways for us to improve our service, the most commonly mentioned improvements mentioned by respondents were:

  • more personal contact / phone calls (9%)
  • more communication, updates / replies to messages (8%)
  • more consideration / understanding of a case (6%)
  • favourable outcome / taking on case (5%)
  • better clarification of role / process / decisions (5%)
  • more impartiality / less bias (5%)
  • more timely service (5%)

What happens next?

Although it was good to see that we are performing well in many respects (for example, in making our service accessible), customer satisfaction research is most useful as a foundation for learning and improvement.

Clearly, some of the less favourable opinions about our service were strongly linked to the outcome of the case. This is perhaps inevitable, given that many of our service users feel very strongly about their cases. Also, many have had adverse experiences in accessing complaints processes before reaching us and have little trust in the fairness of the process.

There is not much we can do to improve our performance in this respect, as the outcome of each case will always depend ultimately on the evidence presented to us. However, the results of the survey pointed to several areas on which we can focus, moving forward.

Overall, although a large majority of people felt that our staff were courteous and respectful, it appeared that our service users would appreciate more consistent contact and updates on their cases.

We were surprised by the low score in response to the question about availability of reasonable adjustments. Questions about reasonable adjustments are included in hard copy and online complaint forms and in letters acknowledging the receipt of new complaints. It may be that respondents not needing reasonable adjustments were less likely to remember being asked about this. However, we realise that we need to do more to raise awareness of this option. During 2020/21, we will remind our staff to ask our complainants about this more proactively throughout the process of handling their complaint. We will also seek to collect better data about the number of reasonable adjustments requests made to us and the proportion that we are able to meet.

Clearly, we also need to do more to improve the experience of our service users who complain about alleged breaches of the Code of Conduct. We realise that Code cases tend to be complex. Also, very few result in investigation and/or further referral of the case – therefore, people are less likely to be satisfied with the outcome, impacting also their perception of our service overall. However, we will be reviewing this year our publications about our powers and process in relation to these cases, to ensure that we do all we can to improve the experience of our service users in this respect.

Finally, although we took great care designing the survey, several questions did not render sufficiently meaningful responses to be included in the overall analysis of the results. For example, only 30% of respondent dissatisfied with the outcome stated that they felt able to ask for a review. However, given the brevity of the survey, we lack more detailed information as to why that was the case.  In another example, only 22% of the respondents were aware that we have a policy in respect of unacceptable behaviour. This may be because individuals would only be informed of this policy at the point when it would be needed. Again, however, we don’t have enough information to state that with certainty. We will make sure that we incorporate these lessons into the design of the survey in 2020/21.