Care at home: reflections on quality

Annie Gunner Logan

The Care Inspectorate’s first themed report on care at home shows that the voluntary sector is still at the top of the tree when it comes to service quality. It’s worth highlighting the figures, I think, just to see how far ahead of our public and private sector counterparts we seem to be:


Percentage of services with all grades at 5 and 6, by sector:

Voluntary sector
50.1%   (194 services)
Private sector 29%       (69 services)
Local authority 23.2%    (26 services)
NHS 0%         (0 services)   (N.B. there are only three in total)

At the other end of the quality spectrum, 5% of private sector services didn’t get any grades higher than a 2; the equivalent figure for the voluntary sector is 1% (and, to give credit where it’s due, there are none at all in this category in the public sector).

A similar picture of voluntary sector quality emerges from the figures relating to the proportion of services about which complaints have been upheld, where we lead the field again at 7.2% whilst local authorities and the private sector stand at 12.6% and 22.1% respectively.

But just what is it about the voluntary sector that produces these results, time and again? This is a question over which there has been a great deal of speculation, yet curiously little systematic investigation.

According to the report, there are differences between the sectors beyond their gradings record: the private sector, for example, provides mostly ‘stand-alone’ care at home services, meaning that these services do not combine care at home with housing support.

Meanwhile we learn that local authority care at home services tend to be comparatively larger than others: 40% employ more than 100 WTE staff, compared with only 13% across all providers.

This information is interesting, but it leaves me wondering whether it has any significance in relation to the document’s focus on quality. Is there a correlation, for example, between the size of a service and its ability to achieve the highest grades, or is that just a coincidence? Are ‘stand-alone’ services more susceptible to complaints being upheld, as the figures would seem to suggest, or are we looking at a random statistical accident?

The report goes on to highlight the various characteristics that influence the awarding of grades. It’s reasonable to assume that the positive attributes outlined in the narrative are more common among voluntary organisations than in other sectors, although again, the report doesn’t offer any suggestions as to why this might be the case.

What’s even more curious is the absence of analysis in relation to the characteristics of poorly-performing services.

Take this quote from an inspector of a service with low grades: “Several of the staff we spoke with described the problems they faced sometimes having to cut visits short to make up for the lack of travel time between visits. One member of staff commented that they frequently found themselves running over an hour late by the time they reached the end of their morning rota.”

What is needed here, says the report, is clearer guidance for staff travelling between visits, and the introduction of a ‘contract’ between the person using the service and the organisation providing it. Well, maybe: but in the experience of providers, a different kind of contract lies at the heart of many such problems, and it’s the one issued by local authorities saying that they’ll only pay for ‘contact time’.

Here’s another example: “Some services with low grades had a high turnover of staff or recruitment issues which had not been addressed by management.” The reasons why care at home services might experience high staff turnover or recruitment difficulties are not explored, and correspondingly there is no suggestion that anyone other than service managers have a responsibility to put it right.

This is important, because the principal reason why CCPS (and others) supported the merging of functions held by the Care Commission and the Social Work Inspection Agency was precisely because of the potential for a ‘whole-systems’ approach to the scrutiny of care and support, and specifically the connections that could then be made between the quality of services on the one hand, and the quality of all the other relevant processes on the other, including care management, assessment and commissioning.

If such an approach had been taken in relation to this report, then it might have been able to identify – for example – which authorities continue to purchase poor quality services, and why; whether and how commissioning and contracting practice influence a provider’s ability to achieve high grades; and what role there might be for outcomes-focused contracts to improve care at home. As it is, these issues remain unexplored.

Sure, the report says that commissioners should learn from the report’s findings, and that commissioning practice needs to improve, but it doesn’t explain why, or how. Providers at a recent dialogue event held by CCPS with the Care Inspectorate, in the context of its review of inspection methodology, emphasised the importance of ‘joining the dots’ a bit more clearly in this respect.

Our own 2012 report on care at home suggested that one of the key determining factors for quality is the extent to which commissioners are able (and indeed willing) to satisfy themselves that hourly rates are sufficient for providers to attract, retain, develop and reward staff capable of offering a service with all the positive characteristics highlighted by the Care Inspectorate.

Our report didn’t go so far as to suggest that all you need to achieve higher grades is higher rates (if that were the case, many more local authority services would surely top the tables). But it still seems reasonably clear that forcing rates ever further downwards isn’t going to get you there either.

Many of us in the voluntary sector believe that a big part of being a values-based organisation – which the Care Inspectorate recognises as crucial to good quality – involves valuing staff, including through appropriate remuneration. Some of the quotes in the new report drop a very heavy hint in this direction: “the management team seem to run on a very tight shoestring…staff support is laughable”. Once again however, all the recommendations are for providers, not purchasers.

Still, the report provides a clear indication that if commissioners want high quality services, they know where to come. In which case, we all ought to be concerned that the proportion of care at home services provided by the voluntary sector is decreasing, not increasing.

Time for some serious scrutiny of commissioning, I think.