This post is authored by Dr Karen Nokes (UCL) and Pf. Richard Moorhead (Exeter)
The SRA published its Thematic Review into In-House Lawyers last month and you can read it here. The findings have attracted shall we say some exacting commentary challenging the somewhat upbeat tone used by the CEO of the SRA, examples of which include from Ben White of Crafty Counsel and one led by Jenifer Swallow on behalf of a group of General Counsel.
Those responses have also emphasised the importance of the work the SRA have done and that, whilst the ways in which they have reported the findings seem to be lacking, we should overall be encouraging the SRA’s work in this field. It is to their credit. In-house lawyers, and lawyers’ ethics – a much broader priority area, have both been neglected fields until latterly.
Rather than focus on the results and their interpretation we take a slightly different tack. We look at the methods. This produces a new set of questions about the findings. We are grateful to the SRA who upon request shared their survey together with some additional information about the way the study was conducted.
First of all, let’s talk about the elephant in the room – bias, response bias. The SRA sent out their own survey and conducted their own interviews rather than use an independent intermediary to do so. There are many reasons why they would choose to do this. Expediency could be one; putting research out to tender inevitably adds to the timeline and can be resource intensive to procure and manage. But the result is the regulator is directly asking questions of the regulated: spot the issue? Respondents, even when responding anonymously, are more likely to respond according to what they think their regulator wants to hear or respond in a way that they consider reflects positively on themselves. Response bias often leads to falsely positive satisfaction rates and the painting of a rosier picture in the findings than is actually the case. The SRA don’t acknowledge this; they should have done.
Secondly, the SRA didn’t publish the survey, or the guide used for the interviews. The days of holding back research instruments has well and truly gone. Transparency is a critical part of being able to evaluate a study like this. We need to know if the survey was shonky or not. The SRA has published research instruments before. We urge them to revert to that practice. Only then is it possible to assess the usefulness of the findings. We need to know what respondents were actually responding to.
Specificity regarding the sample is our next concern. The SRA tell us that they had 1,200+ responses from in-house solicitors, with the survey link being distributed via direct email to 33, 515 in-house solicitors and via SRA Update Newsletter and social media channels. 1,200+ could mean 1201 or could conjure up a figure of 1400 or even larger. The SRA confirm that the figure is 1201 – 915 fully completed responses and and 286 partial responses. We don’t know if the partial responses were analysed and made their way into the findings or not. The SRA inform us that they recognise the survey is only representative of those responding to it. This should also have been publicly stated – the issue of representativeness is important – it affects whether researchers can make any claims that their findings are reflective of the population at large.
The SRA has shared the survey with us. Surveys are one of those research methods that appear easy. We have all been accosted at some point by someone asking us a few questions about our shopping habits or being asked to complete post event surveys. Surveys are everywhere. But good, robust surveys take time to construct and it’s harder than it looks. There are decisions to make about whether to make responses mandatory, whether to include free text boxes and what type of scales to use for responses. Here is not the place for us to comment in detail about the contents of the survey but we would highlight three things.
We understand that the majority of responses were mandatory, we don’t know which ones. How does this affect the findings? We don’t know because this isn’t reported.
Secondly, it is important to have response scales that are clear and unambiguous but also allow for variance within the sample. Response options which include “Yes, to a degree” are ambiguous – does this mean a large extent (but not 100%) or a small extent? It could mean one thing to you and something different to me. Binary response options can be useful, but they deny us a finer grained exploration of how people actually behave. The complex, dynamic, and finely balanced challenges and dilemmas facing in-house counsel can rarely be distilled to a simple yes or no answer. Finally, the survey provided space for respondents to provide text-based answers. We don’t know how these responses were analysed. The report doesn’t tell us. This is a shame; often these responses add depth to what can seem stark percentage based responses.
Our final concern is an overwhelming lack of clarity about where reported findings in the thematic review report come from: survey question response, survey text-based response or interview? It is sometimes hard to tell. Some questions referred to in the report (and represented by pie charts) are not contained in the survey instrument. If some of the responses reported are based on interview data, then this amounts to a sample of 20 and not 1201 (or 915). This clarity is especially important when reporting that 80% of respondents said ‘this and that. 80% of 915 completed responses paints a different picture to 80% of 20 interview participants.
Some final points. Empirical research is vital to knowing how the world works and it is much needed in this area. Empirical research is also hard – it takes time and resources, to plan and execute, to analyse and report. BUT if you do it, it should be done with rigour and transparency, otherwise, we end up with more questions than answers.
2 thoughts on “Thematic Credibility Part II – there are more questions than answers…”
Excellent analysis on your part Richard, shame about the SRA.
Great insights Karen and Richard, and some good lessons for the SRA to learn!