It’s the prestigious Law Firm Services (LFS) Conveyancing Awards later this week and we have been working in the background to help the organisers with the “scores on the doors”.
The evaluation process has developed over the years and includes a number of different angles to sort the wheat from the chaff.
One aspect of the awards is to look at the experience of the conveyancing team at the different firms and the team at LFS was looking for an new approach to testing this. In previous years, firms had to submit various documentation to be checked and scored to gain a compliance score. This was time consuming and cumbersome for both law firms and the judging team. A chance meeting at the Legal North event last year opened up a conversation between our founder Steve and Richard at LFS to see how our tools could be used to support the awards.
Roll forward to today and leading conveyancing firms across the UK have taken the quiz designed to test the knowledge of the practitioners in the firms vying for the firm of the year award. The E3 team worked with leading conveyancing experts to identify a suite of questions to test the brightest brains around. Some questions were (fairly) easy and some were down right tricky but with around 70 questions, the quiz has proven to be a useful input to the awards programme.
So with the awards about to be handed out we thought that some of the insights from the data would be interesting. Perhaps the first thing to look at is the overall distribution.
As you can see from figure 1, we see a broad distribution in scores, so the quiz we designed has worked well to differentiate the firms. A great input for the overall judging process for the awards.
We were interested in what made the difference between the high and low scoring firms so we looked at the scores for the different questions and how they varied by question. We won’t share the data down to an individual question as we don’t want to provide any coaching for subsequent awards, but there are some interesting differences to observe.
First, when we look at the overall answer rate, there are big differences between the firms at the top and bottom of the list as you can see in figure 2. As we look at firms, the average answer rate increases for the better firms – not unexpected. But the distribution of answers is interesting. In the lower scoring firms, none of the staff taking the course could answer some of the questions correctly. Some of these questions were the scenario based questions designed to be taxing but some of the questions were relatively simple and based on questions we use in our standard courses such as a basic sanctions question which we would hope firms would have good grasp of.
There were a number of questions that all firms did well on which was great to see and demonstrated a good understanding of basic anti money laundering rules and approaches firms should be adopting.
What made the top firms stand out was their consistency – for most of the questions, the top firms were seeing at least 60% correct answers across their staff and often above 90%. It was this consistency across the hard and easy questions that made a difference across the wide range of topics needed for conveyancing firms.