Monday, July 2nd, 2012 « Untold Arsenal: Arsenal News. Supporting the Lord Wenger; coach of the decade


By DogFace and Walter Broeckx

Untold Arsenal has a team of qualified referees who have reviewed more than 40% of the EPL games from last season. The reviews themselves were based on full match video footage with the advantage of video technology features such as slow motion and pause.

By reviewing those 155 games we have made a database of more than 7000 decisions that have been judged by our panel of dedicated and qualified referees.

The numbers you will see are based on those decisions and those reviewed games.

Welcome to the start of our investigation; in these first two articles [of increasing granularity] we will introduce you to the data analysis tools we will be using for this series. Initially we will take a holistic approach i.e. we will crunch the Ref Review numbers of the league as a whole with all teams and all referee’s included.

With each subsequent series of articles we will attempt to drill down into more specific questions such as referee(s) performance and inconsistancy with regards to a regional and team perspective.

For the purposes of analysis the 155 games we have reviewed have been boiled down and split into the perspective of the two respective teams – this is why there are 2 rows of ‘concentrated’ match data per game i.e. the referee’s performance against the home team and the referee’s performance in regards to the away team – these are essentially a reflection of each other and creates a zero-sum calculation when all teams/referees are included – but as you will see later, when we get into the specifics of home/away, teams or regions, this method will yield some interesting results.

This is worth bearing in mind, to avoid confusion, when studying the graphs with regards to the stated numbers of matches analysed.

So here we start with the numbers part.

The first series of numbers is how many games we reviewed of each team and how much this is from the total games they played in the PL last season.

Overview of games reviewed

Let me start by saying sorry for the supporters of teams further down the league table. To do our reviews we had to work with the games that were broadcast by the different sport channels on TV (in different continents even!).  Our ref reviewers don’t decide which games are on TV so we just have to live with what they show. And they usually show the top teams like Manchester United, Manchester City, Arsenal, Chelsea and other teams that are close to the top. The good thing for the smaller teams is that they are in the reviewing system when they play the big teams – this should therefore be enough to shake out the theory that the big teams get all the calls – a common complaint from the supporters of the teams lower down the league.

I know that in a way this isn’t totally fair for the small teams but we just have to work with the things we have at our disposal.

As this is primarily written for an Arsenal supporters blog it is not a surprise that we have done all 38 games of Arsenal. The next team we covered the most is Chelsea followed by Manchester City and Manchester United. We have done 32 of their games and that is 84,21% of their total games in the PL; next is Tottenham and Liverpool with around 20 games reviewed and that is just over 50% of the games. We then have the teams like Newcastle, Sunderland, Everton, Norwich, Blackburn, Bolton and Fulham – we have full match reviews for something between 40% and 26% of the games of these teams.

And finally we have the teams that didn’t get much coverage live on TV. But even from those teams the lowest number of reviews is 6 and that is more than 15% of their games. I would have loved it, if we could have offered more but that is the way things went.

So here is cry for help. Anyone who has access to video archives of games if he could open up his archives to us. Just imagine if we could pick out any game and start reviewing it. But that itself wouldn’t be enough.

So here is another shout for help. If you are a qualified ref and you want to help us in the future. Please let us know! This project shouldn’t stop right here. In fact it should go further from here. If we could get in more referees to review the games our numbers will become even better. So if you are a ref yourself and want to be involved in this: give us a shout.

If those two things would happen in the future we could set up something that would be a kind of shadow-PGMOL organisation that would be able to judge the refs in the open and not behind closed doors like the PGMOL does.

So why don’t you just join us?

Like I have said in the introduction article we can only use what we got. So we will treat those games as the data for each team. Of course doing a team over 30 games brings up a better overall view then doing a team just 6 games.  If you are very critical about that, fine. Then why don’t you look for a ref in your friends circle and ask him to join our team. So we can do more games and most of all, do more games of the smaller teams (with all respect).

The next set of numbers we will show is how many games each ref has been reviewed and how much this represents their season in the EPL.

Overview of referee’s reviewed

Referee Reviewed Percentage
Andre Marriner 10 43.48
Anthony Taylor 5 20.00
Chris Foy 9 32.14
Howard Webb 17 54.84
Jonathan Moss 1 5.26
Kevin Friend 7 29.17
Lee Mason 6 22.22
Lee Probert 9 33.33
Mark Clattenburg 9 36.00
Mark Halsey 9 33.33
Martin Atkinson 10 37.04
Michael Jones 12 42.86
Michael Oliver 8 30.77
Mike Dean 18 52.94
Neil Swarbrick 2 10.53
Peter Walton 7 38.89
Phil Dowd 12 38.71
Stuart Attwell 3 18.75

So in numbers the most reviewed ref last season was Mike Dean with 18 games. Closely followed by Howard Webb with 17 games.  But if you compare this with the total games they did in the EPL last season you can see that the the 17 from Webb represent a higher % of games from his total EPL games compared to Dean.

The same remark about the team can also be made about the refs. The more one reviewed the refs the more accurate our data will be.  So the numbers from Johathan Moss, Neil Swarbrick and Stuart Attwell have to be looked at with lots of question marks.  As has been said before: also refs have their ups and downs. And so they could be more vulnerable to mistakes when they have just a few games done in the reviewing system.  We will give their numbers but will remind you to keep the bag of salt ready just in case you see some strange things in their numbers.

But I think it is fair to say that from every ref we have reviewed at least 10 times we will have a great insight in how they have performed and the more reliable their good points will become and the more visible any flaws will become.

So it will be most interesting to see the final numbers from Marinner, Webb, Atkinson, Jones, Dean and Dowd as we have done more than of their games and in % of their total games. In fact we have covered more than 1/3 of their games. There are only 3 refs whom we have covered less than 20% and those refs are Jonathan Moss, Neil Swarbrick and Attwell.  But as with the teams we will try to cover the few games we have but you can understand that their numbers are not as reliable as other are.

How competent have the refs been in the games we reviewed?

In this graphic we have made the difference between the different types of decisions. With the “low” decisions we talk about the fouls on the field. The “Medium” decisions are in fact the decisions about yellow cards. And the “High” decisions are the decisions about red cards, penalties and goals. And for all those different types of decisions we have the correct and incorrect decisions.

As you can see on a total of all the decisions the total of correct decisions was 72,49%. The incorrect decisions was 27,51%. This means that when a ref has to make 4 decisions 1 of them is wrong. And if we put weight on the decisions the number of correct calls goes even further down.

The top referees in the country. Professional referees,  the best possible preparations, rather well paid and still they only manage to just hit the 70% mark of correct decisions. This is low. Too low in my opinion.

And certainly if you keep in mind that in case of doubt we gave the decision as a correct one to the ref. So the chance that if we would have been able to determine our own camera angle on some of the decisions, the total score of correct decisions could have even been lower.

I said in my introduction articles that a minimum score of 70% would be the minimum for a ref. But for all the refs together this clearly indicates that there are some refs who get a higher % of correct decisions but that in some cases you have refs who get a score below 70%. And this is something that I find unacceptable for what should be the best league in the world and should be refereed by the best referees in the world – but at least now we have an average set of figures for basic match competency in the EPL that we can reference in later articles and this is an important first step.

Competency per type of decisions

There are two types of decisions that at first sight are fine.  If you allow me I will start with the offside decisions. Mike Riley gave some numbers this season and said that 99% of the offside decisions were correct. We only come up to something of 90,199%. And I know that a lot of the correct decisions were given as correct just because we couldn’t check them.   So there is a big difference between our verified number and the unverifiable number of Mike Riley.

Let us move to the goal decisions. 91,753% of the goals are correct. But that is just over 9 out of 10 goals! This means that you almost have 2 or 3 goals each match day that are wrong given or wrongly disallowed. And we must keep in mind that goals is what wins games. So it is the most important decision we must get right. This number should be around 99% correct but we still are a bit away from such a score.

The other decisions are the fouls in general. The score there is 71,960% correct and this is in line with the overall competency of the refs one could say. Not great, just enough.

Another important issue is the penalties. I know it is difficult with some players diving around a bit. So the refs only getting 62,241% of the penalty decisions correct is also because of the bad behaviour of some players. But still the importance of these decisions justify that something should be done about it.

If we look at the red cards we see…a complete disaster imo. I have seen players kick each other off the ball, I’ve seen players punch each other off the ball, I’ve seen elbows against heads, I’ve seen horror tackles… some even unpunished – not even a foul given – let alone a red card.  It really is time to clean up football. The things I have mentioned do not belong on a football field and should be banned. And if the ref is not brave enough or did not see it, the FA should take action. But last season they have let me and the referee world down on that by not taking serious action against some actions.

The final table shows that the yellow cards are only given correct for not even 60%. One of the great issues here is the consistency. It is great to see a yellow card given when a player comes in to the ref waving his arms and protesting a decision. But when another player from another team does the same and the ref gives no card you really do wonder what is happening. Consistency is what we ask and this is very hard to get it seems.

But we surely will come back to this later in this series.

Who is not fit to referee?

Let us go one step further and move on to a next graphic. In this graphic we can show you who has been top in making wrong calls.

As this is not filtered on home and away games the score of the refs will be as high above as below the zero line for each ref. But of course the higher the lines in total are the higher number of wrong calls the ref has produced. So Peter Walton is very much in front of this table.  The more to the right the better the ref one could say… but please bear in mind that we have more referee/match reviews for some refs than others and this is shown on the graph by the number just below the referee’s name – the best referee looks to be Jonathan Moss, but we only have one match review (refereed 2 teams in our data) – so we must take this with a pinch of salt.

Let us take it yet again another step further and put the weight from the calls in it.

It is interesting to note that with the weighting applied (big calls worth more than small calls) the order of competency changes.

Now we don’t see Peter Walton on top (but still a good second) but Mark Halsey. If you ever want to have a look at the refs who have been having bad games last season just look at those tables from left to right and you will find the answer.

But we will dig in to those numbers later on in a more detailed way.

Next article in this series:  something about bias. Was Mike Riley correct when he said there is no bias in the refs in the PL?

Similar Posts