Pass accuracy levels, shot-to-goal ratios, possession percentages. Statistics are seen and heard everywhere in the modern football media, thrust into type like punctuation marks by Twitter-fanatics and lazy journalists alike, summing up entire player and team performances into simple numeric tables.
For sure, Team A may enjoy statistically superior possession of the ball in a given game, but that is absolutely irrelevant to the outcome of the match. Games are won on account of how many goals are scored, not on the amount of times defensive midfielders have passed to centre-backs with no intention of creating a goalscoring opportunity, or even progressing up the pitch.
Cast back your mind to Celtic’s famous 2-1 home victory over Barcelona in November’s Champions’ League group stage fixture. Neil Lennon’s Bhoys enjoyed a meagre 16.4% possession of the ball, apparently dominated by Barcelona’s control of the play.
However, Celtic were not dominated, at all. Certainly, Barcelona had more attempts on goal, but Celtic defended resolutely with men behind the ball, tightly forcing Barcelona’s vast wealth of attacking talent into spurning opportunities by blazing wide or hitting daisy-cutters at goalkeeper Fraser Forster. Lennon’s side twice made use of two individual lapses in concentration from Barcelona players, and the night will forever go down in Celtic’s proud history.
So, what did the initially shocking possession statistic tell everyone? Absolutely nothing. Lennon sent his team in with a game-plan, and they executed it perfectly. Now, I’m not trying to have a go at Opta, ProZone, or any of the bodies that detail these statistics, as their findings can sometimes make interesting reading. Managers such as Sam Allardyce seem to believe so, having pioneered the use of ProZone statistics to the extent that almost all top clubs now use information from previous games to devise new tactical plays and training regimes.
Clearly, statistical analysis has an important role in the game, but when those statistics are used by amateur journalists as some kind of alternative to gaining legitimate opinions through watching live football, it becomes ludicrous. Take this article, where the author suggests that Gary Cahill was going to be a bad transfer decision for Chelsea last year because he statistically didn’t make a lot of passes (as he is paid to defend) and statistically didn’t win very many tackles (in the most lacklustre half-season I’ve seen in thirteen years of following Bolton Wanderers). While the author of the piece could perhaps have made a decent point, his awful reliance on numbers suggested he hadn’t seen a single game Cahill played in a Wanderers shirt, which, for me, instantly takes away any legitimacy his argument could have had.
This increasing trend of taking a snapshot of one particular aspect of a given player’s game and putting it in a numeric form is easily one of the most infuriating things about people who comment on football in any capacity. In an age where rapid reportage and quick summary is demanded of all forms of football coverage, there is a real danger that journalists will make snap judgements on players and teams based simply on figures, figures that can be interpreted in almost any way possible to fit almost any argument.
If football writers start forming articles with information gleaned from statistics alone, and readers of said articles are happy to take in match previews and reviews with emphasis on ‘average passing success rates’ and the ‘average amount free-kicks conceded by Team B away from home in the middle third of the pitch’ (oh, YAWN), there is a danger of football media becoming dependent on stats to convey a point to a new breed of football fan that demands it. The only place where high journalistic numerical analysis has any place is in cricket, a game as numbingly dour as listening to pundit Steve Claridge give analysis on a 0-0 orgy of tedium between Stoke City and Aston Villa.
Let’s not let our game get to that level, eh?