Top shots
The top performers, at least in terms of goal mouth actions, taking shots and goalscoring, are always the players who get the most attention. Doesn’t matter really what the sport happens to be. And this is no different to hockey.
So keeping with this stereotypical view let’s look at the players who took the most shots in the Olympics and see how their predicted performance compares to the actual performance.
In any of these assessments we always need to define the space in which we are operating. To provide a minimally adequate data set we focus here on those players who took at least six shots during the Olympic tournament and we want to compare the number of goals these players scored with the number of goals there were expected to score based on model estimates.
First then a summary. There are twenty three players who managed six or more shots at goal coming from eight different teams. Players are identified by the (sort of) colour of their national team
There are some things to note already here. Teams that did not make the knockout have no players represented. In addition, Spain and Great Britain, arguably the two weakest of the quarter finalists, only have one player representing each team. Only the Netherlands have more than three players (they actually had twice as many as any other team) who took at least six shots.
Most players took between one and two shots per game and would have been expected to score between about 0.1 and 0.3 goals per game, give or take - a range that covers about 61% of the players here.
The German players, Charlotte Stapenhorst, Jette Fleschütz and their captain Nikki Lorenz are all doing slightly better than the other players (Fleschütz if only in terms of the number of shots rather than the xG value of those shots).
Ambre Ballenghien1 seemed to have a very good tournament converting a fairly average number of shots taken into the second highest xG estimate.
How does this pan out if we compare the xG values for these players to the actual number of goals they scored per game? Did the players meet, over- or under-perform model expectations?
In Figure 2 points falling on the dotted grey line indicate players performing in line with expected values , i.e the xG estimate is the same as the actual number of goals scored per game. The greener the players name the more they over-performed expectations (points above the line) whereas the redder the player’s name the more they under-performed their xG.
The Dutch players Frédérique Matla, Freeke Moes, Luna Fokke and Joosje Burg as well as the Australians Grace Stewart and Stephanie Kershaw lie near the dotted line. The value of the shots they took, and so the xG predicted from those shots, is very close to the actual number of goals they scored per game.
Two players stand out as having notably outperformed expectations. Marijn Veen took twelve shots at an xG of 0.20 goals per game (1.6 goals in total) but actually scored three goals. Ambre Ballenghien really did have a good tournament with xG value of 0.51 per game (4.1 goals in total) from 12 shots and an actual tally of six field goals.
At the other end we might pick out Brooke Peris whose eight shots were of a consistently high enough value to produce an xG per game of 0.43 (adding up to 2.6 goals in her six games) but did not find the net in any game. Additionally, Charlotte Stapenhorst had the highest shot count of any player in the tournament, a very high xG (0.76 per game and 4.6 goals in total from six games) and scored three.
The details for each player can be found in the table below.
Brooke Peris may have had a poor scoring tournament and overall the three Australians underperformed slightly (1.6 goals less than expected) but Australia’s other players did contribute goals (not shown here).
Germany’s three players on the other hand, all largely under-performed. They may have had a very good shot count (Figure 1) and as a consequence, at least in part, a high overall xG (8.3 goals for all three) but did not meet those expectations (5 goals from Stapenhorst and Lorenz, Fleshütz didn’t score) leading to 3.3 goals less than expected. And unlike Australia, Germany’s other players didn’t add any more field goals to this tally.
Examining a tournament like the Olympics is interesting of course, it adds to our understanding of the game, the performance of teams and individual players. But a lot of this kind of assessment, especially the use of statistical models, has as much, if not more value when applied and regularly updated during the build-up phase to a tournament. With the 2024 Olympics over and the Pro-League 2024/25 started it is now the long, four year process of developing teams for the 2028 Olympics in Los Angeles. This is, I think, where probabilistic modelling can be an invaluable tool for providing insights into player and team performance.
And I see I have misspelt Ambre Ballenghien’s name in both figures - for which my apologies.