Analysis BBL - A Statistical Analysis

Rowsus

Statistician
Joined
19 Mar 2012
Messages
29,272
Likes
65,643
AFL Club
Melbourne
#1
With the help of @Damion23, I thought I'd try and have a look at a few statistical bits and pieces leading into the seasons start.

Due to time constraints, it won't be as in depth as I'd possibly like, but it might help in one or two starting selections.

One of the references used in this analysis is Expected Scoring Pattern (ESP). This can be subjective, as to how you arrive at this figure. With some players being fairly consistent it is easy, but when they are up and down it is very much a matter of opinion. When people set ESP's there are usually two different traps they fall for:
Trap 1: Recency Bias - the most recent perfomance is the new norm! This is a very common trap people fall for in SC. That can be on a weekly basis ie. chasing last weeks scores. Or on a season basis ie. He scored at 120 (AFL SC) last season, so we can expect that is his new level.
Trap 2: Hero Bias - pretty much the opposite of the first one. "He's always been a Prem". Yeah, he scored at 112 two and a half seasons ago, but hasn't looked like it since, yet you can't let go of the memory of him doing it back then!

As with AFL SC, you are better to slightly downgrade your ESP's, rather than set them at upper elite levels. How often have you thought "This guy will be a 21-22/120+ player this season", only for him to go 18/115???
Realistic expectations are the key to good planning, when it comes to SC!
 
Last edited by a moderator:

Rowsus

Statistician
Joined
19 Mar 2012
Messages
29,272
Likes
65,643
AFL Club
Melbourne
#2
Reversion

Let's look at how a player fares in his following season, after he has scored at 10 or more higher than his previous season.
We'll also look at how they fare when they score at 10 or more higher than their ESP.
All figures taken from the last 5 seasons of BBL SC only.

SCS 2024 BBL Rev1a.jpg



Let's look at the first part of the table.
The values across the top are how far they outscored their previous season of BBL SC.
By 10 to 15, 15 to 20, or more than 20. The last on of 10+ is just a total of the first 3 columns.
The values down the left side are how they fared the season after recording that increase.
The blue section up the top shows they recorded another increase. The green section shows they scored within +/- 5 of that season, and the bottom section shows how much they dropped off, after recording that 10+ increase.
As we can see, after scoring an increase of 10+ on their average, from one season to the next: 12% of them recorded another increase of at least 5 points. 19% recorded a very similar season. 69% recorded a drop of at least 5 points the following season. More alarmingly, 34% recorded a drop of at least 25 points, and 50% recorded a drop of at least 15 points!!!!

The 2nd part of the table looks at how players went in the following season, after scoring at 10 or more higher than their ESP. The figures are frightening! Only 3 out of 82 managed to record another increase. 9 maintained the status quo, while 70 out of 82 recorded a loss of 5 or higher. 63% recorded a loss of 15 or higher!!!!!
 

Rowsus

Statistician
Joined
19 Mar 2012
Messages
29,272
Likes
65,643
AFL Club
Melbourne
#3
Which players recorded an increase of 10 or higher from BBL 12 to BBL 13?
Which players recorded a season that was at least 10 higher than ESP in BBL 13?

SCS 2024 Rev2.jpg

Some people will look at Matthew Short's ESP of 57.1, and say it is too low. Fair enough, as stated earlier, ESP's are subjective, and you can set your own. Short's previous averages coming into BBL 13 were: 24, 22, 18, 71, 79. Even if you were inclined to set his ESP at say 77 for BBL 13, he still outscored that by 16, and would appear on this list anyway!

Is it suggested you avoid these players? Not necessarily. You just need to be wary of what you expect from them, given that 69% and 85% have historically failed from this position, and 50% and 63% have failed pretty dramatically!!!
 

Rowsus

Statistician
Joined
19 Mar 2012
Messages
29,272
Likes
65,643
AFL Club
Melbourne
#4
Bounce

The obvious next question is, how do players fare when they drop by 10 or more, from one season to the next? Also, when they score 10 or more under their ESP.

SCS 2024 BBL Bnc1.jpg

Interestingly, the 31% that managed to bounce back, and improve their next seasons average by 10+ are pretty evenly spread across all the ranges, from 10-15 to 30+.
43% scored within +/- 10 of their previous season, coinciding with 43% that manged some sort of increase of 5 or more.
Terrifyingly, if you were looking for a fallen Prem, or bargain priced player, 25% recorded another drop of 10 or more, and 16% recorded another drop of 20 or more!!!

The ESP numbers look a bit more useful. 71% recording some sort of bounce back. 63% bounced back by 10 or more, and 44% by 20 or more. Identifying the right players from this group might prove useful!!!
 
Joined
16 Nov 2022
Messages
10
Likes
48
AFL Club
Bulldogs
#7
Hey Rowsus, interesting analysis! I have a few questions.

Firstly, what is your basis for constructing ESP's? Is it just a flat average of previous years or a weighted average based on recency/games played in a season and is there any judgement/adjustments taken? I'm sure this would be extremely time consuming, but even adjusting averages to remove the effects of reduced overs games would remove some of the 'noise' with these averages.

How have you dealt with survivorship bias in the data, either with players performing really well one season and not playing the next, or more likely players performing poorly and then getting dropped the following season?

Finally, have you considered performing any sort of fundamental analysis on expected averages, by stripping down their stats into batting, bowling and fielding, then extrapolating to get expected figures for the next season? If so, do you have any sort of spreadsheet/csv file with the stats from previous seasons that you would be willing to share?

Ultimately, I really enjoyed reading through your breakdown. I think it highlights that the risk in a selection isn't based on the quality of the player but on the price that the player is purchased at!
 

Rowsus

Statistician
Joined
19 Mar 2012
Messages
29,272
Likes
65,643
AFL Club
Melbourne
#8
Hey Rowsus, interesting analysis! I have a few questions.

Firstly, what is your basis for constructing ESP's? Is it just a flat average of previous years or a weighted average based on recency/games played in a season and is there any judgement/adjustments taken? I'm sure this would be extremely time consuming, but even adjusting averages to remove the effects of reduced overs games would remove some of the 'noise' with these averages.

How have you dealt with survivorship bias in the data, either with players performing really well one season and not playing the next, or more likely players performing poorly and then getting dropped the following season?

Finally, have you considered performing any sort of fundamental analysis on expected averages, by stripping down their stats into batting, bowling and fielding, then extrapolating to get expected figures for the next season? If so, do you have any sort of spreadsheet/csv file with the stats from previous seasons that you would be willing to share?

Ultimately, I really enjoyed reading through your breakdown. I think it highlights that the risk in a selection isn't based on the quality of the player but on the price that the player is purchased at!
Hey WoodyTea,
The whole analysis is just a bucket analysis. No background or reasoning along the lines of this player was injured, or this player was robbed of scoring in three rain affected matches. It is just a raw figure analysis. I meant to explain that earlier, as I fully expect people will point out things like "Your analysis says Bradman dropped by more than 10 points, but he played with a broken leg that season.". No doubt, there are excuses for some of the players included here, whether it be in the Reversion or the Bounce section.
As to the ESP calculation, this is the first time I have done it in BBL, but I have done it in AFL over many years. Due to time constraints, and my hope to add a few more analyses, I used a pretty basic system of look at the last 5 seasons, throw out the high and low score, and take an average of the rest. While this is very "unscientific", I have found it is actually a pretty good approximation in 70-75% of the relevant AFL players, so I just used it here. As I mentioned earlier, Short being ESP'ed at 57.1 might furrow a few brows. If I was doing it more manually, and player by player, I'd have him higher than that, probably around the 70-ish mark. That mark would still have him well and truly in the table he sits a-top now anyway.

As to making any detailed projection type analysis, time constraints means I won't be able to do that. If I had started this 5 or 6 weeks ago there was some chance for something like that. This is more a meat and potatoes analysis, which might help people narrow down players they want to pull apart further themselves. Or might help them with some toss of the coin type decisions in their selections.

Thanks for your question, and good luck for the upcoming season.
 

Diabolical

Leadership Group
Joined
17 Jun 2014
Messages
9,731
Likes
38,377
AFL Club
Essendon
#9
Great analysis @Rowsus

I was wondering how much role change affects how a player improves or otherwise. You memtioned Short, and his big jump in improvement came the year he moved to opening the batting and bowling regular overs. Prior to that season he was batting in the middle order and rarely bowling.

I decided to go back and look at his scores against expected score. I have my own loose scoring system where I allocate points based on batting position plus role the player takes in the field. I went through previous seasons and calculated Shorts expected score for each match based on his role.

Season: SC Average (Expected Average) % Av/Exp Av
BBL|07: 9 (39) 22%
BBL|08: 26 (39) 66%
BBL|09: 23 (28) 82%
BBL|10: 18 (20) 91%
BBL|11: 68 (65) 105%
BBL|12: 79 (67) 117%
BBL|13: 103 (64) 160%

As you can see, he has improved every season. I would expect the average player to score at 100% of expected score, so we can see how Short has progressed from beginner to elite over the years. However it is actually his role that has the biggest effect on his score with his natural progression just expanding on that. Given Short has just about the best possible role for expected score, I can't see much opportunity for improvement for him. His expected score can only drop if he bowls a bit less, and even if he doesn't, it will be hard to back up a 160% season.

By comparison, his team mate Overton averaged 70 last year with an expected average 58 which put him at 120%. Whilst Overton is maxed in his expected fielding potential he does bat around number 6, so he could still see a slight rise in expected score if he goes up a batting position or two in some games. 120% would also be a much easier level to maintain than Short's 160%, so I think he is much better value.

It is for this reason that I am happy to start without Short as I can see better value in other players. However, I will still be targeting him for his doubles given his perfect role for scoring potential.
 

Rowsus

Statistician
Joined
19 Mar 2012
Messages
29,272
Likes
65,643
AFL Club
Melbourne
#10
Great analysis @Rowsus

I was wondering how much role change affects how a player improves or otherwise. You memtioned Short, and his big jump in improvement came the year he moved to opening the batting and bowling regular overs. Prior to that season he was batting in the middle order and rarely bowling.

I decided to go back and look at his scores against expected score. I have my own loose scoring system where I allocate points based on batting position plus role the player takes in the field. I went through previous seasons and calculated Shorts expected score for each match based on his role.

Season: SC Average (Expected Average) % Av/Exp Av
BBL|07: 9 (39) 22%
BBL|08: 26 (39) 66%
BBL|09: 23 (28) 82%
BBL|10: 18 (20) 91%
BBL|11: 68 (65) 105%
BBL|12: 79 (67) 117%
BBL|13: 103 (64) 160%

As you can see, he has improved every season. I would expect the average player to score at 100% of expected score, so we can see how Short has progressed from beginner to elite over the years. However it is actually his role that has the biggest effect on his score with his natural progression just expanding on that. Given Short has just about the best possible role for expected score, I can't see much opportunity for improvement for him. His expected score can only drop if he bowls a bit less, and even if he doesn't, it will be hard to back up a 160% season.

By comparison, his team mate Overton averaged 70 last year with an expected average 58 which put him at 120%. Whilst Overton is maxed in his expected fielding potential he does bat around number 6, so he could still see a slight rise in expected score if he goes up a batting position or two in some games. 120% would also be a much easier level to maintain than Short's 160%, so I think he is much better value.

It is for this reason that I am happy to start without Short as I can see better value in other players. However, I will still be targeting him for his doubles given his perfect role for scoring potential.
I'm slinging stew, meanwhile you're serving filet mignon! 🤣🤣🤣
 

Rowsus

Statistician
Joined
19 Mar 2012
Messages
29,272
Likes
65,643
AFL Club
Melbourne
#11
I always had a sense that Bowlers were better served in the early games, but then as the pitches flattened out, Batsmen would start to dominate.
Splitting last season up into December matches and January matches doesn't back this up.
Once again, this is a bucket analysis. Just looking at raw figures without factoring in weather. It only looks at matches where both teams got a chance to bat.

December matches - 18 matches - 1 no result.
Average innings
Team batting first: 166.44 for 7.44 wkts @ 8.36 runs/over - 7 wins
Team batting second: 147.72 for 5.22 wkts @ 8.95 runs/over - 10 wins
January matches - 18 matches - finals not included.
Team batting first: 156.78 for 6.61 wkts @ 7.99 runs/over - 7 wins
Team batting second: 147.89 for 5.06 wkts @ 8.30 runs/over - 11 wins

While there is a slight drop in the team batting first's figures across the board, the team batting seconds figures are close enough to the same. The Bowlers of the team bowling first take nearly an extra wicket/game earlier in the season.
Not a big enough difference (nor sample) to call it a trend!

As most of you realise, when doing an analysis like this, over a decent number of games, the team batting second will average less runs made, and less wickets lost, than the team batting first. This is because the team batting first's innings always meets its' natural conclusion. ie. all Out, or 20 overs bowled. The team batting second will in just over 50% of their matches, have their innings cut short, as they have successfully chased the target. (based on the figures above, team batting 2nd won 60% of matches)
 
Joined
15 Jan 2014
Messages
680
Likes
3,531
AFL Club
Essendon
#13
I always had a sense that Bowlers were better served in the early games, but then as the pitches flattened out, Batsmen would start to dominate.
Splitting last season up into December matches and January matches doesn't back this up.
Once again, this is a bucket analysis. Just looking at raw figures without factoring in weather. It only looks at matches where both teams got a chance to bat.

December matches - 18 matches - 1 no result.
Average innings
Team batting first: 166.44 for 7.44 wkts @ 8.36 runs/over - 7 wins
Team batting second: 147.72 for 5.22 wkts @ 8.95 runs/over - 10 wins
January matches - 18 matches - finals not included.
Team batting first: 156.78 for 6.61 wkts @ 7.99 runs/over - 7 wins
Team batting second: 147.89 for 5.06 wkts @ 8.30 runs/over - 11 wins

While there is a slight drop in the team batting first's figures across the board, the team batting seconds figures are close enough to the same. The Bowlers of the team bowling first take nearly an extra wicket/game earlier in the season.
Not a big enough difference (nor sample) to call it a trend!

As most of you realise, when doing an analysis like this, over a decent number of games, the team batting second will average less runs made, and less wickets lost, than the team batting first. This is because the team batting first's innings always meets its' natural conclusion. ie. all Out, or 20 overs bowled. The team batting second will in just over 50% of their matches, have their innings cut short, as they have successfully chased the target. (based on the figures above, team batting 2nd won 60% of matches)
Also seems to be a trend where the team who wins the toss bowls first so teams are happy to chase and win.

25 out of the first 32* games where there was a toss the team choose to bowl first. (16 wins 8 losses 1 N/R)

(*removed the 2 washed out games where there wasn't a coin toss otherwise it would be 34 games)
 
Top