Jump to content

WheeloRatings

Members
  • Joined

  • Last visited

Everything posted by WheeloRatings

  1. WheeloRatings replied to Demonland's post in a topic in Melbourne Demons
    Notwithstanding @old dee's comment that we will win them all, here are each team's chances of winning each remaining match based on my model.
  2. WheeloRatings replied to Demonland's post in a topic in Melbourne Demons
    Here's mine:
  3. @Demon Dynasty @deanox I can confirm that the score involvements do include the player's own goals and behinds. The giveaway is that a player always has at least as many score involvements as own scores. I guess the 'score involvements' metric is meant to stand on its own, as opposed to supplementing the goals/behinds.
  4. WheeloRatings replied to Demonland's post in a topic in Melbourne Demons
    @rpfc @Lucifers Hero yeah The FMI has a momentum factor in their model which is why Melbourne is expected to drop and Gold Coast is 4th favourite to make the grand final!
  5. WheeloRatings replied to Demonland's post in a topic in Melbourne Demons
    FWIW, my latest simulations show that there's a ~71% chance of making the top 4 with 15 wins and a ~75% chance of making the top 8 with 13 wins, and would quite possibly be reliant on other teams' results (and percentage). Hopefully we can get to 16+ wins to be safer.
  6. Elite in this context is the top 10% of players in each position (broad position categories) based on the average AFL Player Ratings in 2022. Champion Data shoot themselves in the foot (or the media does it for them) by releasing the players categorised as elite without detail or context. The following document provides an overview of how the AFL Player Rating system works, but it's effectively a measure of how much a player improves their team's scoring chances from their involvement. https://s.afl.com.au/staticfile/AFL Tenant/AFL/PlayerRatings/PlayerRatings_HOW.pdf I haven't listened to it yet, but this week's "ESPN Footy Podcast" explains the elite ratings. For a lot more technical detail, read from chapter 5 in the following thesis: https://researchbank.swinburne.edu.au/file/248ec147-72d7-448c-a19d-49f01d90b12f/1/Karl Jackson Thesis.pdf The club leaders, which includes their position, can be seen here (noting a minimum of 9 games): I have all the Player Ratings on my site too, but the player positions are a bit different: https://www.wheeloratings.com/afl_stats.html
  7. WheeloRatings replied to tiers's post in a topic in Melbourne Demons
    The final ladder in both 2012 and 2018 had 12 teams finish with a percentage of at least 100. Also: 2013 had 12 teams with a percentage of at least 100 after 21 rounds. 2015 had 12 teams with a percentage of at least 100 after 20 rounds. 2020 had 12 teams with a percentage of at least 100 after 8 rounds. The final ladder in 1914 had 8 of 10 teams with a percentage of at least 100. https://en.wikipedia.org/wiki/1914_VFL_season#Ladder
  8. May had 3 possessions, but 6 disposals including kick ins.
  9. The AFL's API is available again so here is Richard Little's analysis of this week's game/round.
  10. I'm not sure I'd go as far as saying there's a real problem. Geelong has had more scoring shots than Carlton and conceded fewer scoring shots over their 10 games, and Carlton has only won 50% of quarters. If Carlton's games against Hawthorn and Port Adelaide go much longer, they quite possibly lose both and they're in 8th. They've done very well but their ladder position probably flatters them a little at this stage. I provided some additional explanation in my other post:
  11. A lot of the models will take into account margin of victory (as well as opposition and venue), not just simply win/loss. Carlton's percentage is the worst of the top 8 teams. They beat Hawthorn by 1 point, Port Adelaide by 3 points (with fewer scoring shots) and the Western Bulldogs by 12 but also with fewer scoring shots. They're all worth 4 points on the ladder but most of the models probably won't see a 1 point win as being much better than a 1 point loss. Luck is a factor in close games and if they lose to Port and Hawthorn, they're currently 8th on the ladder. The primary purpose of the ratings in these models is to predict future matches, not simply rank how teams have performed this year, and putting too much weight on a 1 point win or on recent matches doesn't help with predicting future matches. As such, the models are somewhat conservative but if Carlton is genuinely a top 4 team this year, that will most likely be reflected in the ratings later in the year. Another factor is that Carlton started the season from a lower base given the ratings are generally dependent on the previous season. This may still be impacting their rating in some models to some extent, although I haven't quantified how much that may still be impacting their rating in my model. Their rating in my model has gone from -4.3 after the loss to Fremantle in round 6 to +7.0 currently so they could be up to 4th by my model in a week or two at that rate (they're currently 7th).
  12. Yeah the "Aggregate" Power Rankings is basically the average of the ratings for all the individual Squiggle models. Given all the models use different methods (and rating scales), the ratings are standardised and then averaged. Here is a graphic I had created from the Squiggle API which shows the variability in some models' ratings of certain teams.
  13. Sparrow's accuracy looks a lot worse when you include the six shots at goal which haven't registered a score. Here are all Melbourne's players to have taken at least five shots this year, noting of course that raw accuracy doesn't take into account where players take their shots from nor the pressure they are under when they take the kick. This is where Champion Data's expected score can be useful but they don't make the data available for individual players. Here are the least accurate players in the AFL this year who have taken at least ten shots. Notes: For the purpose of comparison to xScore, Score excludes rushed behinds. xScore sourced from the Herald Sun / Champion Data via https://twitter.com/OliverGigacz/status/1528338101042806785?s=20&t=LvFmdS4tCVKdBsutoZc2pg
  14. I agree with your points, although that West Coast v Carlton game was only played at the MCG due to the agreement that a match had to be played at the MCG each week of finals. West Coast was the designated home team in that match.
  15. Yeah Melbourne certainly had their fate in their own hands in 2004 and should never have finished 5th. The paper goes into the home ground advantage as you would've seen but I guess the main thing it illustrates is the drop off from 4th to 5th. This is evident when looking at Preliminary Final appearances with 3rd and 4th making the PF 18 times compared to 5th and 6th three and four times respectively. This is of course what you want - the best four teams playing in the Prelims. As you have noted though, there has been a clear difference in finals results between 3rd and 4th. I have had a look at (since 2000) how far teams finishing in each ladder position made it in the finals and (in addition to not winning a Premiership) 4th has only made the Grand Final three times compared to 15, 12 and 11 for 1st, 2nd and 3rd respectively. Interestingly, 5th has a 12-10 record against 8th in Elimination Finals but 6th has a 14-8 record against 7th. The results for the top 4 based on the outcome of the Qualifying Finals are interesting. Teams in 3rd and 4th have each lost the QF 16 times and both ladder positions have made it to the Prelim 12 times. Teams in 4th after losing the QF have a 1-11 record in the PF whereas teams in 3rd after losing the QF have a 5-7 record in the PF and 4-1 record in Grand Finals. Your hypothesis of 4th having a tougher finals draw does make sense and is certainly backed by the numbers.
  16. The main thing I liked was that the probability of winning the premiership was more in line with ladder position. Currently, the only advantage to finishing 1st over 4th or 5th over 8th is the home advantage and there's a big drop off from 4th to 5th. I think my feelings probably stem from 2004 when Melbourne lost to Essendon in the elimination final! 😂 I am not advocating going back to that system though. Source: https://researchbank.swinburne.edu.au/file/ca205430-48c0-4721-8dd7-b05b36f4df8b/1/PDF (Published version).pdf
  17. Sorry, I did think that might be the case 😊 Median and mode would probably work for individual models, but it's harder with Squiggle given each model uses a different number of simulations. I'm always partial to mean 😉 Yeah I'm not sure I'm totally over Adelaide winning that premiership either, but last year certainly helped! I must admit, whilst the McIntyre final 8 system wasn't perfect, I did prefer certain aspects of it.
  18. No it went to a top 6 in 1991 and top 8 in 1994. The AFL moved to the current final eight system in 2000 (from the McIntyre final eight system used from 1994-1999).
  19. Well yes, I have them most likely to finish with 19 wins, followed by 18 then 20. On average 18.4 wins.
  20. Here are a couple of records for consecutive wins to start a season.
  21. Yes and yes. Melbourne's biggest wins v West Coast 74 - 2022, Optus Stadium 70 - 2000, Subiaco 61 - 1987, MCG 61 - 2000, Colonial Stadium (Marvel Stadium) West Coast's lowest scores in Western Australia 5.8.38 v Melbourne - 2022, Optus Stadium 6.5.41 v Geelong - 2013, Subiaco 6.6.42 v Richmond - 2014, Subiaco 5.13.43 v Adelaide - 2013, Subiaco West Coast's lowest scores v Melbourne 5.8.38 - 2022, Optus Stadium 5.15.45 - 2008, MCG 9.7.61 - 1990, MCG 9.9.63 - 2021, Optus Stadium
  22. Here are Melbourne's expected margin and win percentage for each game based on my model. Melbourne are favourites in all but the last game against Brisbane at this stage. The 3.6 losses is based on the win probability for each match - so a 50-50 game is counted as 0.5 wins and 0.5 losses. By the way, Melbourne's probability of winning all 22 games is 3.8%! 😁
  23. Yeah I have them at 12.4 and 11.0. https://www.wheeloratings.com/afl_simulations.html There's definitely a gap opening up, but having said that, this time last year I had Richmond at 12.8 wins and a 68% chance of top 8 and GWS at 10.9 wins and a 38% chance of top 8.
  24. Based on my simulations, 12 wins (& 0 draws) gives you about a 54% chance of making the top 8 and would likely then come down to percentage. 48 points and a percentage of 106 is roughly 50-50 to make the top 8.
  25. In simple terms, the attack metric is based on whether teams score above or below average and the defence metric is based on whether teams concede above or below average. Instead of using a team's actual score, their score is calculated using a weighted average of their actual score and the score they would have kicked had they kicked at an expected accuracy. So a team that scores 9.22.76 (like Melbourne did against Richmond) will be credited with a much higher score than 76. I don't have Melbourne's adjusted score at hand, but the adjusted margin was 60 points instead of 22 which better reflects their dominance. The attack and defence metrics are updated following each match based on these "adjusted" scores compared to the expected scores. If a team's adjusted score is higher than expected, their attack rating increases (and vice versa). If their opponent's adjusted score is higher than expected, their defensive rating decreases (and vice versa). I have an unfinished page on my site which provides some more detail: https://www.wheeloratings.com/afl_methodology.html In addition, teams carry over ~65% of their rating from the previous year so Melbourne's overall rating dropped from 32.1 at the end of 2021 to 20.5 at the beginning of 2022. FYI, this is Melbourne's rating progression since round 22 last year which shows Melbourne's rating is almost back to the pre-Grand Final level. Let me know if you need any additional information!