Introduction
After testing available to download test leagues and checking mainly the randomness between seasons with the same tactic, I decided create my own and see if I can reduce those problems. Downloaded Will's first Tactic Premier League Database file and tweaked it. Removed all players on world, except players from the 16 teams on League. All teams got fixed same Rep (except one or two that I forgot to set and they are 500 rep far from the rest).
All players are the same from TFF Tactic Testing League, which means there are many equals GK, many equals DC, and so on. All players got max Consistency, hidden attributes and factors like that. Players also have equal reputation and are either footed.
The save start after many friendlies match, with max match prep and 100% fitness/morale for all teams. Remembering that if you use the save you must set select manually all players from the 16 teams and freeze them. It doesn't takes long and you will need to do it once, saving on FMRTE and on game again. After that, to test a tactic just plug it, change to max match preparation on FMRTE, save on FMRTE and save your game as another name (if you want to have it on end of season, like I do).
Teams face each other 6 times, for a total of 90 matches per season. I set up a macro to run until the season ends, simulating results of the matches. A known bug when using the macro is that after 80 games you must choose the bonus for the next season, I tick on High, manually simulate the next match and run the macro again.
Tested the same tactic twice on this scenario and difference was 1.6% on points and 3.4% on goal difference, which accounts for 191x194 pts and 119x123 gd.
Collecting results
After simulating the season I save and collect home/aways points, goals for, goals ag, CCC for/ag (only from last 50 matches), possession, headers/90played, headers won(%), yellow cards/90P, red cards/90P, crosses succeded (%), crosses/90P, passes (%), passes/90P, shots on target (%), shots on target/90P, dribbles/game, tackles won/90P, tackles won(%).
To try to get any meaning from those stats I collected two seasons data for those stats for all teams, plotted them into graphics, found the equation that correlates stats to points, found also the correlation of the equation with the points, that with 16 teams x 2 seasons x 90 matches got to the following number of correlation (%):
After 2 saves database / after 8 saves database
CCC for: 60,24%.
CCC ag: -47,53%
CCC ratio: 74,36%
Goals for: 87,94% / 88,75%
Goals ag: -38,87% / -63,07%
Goal difference: 91,90% / 96,65%
Possession: 36,85% / 39,62%
Headers/90P: 42,37% / 27,89%
Headers(%): 13,30% / 18,15%
Yellows/90P: 65,99% / 65,88%
Reds/90P: 13,62% / 35,88%
Crosses(%): 68,15% / 60,57%
Crosses/90P: 86,44% / 74,39%
Passes(%): 18,68% / 8,54%
Passes/90P: 13,74% / 2,52%
Shots on target(%): 45,01% / 57,84%
Shots on target/90P: 79,57% / 80,10%
Dribbles/game: 21,43% / 13,33%
Tackles/90P: 7,54% / 43,08%
Tackles(%): 7,00% / -7,99%
What does it means: Some people care too much about having very high possession or many passes but you can have something like the best possession of the league and still get above average points. The graph is like below:
View attachment 159893
Changing the equation to 2nd degree wouldn't make sense for this data as it would indicate better results for let's say 40% possession than for ~50%. This happens basically because of the outliers (Tactic testing team) and should be addressed by having much more data from AI and from self team, the one that usually differs somewhat from the rest of the data. Unfortunately by now I'm doing that data addition very slowly due to a recent dislocated shoulder that reduces my mobility when typing. On an ideal world we would have as much data as possible to know better how good stats correlates with points, and even then we know that correlation does not means causation, all of this is just conjectures but trying to figure out better ways to predict if a tactic is good not only by points and goal difference, isolating randomness even further.
Based on the above correlations of the small data gathered we see that it's very good to have better CCC ratios, shots on target, crosses completed, goals for and goal difference. Formula to predict points/game by goal difference/game for example, on this specific league is like ((GD*0,5957/90)+1,3813) and usually I multiply it to the correlation (%) to add to the final result based on all stats.
Results of the simulations
There are two type of tests I did so far, the first one with even reputations and the other for underdog (3000 less rep for the test team). Results are on the spreadsheet:
https://docs.google.com/spreadsheets/d/1xDip_fB8bqyVPqAe5iQ-aVmIwJcaO0_OUlXfbkV2Atw/edit?usp=sharing
Comments:
Many things are yet to be improved, of course, as this way of testing/assessing results is just an idea I had and should be optimized by the community, specially rating the stats, which should have much more data (seasons) input.
I uploaded the save game for those who want to test (with FMRTE), and also the Macro file created on Pulover's Macro (noting that when installing it Windows flags it as a malware. I had no problems at all but use it at your own risk, the code is open to read and can be changed easily to fit any testing league using original skin + instant result).
Save game:
http://www.mediafire.com/download/5znb5l8ozdxjiuy/TPL16teamsFreezectr.fm
Macro:
http://www.mediafire.com/download/5834lqhc66h3ezb/TPLjairo.pmc
To change the Macro to fit other leagues you must change the line 378, changing the pixel on the bottom left (for England Team is a red pixel), that fills almost the entire menu, for the color of the team.
One test takes around 1h20m on my laptop on macro, it could be faster/slower for you. Any ideas of assessing the results of the tactics and tactics to test are welcome.
After testing available to download test leagues and checking mainly the randomness between seasons with the same tactic, I decided create my own and see if I can reduce those problems. Downloaded Will's first Tactic Premier League Database file and tweaked it. Removed all players on world, except players from the 16 teams on League. All teams got fixed same Rep (except one or two that I forgot to set and they are 500 rep far from the rest).
All players are the same from TFF Tactic Testing League, which means there are many equals GK, many equals DC, and so on. All players got max Consistency, hidden attributes and factors like that. Players also have equal reputation and are either footed.
The save start after many friendlies match, with max match prep and 100% fitness/morale for all teams. Remembering that if you use the save you must set select manually all players from the 16 teams and freeze them. It doesn't takes long and you will need to do it once, saving on FMRTE and on game again. After that, to test a tactic just plug it, change to max match preparation on FMRTE, save on FMRTE and save your game as another name (if you want to have it on end of season, like I do).
Teams face each other 6 times, for a total of 90 matches per season. I set up a macro to run until the season ends, simulating results of the matches. A known bug when using the macro is that after 80 games you must choose the bonus for the next season, I tick on High, manually simulate the next match and run the macro again.
Tested the same tactic twice on this scenario and difference was 1.6% on points and 3.4% on goal difference, which accounts for 191x194 pts and 119x123 gd.
Collecting results
After simulating the season I save and collect home/aways points, goals for, goals ag, CCC for/ag (only from last 50 matches), possession, headers/90played, headers won(%), yellow cards/90P, red cards/90P, crosses succeded (%), crosses/90P, passes (%), passes/90P, shots on target (%), shots on target/90P, dribbles/game, tackles won/90P, tackles won(%).
To try to get any meaning from those stats I collected two seasons data for those stats for all teams, plotted them into graphics, found the equation that correlates stats to points, found also the correlation of the equation with the points, that with 16 teams x 2 seasons x 90 matches got to the following number of correlation (%):
After 2 saves database / after 8 saves database
CCC for: 60,24%.
CCC ag: -47,53%
CCC ratio: 74,36%
Goals for: 87,94% / 88,75%
Goals ag: -38,87% / -63,07%
Goal difference: 91,90% / 96,65%
Possession: 36,85% / 39,62%
Headers/90P: 42,37% / 27,89%
Headers(%): 13,30% / 18,15%
Yellows/90P: 65,99% / 65,88%
Reds/90P: 13,62% / 35,88%
Crosses(%): 68,15% / 60,57%
Crosses/90P: 86,44% / 74,39%
Passes(%): 18,68% / 8,54%
Passes/90P: 13,74% / 2,52%
Shots on target(%): 45,01% / 57,84%
Shots on target/90P: 79,57% / 80,10%
Dribbles/game: 21,43% / 13,33%
Tackles/90P: 7,54% / 43,08%
Tackles(%): 7,00% / -7,99%
What does it means: Some people care too much about having very high possession or many passes but you can have something like the best possession of the league and still get above average points. The graph is like below:
View attachment 159893
Changing the equation to 2nd degree wouldn't make sense for this data as it would indicate better results for let's say 40% possession than for ~50%. This happens basically because of the outliers (Tactic testing team) and should be addressed by having much more data from AI and from self team, the one that usually differs somewhat from the rest of the data. Unfortunately by now I'm doing that data addition very slowly due to a recent dislocated shoulder that reduces my mobility when typing. On an ideal world we would have as much data as possible to know better how good stats correlates with points, and even then we know that correlation does not means causation, all of this is just conjectures but trying to figure out better ways to predict if a tactic is good not only by points and goal difference, isolating randomness even further.
Based on the above correlations of the small data gathered we see that it's very good to have better CCC ratios, shots on target, crosses completed, goals for and goal difference. Formula to predict points/game by goal difference/game for example, on this specific league is like ((GD*0,5957/90)+1,3813) and usually I multiply it to the correlation (%) to add to the final result based on all stats.
Results of the simulations
There are two type of tests I did so far, the first one with even reputations and the other for underdog (3000 less rep for the test team). Results are on the spreadsheet:
https://docs.google.com/spreadsheets/d/1xDip_fB8bqyVPqAe5iQ-aVmIwJcaO0_OUlXfbkV2Atw/edit?usp=sharing
Comments:
Many things are yet to be improved, of course, as this way of testing/assessing results is just an idea I had and should be optimized by the community, specially rating the stats, which should have much more data (seasons) input.
I uploaded the save game for those who want to test (with FMRTE), and also the Macro file created on Pulover's Macro (noting that when installing it Windows flags it as a malware. I had no problems at all but use it at your own risk, the code is open to read and can be changed easily to fit any testing league using original skin + instant result).
Save game:
http://www.mediafire.com/download/5znb5l8ozdxjiuy/TPL16teamsFreezectr.fm
Macro:
http://www.mediafire.com/download/5834lqhc66h3ezb/TPLjairo.pmc
To change the Macro to fit other leagues you must change the line 378, changing the pixel on the bottom left (for England Team is a red pixel), that fills almost the entire menu, for the color of the team.
One test takes around 1h20m on my laptop on macro, it could be faster/slower for you. Any ideas of assessing the results of the tactics and tactics to test are welcome.
Last edited: