I'm looking at trying to incorporate some sort of methodology in the adjustment/standardization of NCAAF box score variables to in order to project these variable outputs when going forward in time. Commonly you can come across a team that has run up its box score outputs against big dogs, which distorts box score match up analysis against a team that has had a recent run against more tougher rivals.
My initial idea is to incorporate a basic efficiency +/- system by regressing dog/faves performances for any box score metric against what has been historically achieved at that spread.
EG:
A regression equation shows the average +5 dog has a rushing yards differential of -50. Team X in its previous game as a +5 dog had a rushing differential of -40. Therefore, team X has a prior week 'ATS efficiency adjusted' metric of +10.
What's the pros/cons to this approach? What are some better ways of looking at this problem?
My initial idea is to incorporate a basic efficiency +/- system by regressing dog/faves performances for any box score metric against what has been historically achieved at that spread.
EG:
A regression equation shows the average +5 dog has a rushing yards differential of -50. Team X in its previous game as a +5 dog had a rushing differential of -40. Therefore, team X has a prior week 'ATS efficiency adjusted' metric of +10.
What's the pros/cons to this approach? What are some better ways of looking at this problem?