And the Vegas line is generally the best indication of who the winner will be.
Come on, we've been here before.
Models that are line plus noise have consistently been inferior to the opening line and updated line. That model may correlate with the line and the line may correlate with the result, but that doesn't mean that the model will correlate to the result.
Example: Jar of beans.
1. Have a crowd guess the number of beans in a jar. The average of the guesses will correlate with the number of beans in the jar.
2. Develop a model that will estimate the average crowd guess based on jar and type of bean. Ignore the actual number of beans in the jar. Error is the difference between the estimate and the crowd guess.
3. Model is "improved" by reducing the error between the model's guess and the crowd's guess.
4. Error between model and crowd exists. There is now error from jar to crowd and crowd to model. Error from the model to the jar is necessarily worse than the error from the crowd to the jar (this is essentially by definition of how errors add and accumulate).
In our above example, any systemic error that is present in the crowd guess will be systemic in the model of the crowd guess. Per
http://www.atomicfootball.com/ the mean square error of the line is about 235 (root mean square = 15.3) and the mean absolute error is about 12. That the root mean square error is 25% greater than the mean absolute error shows that the line has at least some bias in one direction. What direction? Indeterminate. If you could determine the bias, you could become very rich.
True Error = Absolute Error + Bias
Bias in the line will be transferred to the model. Additionally, modeling the line will add its own absolute error and its own bias error.
If you make a model of a model of a cat, you don't have a cat; you have a worse model of a cat.