V originále
In the field of machine learning, building models and measuring their performance are two equally important tasks. Currently, measures of precision of regression models’ predictions are usually based on the notion of mean error, where by error we mean a deviation of a prediction from an observation. However, these mean based measures of models’ performance have two drawbacks. Firstly, they ignore the length of the prediction, which is crucial when dealing with chaotic systems, where a small deviation at the beginning grows exponentially with time. Secondly, these measures are not suitable in situations where a prediction is made for a specific point in time (e.g. a date), since they average all errors from the start of the prediction to its end. Therefore, the aim of this paper is to propose a new measure of models’ prediction precision, a divergence exponent, based on the notion of the Lyapunov exponent which overcomes the aforementioned drawbacks. The proposed approach enables the measuring and comparison of models’ prediction precision for time series with unequal length and a given target date in the framework of chaotic phenomena. Application of the divergence exponent to the evaluation of models’ accuracy is demonstrated by two examples and then a set of selected predictions of COVID-19 spread from other studies is evaluated to show its potential.