Adding endogenous features can improve model accuracy

Endogenous features are additional features that are built by staying within the existing data and reworking it. A Fourier transform or a wavelet decomposition are good examples.

To observe the effect of endogenous features, I implement an MNIST classifier using XGBoost from the book Practical Gradient Boosting. The example notebook in the Github repo below shows that the addition of an endogenous feature improves the accuracy of the XGBoost model from 97.78% to 98.15%.

<Github repo>