I am currently rebuilding Bear to properly implement leave-one-out cross validation. ETA is late 2025. Apologies.
Bear is a new free and open source machine learning engine that I have developed as a personal hobby since 2008.
Bear finds statistically significant dependencies between features and labels to build forests of piecewise constant models.
Leave-one-out cross-validation is baked in, automatic, and free, for any arbitrarily large sample size.
Bear is thus a Monte Carlo engine similar to random forests, but with a different core philosophy and algorithm.
Its fundamentally frequentist approach automatically avoids overfitting. It learns efficiently without needing neural networks.
Bear won’t “replace” existing ML or AI systems. It’s a machine learning engine that might in future be dropped into existing systems.
A quick guide to building and running my free and open source ANSI C implementation of Bear is here.
These Bear pages describe personal hobby research that I have undertaken since 2008.
All opinions are mine alone. All code is from my personal codebase, supplied under the MIT-0 License.
© 2008–2025 John Costella