- formatting
- images
- links
- math
- code
- blockquotes
- external-services
•
•
•
•
•
•
-
Online and Batch-Incremental Estimation of Covariance Matrices and Means in Python
Learn how to estimate the mean, covariance, and inverse covariance matrices in an online or batch-incremental fashion. This post explains the theory behind forgetting factors and effective memory, provides Python implementations for both online and batch estimators, and investigates their accuracy and efficiency through experiments and visualizations.
-
The Relationship between the Mahalanobis Distance and the Chi-Squared Distribution
This post explores why the squared Mahalanobis distance of Gaussian data follows a Chi-square distribution. We cover the theory step by step, show empirical evidence, and explain how this relationship provides a principled way to set anomaly detection thresholds using quantiles.A companion Jupyter Notebook with code, benchmarks, and visualizations is provided to put the theory into practice.
-
Notes on the Runtime Complexity of Latin Hypercube Sampling
Exploring the runtime complexity of the Latin Hypercube Sampling (LHS) algorithm, this post investigates how computation time scales with the number of design points. By measuring runtimes, applying log-log transformations, and using weighted linear regression, we estimate the polynomial order of growth and provide practical insights into the expected performance of LHS for larger datasets.
-
Implementing the Mahalanobis Distance in Python
A hands-on Jupyter Notebook implementation of the Mahalanobis distance in Python. Covers theory, multiple implementations (NumPy, JAX, TensorFlow, SciPy), benchmarking on low- and high-dimensional data, visualizations, and its connection to the Chi-square distribution for anomaly detection.
-
Building Intelligent Agents for Connect-4: Tree Search Algorithms
Learn how the Alpha-Beta search algorithm optimizes Minimax for Connect-4 by pruning unnecessary branches, improving efficiency, and enabling stronger gameplay strategies.