Supervised Clustering With Kan Extensions
Posted on July 25, 2021
Clustering algorithms allow us to group points in a dataset together based on some notion of similarity between them. Formally, we can consider a clustering algorithm as mapping a metric space \((X, d_X)\) (representing data) to a partitioning of \(X\). In most applications of clustering the points in the metric...
[Read More]
Tags:
Clustering, Machine Learning, Extrapolation, Kan Extension, Category Theory, Functorial
Transformation Invariant Continuous Optimization Algorithms
Posted on April 15, 2021
Continuous Optimization Algorithms Suppose we have a function \(l: \mathbb{R}^n \rightarrow \mathbb{R}\) that we want to minimize. A popular algorithm for accomplishing this is gradient descent, which is an iterative algorithm in which we pick a step size \(\alpha\) and a starting point \(x_0 \in \mathbb{R}^n\) and repeatedly iterate \(x_{t+\alpha}...
[Read More]
Tags:
Gradient Descent, Differential Equations, Euler's Method
Gradient Descent Is Euler's Method
Posted on November 28, 2020
Gradient Descent Gradient descent is a technique for iteratively minimizing a convex function \(f: \mathbb{R}^n \rightarrow \mathbb{R}\) by repeatedly taking steps along its gradient. We define the gradient of \(f\) to be the unique function \(\nabla f\) that satisfies: \[lim_{p \rightarrow 0} \frac{f(x+p) - f(x) - \nabla f(x)^{T}p}{\|p\|} = 0\]...
[Read More]
Tags:
Gradient Descent, Differential Equations, Euler's Method
Stability of Mapper Graph Invariants
Posted on November 2, 2020
Introduction The Mapper algorithm is a useful tool for identifying patterns in a large dataset by generating a graph summary. We can describe the Mapper algorithm as constructing a discrete approximation of the Reeb graph: Suppose we have a manifold \(\mathbf{X}\) equipped with a distance metric \(d_{\mathbf{X}}\) (such as a...
[Read More]
Tags:
Mapper, TDA, Topological Data Analaysis, Machine Learning
Compositionality and Functoriality in Machine Learning
Posted on October 2, 2020
Introduction At the heart of Machine Learning is data. In all Machine Learning problems, we use data generated by some process in order to make inferences about that process. In the most general case, we know little to nothing about the data-generating process, and the data itself is just a...
[Read More]
Tags:
Compositionality, Functor, Machine Learning, Category Theory