The curvature of curves and its computation

How much does a curve bend? That looks like an important question to ask. Indeed, it is THE question to ask because curvature is everything we need to know about a curve (modulo some annoying groups we will talk about in the future). If you are too shy to ask, you can compute it and that is what this post is about. In order to compute the curvature you need a bunch of things and for each one there is a bunch of ways of doing it, so, let’s talk about some of them.


Consider the curve \gamma(t) = (x(t), y(t)) (yeah, we are going to work in two dimensions where everything is nice). It has coordinates x(t) and y(t) that change according with a parameter t. The derivative of the curve is the rate of change of its coordinates, so, let’s just focus in one of them.

The derivate is the rate of change. How much change in a minute, for example, if our parameter represent time in minutes as a unit. There are  three ways of expressing this numerically:

Forward: \frac{x(t + h) - x(t)}{h}

Backwards: \frac{x(t) - x(t-h)}{h}

Central: \frac{x(t + h) - 2x(t) + x(t - h)}{h^2}

This is only an approximation, because, if you remember your calculus lectures, the true derivative occurs in the limit h \rightarrow 0. Your approximation is then as good as small is your h in the denominator. If you ask a numerical analyst, she will say: “use the central one!” (hope you spot the reason for that). Awesome! The problem is that in the fringes of your data (yes, you can not get infinite data), that scheme isn’t defined for obvious reasons.

What to do? Well, the general recipe is like this: use the forward difference for your first point, continue with the central for the interior ones and do the backwards for the last one. That is precisely what the function “gradient” from Matlab® does, so, if you are using Matlab, use that function instead of “diff” to compute derivatives. This is not perfect of course because as you can see in the next figure, where I tried to compute the derivative of x(t) = \cos(t), the first and last points will be much less precise that the rest (as much as h and h^2 differ in orders of magnitude).


It seems that we are condemned to lose one data point every time we compute a derivative. If you are not willing to pay that price and feel a bit adventurous, you can try the following scheme.

Fit a polynomial to each of your coordinate functions. For example, if you are using Matlab,  you can use the function “polyfit” or you can just do it manually. Then you can differentiate that polynomial “analytically” (because derivatives of polynomials are easy) and then evaluate the result at the points you want. Here is a small code that does exactly that:

degree = 10;
r_poly = polyfit( t, gamma, degree );
rp_poly = polyder( r_poly );
rp_poly_vals = polyval(rp_poly, t);

The result depends a lot on the degree of your polynomial and it is a dangerous when your function has sharp changes, so use it wisely! (I mean, you are assuming a lot of things about that curve). Let’s move on to the next thing we need, remember, the aim is to compute the curvature!


You can travel along the curve at different speeds. In general, for any continuous, monotonic function \psi, \phi(\psi(t)) is the same curve (as a set of points), just traversed at different speed!. \psi(t) is called a reparametrization of the curve. A valid reparametrization would be, for example, t \rightarrow t^2. As an example of the different reparametrizations, the next figure shows an astroid traversed at three different speeds.


The bottom line is: there are infinite ways of specify a curve. There is of course one special reparametrization called the arc-length parametrization which every one likes because corresponds to the traversal of the curve with unit speed. In our case that’s helpful because the first derivative \dot{\gamma}(t) \equiv 1 for all t, and that easies the computation of curvature.

Here is how you arc-length reparametrize a curve. First you need to compute the arc-length. The arc length at t is the length of the curve (like, literally) till that time. Funny thing, it is obtained by measuring the length of the tangent vector (derivative) at each point and adding them together. In math:

s(t) = \int_0^t \|\dot{\gamma}(\tau)\| d\tau

Assume t \in [ 0, 1 ], then, the total length of the curve is L = s(1). Setup a uniform partition P of [0, L]. For each \hat{s}_i \in P, find a \tau_i such that \hat{s}_i - s(\tau_i) = 0. The ordered pairs (\hat{s}_i, \tau_i) are a discretization of the parametrization \hat{\phi}(t) sought after. Therefore, \gamma \circ \hat{\phi} : [0,L] \rightarrow \mathbb{R}^2 is arc length parametrized (check the domains and ranges of each function to see they correspond).

The arc-length parametrized astroid from the previous example is shown in the following figure (notice how on the right, the tangent vector has constant length)



Now we can compute the curvature. I will  mention three ways of doing it. First, we can just use the known formula for the curvature in two dimensions

\kappa(t) = \frac{\det(\dot{\gamma}(t), \ddot{\gamma}(t))}{\|\dot{\gamma}(t)\|^3}

I already showed how to compute the derivatives numerically, so we can compute the whole thing directly. This approach is very unstable in the sense that even small fast changes will blow up the curvature, so it is not very appropriate for noisy data.  Alternatively, as proposed in [1], we can get the required first and second derivatives from a fitting procedure as follows:

It is known that, in the vicinity of  t, the curve can be approximated by

\gamma(t + h) \approx \ddot{\gamma}(t)\frac{h^2}{2} + \dot{\gamma}(t)h + \gamma(t)

Now, consider the following linear regression problem:

\gamma(t + h) \approx a(t)\frac{h^2}{2} + b(t)h + c(t),

if we solve it, we will get our derivatives for free: a(t) would be \ddot{\gamma}(t) and b(t), \dot{\gamma}(t) (Ah! this is another method for computing the derivatives). Let’s solve it by least squares. For each k, choose a discrete neighborhood of k (now we are working with the discrete curve, the one you actually have in your computer): -\Delta \le h_i \le \Delta and construct the matrices and vectors:

\mathbf{X} = \begin{pmatrix}h^2_1/2 & h^2_2/2 & \cdots & h^2_n/2 \\h_1 & h_2 & \cdots & h_n \\1 & 1 & \cdots & 1\end{pmatrix}^T, \mathbf{\beta} = \begin{pmatrix}a(t) \\b(t)  \\c(y)\end{pmatrix} and \mathbf{y} = \begin{pmatrix}\gamma[k +h_1] \\\gamma[k +h_2]\\\vdots \\\gamma[k +h_n]\end{pmatrix}

The regression problem can be formulated as \mathbf{X\beta} = \mathbf{y} and its solution:

\mathbf{\beta} = (\mathbf{X}^T\mathbf{X})^{-1}\mathbf{X}^T\mathbf{y},

which can be solved easily with Matlab or any other scientific computing software (you actually solve the problem for each coordinate separately). Cool! isn’t it?

The last way of computing the curvature is to first find the arc-length parametrization and then chill out because, given that \dot{\gamma} \equiv 1, the formula for \kappa(t) reduces to:

\kappa(t) = \|\ddot{\gamma}(t)\|,

so, the curvature at each point is just the length of the acceleration vector along the path in an arc-length parametrized curve!

Let’s see how all these formulas work in real life. In the following figure I show the curvature of a logarithmic spiral. The theoretical value is shown in red and the computed value in black. The fact that you can barely see the black trace means that all 3 schemes work quite well, except in the fringes (where the small inaccuracies we talked about before are amplified greatly).


In the case of the ellipse all three schemes do well, again. You can see the four extrema that are expected for every closed curve (four vertex theorem). Everything’s smooth, everything is easy. Too easy maybe.


To complicate things a bit, let’s add some noise \epsilon \propto 10^{-4} (imperceptible). As we see in the next figure, things blow up out of control for all but the polyfit scheme which does just fine.


Finally, In our initial example, that of the Astriod, the schemes have some trouble with the singularities which, after observing the analytical expression for the curvature,

\kappa(t) = -\frac{1}{3 a  \sin t \cos t},

we discover to be located at t \in \{0, \pi/2, \pi, 2\pi\} (The arc length scheme itself requires some kind of fitting for the interpolation of the required values of time, this can be adjusted to try to get better results).


Noise doesn’t help in this case either. The polyfit scheme starts having problems with the singularities as we start increasing the size of the neighborhood over which we perform the fitting.


A final word about the sign of the curvature. In two dimensions the curvature has a sign which can be set up by choosing it based on the normal vector. I will talk about the Frenet frame in a future post, for now, I chose the sign arbitrarily.  Also in a future post I will explore n– dimensional curvatures.


[1] Younes, L. (2010). Shapes and diffeomorphisms (Vol. 171). Springer Science & Business Media.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s