## The Kuramoto–Sivashinsky Equation (Part 2)

I love the Kuramoto–Sivashinsky equation, beautifully depicted here by Thien An, because it’s one of the simplest partial differential equations that displays both chaos and a visible ‘arrow of time’. Last time I made some conjectures about it. Most notably, I conjecture that the ‘stripes’ you see above can appear or merge as time passes, but never disappear or split.

But I was quite confused for a while, because the Kuramoto–Sivashinsky equation comes in two forms. The integral form, which I spent most of my time discussing, is this:

$h_t + h_{xx} + h_{xxxx} + \frac{1}{2} (h_x)^2 = 0$

where $h(t,x)$ is a real function of two real variables, ‘time’ $t$ and ‘space’ $x.$ The derivative form is this:

$u_t + u_{xx} + u_{xxxx} + u u_x = 0$

where $u(t,x)$ is again a real function of two real variables. If $h$ is a solution of the integral form,

$u = h_x$

is a solution of the derivative form. This is easy to see: just take the $x$ derivative of the equation with $h$ in it and see what you get.

By the way, beware of this: Thien An’s wonderful movie above shows the integral form of the equation, but she calls the function $u.$

Note that if $h$ has a local maximum as a function of $x,$ then $u$ will go positive and then negative. This is important because it affects the definition of ‘stripe’. Here’s what stripes look in the derivative form of the Kuramoto–Sivashinsky equation, as computed by Steve Huntsman:

As you can see, the function $u$ goes negative and then positive as we increase $x$ moving through a stripe. This means that $h$ would have a local minimum.

Another thing puzzling me was that Steve Huntsman found solutions of the derivative form where the stripes in $u$ mostly move in the same direction as time passes:

This suggests that there’s some way to take a solution where the stripes aren’t moving much, and switch to a moving frame of reference to get a solution with moving stripes. And indeed I’ll show that’s true! But I’ll also show it’s not true for the integral form of the Kuramoto–Sivashinsky equation—at least not with periodic boundary conditions, which is what we’re using here.

### Galilean transformations

Galileo was the guy who first emphasized that the laws of physics look the same in a moving frame of reference, but later Maxwell explained the idea very poetically:

Our whole progress up to this point may be described as a gradual development of the doctrine of relativity of all physical phenomena. Position we must evidently acknowledge to be relative, for we cannot describe the position of a body in any terms which do not express relation… There are no landmarks in space; one portion of space is exactly like every other portion, so that we cannot tell where we are. We are, as it were, on an unruffled sea, without stars, compass, sounding, wind or tide, and we cannot tell in which direction we are going. We have no log which we can case out to take a dead reckoning by; we may compute our rate of motion with respect to the neighboring bodies, but we do not know how these bodies may be moving in space.

Before Einstein came along, a transformation into a moving frame of reference worked like this:

$t \mapsto t$
$x \mapsto x + vt$

where $v$ is a constant, the velocity.

These are called ‘Galilean transformations’ And here’s something cool: the derivative form of the Kuramoto–Sivashinsky equation has Galilean transformations as symmetries! If $u(t,x)$ is a solution, so is the boosted function

$u'(t,x) = u(t,x+vt) - v$

The prime here does not mean derivative: $u'$ is the name of our new boosted solution. To get this boosted solution, we do the obvious coordinate transformation into a moving frame of reference, but then a bit more: we must subtract the constant $v.$

Let’s see how this works! Suppose $u$ is a solution of the derivative form of the Kuramoto–Sivashinsky equation:

$u_t + u_{xx} + u_{xxxx} + u u_x = 0$

Then defining

$u'(t,x) = u(t, x+vt) - v$

we see the $t$ derivative of $u'$ has an extra $vu_x$ term:

$u'_t(t,x) = u_t(t,x+vt) + vu_x(t,x+vt)$

while its $x$ derivatives are simple:

$u'_x(t,x) = u_x(t,x+vt)$

and so on for the higher $x$ derivatives.

This lets us check that the boosted function $u'$ is also a solution:

$\begin{array}{ccl} u'_t + u'_{xx} + u'_{xxxx} + u' u'_x &=& u_t(t,x+vt) + vu_x(t,x+vt) \\ \\ && + u_{xx}(t,x+vt) + u_{xxxx}(t,x+tv) \\ \\ && + (u(t,x+vt) - v) u_x(t,x+vt) \\ \\ &=& 0 \end{array}$

Note how subtracting the constant $v$ when we transform $u$ exactly cancels the extra term $vu_x$ term we get when we transform the time derivative of $u.$

I got this idea from here:

• Uriel Frisch, Zhensu She and Olivier Thual, Viscoelastic behaviour of cellular solutions to the Kuramoto–Sivashinsky model, Journal of Fluid Mechanics 168 (1986), 221–240.

Does the integral form of the Kuramoto-Sivashinsky equation also have Galilean transformations as symmetries? Yes, but there’s a certain problem with them. Let’s see how it goes.

If $h(t,x)$ is a solution, then so is this boosted function:

$h'(t,x) = h(t,x+vt) - vx - \frac{1}{2}t v^2$

But now the ‘fudge factors’ in the boosted function are more sneaky! We don’t just subtract a constant, as we did with the derivative form. We subtract a function that depends on both the space and time coordinates.

Let’s check. The first time derivative of $h'$ works like this:

$h'_t(t,x) = h_t(t,x+vt) + vh_t(t,x+vt) - \frac{1}{2} v^2$

The first space derivative works like this:

$h'_x(t,x) = h_x(t,x+vt) - v$

The second space derivative is simpler:

$h'_{xx}(t,x) = h_{xx}(t,x+vt)$

and the higher space derivatives work the same way. Now suppose $h$ was a solution of the integral form of the Kuramoto-Sivashinsky equation:

$h_t + h_{xx} + h_{xxxx} + \frac{1}{2} (h_x)^2 = 0$

We can use our formulas to check that $h'$ is a solution:

$\begin{array}{ccl} h'_t \! +\! h'_{xx}\! +\! h'_{xxxx} \!+ \! \frac{1}{2} (h'_x)^2 \! &=& h_t(t,x+vt) + vh_t(t,x+vt) \! - \!\frac{1}{2} v^2 \\ \\ && + h_{xx}(t,x+vt) \\ \\ && + h_{xxxx}(t,x+vt) \\ \\ && + \frac{1}{2}(h_x(t,x+vt) - v)^2 \\ \\ &=& h_x(t,x+vt) + h_{xx}(t,x+vt) \\ \\ && + h_{xxxx}(t,x+vt) + \frac{1}{2} h_x(t,x+vt)^2 \\ \\ &=& 0 \end{array}$

The cancellations in the second step rely crucially on those sneaky fudge factors in the definition of the boosted solution $h'.$ I chose those fudge factors precisely to make these cancellations happen.

### Periodic boundary conditions

A lot of really interesting results about the Kuramoto-Sivashinsky equation involve looking at solutions that are periodic in space. For the derivative form this means

$u(t,x+L) = u(t,x)$

It’s easy to see that if $u$ obeys this equation, so does its boosted version:

$u'(t,x) = u(t,x+vt) - v$

After all, a quick calculation shows

$\begin{array}{ccl} u'(t,x+L) &=& u(t,x+L+vt) - v \\ \\ &=& u(t,x+vt) - v \\ \\ &=& u'(t,x) \end{array}$

So for any spatially periodic solution with some stripes that are basically standing still, there’s a spatially periodic boosted version where they’re moving at velocity $v.$ You can think of a spatially periodic solution as a function on the cylinder. In the boosted version, the stripes spiral around this cylinder like the stripes on a barber pole!

But for the integral form this doesn’t work! Suppose $h$ is a solution of the integral form that is periodic in space:

$h(t,x+L) = h(t,x)$

Now its boosted version is defined by

$h'(t,x) = h(t,x+vt) - vx - \frac{1}{2}t v^2$

and this is not periodic in space:

$\begin{array}{ccl} h'(t,x+L) &=& h(t,x+L+vt) - v(x+L) - \frac{1}{2}t v^2 \\ \\ &=& h(t,x+vt) - v(x+L) - \frac{1}{2}t v^2 \\ \\ &\ne& h(t,x+vt) = h'(t,x) \end{array}$

### A conserved quantity

The derivative form of the Kuramoto–Shivashinsky equation has an unexpected symmetry under Galilean transformations, or boosts, even for periodic solutions. It also has an interesting conserved quantity, namely

$\displaystyle{ \int_0^L u(t,x) \, dx }$

To see this, note that

$\begin{array}{ccl} \displaystyle{ \frac{d}{dt} \int_0^L u(t,x) \, dx } &=& \displaystyle{ \int_0^L u_t(t,x) \, dx } \\ \\ &=& \displaystyle{ - \int_0^L \left( u_{xx} + u_{xxxx} + u u_x \right) \, dx } \\ \\ &=& \displaystyle{ \left. - \left( u_x + u_{xxx} + \frac{1}{2} u^2 \right) \right|_0^L } \\ \\ &=& 0 \end{array}$

Note that if we boost a solution, replacing $u(t,x)$ by $u(t,x+vt) - v,$ we also subtract a constant from this conserved quantity, namely $v L.$ So, it seems that the conserved quantity

$\displaystyle{ \int_0^L u(t,x) \, dx }$

is a way of measuring ‘how much the stripes are moving, on average’.

I’ll venture a guess that when

$\displaystyle{ \int_0^L u(t,x) \, dx = 0 }$

the stripes are stationary, on average. One reason I’m willing to guess this is that this equation has another meaning, too.

I mentioned that we can get solutions $u$ of the derivative form from solutions $h$ of the integral form by differentiating them, like this:

$u = h_x$

Does every solution of the derivative form arise this way from a solution of the integral form? Given $u$ we can define $h$ by

$\displaystyle{ h(t,x) = \int_0^x u(t,y) \, dy }$

and we can check it obeys the integral form of the Kuramoto–Sivashinsky equation. I’ve been showing you lots of calculations and we’re probably both getting tired of them, so I won’t actually go through this check. But here’s the interesting part. Suppose we restrict attention to spatially periodic solutions! If $u$ is spatially periodic, $h$ will be so iff

$\displaystyle{ \int_0^L u(t,x) \, dx = 0}$

since we have

$\begin{array}{ccl} h(t,x+L) &=& \displaystyle{ \int_0^{x+L} u(t,y) \, dy } \\ \\ &=& \displaystyle{ \int_0^x u(t,y) \, dy + \int_x^{x+L} u(t,y) \, dy } \\ \\ &=& h(t,x) \end{array}$

where in the last step, the first integral is $h(t,x)$ by definition and the second is zero because

$\displaystyle{ \int_0^L u(t,x) \, dx = 0 }$

and $u$ is periodic with period $L.$

### Summary

Limiting ourselves to spatially periodic solutions, we see:

• The derivative form of the Kuramoto–Sivashinsky equation has symmetry under boosts.

• The integral form does not.

• Solutions of the integral form correspond precisely to solutions of the derivative form with

$\displaystyle{ \int_0^L u(t,x) \, dx = 0 }$

• Boosting a solution of the derivative form with

$\displaystyle{ \int_0^L u(t,x) \, dx = 0 }$

gives a solution where this quantity is not zero.

Given all this, I think the good theorems about spatially periodic solutions of the Kuramoto–Sivashinsky equation will focus on solutions of the integral form, or equivalently solutions of the differential form with

$\displaystyle{ \int_0^L u(t,x) \, dx = 0 }$

### Epilogue: symmetries of the heat equation

At first I was shocked that the Kuramoto–Sivashinsky equation had Galilean transformations as symmetries, because it’s a relative of the heat equation, and the heat equation obviously doesn’t have Galilean transformations as symmetries: you can’t get a solution of the heat equation where a wave of heat moves along at constant velocity as it spreads out.

But then I remembered that the heat equation

$u_t = u_{xx}$

is very similar to Schrödinger’s equation:

$u_t = i u_{xx}$

And Schrödinger’s equation does have Galilean transformations as symmetries, since it describes a free particle! Physicists know how they work. You don’t just replace $u(t,x)$ with $u(t,x+vt),$ you also multiply it by a suitable complex phase that depends on $t$ and $x$. That is, you multiply $u(t,x+vt)$ by the exponential of some imaginary-valued function of $t$ and $x.$

So then I realized that the heat equation does have Galilean symmetries! They work just as for Schrödinger’s equation, but now you have to multiply $u(t,x+vt)$ by the exponential of some real-valued function of $t$ and $x.$

This grows exponentially as you move in one direction in space, so people thinking about heat often don’t think about such solutions. Yes, you can get a ‘wave of heat that moves along at constant velocity as it spreads out’, but it’s like a huge tsunami wave rolling in from infinity!

I was very proud of myself for having discovered this weird Galilean invariance of the heat equation, and I posed it as a puzzle on Twitter. But then Tom Price pointed out a paper about it:

• U. Niederer, Schrödinger invariant generalized heat equations, Helvetica Physica Acta 51 (1978), 220–239.

It turns out the Galilean invariance of the heat equation, and much more, was discovered by Sophus Lie in 1882!

The so-called ‘Schrödinger group’ also includes certain fractional linear transformations of the plane. So this raises the question of whether the Kuramoto–Sivashinsky equation has even more symmetries than those in the Galilei group (translations, spatial reflections and Galilean transformations).

But it seems this too has been studied:

• Mehdi Nadjafikhah and Fatemeh Ahangari, Lie symmetry analysis of the two-dimensional generalized Kuramoto–Sivashinsky equation, Mathematical Sciences 6 (2012).

This discusses the analogue of the Kuramoto–Sivashinsky equation in two dimensions of space and one of time. It uses some techniques to compute its symmetry group, but I don’t see any sign of invariance under a big group like the ‘Schrödinger group’.

### 7 Responses to The Kuramoto–Sivashinsky Equation (Part 2)

1. Steve Huntsman says:

I think solutions of the derivative form with spatial integral zero are the way to go for numerics. It’s trivial to enforce this constraint on the initial condition (which as you point out is conserved thereafter) and it removes a free parameter from the definition of stripe–just consider zero crossings appropriately. All of the KS solvers I’ve seen, including Nick Trefethen’s, use this form, and my experiences trying to either code up a solver for the integral form or spatially integrate the output of a solver for the differential form have not been encouraging.

• Graham Jones says:

I made a crude solver for the integral form in R, which seems OK. See below, hope it doesn’t get mangled.

Some comments on the code.. xl,xll,xr,xrr are x rotated, ie shifted and wrapped around. Eg if x is (1 2 3 4 5 6 7 8), then xl is (2 3 4 5 6 7 8 1). d2d4 is the 2nd and 4th derivatives combined, and gg the squared 1st derivative. I subtracted the mean from gg, or the values of u became more and more negative

iterate <- function(x) {
xl = c(x[-1], x[1])
xll = c(x[-2:-1], x[1:2])
lenx = length(x)
xr = c(x[lenx], x[1:(lenx-1)])
xrr = c(x[(lenx-1):lenx], x[1:(lenx-2)])
d2d4 = xll -3xl + 4x -3*xr + xrr
gg = (xr-xl)^2 / 8
gg = gg – mean(gg)
x – 0.1 * (d2d4 + gg)
}

xlen = 100
N = 200
z = matrix(0, nrow=N, ncol = xlen)
x = runif(xlen) #sin((1:xlen)/2) + rnorm(xlen, sd=0.01)
plot(x,type=’l’)
for (i in 1:N) {
for (j in 1:10) {
x = iterate(x)
}
z[i,] = x
}
image(z)

• Steve Huntsman says:

So I don’t know R (and was put off when I briefly considered learning it) but it looks to me like this does explicit–and vanilla–finite differencing (e.g., (1 -4 6 -4 1) + (0 1 -2 1 0)). I had been tinkering briefly with doing stuff in Fourier space to maintain periodicity, without success or patience.

2. Bob says:

You mention Galilean boosts: I guess there is a symmetry under the full d=1+1 Galilean group that takes solutions to solutions? In this case, I guess that means boosts, t translations and x translations. An interesting thing about the Galilean group is that it has a central extension, there is a central term in the [boost, translation] commutator. Is the central extension related to any interesting physics in the K-S equation?

• John Baez says:

Great questions! Since the Galilean group acts with a central extension on the Schrödinger equation and the heat equation (i.e. we have a projective representation of this group on the space of solutions, not an ‘honest’ representation), I would not at all be surprised if this is true of the K-S equation. But I have not done the calculation needed to check this. It should be pretty easy for the derivative form of the K-S equation!

3. allenknutson says:

I have a question about this “Schrödinger group”, which Wikipedia reports as the semidirect product of the symplectic group acting on the Heisenberg group. Is it really that? Or is it really the metaplectic group acting on the Heisenberg group? Obviously, older physicists aren’t to be trusted on this question, since they are only really thinking about the Lie algebra not the group.

My understanding is this. Let G be a group with a set Irr(G) of isomorphism classes of irreps. Assume that [V] in Irr(G) is a fixed point for the natural action of Aut(G) on Irr(G), or maybe for just Aut(G)_0, the identity component. (Which is automatic if Irr(G) is discrete.) Then we can try to extend the action of G on V to include Aut(G)_0 as well, but we fail up to some Schur’s lemma factors, which effectively mean we need to take a central extension of Aut(G)_0.

In the case G = an affine Lie group, Aut(G)_0 = Diff(S^1), then the positive-level representations form a discrete set and the action extends to that of the Virasoro centrally extending Diff(S^1).

In the case at hand G = Heisenberg, the Stone-von Neumann theorem says that our irrep is essentially unique, and so some extension of the symplectic group Aut(G)_0 should act… which I thought was the metaplectic group.

(This central extension issue may not be relevant for this Kuramoto-Sivashinsky stuff. But I still would like to know which group the Schrödinger group “should” be.)

• John Baez says:

Interesting! I think the Schrödinger group must be formed from the metaplectic group acting on the Heisenberg group. After all, we want the Schrödinger group to act on the Hilbert space of solutions of the 1d Schrödinger equation. This can be thought of as $L^2(\mathbb{R}),$ or equivalently a Fock space… and it’s the metaplectic group that acts on this nontrivially in a unitary way, not the symplectic group.

It’d be really cool if the whole Schrödinger group action on the space of solutions of Schrödinger equation deformed to give a nonlinear action on the space of solutions of the Kuramoto–Sivashinsky equation in this form

$u_t = -u_{xx} - u_{xxxx} + \lambda u_x u$

I don’t have any special reason to think it does, but I don’t think the papers I’ve read have ruled out the possibility.

This site uses Akismet to reduce spam. Learn how your comment data is processed.