In his famous 1981 paper, Mourre gave a sufficient condition for a self-adjoint operator to assure the absence of its singular continuous spectrum. More precisely, consider a self-adjoint operator on a Hilbert space (e.g., with the usual norm), and assume that there is a self-adjoint operator , called a conjugate operator of on an interval , so that

for some positive constant and some compact operator on , where denotes the spectral projection of onto , the commutator , and the inequality is understood in the sense of self-adjoint operators. Then, Mourre’s main results are

- the point spectrum of in the interior of is finite.
- for any closed interval and any , the operator is bounded on uniformly as , where .

The Mourre’s theory is proven to be very useful in the study of spectral and scattering theory for Schrödinger operators and other dispersive PDEs. For instance, it yields the *limiting absorbing principle*, which in turn gives the *Kato’s local smoothing estimate* and the *scattering RAGE’s theorem*; for instance, see this blog post of T. Tao.

Below, I shall give a sketch of the proof of the Mourre’s beautiful theorem, then derive some local decay estimates on solutions to Schrödinger equations, and discuss some quick applications to linear damping in fluids.