Operator Calculus
Operators can be seen as mathematical devices that operate on functions.
Hence, actually, operators are functions of functions. Concerning the kind of
operations that are involved, differentiation and integration
should be considered in the first place.
Let $\psi$ be a function. Then the operation $(d/dx)$ in $ d\psi/dx$ is an
example of a (differential) operator. But also multiplication with another
function, for example $f$, is an operator. Like $(f)$ in $f\psi$.
Now we have
the following obvious definitions for equality, sums and products of operators
$ \alpha, \beta $ and (soon also) $\gamma$, when applied on arbitrary functions
$\psi$ and $\phi$ :
$$ [\, \alpha = \beta \,] \; \equiv \; [\, \alpha\psi = \beta\psi \,] \quad ;
\quad ( \alpha + \beta ) \psi \; \equiv \; \alpha\psi + \beta\psi \quad ;
\quad ( \alpha \beta ) \psi \; \equiv \; \alpha ( \beta \psi ) $$
An operator is called linear if the following two requirements are
fulfilled. Here $\lambda$ is a scalar.
$$
\alpha ( \psi + \phi ) = \alpha \psi + \alpha \phi \quad ; \quad
\alpha ( \lambda \psi ) = \lambda ( \alpha \psi )
$$
The second requirement can be derived from the first, provided that
$\lambda$ is a rational number:
$$
\alpha(\left[\psi-\phi\right]+\phi)=\alpha(\psi-\phi)+\alpha\phi
\quad \Longrightarrow \quad \alpha(\psi-\phi)=\alpha\psi-\alpha\phi \\
\alpha(\psi-\psi)=\alpha\psi-\alpha\psi \quad \Longrightarrow \quad \alpha\,0 = 0 \\
\alpha(n.\psi)=\alpha(\psi+\psi+ ... +\psi)=\alpha\psi+\alpha\psi+ ... +\alpha\psi
\quad \Longrightarrow \quad \alpha(n.\psi)=n.\alpha\psi \\
\alpha(n.\psi/n) = n.\alpha(\psi/n) \quad \Longrightarrow \quad
\alpha(\frac{1}{n}\psi) = \frac{1}{n}\alpha\psi \\
\alpha(m/n.\psi)=m.\alpha(1/n.\psi)=m.1/n.\alpha\psi \quad \Longrightarrow \quad
\alpha(\frac{m}{n}\psi) = \frac{m}{n}\alpha\psi \\
\alpha(0-m/n.\psi) = \alpha\,0-\alpha(m/n.\psi) = 0-m/n.\alpha\psi
\quad \Longrightarrow \quad \alpha(-\frac{m}{n}\psi) = -\frac{m}{n}\alpha\psi
$$
However, in the realm of physics, rational numbers cannot be distinguished from the
irrationals: $(\lambda \in \mathbb{Q}) \;\Leftrightarrow\;(\lambda \in \mathbb{R})$.
Part of our motivation is in Additive functions and measure theory at the Mathematics Stack Exchange forum.
Hence the above may be assumed to be valid for all real valued scalars $\lambda$.
In the sequel, all of our utterings will be restricted to linear operators.
It is a simple excercise to prove the following Rules of Arithmetic:
$$ \alpha + \beta = \beta + \alpha \quad ; \quad
( \alpha + \beta ) + \gamma = \alpha + ( \beta + \gamma ) $$
$$ ( \alpha \beta ) \gamma = \alpha ( \beta \gamma ) \quad ; \quad
( \alpha + \beta ) \gamma = \alpha \gamma + \beta \gamma $$
$$ \gamma ( \alpha + \beta ) = \gamma \alpha + \gamma \beta \quad ; \quad
\alpha \lambda = \lambda \alpha $$
The rules for manipulating linear operators are indeed very much resemblant to
the arithmetic rules for ordinary numbers. With one single exception. And
this is actually the only thing one should keep in mind, in practice, when
performing arithmetic with (linear) operators.
The commutative law, namely, in general, is not valid:
$$ \alpha\beta \neq \beta\alpha $$
For this reason, the commutator of two operators is defined as:
$$ \left[ \alpha , \beta \right] = \alpha \beta - \beta \alpha $$
The commutator of two operators, in general, will not be zero.
Furthermore we define an inverse and a (repeatedly) composite operator:
$$ [\, \beta = \alpha^{-1} \,] \; \equiv \; [\, \alpha \beta = 1 \,] \quad ;
\quad \alpha^n \; \equiv \; \alpha ... \alpha \quad \mbox{(n terms)} $$
Let's be somewhat more specific. The product rule for differentiation reads:
$ (f.\psi)' = f'.\psi + f.\psi' $. Or:
$$ \left(\frac{d}{dx} f\right) \psi = (\frac{df}{dx}) \psi + (f.\frac{d}{dx}) \psi $$
The function $\psi$, being entirely arbitrary, provides not a shred of essential
information. Therefore it would be desirable to leave it out. Working conditions
which enable us to do so have been created by the above operator-definitions.
It is clear, namely, that it is always possible to arrive at an expression
of the form: $ \alpha \psi = \beta \psi $, which makes it possible to leave out
$\psi$ . Therefore:
$$ \frac{d}{dx}f = \frac{df}{dx} + f.\frac{d}{dx} $$
The (non-commutative) law for composing a differential and a product-operator
is derived herefrom:
$$ \left[ \frac{d}{dx} , f \right]
= \frac{d}{dx} f - f \frac{d}{dx} = \frac{df}{dx} $$
After division by $f$ the earlier formula can also be written as folllows:
$$ f^{-1} \frac{d}{dx} f = \frac{d}{dx} + \frac{f'}{f} $$
The fraction $ f'/f $ reminds of the derivative of $\log(f) $. If we put
$ f'/f = g $ then $\log(f) = \int g \ dx $, hence $ f = \exp(\int g\,dx) $ .
Let's replace the name $g$ by the name $f$ again. At last, exchange the left
and the right hand side. Then the end-result is:
$$ \large \overline{\underline{ \left| \; \frac{d}{dx} + f = e^{-\int f \, dx}\,
\frac{d}{dx}\, e^{+\int f \, dx } \; \right| }} $$
It is expected from the reader that he or she transfers this formula to his or
her non-volatile memory, so to speak. It is a very useful result, namely, as we
will demonstrate with quite a few examples.