Is this serious Mathematics?
Let there be given an real-valued function $f(n)$
, with $n\in\mathbb{N}$ , $a,b \in\mathbb{R}$:
$$
f(n+1) = a f(n) + b
$$
What then is the value of $f(\infty)$ ?
As a physicist by education, being brought up with with limits and calculus,
I would first derive the following, for $m\in\mathbb{N}$:
$$
f(n+m) = a^m f(n) + \frac{1-a^m}{1-a} b
$$
Consequently, but only for $|a| < 1$ :
$$
f(\infty) = \lim_{m\to\infty} f(n+m) = b/(1-a)
$$
And if $|a|\ge 1$ then this limit and so $f(\infty)$ does not exist.
However, once upon a time on the internet, sci.math
to be precise, I've seen an argument like this - the original is in ASCII:
$$
f(n+1) = a f(n) + b
\quad \Longrightarrow \quad
f(\infty) = f(\infty+1) = a f(\infty) + b
\quad \Longrightarrow \quad
f(\infty) = b/(1-a)
$$
Thus, $f(\infty)$ is defined even if the sequence $f(n)$ is divergent.
I've reproduced it as it says.
Now the question is: is the latter a valid argument in mathematics, and is $f(\infty)$ indeed
defined, even if the sequence $f(n)$ is divergent?