Let’s talk about \(\Gamma\), the gamma function. One easy-to-understand definition of \(\Gamma\) is this recursive one:

\(\Gamma(n) = (n - 1)!\)

where \(!\) is the factorial function, which we can define like this:

\(n! = n \times (n-1) \times (n-2) \times \cdots \times 3 \times 2 \times 1\)

or equivalently, like

\[ n! = \begin{cases} 1 & n = 0 \\ n \times (n - 1)! & \mathrm{otherwise} \end{cases} \]

Both of these notations give us a view into the recursive nature of this function, which we’ll get to in a second. Clearly, \(\Gamma\) and \(!\) are closely related. In fact, we can define \(\Gamma\) in very similar ways to the ways we’ve defined factorial here. Here’s the multiplication-elided-with-dots version:

\(\Gamma(n) = (n-1) \times (n-2) \times (n-3) \times \cdots \times 3 \times 2 \times 1\)

And here’s the base-case-and-recursive-case version:

\[ \Gamma(n) = \begin{cases} 1 & n = 1 \\ (n-1) \times \Gamma(n-1) & \mathrm{otherwise} \end{cases} \]

These notations make it pretty obvious that \(\Gamma\) (and its close cousin \(!\)) can be thought of recursively. There’s a base case—the function bottoms out in \(1\)—and there’s a pattern of reducing the problem from its original size down to the size of the base case. Because this recursiveness is inherent to \(\Gamma\), we should always be able to find it if we look hard enough…so let’s take a look at another definition for \(\Gamma\):

\[ \Gamma(n) = \int_0^\infty x^{n-1} e^{-x} \mathrm{d}x \]

At first glance, it’s harder to see where the recursiveness is now. Let’s try to evaluate the integral, though.

\[ \Gamma(n) = \int_0^\infty x^{n-1} e^{-x} \mathrm{d}x \]

\[ \Gamma(n) = \left[ -x^{n-1} e^{-x} \right]_0^\infty + \int_0^\infty (n-1)x^{n-2} e^{-x} \mathrm{d}x \]

\[ \Gamma(n) = 0 + \int_0^\infty (n-1)x^{n-2} e^{-x} \mathrm{d}x \]

\[ \Gamma(n) = (n-1)\int_0^\infty x^{n-2} e^{-x} \mathrm{d}x \]

\[ \Gamma(n) = (n-1)\Gamma(n-1) \]

In the intermediate step, we get back an integral suspiciously similar to the one we’re currently working with! This is, in fact, just \(\Gamma(n-1)\), which brings us right back to our way of thinking about \(\Gamma\) as recursive.

Let’s try one more, this time a tricky one. Here’s a beast of a definition for \(\Gamma\), based on the Weierstrass Product Formula. We’ll try to reduce it to a form that feels more…recursivey.

\[ \Gamma(n) = \frac{1}{ ne^{ \lim_{m \to \infty} (\sum_{j=1}^{m} \frac{1}{j} - \ln m) n } \prod_{k=1}^{\infty} \left( 1+\frac{n}{k} \right) e^{-\frac{n}{k}} } \]

We can first simplify by noting that \[ \lim_{m \to \infty} \left( \sum_{j=1}^{m} \frac{1}{j} - \ln m \right) \] is actually just the Euler-Mascheroni constant, a number around \(0.5772\). Confusingly, it’s denoted by lowercase gamma (\(\gamma\)).

\[ \Gamma(n) = \frac{1}{ ne^{ \gamma n } \prod_{k=1}^{\infty} \left( 1+\frac{n}{k} \right) e^{-\frac{n}{k}} } \]

Taking the logarithm of both sides does all kinds of fancy (and helpful!) stuff, like removing exponential functions and turning the product into a sum.

\[ - \ln \Gamma(n) = \ln n + \gamma n + \sum_{k=1}^{\infty} \ln \left( 1+\frac{n}{k} \right) - \frac{n}{k} \]

The derivative also gets us closer to where we want to be

\[ - \frac{\Gamma^\prime(n)}{\Gamma(n)} = \frac{1}{n} + \gamma + \sum_{k=1}^{\infty} \frac{1}{n + k} - \frac{1}{k} \]

From the definition of \(\Gamma\), we can come up with a convenient second definition for \(\frac{\Gamma^\prime(n)}{\Gamma(n)}\).

\(\Gamma(n) = (n-1)\Gamma(n-1)\)

\(\Gamma^\prime(n) = (n-1)\Gamma^\prime(n-1) + \Gamma(n-1)\)

\(\frac{\Gamma^\prime(n)}{\Gamma(n)} = \frac{\Gamma^\prime(n-1)}{\Gamma(n-1)} + \frac{1}{n}\)

Now let’s substitute this result back into our previous equation.

\[ - \frac{\Gamma^\prime(n-1)}{\Gamma(n-1)} - \frac{1}{n} = - \frac{\Gamma^\prime(n)}{\Gamma(n)} = \frac{1}{n} + \gamma + \sum_{k=1}^{\infty} \frac{1}{n + k} - \frac{1}{k} \]

Hmm. So we have a recursivey looking thing on the left sides, but not on the right:

\[ \frac{1}{n} + \gamma + \sum_{k=1}^{\infty} \frac{1}{n + k} - \frac{1}{k} \]

These aren’t equations anymore, but let’s try to eliminate some stuff if we can. Adding a constant probably doesn’t add recursive flavor. Neither does adding the constant parameter overhead. Let’s examine just the sum:

\[ \sum_{k=1}^{\infty} \frac{1}{n + k} - \frac{1}{k} \]

Maybe reorganizing terms will make something appear?

\[ \sum_{k=1}^{\infty} - \frac{n}{k (n + k)} \]

Writing out the first few terms?

\[ - \frac{n}{n + 1}, - \frac{n}{2n + 4}, - \frac{n}{3n + 9}, - \frac{n}{4n + 16} \cdots \]

Well sometimes an investigation doesn’t go the way you wanted. This doesn’t have the self-referential recursive flavor I was looking for, at least not in a straightforward way. This kind of reminds me of the closed-form formula for finding Fibonacci numbers. The usual recursive way of thinking has vanished, yielding to an easier-to-compute but in my opinion, harder-to-understand definition.

\[ \frac{\varphi^n - \psi^n}{5} \]

\(\varphi\) and \(\psi\) are just numerical constants, meaning the function is parameterized only by n. There’s no recursive flavor here. It’s interesting to me that we can make recursion “vanish” in this way for at least some set of recursively defined functions, though I’m not really sure what it even means to make that “flavor” go away. I wonder if there are any recovery strategies to get recursive definitions back out of closed-form solutions, but I haven’t run into anything too profound in my brief searching.