Determinant of a generalized Pascal matrix

Let $M$ denote the infinite matrix defined recursively by

$$ M_{ij} = \begin{cases} 1, & \text{if } i=1 \text{ and } j=1; \\ aM_{i-1,j}+bM_{i,j-1}+cM_{i-1,j-1}, & \mbox{otherwise}.\\ \end{cases} $$ ($M_{i,0}$ and $M_{0,j}$ are both defined to be $0$.)

(Added: I just discovered that the numbers in the $M$ matrix are called weighted Delannoy numbers.)

Let $M_n$ denote the $n \times n$ upper-left submatrix of $M$.

For example, with $a = b = c = 1$,

$M_1 = \begin{bmatrix} 1 \end{bmatrix}$, $M_2= \begin{bmatrix} 1 & 1 \\ 1 & 3 \end{bmatrix}$, and $M_3 = \begin{bmatrix} 1 & 1 & 1 \\ 1 & 3 & 5 \\ 1 & 5 & 13 \end{bmatrix}$, and $M_4 = \begin{bmatrix} 1 & 1 & 1 & 1 \\ 1 & 3 & 5 & 7 \\ 1 & 5 & 13 & 25 \\ 1 & 7 & 25 & 63 \end{bmatrix}.$

A few years ago one of my students proved, by induction, that $$\det M_n = (ab+c)^{n(n-1)/2}.$$
My question is

Is there a noninductive proof that $\det M_n = (ab+c)^{n(n-1)/2}$ that gives more insight into why the determinant works out so nicely?

For example, when $a = b = 1$, $c = 0$, $M$ is the symmetric Pascal matrix. I've seen more than one way to prove that $\det M_n = 1$ in this case. For example, Edelman and Strang give four proofs of an LU-decomposition that does it. I also once saw, at a conference, a combinatorial proof using the interpretation of the determinant in terms of nonintersecting paths in a directed graph. (I think the talk was given by Art Benjamin, but it was several years ago, and I may be misremembering.) So I know that there are some nice proofs in the special case of the Pascal matrix. But what about the general case?


Solution 1:

EDIT:

So after all the effort I put in couple of days back to prove this, I found out today that this result was already published earlier this year. The article is "LDU decomposition of an extension matrix of the Pascal matrix," by Ik-Pyo Kim (Linear Algebra and Its Applications 434(10) 2187-2196, 2011) and can be found here.


The $LU$ factorization of $M_{n+1}$ is given below. Let $x$ be such that $x^2 = 1 + \frac{c}{ab}$. Then we have that $$M_{n+1} = L_{n+1} U_{n+1}$$ where $$\small L_{n+1} = \begin{bmatrix}1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\a & ax & 0 & 0 & 0 & \cdots & 0 & 0\\ a^2 & 2a^2x & a^2x^2 & 0 & 0 & \cdots & 0 & 0\\ a^3 & 3a^3x & 3a^3x^2 & a^3x^3 & 0 & \cdots & 0 & 0\\ a^4 & 4a^4x & 6a^4x^2 & 4a^4x^3 & a^4x^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots \\ a^n & \binom{n}{1}a^nx^1 & \binom{n}{2}a^nx^2 & \binom{n}{3}a^nx^3 & \binom{n}{4}a^nx^4 & \cdots & \binom{n}{n-1}a^nx^{n-1} & a^nx^n \end{bmatrix}$$

$$\small U_{n+1}^T = \begin{bmatrix}1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\b & bx & 0 & 0 & 0 & \cdots & 0 & 0\\ b^2 & 2b^2x & b^2x^2 & 0 & 0 & \cdots & 0 & 0\\ b^3 & 3b^3x & 3b^3x^2 & b^3x^3 & 0 & \cdots & 0 & 0\\ b^4 & 4b^4x & 6b^4x^2 & 4b^4x^3 & b^4x^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots \\ b^n & \binom{n}{1}b^nx^1 & \binom{n}{2}b^nx^2 & \binom{n}{3}b^nx^3 & \binom{n}{4}b^nx^4 & \cdots & \binom{n}{n-1}b^nx^{n-1} & b^nx^n \end{bmatrix}$$

This gives us $\det(L_n) = (ax)^{\frac{n(n-1)}{2}}$ and $\det(U_n) = (bx)^{\frac{n(n-1)}{2}}$.

Hence, $\det(M_n) = (ax)^{\frac{n(n-1)}{2}}(bx)^{\frac{n(n-1)}{2}} = (abx^2)^{\frac{n(n-1)}{2}} = (ab+c)^{\frac{n(n-1)}{2}}$.

Also note that $$L_n = A_n X_n$$ and $$U_n^T = B_n X_n$$ where $$A_{n+1} = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & a & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & a^2 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & a^3 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & 0 & a^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots\\ 0 & 0 & 0 & 0 & 0& \cdots & 0 & a^n\\ \end{bmatrix}$$ $$B_{n+1} = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & b & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & b^2 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & b^3 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & 0 & b^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots\\ 0 & 0 & 0 & 0 & 0& \cdots & 0 & b^n\\ \end{bmatrix}$$ $$X_{n+1} = \begin{bmatrix}1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\1 & x & 0 & 0 & 0 & \cdots & 0 & 0\\ 1 & 2x & x^2 & 0 & 0 & \cdots & 0 & 0\\ 1 & 3x & 3x^2 & x^3 & 0 & \cdots & 0 & 0\\ 1 & 4x & 6x^2 & 4x^3 & x^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots \\ 1 & \binom{n}{1}x^1 & \binom{n}{2}x^2 & \binom{n}{3}x^3 & \binom{n}{4}x^4 & \cdots & \binom{n}{n-1}x^{n-1} & x^n \end{bmatrix}$$ Further $X_n$ can be written as $C_n \tilde{X}_n$ where $$C_{n+1} = \begin{bmatrix}1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\1 & 1 & 0 & 0 & 0 & \cdots & 0 & 0\\ 1 & 2 & 1 & 0 & 0 & \cdots & 0 & 0\\ 1 & 3 & 3 & 1 & 0 & \cdots & 0 & 0\\ 1 & 4 & 6 & 4 & 1 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots \\ 1 & \binom{n}{1} & \binom{n}{2} & \binom{n}{3} & \binom{n}{4} & \cdots & \binom{n}{n-1} & 1 \end{bmatrix}$$ and $$\tilde{X}_{n+1} = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & x & 0 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & x^2 & 0 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & x^3 & 0 & \cdots & 0 & 0\\ 0 & 0 & 0 & 0 & x^4 & \cdots & 0 & 0\\ \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \cdots & \ddots & \vdots\\ 0 & 0 & 0 & 0 & 0& \cdots & 0 & x^n\\ \end{bmatrix}$$ $$M_n = A_n C_n \tilde{X}_n^2 C_n^T B_n^T$$ Setting $D_n = \tilde{X}_n^2$, we get a nice decomposition for $M_n$ as $$M_n = A_n C_n D_n C_n^T B_n.$$ Note that $A_n,B_n,D_n$ are diagonal matrices where $A_n(i,i) = a^i$,$B_n(i,i) = b^i$ and $D_n(i,i) = \left( 1+\frac{c}{ab} \right)^i$.

$C_n$ is a triangular matrix containing the binomial coefficients where $C_n(i,j) = \left \{ \begin{array}{lr} \binom{i-1}{j-1} & j \leq i\\ 0 & j > i \end{array}\right.$.


EDIT

I computed this decomposition by the usual $LU$ algorithm. But given that now we have this, it should be possible to reverse engineer other proofs, similar to the proofs for the Pascal's matrix.


Proof

Let me first write out the proof based on the algorithm I used to construct these $L$ and $U$ factors.

Consider the elimination matrix $$E_{n+1}^y = \begin{pmatrix} 1 & 0 & 0 & 0 & \cdots\\ -y & 1 & 0 & 0 & \cdots\\ 0 & -y & 1 & 0 & \cdots\\ \vdots & \vdots & \vdots & \ddots & \cdots\\ \vdots & \vdots & \vdots & -y & 1 \end{pmatrix}$$

Note that $E_{n+1}^a L_{n+1} = \begin{pmatrix}1 & 0\\ 0 & ax L_n \end{pmatrix}$ and $E_{n+1}^b U_{n+1}^T = \begin{pmatrix}1 & 0\\ 0 & bx U_n^T \end{pmatrix}$

Hence, $E_{n+1}^a L_{n+1} U_{n+1} (E_{n+1}^b )^T = \begin{pmatrix}1 & 0\\ 0 & abx^2 M_n \end{pmatrix}$.

Our hope is that the above matrix should match with $E_{n+1}^a M_{n+1} (E_{n+1}^b)^T$.

Consider the $(i,j)^{th}$ entry of $E_{n+1}^a M_{n+1}$. This is $M_{n+1}(i,j)-a M_{n+1}(i-1,j)$.

Now the $(i,j)^{th}$ entry of $E_{n+1}^a M_{n+1} (E_{n+1}^b)^T$ is $\begin{align} M_{n+1}(i,j)-a M_{n+1}(i-1,j) -b \left( M_{n+1}(i,j-1)-a M_{n+1}(i-1,j-1) \right)\\ = M_{n+1}(i,j) - a M_{n+1}(i-1,j) -b M_{n+1}(i,j-1) + ab M_{n+1}(i-1,j-1) \\ = \left( M_{n+1}(i,j) - a M_{n+1}(i-1,j) -b M_{n+1}(i,j-1) \right) + ab M_{n+1}(i-1,j-1) \\ = c M_{n+1}(i-1,j-1) + ab M_{n+1}(i-1,j-1) = abx^2 M_{n+1}(i-1,j-1) \end{align}$

But $M_{n+1}(i-1,j-1) = M_n(i-1,j-1)$. Hence, we get that $$M_{n+1}(i,j)-a M_{n+1}(i-1,j) -b \left( M_{n+1}(i,j-1)-a M_{n+1}(i-1,j-1) \right) = abx^2 M_{n}(i-1,j-1).$$

Hence, coupling with induction we get the complete proof. (Note that the matrix $E_n^y$ is always invertible enabling us to conclude what we want.)


Proof of the decomposition $M_n = A_n C_n D_n C_n^T B_n$ based on a combinatorial argument.

Let us first fix what we are counting.

Consider a positive integer lattice. Starting at a point $(m,n)$ the allowed options to move are either horizontal towards right to $(m+1,n)$ (or) vertical towards top to $(m,n+1)$ (or) diagonal towards right top $(m+1,n+1)$.

The cost for a horizontal movement is $a$, the cost for a vertical movement is $b$ and the cost for a diagonal movement is $c$. Further, the total cost of a path is the product of the costs of individual movements i.e. if a path takes $x$ horizontal right movement, $y$ vertical top movements and $z$ diagonal right top movements, the total cost of this path is $a^xb^yc^z$. Note that by this the cost to stay where you are is $1$.

Claim: $M(i,j)$ gives the sum of the costs over all paths to move from $(1,1)$ to $(i,j)$.

Proof: This immediately follows from the recursive definition of $M(i,j)$. The last step before hitting $(i,j)$, we will be either at $(i-1,j)$ (or) $(i-1,j-1)$ (or) $(i,j-1)$. Hence, the total cost over all paths from $(1,1)$ to $(i,j)$ is $aM(i-1,j) + bM(i,j-1) + cM(i-1,j-1)$.

So all we need to prove is $(A_n C_n D_n C_n^T B_n) (i,j)$ also gives the cost of moving from $(1,1)$ to $(i,j)$. Note that $A_n,B_n,D_n$ are diagonal matrices which essentially means though the product has $4$ summations, they can be collapsed to just one i.e. $ \begin{align} (A_n C_n D_n C_n^T B_n) (i,j) & = \sum_{k1} \sum_{k2} \sum_{k3} \sum_{k4} A_n(i,k1) C_n(k1,k2) D_n(k2,k3) C_n(k4,k3) B_n(k4,j)\\ & = \sum_{k1} \sum_{k} \sum_{k4} A_n(i,k1) C_n(k1,k) D_n(k,k) C_n(k4,k) B_n(k4,j)\\ & = \sum_{k} A_n(i,i) C_n(i,k) D_n(k,k) C_n(j,k) B_n(j,j)\\ & = \sum_{k=1}^{\min(i,j)} a^{i-1} \binom{i-1}{k-1} (x^2)^{k-1} \binom{j-1}{k-1} b^{j-1}\\ & = \sum_{k=1}^{\min(i,j)} a^{i-1} \binom{i-1}{k-1} \left( 1 + \frac{c}{ab} \right)^{k-1} \binom{j-1}{k-1} b^{j-1}\\ & = \sum_{k=1}^{\min(i,j)} a^{i-k} \binom{i-1}{k-1} \left( ab + c \right)^{k-1} \binom{j-1}{k-1} b^{j-k}\\ & = \sum_{k=1}^{\min(i,j)} \sum_{r=0}^{k-1} a^{i-k} \binom{i-1}{k-1} \binom{k-1}{r} (ab)^{k-1-r} c^r \binom{j-1}{k-1} b^{j-k}\\ & = \sum_{r=0}^{\min(i,j)-1} a^{i-1-r} b^{j-1-r} c^r \left( \sum_{k=r+1}^{\min(i,j)} \binom{i-1}{k-1} \binom{k-1}{r} \binom{j-1}{k-1} \right) \end{align} $ All we need to prove now is that $\displaystyle \left( \sum_{k=r+1}^{\min(i,j)} \binom{i-1}{k-1} \binom{k-1}{r} \binom{j-1}{k-1} \right)$ counts the number of paths with $r$ top right diagonal movements.

Equivalently, we need to prove that $\displaystyle \left( \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} \right)$ counts the number of paths with $r$ top right diagonal movements.

Counting the number of paths from $(1,1)$ to $(i,j)$ with $r$ top right diagonal movements. If we denote the right horizontal movement by $h$, top vertical movement by $v$ and top right diagonal movement by $d$, then the path we are interested in contains $(i-1-r)$ $h$'s, $(j-1-r)$ $v$'s and $(r)$ $d$'s. Any arrangement of this gives us the desired path. Hence, the total number of paths from $(1,1)$ to $(i,j)$ with $r$ top right diagonal movements is $$\frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}.$$

Hence all we need to prove is that $$\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}$$

Claim:$\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}$

First note that $\displaystyle \binom{i-1}{t+r} \binom{t+r}{r} = \binom{i-1}{r} \binom{i-1-r}{t}$.

(Can be proved algebraically or through a combinatorial argument of choosing $3$ sets of things in different orders.)

Hence, we get that $$\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \binom{i-1}{r} \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1-r}{t} \binom{j-1}{t+r}$$

Now $\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1-r}{t} \binom{j-1}{t+r} = \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1-r}{t} \binom{j-1}{j-1-t+r} = \binom{i+j-r-2}{j-r-1}$

In the first step above, the first step is nothing but $\binom{n}{r} = \binom{n}{n-r}$ while the second step splits things into two different sets and does the same counting.

Hence, we get that $$\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \binom{i-1}{r} \binom{i+j-r-2}{j-r-1}$$

But $\displaystyle \binom{i-1}{r} \binom{i+j-r-2}{j-r-1} = \frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}$ which can be seen through direct algebra or again a combinatorial argument which relies on picking things in different order.

Hence, we get that $\displaystyle \sum_{t=0}^{\min(i,j)-r-1} \binom{i-1}{t+r} \binom{t+r}{r} \binom{j-1}{t+r} = \frac{(i+j-r-2)!}{(i-1-r)!(j-1-r)!r!}$.

This completes the combinatorial proof.

Solution 2:

Inspired by the solution of Sivaram Ambikasaran, I would just like to furnish some concision. Given $$ M_{i,j} = \begin{cases} a^{i-1}b^{j-1} & \text{if } i=1 \text{ or } j=1, \\ aM_{i-1,j}+bM_{i,j-1}+cM_{i-1,j-1} & \mbox{if }i>1 \text{ and } j>1,\\ \end{cases} $$ one may (inspired by what is said in the Edelman-Strang paper) start subtracting from each row (except the first) $a$ times the previous row. This is to be done simultaneously, or working from the bottom up, so that it is the original value of the row that is subtracted. This amounts to left-multiplying by the elimination matrix $E_{-a}$ with diagonal entries $1$ and subdiagonal entries $-a$. One obtains a matrix $M'$ given by $$ M'_{i,j} = \begin{cases} b^{j-1} & \text{if } i=1, \\ M_{i,j}-aM_{i-1,j} & \mbox{if }i>1.\\ \end{cases} $$ The latter expression is $0$ if $j=1$, and otherwise $$ M'_{i,j} = bM_{i,j-1}+cM_{i-1,j-1}\qquad\text{if }i,j>1 $$ using the recursion defining $M$. Now proceed similarly by columns, right-multiplying by the transpose $E_{-b}^\top$ of $E_{-b}$, giving a new matrix $M''$ given by $$ M''_{i,j} = \begin{cases} \delta_{i,j} & \text{if } j=1, \\ M'_{i,j}-bM'_{i,j-1} & \mbox{if }j>1.\\ \end{cases} $$ The latter expression is $0$ if $i=1$, and otherwise $$\begin{align} M''_{i,j} &= bM_{i,j-1}+cM_{i-1,j-1}-b(M_{i,j-1}-aM_{i-1,j-1})\\ & =(ab+c)M_{i-1,j-1} \end{align}\qquad\text{if }i,j>1, $$ using the two expressions for $M'_{i,j}$ given above. In other words one has in block form $$ M''=\begin{pmatrix}1&0\\ 0&(ab+c)M_{(n-1)}\end{pmatrix} $$ where $M_{(n-1)}$ is the $(n-1)\times(n-1)$ top-left submatrix of $M$, which is its counterpart of size one less. Therefore $$ \det M_{(n)} = \det M = \det M''= (ab+c)^{n-1} \det M_{(n-1)} $$ from which it follows by induction that $\det M=(ab+c)^{\tbinom{n}2}$.

The scalar multplication of $M_{(n-1)}$ by $ab+c$ can of course be realised as the multiplication by a multiple of the identity matrix, which commutes with with the matrices obtained in a recursive decomposition of $M_{(n-1)}$. It is then easy to conclude that one has a decomposition $$ M_{(n)} = L_{(n)} D_{(n)} U_{(n)} $$ where $D_{(n)}$ is diagonal with entries $(D_{(n)})_{i,i}=(ab+c)^{i-1}$ and $L_{(n)}$ and $U_{(n)}$ are unitriangular and therefore have determinant $1$. Their explicit form can be obtained from the recurrences $$ L_{(n)} = (E_{-a})^{-1}\cdot \begin{pmatrix}1&0\\ 0&L_{(n-1)}\end{pmatrix}, \qquad U_{(n)} = \begin{pmatrix}1&0\\ 0&U_{(n-1)}\end{pmatrix}\cdot(E_{-b}^\top)^{-1}, $$ which can easily found to be solved by $(L_{(n)})_{i,j}=\binom{i-1}{j-1}a^{i-j}$ and $(U_{(n)})_{i,j}=\binom{j-1}{i-1}b^{j-i}$, either using $((E_{-a})^{-1})_{i,j}=a^{i-j}$ or by solving first $((L_{(n)})^{-1})_{i,j}=\binom{i-1}{j-1}(-a)^{i-j}$.

Solution 3:

This is not an answer but rather a comment of Sivaram's answer.
[update] I've to correct the numerical examples because I used a slightly distorted version of M. But this does not affect the basic idea of using the LDU-decompostion instead of the LU-decomposition. Finally, this is now not much more than a slight optimization of Sivaram's ansatz

If I decompose into $\small L \cdot D \cdot U $ instead, where D is diagonal, then the pattern of the construction of the factors L and U is more "simple"(contains only a resp. b) but their diagonals are 1 and so can be neglected for the determinant. So we need only the determinant of D. Then we find, that D can be described by
$\qquad \small D=[1,c+ab, (c+ab)^2, (c+ab)^3,\ldots ,(c+ab)^{n-1} ] = dV(c+ab,n) $

and the determinant for matrix-size n x n is then the product of these terms up to n-1 by an obvious simple expression. Again the found pattern in the LDU-decomposition must be proven, but the approach might possibly allow a shorter path...

[added] : The actual LDU-decomposition can be written as

$\qquad \small M = P^{\quad a} \cdot \quad dV(c+ab) \cdot (P^{\quad b} \sim )$

where P is the lower triangular Pascalmatrix, "~" means the transpose and dV(x) is the diagonalmatrix containing the consecutive powers of its argument x beginning at $\small x^0 $

(remark: I also prefer this LDU often because it avoids introducing sqaures and/or squareroots, like for instance does the cholesky-decomposition for the symmetric case)


in case your software has a LDU-decomposition not available, here is some code usable for Pari/GP
 LDU(Y) = local(dim=#Y, D, MR, ML, dx); \\ Y must be square, 
       \\ no errorchecks in     demo-documentation
 D=matrix(dim,dim); MR=matid(dim); ML=matid(dim);
 for(p=1,dim, 
   D[p,p]=(dx=Y[p,p]);
   for(c=p+1,dim,
        MR[p,c]=if(dx==0,0,Y[p,c]/dx)
       );
   for(r=p+1,dim, 
        ML[r,p]=if(dx==0,0,Y[r,p]/dx)
      );
   for(r=p+1,dim,
      for(c=p+1,dim,
            Y[r,c]-=ML[r,p]*dx*MR[p,c]
      ));
  );
return([ML,D,MR]);

The matrix L has the form [now corrected!] $\small L = P^a $

$\small \begin{array} {lllll} 1 & . & . & . & . & . \\ a & 1 & . & . & . & . \\ a^2 & 2a & 1 & . & . & . \\ a^3 & 3a^2 & 3a & 1 & . & . \\ a^4 & 4a^3 & 6a^2 & 4a & 1 & . \\ a^5 & 5a^4 & 10a^3 & 10a^2 & 5a & 1 \end{array} $

while U is simply the transposed and a replaced by b.


[update] Here is the Pari/GP code to generate the matrix M (corrected) and the call for the LDU-decomposition:
M = matrix(6,6,r,c); for(r=1,6,M[r,1]='a^(r-1)); for(c=1,6,M[1,c]='b^(r-1));
    for(r=2,rows(M),for(c=2,cols(M),M[r,c]='a*M[r-1,c]+'b*M[r,c-1]+'c*M[r-1,c-1]))
M_ldu = LDU(M); L=M_ldu[1];D=diag(M_ldu[2]);U=M_ldu[3];
print (D)  \\ check the diagonal-component
%1164 = [1,
         c + a*b ,
         c^2 + 2*b*a*c + b^2*a^2,
         ... (snipped) ] ~