Rank of square matrix $A$ with $a_{ij}=\lambda_j^{p_i}$, where $p_i$ is an increasing sequence
Let $$ A = \begin{bmatrix} \lambda_1^{p_1} & \lambda_2^{p_1} & \cdots & \lambda_n^{p_1} \\ \lambda_1^{p_2} & \lambda_2^{p_2} & \cdots & \lambda_n^{p_2} \\ \lambda_1^{p_3} & \lambda_2^{p_3} & \cdots & \lambda_n^{p_3} \\ \vdots & \vdots & \ddots & \vdots \\ \lambda_1^{p_n} & \lambda_2^{p_n} & \cdots & \lambda_n^{p_n} \end{bmatrix}$$ where $p_1=0$ and $p_k > p_i$ for $i < k \leq n$. Each $p_i$ is an integer number (increasing but bounded) and each $\lambda_j$ is a positive real number.
I believe that I can show that $A$ is full rank if $\lambda_k \neq \lambda_j$ for all $k\neq j$ (distinct). That is the question that I would like to answer, when is $rank(A)=n$. At least I can prove it for $n =2$ and $n=3$ if each $\lambda_j$ is different.
In case it helps, $$ A = \begin{bmatrix} C\Lambda^{p_1} \\ C\Lambda^{p_2} \\ \vdots \\ C\Lambda^{p_n} \end{bmatrix}$$ where $C = [1\;1\; 1\; \cdots\; 1]$ is a $1\times n$ vector of 1s and $\Lambda = diag\{\lambda_1, \lambda_2, \cdots, \lambda_n\}$ is a diagonal matrix.
Also, if $p_i = i-1$, my hypothesis can be proven as long as the entries of the diagonal matrix are different. I know the later from observability in control system theory. However, the case I am trying to evaluate is more general and $p_i$ might not be $i-1$.
It is not true that $\text{rank}(A)=n$ whenever the $\lambda_i$ are pairwise ditinct, even for $n=3$. Consider
\begin{equation} A=\left(\begin{array}{ccc} 1&1&1\\ {\lambda_1}&{\lambda_2}&{\lambda_3}\\ {\lambda_1}^3&{\lambda_2}^3&{\lambda_3}^3 \end{array} \right) \end{equation}
Then $\det(A)=-({\lambda_1}-{\lambda_2})({\lambda_1}-{\lambda_3})({\lambda_2}-{\lambda_3})({\lambda_1}+{\lambda_2}+{\lambda_3})$ so that if ${\lambda_1}+{\lambda_2}+{\lambda_3}=0$, even if the $\lambda_i$ are pairwise distinct, we'll have $\det(A)=0$ so it will not be full rank.
EDIT: Expanding the answer as per the comments, and taking into consideration now that all $\lambda_i$ are positive. When $p_i=i-1$ we have a Vandermonde Matrix, whose determinant is simply
$$\prod_{0\leq i<j \leq n} (\lambda_j-\lambda_i)$$
This confirms that the determinant is $0$ if and only if $\lambda_i=\lambda_j$ for some $i\neq j$, that is, when the $\lambda_i$'s are pairwise distinct $A$ is indeed full-rank.
In the more general case that the $p_i$ can skip powers as in the question, $\det(A)$ is an alternating polynomial (notice that swapping two columns is equivalent to swapping tow variables and causes a change of sign). As such, it is a product $v\cdot s$ of a Vandermonde determinant $v$ by a symmetric polynomial $s$ (in $\lambda_1,\lambda_2, \dots, \lambda_n$).
We can conclude more with Schur polynomials. Indeed, we have that $\det(A)$ is, up to sign, given by
$$a_{\displaystyle (p_1,p_2,\dots,p_n)}(\lambda_1,\lambda_2,\dots,\lambda_n)$$
so the symmetric polynomial $s$ that's multiplying the Vandermonde determinant in $\det(A)$ is a Schur polynomial.
Finally, use that Schur polynomials can be written as linear combinations with non-negative integer coefficients of monomial symmetric polynomials:
$$s=\sum_{\mu}K_{\mu}m_{\mu}$$
where $K_{\mu}>0$ for all $\mu$ and $m_{\mu}$ is a monomial symmetric polynomial. Since these functions are positive whenever their arguments are all positive, so too is $s$.
Hence, when the $\lambda_i$ are all positive, $s$ does not vanish, so in that range $\det(A)=0$ if and only if $\lambda_i=\lambda_j$ for some $i\neq j$. In other words, when the $\lambda_i$ are positive, $A$ is full rank if and only if the $\lambda_i$ are pairwise distinct.
EDIT²: It means that the determinant is of the form
$$\left(\prod_{1\leq i < j \leq n} (\lambda_j - \lambda_i)\right) \cdot s(\lambda_1, \dots, \lambda_n)$$
where $s$ is a Schur polynomial. As a Schur polynomial, $s$ is a symmetric polynomial and can be written as a (finite) linear combination
$$s(\lambda_1,\dots, \lambda_n) = \sum_{\mu}K_{\mu}\cdot m_{\mu}$$
where the $K_{\mu}$ are all non-negative integers and the $m_{\mu}$ are monomial symmetric polynomials.
Monomial symmetric polynomials can be indexed by a partition $i=i_1+i_2+\cdots + i_j$ (where we assume without loss of generality $i_1 \geq i_2 \geq \dots \geq i_j \geq 0$). Such a partition is indicated simply by $(i_1,i_2,\dots,i_j)$ and gives rise to the following polynomial
$$m_{(i_1,i_2,\dots,i_j)}(X_1,X_2,\dots , X_j)=\sum\limits_{\sigma \in P(i_1,i_2,\dots,i_j)} {X_1}^{\sigma(i_1)}\cdot {X_2}^{\sigma(i_2)} \dots{X_j}^{\sigma(i_j)}$$
where $P(i_1,i_2,\dots,i_j)$ is the set of all distinct permutations of $(i_1,i_2,\dots,i_j)$.
For concrete examples on these polynomials, consider this wikipedia article. Also, the Schur polynomial in my first example matrix $A$ up above is $$s(\lambda_1,\lambda_2,\lambda_3)= 1 \cdot m_{(1,0,0)}(\lambda_1,\lambda_2,\lambda_3)=\lambda_1+\lambda_2+\lambda_3$$
In particular, when $X_1,X_2,\dots, X_j >0$ any monomial symmetric polynomial is a sum of positive terms and thus itself positive. Hence, $s$ too is positive when the $\lambda_i$'s are positive.