uniqueness of barycentric coordinates of a simplex

Let $k,d\in\mathbb N$, $p_0,\ldots,p_k\in\mathbb R^d$ be affinely independent and $$\Delta:=\left\{\sum_{i=0}^k\lambda_ip_i:\lambda_0,\ldots,\lambda_k\ge0\text{ and }\sum_{i=0}^k\lambda_i=1\right\}.$$

How can we show that for all $p\in\Delta$, there is an unique $(\lambda_0,\ldots,\lambda_k)$ with $\lambda_0,\ldots,\lambda_k\ge0$, $\sum_{i=0}^k\lambda_i=1$ and $p=\sum_{i=0}^k\lambda_ip_i$?

Obviously, any such $(\lambda_0,\ldots,\lambda_k)$ is a solution of $$\underbrace{\begin{pmatrix}p_0&\cdots&p_k\\1&\cdots&1\end{pmatrix}}_{=:\:A}\begin{pmatrix}\lambda_0\\\vdots\\\lambda_k\end{pmatrix}=\begin{pmatrix}p\\1\end{pmatrix}\tag1.$$ Now, since $p_0,\ldots,p_k$ are affinely independent, $\begin{pmatrix}p_0\\1\end{pmatrix},\ldots,\begin{pmatrix}p_k\\1\end{pmatrix}$ are linearly independent. In particular, $$\operatorname{rank}A=k+1\le d+1\tag2.$$

But how do we conclude? Even when $k=d$ - in which case $A$ is an invertible square matrix - I still got trouble to see why the unique solution of $(1)$ - which exists in that case in $\mathbb R^{k+1}$ - does also satisfy the nonnegativity condition $\lambda_0,\ldots,\lambda_k\ge0$.

For the general case, I'm not sure how to argue at all.

The most relevant example for me is $k=2$ and $d=3$, in which case $\Delta$ is a flat closed triangle in three-dimensonal space.


your symplex is defined as all convex combinations of $p_0,\ldots,p_k\in\mathbb R^d$:
$\Delta:=\left\{\sum_{i=0}^k\lambda_ip_i:\lambda_0,\ldots,\lambda_k\ge0\text{ and }\sum_{i=0}^k\lambda_i=1\right\}.$

then for arbitrary $p\in \Delta$, it can be written as a convex combination of the $p_i$ by definition. This gives existence of a (non-negative) solution to
$A\mathbf x = \begin{pmatrix}p_0&\cdots&p_k\\1&\cdots&1\end{pmatrix}\begin{pmatrix}\lambda_0\\\vdots\\\lambda_k\end{pmatrix}=\begin{pmatrix}p\\1\end{pmatrix}$

and uniqueness follows because, as you say, all columns of $A$ are linearly independent. The usual linear algebra argument is:
suppose $A\mathbf x = \begin{pmatrix}p\\1\end{pmatrix}= A\mathbf x '\implies A\big(\mathbf x-\mathbf x'\big)=A\mathbf v=\mathbf 0$
Thus $\sum_{i=0}^k v_i \cdot \begin{pmatrix}p_i\\1\end{pmatrix}=\mathbf 0\implies 0=v_0=v_1=\cdots=v_k$ by linear independence, hence $\mathbf x = \mathbf x'$ and the solution is unique.