Linear Model and Posterior distribution
Consider the following linear model:
$$ \mathbf{y} = A \mathbf{x} + \sigma\mathbf{z}, $$ where $\mathbf{x} \in \{0,1\}^n$ is an unknown signal, to be recovered; $A \in \mathbb{R}^{m \times n}$ is a (known) linear measurement matrix; $\sigma > 0$; and $\mathbf{z} \in \mathbb{R}^{m}$ is i.i.d. Gaussian noise: $z_1,\dots,z_m \overset{\text{i.i.d.}}{\sim} \mathcal{N}(0,1)$.
Assume a sparse binary prior for $\mathbf{x}$. Specifically, let $k$ be the expected sparsity and denote $\rho = \frac{k}{n}$. The coordinates of $\mathbf{x}$ are assumed i.i.d. Bernoulli random variables: $$ x_1,\dots,x_n \overset{\text{i.i.d.}}{\sim} \text{Bernoulli}(\rho). $$
I want to determine the posterior distribution
$$ \mathbb{P}(\mathbf{x} \mid \mathbf{y}) . $$ My attempt was to use Bayes theorem, i.e., $$ \mathbb{P}(\mathbf{x} \mid \mathbf{y}) = \frac{\mathbb{P}(\mathbf{y} \mid \mathbf{x}) \cdot \mathbb{P}(\mathbf{x})}{\mathbb{P}(\mathbf{y})}. $$ Now $\mathbb{P}(\mathbf{x})$ is given by
$$ \Pi_{i=1}^n \rho^{x_i}(1 - \rho)^{1-x_i} $$ I'm having trouble determining the $\mathbb{P}(\mathbf{y} \mid \mathbf{x})$ and $\mathbb{P}(\mathbf{y})$.
I would be very grateful for any help.
If $\bf x$ and $\bf z$ are independent, then ${\bf z}\big\vert({\bf x}=x) \quad \underset{d}{=}{\bf z}\sim N(0,\sigma^2\mathbb{I}_m),$ so $${\bf y}\big\vert({\bf x}=x)\quad \sim N(Ax,\sigma^2\mathbb{I}_m).\\$$
To obtain the distribution of ${\bf y}$, we have its density given by the mixture
$$f_{\bf y}(y)=\sum_{x\in \{0,1\}^n} f_{{\bf y}|{\bf x}}(y|x)P_{\bf x}(x).$$
Typically, you don't need the lattermost density in Bayesian analysis since it is just a proportionality constant (doesn't depend on $\bf x$ ), but I am not sure what family of distributions the posterior ${\bf x}|{\bf y}$ is in.