Probability distribution and their related distributions
Solution 1:
The most complete reference I know is this paper by Leemis and McQueston ("Univariate Distribution Relationships," The American Statistician 62(1) 2008, pp. 45-53). In particular, see page 47 for a nice diagram of many of the relationships.
Added: The diagram is given below.
Solution 2:
"a nice summary or something like that about all the distributions and their related distributions. "
is a very broad request and Mike Spivey has already pointed out a very nice summary to you and to the readership of math.SE. But, following the spirit of your example, consider independent Gaussian random variables $X$ and $Y$. Then, not only is the sum $X+Y$ a Gaussian random variable (so far very similar to your example of Poisson random variables) but the conditional distribution of $X$ (or of $Y$ for that matter) given $X + Y = \alpha$ is also a Gaussian distribution. Since you are taking what might be your first undergraduate course in probability, working out the details of this assertion might give you helpful drill in working with probability distributions. You will need the following facts.
$X$ and $X+Y$ have a bivariate Gaussian density (alternatively, are jointly Gaussian random variables) whose $5$ parameters $\mu_X$, $\mu_{X+Y}$, $\text{var}(X)$, $\text{var}(X+Y)$, and $\rho_{X,X+Y}$ can all be found without writing down the joint density $f_{X,X+Y}$ and integrating.
If $W$ and $Z$ are jointly Gaussian random variables, then the conditional density of $W$ given $Z = \alpha$ is a Gaussian density whose mean and variance can be expressed in terms of $\alpha$ and the $5$ parameters of $f_{W,Z}$.