Sxx in linear regression
Solution 1:
$S_{xx}$ is the sum of the squares of the difference between each $x$ and the mean $x$ value.
$S_{xy}$ is sum of the product of the difference between $x$ its means and the difference between $y$ and its mean.
So $S_{xx}=\Sigma(x-\overline{x})(x-\overline{x})$ and $S_{xy}=\Sigma(x-\overline{x})(y-\overline{y})$. Both of these are often rearranged into equivalent (different) forms when shown in textbooks.
Solution 2:
To add to the answer from Ian Miller,
$$S_{xx}=\sum x^2 -\frac{(\sum x)^2}{n}=\sum x^2 -n\bar{x}^2$$
Intuitively, $S_{xy}$ is the result when you replace one of the $x$'s with a $y$.
$$S_{xy}=\sum xy -\frac{\sum x \sum y}{n}=\sum xy -n\bar{x}\bar{y}$$
Also, just for your information, the good thing about this notation is that it simplifies other parts of linear regression.
For example, the product-moment correlation coefficient:
$$r=\frac{\sum xy -n\bar{x}\bar{y}}{\sqrt{(\sum x^2 -n\bar{x}^2)(\sum y^2 - n\bar{y}^2)}} = \frac{S_{xy}}{\sqrt{S_{xx}S_{yy}}}$$
or to find the gradient of the best-fit line $y=a+bx$:
$$y-\bar{y}=b(x-\bar{x}), \text{ where } b=\frac{S_{xy}}{S_{xx}}$$
The pragmatic importance of this is that if you are doing a long question about linear regression, calculating $S_{xx}$, $S_{yy}$ and $S_{xy}$ at the beginning can save you a lot of work.