Some trigo identities
I aacidently found the following: $$\sin\frac{2\pi}{7}+\sin\frac{4\pi}{7}-\sin\frac{6\pi}{7}=\frac{\sqrt{7}}{2}$$ $$\sin\frac{2\pi}{11}-\sin\frac{4\pi}{11}+\sin\frac{6\pi}{11}+\sin\frac{8\pi}{11}+\sin\frac{10\pi}{11}=\frac{\sqrt{11}}{2}$$ $$\sin\frac{2\pi}{19}-\sin\frac{4\pi}{19}-\sin\frac{6\pi}{19}+\sin\frac{8\pi}{19}+\sin\frac{10\pi}{19}+\sin\frac{12\pi}{19}+\sin\frac{14\pi}{19}-\sin\frac{16\pi}{19}+\sin\frac{18\pi}{19}=\frac{\sqrt{19}}{2}$$
I wonder if there is a general rule such that, for prime number $p \cong3 \pmod 4$ and some suitable $f(k)=1$ or $f(k)=2$, the following identity holds? $$\sum_{k=1}^{\frac{p-1}{2}}(-1)^{f(k)}\sin\frac{2k\pi}{p}=\frac{\sqrt{p}}{2}$$
Any idea?
Very good observation. The proof is somewhat long, here I present a sketch. For full details, see Chapter 6 in Ireland-Rosen "A classical introduction to modern number theory".
Let $p$ be an odd prime, and set $\zeta=e^{2\pi i/p}$, so the powers of $\zeta$ are the $p$-th roots of unity. The relevance here lies in the fact that $\sin(2 \pi k/p)$ is the imaginary part of $\zeta^k$.
The Gauss sum $g_a$ is defined by $$ g_a =\sum_{k=1}^{p-1}\left(\frac{k}{p}\right)\zeta^{ak}, $$ where $\displaystyle \left(\frac{k}p\right)$ is the Legendre symbol: $0$ if $p\mid k$, $1$ if $k$ is a square modulo $p$, and $-1$ otherwise.
You can check that $g_a=0$ if $p\mid a$, and $g_a=\displaystyle\left(\frac{a}p\right)g_1$ otherwise. Anyway, you are interested in $g_1$ (or rather, its imaginary part).
One starts by verifying that $g_1^2=(-1)^{(p-1)/2}p$. This is not too bad. The standard way of checking this is to add $\sum_{k=1}^{p-1}g_kg_{-k}$ in two different ways.
If $p\equiv 3\pmod 4$, this means that $g_1=\pm i\sqrt p$. Ignoring the sign, the result follows, since $-1$ is not a square modulo $p$, so $$ \left(\frac kp\right)\sin\left(\frac{2\pi k}p\right)=\left(\frac {p-k}p\right)\sin\left(\frac{2\pi (p-k)}p\right) $$ for all $k$, and the sum of the first $(p-1)/2$ terms in $g_1$ equals the sum of the other half.
The difficulty lies in verifying the sign is $+1$ rather than $-1$. This actually gave Gauss a lot of work (he found $g_1^2$ in 1801, and $g_1$ in 1805). I do not know of any quick argument, but the proof in section 6.4 of the Ireland-Rosen book is rather elegant. Another book that presents a nice proof is Nathanson "Elementary methods in number theory". For what is worth, I think that the notation in Ireland-Rosen is somewhat easier to follow if you have no prior experience with Gauss sums. (If you have experience with contour integration, yet another proof is described in exercises in Berenstein-Gay "Complex variables: An introduction".)
Using the value of $g_1^2$, Gauss found a very clever (Fourier theoretic, we could say) proof of quadratic reciprocity. For completeness, let me add that for $p\equiv 1\pmod4$, we have $g_1=\sqrt p$.
Noam Elkies has posted on MO a very clever argument (that goes back to Schur) for evaluating $g_1$ by use of linear algebra.