Diagonalization of a symmetric matrix over algebraically closed field

Yes. First, any quadratic form $q$ over a finite dimension vector space over any field $k$ of characteristic $\not=2$ has an orthogonal basis. Precisely, you can find a basis $(e_1,\ldots,e_d)$ such that $\varphi(e_i,e_j) = 0$ if $i\not = j$, where $\varphi$ is the bilinear form associated to $q$. (This makes sense as the characteristic is $\not=2$.)

You can prove this by induction on the dimension $d$ of $V$. Indeed, if $V$ is zero dimensional it is trivial, and for the induction step, you do like this : if $q = 0$ any basis ($V$ has some !) is orthogonal ! and if $q\not = 0$ then take an $e_1\in V$ such that $q(e_1)\not = 0$ (not that $e_1 \not=0$ then), take $V' := $ the orthogonal (for the bilinear form $\varphi$) of the line generated by $e_1$ and apply induction on $V'$... Note that in an orthogonal basis, the matrix of $q$ (or $\varphi$) will already be diagonal. See JP Serre's A course in arithmetic, chapter IV, paragraph 1.4., definition 5 and theorem 1 for more details.

Now, if your field $k$ is algebraically closed, you can always solve the equation $q(x) = \lambda$ in $x$ for any $\lambda\in k$, and turn your orthogonal basis into an orthonormal (for $q$) basis by normalizing its vectors. Note that in an orthormal basis, the matrix of $q$ (or $\varphi$) will be the identity matrix.

Finally, apply this to $V = k^n$ where $k$ is your algebraically closed field, and where $q$ is the quadratic form defined by $q(X) = {}^{t} X A X$ for $X\in V$. In this case its associated bilinear form is defined by $\varphi(X,Y) = {}^{t} X A Y$ for $X,Y \in V$.

What happens in characteristic $2$ ? In characteristic $\not=2$, because you can invert $2$, you have a bijection between quadratic forms and symmetric bilinear forms. This is why I preferred to speak about quadratic forms while you had a symmetric matrix $A$. In characteristic $2$ you don't have this correspondance anymore, as you can have a non-zero bilinear form $\varphi$ such that its associated quadratic form $q$ is zero, so that you will never be able to find a non zero $e_1$ such that $q(e_1) = \varphi(e_1,e_1) \not= 0$ to start your induction with. In fact the existence of orthogonal basis in characteristic $2$ is completely dealt with in Bourbaki, Algèbre, Chapitre IX, paragraph 6, section 1 theorem 1. In your case they do exist.

Let me just look at $A = \begin{pmatrix}0 & 1\\1 & 0\end{pmatrix}$ over $k$ where $k$ is an algebraic closure of $\mathbf{F}_2$. Its characteristic polynomial is $T^2 - 1 = (T-1)^2$ and has a double root equal to $1$, so that if $A$ is diagonalisable (which is the case if $A$ has an orthogonal basis), it must be the identity matrix, which it is not, so that $A$ is not diagonalizable.