Adjoint of a linear transformation in an infinite dimension inner product space
This is true for (edit: bounded operators on) Hilbert spaces thanks to the Riesz representation theorem. It is false in general: let $e_1, e_2, ...$ be a sequence of orthogonal unit vectors in some infinite-dimensional Hilbert space and let $V$ be their span. Then the linear transformation
$$T(e_i) = e_1 + e_2 + ... + e_i$$
does not have an adjoint, since $\langle T(e_i), e_j \rangle = 1$ whenever $j \le i$, but $\langle e_i, T^{\ast}(e_j) \rangle$ must be $0$ for sufficiently large $i$ and fixed $j$ for any linear operator $V \to V$. This example can be modified so that $T$ is bounded.
If you are not familiar with Hilbert space theory, beware that the definition of "orthonormal basis" is different: it does not refer to a Hamel basis (which is what the word "basis" ordinarily means) but a collection of orthogonal unit vectors such that only the zero vector is orthogonal to all of them. Equivalently, it refers to a collection of orthogonal unit vectors whose span is dense (not the whole space).
Let's look at an example. Take the vector space $\mathbb{R}[x]$ of all real polynomials in one variable and define an inner product as
$$\left( \sum_{j=0}^n a_jx^j, \sum_{k=0}^m b_kx^k \right)=\sum_{h=0}^{\min(n,m)}a_hb_h.$$
Now let $T$ be the linear operator such that
$$T\sum_{j=0}^n a_jx^j=\sum_{j=0}^n a_j+\left(\sum_{j=1}^na_j\right)x+\ldots + (a_{n-1}+a_n)x^{n-1}+a_nx^n.$$
Think $T$ like the operator represented by the infinite matrix below:
$$\begin{bmatrix} 1 & 1 & 1 & \ldots \\ 0 & 1 & 1 & \ldots \\ 0 & 0 & 1 & \ldots \\ \vdots & \vdots & \vdots & \ddots \\ \end{bmatrix}$$
Should $T$ have an adjoint with respect to the inner product $(,)$, it should be somehow associated to this infinite matrix:
$$\begin{bmatrix} 1 & 0 & 0 & \ldots \\ 1 & 1 & 0 & \ldots \\ 1 & 1 & 1 & \ldots \\ \vdots & \vdots & \vdots & \ddots \\ \end{bmatrix}$$
but this makes no sense in $\mathbb{R}[x]$. Formally, let's suppose such an adjoint operator $T^\star$ exists. Fix $k \in \mathbb{N}$: who is $T^\star x^k$? For all $n=0, 1,\ldots$, we should have
$$(x^n, T^\star x^k)=(T x^n, x^k)=(1+\ldots+ x^n, x^k)=\begin{cases}1 & n \ge k \\ 0 & n <k \end{cases};$$
this means that $T^\star x^k$ should be a polynomial with degree $\ge n$ for all $n\in \mathbb{N}$. So, $T$ hasn't got an adjoint.
The reason for it is that the mapping
$$P \in \mathbb{R}[x] \mapsto (P, \cdot) \in \mathbb{R}[x]^\star$$
(here $\mathbb{R}[x]^\star$ is the algebraic dual space of $\mathbb{R}[x]$) is not an isomorphism, because it is not surjective. In fact it can be shown that $\mathbb{R}[x]^\star$ can be represented as $\mathbb{R}[[x]]$, the space of formal power series with real coefficients.
It is possible to define adjoints on infinite-dimensional inner product spaces, but things get more complicated. Bounded linear operators on Hilbert spaces always have well-behaved adjoints, but for unbounded operators the domain of the adjoint may differ from the domain of the operator, and may in fact be just the zero subspace. For the nice case, see this. For the uglier cases, see this, especially the examples beginning on p. 15.