Solution 1:

You're on the right track. Let's say $\beta = \{ v_1, \dots , v_n \}$ is a basis for $V$ then we define $f_j: V \rightarrow \mathbb{F}$ by $f_j(v_i) = \delta_{ij}$ extended linearly. Now, show that $\beta^*$ forms a basis for $V^*$ and you're done since that shows the dimensions of $V$ and $V^*$ are the same (which shows $V$ and $V^*$ are isomorphic). So, what do you need to show $\beta^*$ is a basis ?

  1. linear independence (LI)
  2. spanning

I can show you both, but I stop here for now. Ok, so for LI consider scalars $c_1, \dots , c_n \in \mathbb{F}$ for which: $$ c_1f_1+ \cdots + c_nf_n = 0$$ this is a sum of $\mathbb{F}$-valued functions on $V$. In particular, we can evaluate at $v_i$ to find: $$ c_1f_1(v_i)+ \cdots +c_jf_j(v_i)+ \cdots + c_nf_n(v_i) = 0(v_i) = 0$$ which yields $c_i=0$ for arbitrary $i$ since only the $i=j$ term survives. But, this shows that $c_1 = 0, \dots , c_n=0$ hence $\beta^*$ is LI. To prove spanning consider $\alpha: V \rightarrow \mathbb{F}$ and $v \in V$ where $v = x_1v_1+ \cdots + x_nv_n$, $$ \alpha(v) = \alpha(x_1v_1+ \cdots + x_nv_n) = x_1 \alpha(v_1) + \cdots + x_n \alpha(v_n).$$ Now, here's the neat thing: $x_j = f_j(v)$ (prove it for yourself if you doubt me) thus, $$ \alpha(v) = \alpha(x_1v_1+ \cdots + x_nv_n) = \alpha(v_1)f_1(v) + \cdots + \alpha(v_n)f_n(v) = \left( \alpha(v_1)f_1 + \cdots + \alpha(v_n)f_n \right)(v).$$ Therefore, $\alpha \in \text{span}(\beta^*)$ and this completes the proof that $\beta^*$ is a basis for $V^*$.

Let me just add a bit more on this construction. Notice, our construction depends on the choice of $\beta$. Fortunately, we can show the number of vectors in a basis is independent of the choice (that is, dimension is well-defined). Hence, this choice is harmless. That said, if we could construct an isomorphism which did not rely on such a choice then that isomorphism would be a canonical isomorphism. In finite dimensions, $V$ and $V^{**}$ are canonically isomorphic by the rule $\alpha(v) = \tilde{v}(\alpha)$ where $\tilde{v} \in V^{**}$ and $\alpha \in V^*$ and the isomorphism is $v \mapsto \tilde{v}$. In contrast, for $V$ and $V^*$ we have $G(v_i)=f_i$ extended linearly gives the isomorphism, but, this construction depends on our choice of $\beta$. However, if we have an inner-product $g: V \times V \rightarrow \mathbb{F}$ then $g$ allows the construction of an isomorphism of $V$ and $V^*$ independent of basis choice; $\Psi(v)(w) = g(v,w)$ for all $v,w \in V$ defined $\Psi(v) \in V^*$ and you can check $\Psi: V \rightarrow V^*$ is an isomorphism.