You're definitely on the right track. Here's a formal way to handle this.

First, recall the definition of "interpretation of a term". Given a term $t$, we define $g(t) \in M$ recursively by $g(x) = g(x)$ if $x$ is a variable [where the equality defines the left side, and where the right side is applying the function $g$ to $x$], and $g(f(t_1, \ldots, t_n)) = f^M(g(t_1), \ldots, g(t_n))$ for $f$ an $n$-ary function and $t_1, \ldots, t_n$ a term.

Lemma 0: Suppose that for all variables $x$ occuring in a term $t$, $g(x) = g'(x)$. Then $g(t) = g'(t)$.

Proof: proceed via structural induction on $t$.

We then prove

Lemma 1: Suppose $g(t_i) = g'(t_i)$ for $i = 1, 2$. Then $M, g \models t_1 = t_2$ if and only if $M, g' \models t_1 = t_2$.

and

Lemma 2: Suppose that $R$ is an $n$-ary relation symbol and that $t_1, \ldots, t_n$ are terms. Suppose for all $i$, $g(t_i) = g'(t_i)$. Then $M, g \models R(t_1, \ldots, t_n)$ iff $M, g' \models R(t_1, \ldots, t_n)$.

We then prove

Lemma 3: For all $\star \in \{\top, \bot\}$, $M, g \models \star$ iff $M, g' \models \star$.

Lemma 4: Suppose that for $i = 1, 2$, we have $M, g \models \phi_i$ iff $M, g' \models \phi_i$. Then for all $\star \in \{\land, \lor, \to\}$, we have $M, g \models \phi_1 \star \phi_2$ iff $M, g' \models \phi_1 \star \phi_2$.

using the definition of $\models$.

Noting that $\neg \phi$ is just an abbreviation for $\phi \to \bot$, we get that

Lemma 5: Suppose $M, g \models \phi$ iff $M, g' \models \phi$. Then $M, g \models \neg \phi$ iff $M, g' \models \neg \phi$.

Now, we come to the subtler bit. Here, $g[x \mapsto y]$ is the function which sends $x$ to $y$ but otherwise agrees with $g$.

Lemma 6: Suppose that for all $y \in M$, we have $M, g[x \mapsto y] \models \phi$ iff $M, g'[x \mapsto y] \models \phi$. Then for $\star \in \{\exists, \forall\}$, we have $M, g \models \star x \phi$ iff $M, g' \models \star x \phi$.

You've completed these pieces so far. In order to put them all together, we now state the following theorem:

Let $\phi$ be a formula. Suppose we have partial variable assignments $g, g'$. Suppose that for all $x$ occuring free in $\phi$, $g(x) = g'(x)$. Then $M, g \models \phi$ iff $M, g' \models \phi$.

Proof: we proceed by structural induction on $\phi$.

  1. $\phi$ is of the form $t_1 = t_2$. In this case, for all $i = 1, 2$, for all variables $x$ occuring in $t_i$, $g(x) = g'(x)$. Then $g(t_i) = g'(t_i)$ by Lemma 0. By Lemma 1, we have $M, g \models \phi$ iff $M, g' \models \phi$.
  2. $\phi$ is of the form $R(t_1, \ldots, t_n)$. For all $i$, we see that for all variables $x$ occuring in $t_i$, $g(x) = g'(x)$. Therefore, $g(t_i) = g'(t_i)$ by Lemma 0 for all $i$. Therefore, $M, g \models \phi$ iff $M, g' \models \phi$ by Lemma 2.
  3. $\phi$ is of the form $\top$ or $\bot$. Apply Lemma 3.
  4. $\phi$ is of the form $\psi_1 \star \psi_2$ for some $\star \in \{\land, \lor, \to\}$. Since $FreeVariables(\phi) = FreeVariables(\psi_1) \cup FreeVariables(\psi_2)$, we see that for $i = 1, 2$, for all $x$ occuring free in $\psi_i$, $g(x) = g'(x)$. Therefore, we see that $M, g \models \psi_i$ iff $M, g' \models \psi_i$ by the inductive hypothesis. Apply Lemma 4.
  5. $\phi$ is of the form $\neg \psi$. Then $FreeVariables(\phi) = FreeVariables(\psi)$, so we can apply the inductive hypothesis and Lemma 5.
  6. $\phi$ is of the form $\star x \psi$ for some $\star \in \{\exists, \forall\}$. Then $FreeVariables(\phi) = FreeVariables(\psi) \setminus \{x\}$, so $FreeVariables(\psi) \subseteq FreeVariables(\phi) \cup \{x\}$. Consider an arbitrary $y \in M$. Then $g[x \mapsto y]$ and $g'[x \mapsto y]$ agree on all elements of $FreeVariables(\phi) \cup \{x\}$, so by the inductive hypothesis, $M, g[x \mapsto y] \models \psi$ iff $M, g'[x \mapsto y] \models \psi$. Therefore, by Lemma 6, we have $M, g \models \star x \psi$ iff $M, g' \models \star x \psi$.

The proof is now complete. $\square$