Proof that extreme estimators are consistent
Since $\Theta$ is compact you know this: for any sequence of points in $\Theta$ there is a subsequence that converges to a point of $\Theta$. This is related to the Bolzano-Weierstrass theorem but you can take it as given any time somebody talks about a compact subset in a metric space such as $\mathbb{R}^q$. Therefore, given the estimators $\widehat{\vartheta}_n$, there is a subsequence $\widehat{\vartheta}_{k_n}$ (where $k_1<k_2<\dots$ is a subsequence of $\mathbb{N}=\{1,2,\dots\}$) such that $\lim_{n \to \infty} \widehat{\vartheta}_{k_n}=\theta_1$ for some $\theta_1$ in $\Theta$. But if $\theta_1$ is different than $\theta_0$, based on what you assumed you should reach a contradiction. For example, if $Q$ is continuous, assuming $\theta_1 \in N^c\cap \Theta$ (choosing the neighborhood $N$ of $\theta_0$ such that $\theta_1 \not\in N$) then the maximum over $N^c\cap \Theta$ should be less than $Q(\theta_0)-\delta$ for some $\delta>0$. Choosing $\epsilon<\delta/2$ you will have that $\frac{1}{n} Q_n(\theta_0)>Q(\theta_0)-\epsilon$ with probability converging to $1$. But you also have $Q_n(\theta_n)<Q(\theta_0)-\delta+\epsilon$ with probability converging to $1$. It seems that this should lead to a contradiction which implies the convergence-in-probability.
Note that, if you can conclude that $\theta_1=\theta_0$, then you can obtain convergence. That is because if you do not have convergence of the $\vartheta_n$'s to $\theta_0$ then take a sub-sequence of them staying at least distance $\eta>0$ away (which subsequence exists by the assumption that the $\vartheta_n$'s do not converge to $\theta_0$). But then re-do the argument with a convergent sub-subsequence of this subsequence, which must converge to some $\theta_1$. But if your argument shows $\theta_1=\theta_0$ that leads to a contradiction of the assumption that all the terms in the subsequence remain at least a distance $\eta$ away from $\theta_0$.