A star shaped open set in $\mathbb{R}^n$ is diffeomorphic to $\mathbb{R}^n$
Solution 1:
I gave an answar here as well.
Theorem. Every open star-shaped set $\Omega$ in $\mathbb{R}^n$ is $C^\infty$-diffeomorphic to $\mathbb{R}^n.$
Proof. For convenience assume that $\Omega$ is star-shaped at $0.$
Let $F=\mathbb{R}^n\setminus\Omega$ and $\phi:\mathbb{R}^n\rightarrow\mathbb{R}_+$ (here $\mathbb{R}_+=[0,\infty)$) be a $C^\infty$-function such that $F=\phi^{-1}(\{0\}).$ (such $\phi$ exists due to Whitney extension theorem)
Now we set $f:\Omega\rightarrow\mathbb{R}^n$ by formula: $$f(x)=\overbrace{\left[1+\left(\int_0^1\frac{dv}{\phi(vx)}\right)^2||x||^2\right]}^{\lambda(x)}\cdot x=\left[1+\left(\int_0^{||x||}\frac{dt}{\phi(t\frac{x}{||x||})}\right)^2\right]\cdot x.$$ Clearly $f$ is smooth on $\Omega.$
We set $A(x)=\sup\{t>0:t\frac{x}{||x||}\in\Omega\}.$ $f$ sends injectively the segment (or ray) $[0,A(x))\frac{x}{||x||}$ to the ray $\mathbb{R_+}\frac{x}{||x||}.$ Moreover $f(0\frac{x}{||X||})=0$ and $$\lim_{r\rightarrow A(x)}||f(r\frac{x}{||x||})||=\lim_{r\rightarrow A(x)}\left[1+\left(\int_0^{r}\frac{dt}{\phi\left(t\cdot\frac{rx}{||x||}\cdot||\frac{||x||}{rx}||\right)}\right)^2\right]\cdot r=\\ \left[1+\left(\int_0^{A(x)}\frac{dt}{\phi(t\frac{x}{||x||})}\right)^2\right]\cdot A(x)=+\infty.$$ Indeed, if $A(x)=+\infty,$ then it holds for obvious reason. If $A(x)<+\infty,$ then by definitions of $\phi$ and $A(x)$ we get that $\phi(A(x)\frac{x}{||x||})=0.$ Hence by Mean value theorem and the fact that $\phi$ is $C^1$ $$\phi\left(r\frac{x}{||x||}\right)\leqslant M(A(x)-r)$$ for some constant $M$ and every $r.$ As a result $$\int_0^{A(x)}\frac{dt}{\phi\left(t\frac{x}{||x||}\right)}$$ diverges. Hence we infer that $f([0,A(x))\frac{x}{||x||})=\mathbb{R_+}\frac{x}{||x||}$ and so $f(\Omega)=\mathbb{R}^n.$
To end the proof we need to show that $f$ has $C^\infty$-inverse. But as corollary from the Inverse function theorem we get that it is sufficient to show that $df$ vanish nowhere.
Suppose that $d_xf(h)=0$ for some $x\in\Omega$ and $h\neq 0.$ From definition of $f$ we get that $$d_xf(h)=\lambda(x)h+d_x\lambda(h)x.$$ Hence $h=\mu x$ for some $\mu\neq 0$ and from that $x\neq 0.$ As a result $\lambda(x)+d_x\lambda(x)=0.$ But we have that $\lambda(x)\geqslant 1$ and function $g(t):=\lambda(tx)$ is increasing, so $g'(1)=d_x\lambda(x)>0,$ which gives a contradiction.$\square$
Solution 2:
Let $U$ be an open subset of $\mathbb R^n$ that is star-shaped with respect to the origin $0\in U$. Let $f:U\mapsto (0,\infty)$ be a regularized distance function, i.e., a $C^\infty$ function such that $f(x)/\operatorname{dist}(x,\partial U)$ is pinched between two positive constants.
Aside: how to construct such $U$? Take all maximal dyadic cubes $Q_k\subset U$ such that $\operatorname{dist} (Q_k, \partial U)\ge \operatorname{diam} Q_k $ (aka Whitney decomposition of $U$). Let $\varphi_k$ be a smooth partition of unity associated to the cover of $U$ by larger cubes $\frac32Q_k$. Then $f(x)=\sum \operatorname{dist} (Q_k, \partial U)\,\varphi_k(x)$.
Considering the construction above, it's clear that we can make $f$ constant in a neighborhood of $0$, say $f=K$ there. Pick $r>0$ small enough so that the sphere $r\,\mathbb S^{n-1}$ is contained in this neighborhood. For each unit vector $\xi\in \mathbb S^{n-1}$, let $\gamma_\xi:\mathbb R\to U$ be the solution of the ODE $\dot \gamma_\xi=f(\gamma)\,\gamma$ with initial value $\gamma_\xi(0)=r\xi$. Observe that $\gamma_\xi(t)=e^{Kt}r \xi$ for $t\le 0$. For $t>0$, the integral curve still goes in the direction $\xi$, but slows down approaching $\partial U$ and never leaves $U$. It should be clear that the integral curves sweep out $U$.
Define a map from $\mathbb R^n$ onto $U$ by $$F(\rho\, \xi)= \gamma_\xi( K^{-1}\log \rho ),\quad \rho> 0, \ \xi\in \mathbb S^{n-1} \tag1$$ On the unit ball $F$ is linear. It is $C^\infty$ smooth everywhere. It is a bijection onto $U$. The invertibility of its derivative follows from ODE theorems on the dependence of solutions on initial values, see Hartman.