some confusion on distribution of a random variable- measure theoretic view

enter image description here

Attached is the lecture notes from several years ago, and I just found something confusing.

Q1 let there have two measurable spaces $(\Omega,\mathscr{F},\mathbb{P}),((0,1),\mathcal{B}_{(0,1)},\mathbb{P}_X)$, the pushforward measure $\mathbb{P}_X$ is defined on borel $\sigma(\mathbb{R})\rightarrow [0,1]$, then why do we borther to have another notation $F_X:\mathbb{R}\rightarrow [0,1]$? Highlighted in red

Q2 Is the law of random variable=distribution of random variable= probability measure $\mathbb{P}_X$

Q3 proposition 2.1 highlighted in yellow, totally blind with meaning of $X(u)=\inf\{c\in\mathbb{R}|F(c)\geq u\}$, $u$ is the element in sample space, how should I interpret $F(c)\geq u$

Appreciate any comments.


Q1

The cumulative distribution function (CDF) $F_X$ is indeed entirely determined by the probability measure $\mathbb P_X$ as $F_X(c) = \mathbb{P}_X(X \in (-\infty,c])$, but they are different objects. May be what confuses you is the fact that $\mathbb{P}_X$ is entirely determined by the values $\mathbb P_X((-\infty,c])$. Every time you encounter a CDF, you could rather work with the underlying probability measure, but this object is easier to manipulate in a lot of contexts.

Q2

The law of a random variable is the rule that defines how the outcomes of the random process behaves. Anything that defines uniquely this behaviour can be understood as the law of a random variable. The probability measure is one of them. The cumulative distribution function also works.

Q3

This proprosition essentially describes how to generate any law for a real random variable from a uniform r.v.. The notation can be understood as a generalized inverse of $F$. You may have a look for rejection sampling.