Let $(\Omega, \cal{A}, \mathbb{P})$ be a probability space and $X$ a random variable on $\Omega$. Let, also, $f:\Omega\to\mathbb{R}$ be a Borel function. Then:
$X$ and $f(X)$ are independent $\Longleftrightarrow$ there exists some $t\in\mathbb{R}$ such that $\mathbb{P}[f(X)=t]=1$, that is $f(X)$ is a degenerate r.v.
The only thing that I could make out is that if $X$ and $f(X)$ are independent, then
$\mathbb{P}[f(X)\in B]=0$ or $1$ for every Borel subset of $\mathbb{R}$, since $\sigma(f(X))\subseteq \sigma(X)$ and hence, $f(X)$ is independent of its self. Suppose, now, that $\mathbb{P}[f(X)\leq x]=0$ for all $x\in\mathbb{R}$. Then:
$\mathbb{P}[f(X)\in\mathbb{R}]=\mathbb{P}[\bigcup_{n=0}^{\infty}[f(X)\leq n]]\leq\sum_{n=0}^{\infty}[f(X)\leq n]=0$ which obviously is a contradiction since $\mathbb{P}[f(x)\in\mathbb{R}]=1$.
However, I don't know ow to prove this and my attempt isn't likely to become a complete solution.
Any help would be appreciated.
Thanks in advance!
via Recent Questions - Mathematics - Stack Exchange http://math.stackexchange.com/questions/291284/x-and-fx-independent-longleftrightarrow-fx-is-degenerate
0 comments:
Post a Comment