1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

An inequality about a linear transformation

Discussion in 'Mathematics' started by Na'omi, Oct 8, 2018.

  1. Na'omi

    Na'omi Guest

    I am dealing with the test of the OBM (Brasilian Math Olympiad), University level, 2017, phase 2.

    As I've said at other topics (questions 1, 2, 3, 4 and 6 - this last still open), I hope someone can help me to discuss this test.

    The question 5 says:


    Let be $d\leq n$ positive integers and $A$ a real matrix $d\times n$, that introduces a linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^d$ by $v\mapsto A\cdot v$. Let be $\sigma(A)$ the supreme of $\inf _{v\in W,|v|=1} |A\cdot v|$ over all the subspaces $W$ with dimension $d$ of $\mathbb{R}^n$.
    For each $j\leq d$, let be $r(j)\in\mathbb{R}^n$ the $j^\circ$ line-vector of $A$, it means, $r(j)=A^t\cdot e_j$, where $e_j$ is the $j^\circ$ element of the canonic base of $\mathbb{R}^d$. Prove that: $\sigma(A)\leq \min_{i\leq d} d(r(i),\langle r(j),j\neq i\rangle )\leq \sqrt{n}\cdot \sigma(A)$.

    I know that the distance between one vector $r(i)$ and the subspace is at most $|r(i)|$ and I tried some calculus, but not very substantial.

    Thanks very much.

    Login To add answer/comment
     

Share This Page