Coder Social home page Coder Social logo

mathematics-r's Introduction

Note - check if need revising

Norms

def

\begin{enumerate} \item Positivity $\|x\| \geq 0$ \item Positive definiteness $\|x\|=0 \Longleftrightarrow x=0$ \item Homogeneity $\|α x\|=|α|\|x\|$ for arbitrary scalar $α$ \item Triangle inequality $\|x+y\| \leq\|x\|+\|y\|$ \end{enumerate} Note: not sure if this holds for ever norm

The different matrix ones (Norms on $A ∈ \mathbb{R}m × n$)

$$ \|A\|1=max 1 \leq j \leq n ∑i=1m\left|ai j\right| $$

$$ \|A\|2=\sqrt{λmax \left(AT A\right)} $$

Frobenius norm, sometimes also called the Euclidean norm

$$ \|\mathrm{A}\|F ≡ \sqrt{∑i=1m ∑j=1n\left|ai j\right|2} $$

$$ \|A\|∞=max 1 \leq i \leq m ∑j=1n\left|ai j\right| $$

Vector norms (Norms on $\mathbb{R}n$)

$$|\mathbf{x}|p ≡\left(∑i\left|xi\right|p\right)1 / p$$

special case:

$$|\mathbf{x}|∞ ≡ max i\left|xi\right|$$

Tips: pretty sure think $\|⋅\|$ usually just refers to the 2-norm.

Scalar norm (Norm on $C[a, b]$)

\begin{equation} \left.\begin{array}{l} {\|f\|p=\left(∫ab|f(τ)|p d τ\right)\frac{1{p}}, \quad p ∈[1, ∞]}
{\|f\|=\displaystyle\sup \scriptscriptstyle a \leq t \leq b|f(t)| }\end{array} \quad, \right\} \quad \mathscr{L}p-\text { norms } \end{equation}

$$ C[0, ∞), \mathscr{L}p \text{ is a Banach space} $$

$f ∈ \mathscr{L}p ⇔\|f\|p$ is bounded, i.e. $∃ c:\|f\|p \leq c$

Tensor rank

\begin{table}[h] \begin{tabular}{cl} rank & object
\hline 0 & scalar \ 1 & vector \ 2 & matrix (/Dyad) \ $\geq 3$ & tensor \end{tabular} \end{table} Also sometimes triad, tetrad are used to refer to tensors of rank 3 and 4 respectivly. Some refer to the rank of a tensor as its order or its degree.

SVD

Example: $A=\left[\begin{array}{lll}{0} & {1} & {1} \ {\sqrt{2}} & {2} & {0} \ {0} & {1} & {1}\end{array}\right]$

The SVD is defined as $$ A=P Σ QT $$

Method

Computing: $$ A AT=\left[\begin{array}{lll}{2} & {2} & {2} \ {2} & {6} & {2} \ {2} & {2} & {2}\end{array}\right] $$

\begin{equation} \begin{aligned} -λ3+10 λ2-16 λ &=-λ\left(λ2-10 λ+16\right)
&=-λ(λ-8)(λ-2) \end{aligned} \end{equation}

Eigenvals of $A AT$ are $λ=8, λ=2, λ=0$, thus the singular values are $σ1=2 \sqrt{2}, σ2=\sqrt{2}\left(\text { and } σ3=0\right)$.

Giving out the matrix $$ Σ= 03x3+σ= \left[\begin{array}{ccc}{2 \sqrt{2}} & {0} & {0} \ {0} & {\sqrt{2}} & {0} \ {0} & {0} & {0}\end{array}\right] $$

Finding the eigenvectors $(A-λ I) \mathbf{x}=\mathbf{0}$, we get respetivly to the egienvector who is described before: $p1=\left(\frac{1}{\sqrt{6}}, \frac{2}{\sqrt{6}}, \frac{1}{\sqrt{6}}\right)$, $p2=\left(-\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}},-\frac{1}{\sqrt{3}}\right)$ and $p3=\left(\frac{1}{\sqrt{2}}, 0,-\frac{1}{\sqrt{2}}\right)$ (note: normalised vectors).

Yeilding

$$ P= \left[ {p_1}^T {p_2}^T {p_3}^T \right] =\left[\begin{array}{ccc}{\frac{1}{\sqrt{6}}} & {-\frac{1}{\sqrt{3}}} & {\frac{1}{\sqrt{2}}} \ {\frac{2}{\sqrt{6}}} & {\frac{1}{\sqrt{3}}} & {0} \ {\frac{1}{\sqrt{6}}} & {-\frac{1}{\sqrt{3}}} & {-\frac{1}{\sqrt{2}}}\end{array}\right] $$

$$ AT A=\left[\begin{array}{ccc}{2} & {2 \sqrt{2}} & {0} \ {2 \sqrt{2}} & {6} & {2} \ {0} & {2} & {2}\end{array}\right] $$ With the eigenvals $λ=8, λ=2, λ=0$ with eigenvectors $q1=\left(\frac{1}{\sqrt{6}}, \frac{3}{\sqrt{12}}, \frac{1}{\sqrt{12}}\right), q2=\left(\frac{1}{\sqrt{3}}, 0,-\frac{2}{\sqrt{6}}\right) \text { and } q3=\left(\frac{1}{\sqrt{2}},-\frac{1}{2}, \frac{1}{2}\right)$. (Acually can also use the the formula $pi=\frac{1}{σi} AT pi$ to get the various $q_i$.

$$ Q=\left[ {q_1}^T {q_2}^T {q_3}^T \right] \left[\begin{array}{ccc}{\frac{1}{\sqrt{6}}} & {\frac{1}{\sqrt{3}}} & {\frac{1}{\sqrt{2}}} \ {\frac{3}{\sqrt{12}}} & {0} & {-\frac{1}{2}} \ {\frac{1}{\sqrt{12}}} & {-\frac{2}{\sqrt{6}}} & {\frac{1}{2}}\end{array}\right] $$

We have then the SVD defined as $$ A=P Σ QT $$

Covar

\newpage

change of basis

eigenvalue decomposition

$$ A=\left[\begin{array}{lll}{4} & {0} & {1} \ {2} & {3} & {2} \ {1} & {0} & {4}\end{array}\right] $$ \begin{align*} λ&=5 \quad \operatorname{NUL}(A-5 I) =\text { SPAN }\left\{\left[\begin{array}{l}{1} \ {2} \ {1}\end{array}\right]\right\} λ&=3 \quad \operatorname{NUL}(A-3 I) = \text { SPAN }\left\{\left[\begin{array}{l}{0} \ {1} \ {0}\end{array}\right],\left[\begin{array}{c}{-1} \ {0} \ {1}\end{array}\right]\right\} \end{align*} $$ D=\left[\begin{array}{lll}{5} & {0} & {0} \ {0} & {3} & {0} \ {0} & {0} & {3}\end{array}\right] P=\left[\begin{array}{llc}{1} & {0} & {-1} \ {2} & {1} & {0} \ {1} & {0} & {1}\end{array}\right] $$ OK, good $$ A=P D P-1 $$

jordan

$$ A=\left[\begin{array}{lll}{1} & {1} & {1} \ {0} & {1} & {0} \ {0} & {0} & {1}\end{array}\right] $$

\begin{align*} λ&=1 \quad \operatorname{NUL}(A-1 I) = \text { SPAN }\left\{\left[\begin{array}{l}{1} \ {0} \ {0}\end{array}\right],\left[\begin{array}{c}{0} \ {1} \ {-1}\end{array}\right]\right\} \end{align*} 2 eigenvectors? :-d (for some reason we are calling these vectors the first and third)

$$ \left[\begin{array}{lll} {λ} & {1} & ? {0} & {λ}& ?\\ {0} & {0} &? \end{array}\right] $$

\begin{align*} A V1&=λ V1
A V2&=V1+λ V2 \end{align*}

\begin{equation*} \begin{array}{l}{A V2-λ V2=V1} \ {(A-λ I) V2=V1}\end{array} \end{equation*} $V_1$ is given what is $V_2$? let $V_1=[1,0,0]^T$ \begin{align*} (A-1 I) V2&=V1
\left[\begin{array}{lll}{0} & {1} & {1} \ {0} & {0} & {0} \ {0} & {0} & {0}\end{array}\right] V2&=\left[\begin{array}{l}{1} \ {0} \ {0}\end{array}\right] \end{align*}

\begin{equation} V2=\left[\begin{array}{l}{0} \ {0} \ {1}\end{array}\right]+\bcancel{x\left[\begin{array}{l}{1} \ {0} \ {0}\end{array}\right]+y\left[\begin{array}{c}{0} \ {1} \ {-1}\end{array}\right]} \end{equation} Let \begin{equation} V2=\left[\begin{array}{l}{0} \ {0} \ {1}\end{array}\right] \end{equation}

\begin{equation} V3=\left[\begin{array}{c}{0} \ {1} \ {-1}\end{array}\right] \end{equation}

\begin{equation} A \quad \text { has the from }\left[\begin{array}{lll}1 & 1 &0 \ 0 & 1 &0 \ 0 & 0 &1\end{array}\right] \end{equation}

\begin{equation} P=\left[\begin{array}{lll}{V1} & {V2} & {V3}\end{array}\right]= \left[\begin{array}{ccc}{1} & {0} & {0} \ {0} & {0} & {1} \ {0} & {1} & {-1}\end{array}\right] \end{equation} then \begin{equation} A=P J P-1 \quad J=\begin{equation} \left[\begin{array}{lll}{1} & {1} & {0} \ {0} & {1} & {0} \ {0} & {0} & {1}\end{array}\right] \end{equation} \end{equation}

linear def

positive definite

A square matrix A is positive definite if there is a positive scalar α such that \begin{equation} xT A x \geq α xT x, \quad \text { for all } x ∈ \mathbf{R}n \end{equation} It is positive semidefinite if \begin{equation} xT A x \geq 0, \quad \text { for all } x ∈ \mathbb{R}n \end{equation} We can recognize that a symmetric matrix is positive definite by computing its eigenvalues and verifying that they are all positive, or by performing a Cholesky factorization.

rotation things

3D point as 3-vector $$ \mathbf{X}=\left[\begin{array}{l}{X} \ {Y} \ {Z}\end{array}\right] $$

3D point using affine homogeneous $$ \left[\begin{array}{l}{\mathbf{X}} \ {1}\end{array}\right]=\left[\begin{array}{l}{X} \ {Y} \ {Z} \ {1}\end{array}\right] $$ Inverse of rotation matrix \begin{equation} \begin{aligned} \mathrm{X} &=\mathrm{RX}+\mathrm{t}
\mathrm{X}-\mathrm{t} &=\mathrm{RX} \ \mathrm{R}\top\left(\mathrm{X}-\mathrm{t}\right) &=\mathrm{X} \ \mathrm{R}\top \mathrm{X}-\mathrm{R}\top \mathrm{t} &=\mathrm{X} \end{aligned} \end{equation} This gives in homogenius

$$ \left[\begin{array}{cc}{\mathrm{R}\top} & {-\mathrm{R}\top \mathbf{t}} \ {0\top} & {1}\end{array}\right]\left[\begin{array}{l}{\mathrm{X}′} \ {1}\end{array}\right]=\left[\begin{array}{l}{\mathrm{X}} \ {1}\end{array}\right] $$

Rodrigues rotation

rotation matrix in vector form \begin{equation} \begin{array}{l} θ ← norm(r) \ r ← r/ θ \ R = cos{θ} I + (1- cos{θ} ) r r^T + sin{θ}\left[\begin{array}{ccc}{0} & {-rz} & {ry} \ {rz} & {0} & {-rx} \ {-ry} & {rx} & {0}\end{array}\right] \end{array} \end{equation}

Inverse transformation can be also done easily, since

\begin{equation} sin{θ}\left[\begin{array}{ccc}{0} & {-rz} & {ry} \ {rz} & {0} & {-rx} \ {-ry} & {rx} & {0}\end{array}\right]= \frac{R - R^T}{2} \end{equation}

A rotation vector is a convenient and most compact representation of a rotation matrix (since any rotation matrix has just 3 degrees of freedom)

\newpage

Logic

Necessity and sufficiency

\begin{equation*} \begin{array}{|c|c|c|c|c|}\hline S & {N} & {S ⇒ N} & {S ⇐ N} & {S ⇔ N} \ \hline T & {T} & {T} & {T} & {T} \ \hline T & {F} & {F} & {T} & {F} \ \hline F & {T} & {T} & {F} & {F} \ \hline F & {F} & {T} & {T} & {T} \ \hline\end{array} \end{equation*}

\begin{center} \begin{tikzpicture} \def\radius{2cm} \def\mycolorbox#1{\textcolor{#1}{\rule{2ex}{2ex}}} \colorlet{colori}{blue!70} \colorlet{colorii}{red!70}

\coordinate (ceni); \coordinate[xshift=\radius] (cenii);

\draw[fill=colori,fill opacity=0.5] (ceni) circle (\radius); \draw[fill=colorii,fill opacity=0.5] (cenii) circle (\radius);

\draw ([xshift=-20pt,yshift=20pt]current bounding box.north west) rectangle ([xshift=20pt,yshift=-20pt]current bounding box.south east);

\node[yshift=10pt] at (current bounding box.north) {Venn diagram }; \node[xshift=-.5\radius] at (ceni) {$\mathbf{S}$}; \node[xshift=.5\radius] at (cenii) {$\mathbf{N}$}; \node[xshift=.9\radius] at (ceni) {$\mathbf{S}∩\mathbf{N}}$}; \node[xshift=10pt,yshift=10pt] at (current bounding box.south west) {$∅$}; \end{tikzpicture} \end{center}

Classification: True vs. False and Positive vs. Negative

Confusion matrix

Example based off one story from Aesop’s Fables: \begin{table}[h!] \begin{tabular}{|l|l|} \hline \makecell[l]{\textbf{True Positive (TP):}
Reality: A wolf threatened. \ Shepherd said: “Wolf.” \ Outcome: Shepherd is a hero. } & \makecell[l]{\textbf{False Positive (FP):}\ Reality: No wolf threatened. \ Shepherd said: “Wolf.” \ Outcome: Villagers are \ angry at shepherd for waking them up.} \ \hline \makecell[l]{\textbf{False Negative (FN):}\ Reality: A wolf threatened. \ Shepherd said: “No wolf.” \ Outcome: The wolf ate all the sheep.} & \makecell[l]{\textbf{True Negative (TN):}\ Reality: No wolf threatened. \ Shepherd said: “No wolf.” \ Outcome: Everyone is fine. } \ \hline \end{tabular} \end{table}

## ranodm

  1. f(f)=c^t - exponential
  2. f(f)=t^n (n el natural num) - polymonial
  3. f(f)=t^c - ??

https://en.wikipedia.org/wiki/Wheat_and_chessboard_problem

mathematics-r's People

Contributors

kraftbar avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.