03_005_Linear_transform.html
======================================================================
Linear is performing multiplication or addition with vectors and matrices
If linear is used for purpose of transform, it's linear transform.
======================================================================
Linear transform is mapping "vector space" (N-D space) $$$X^{N}$$$
to "vector space" (M-D space) $$$Y^{M}$$$
Linear transform is actually and simply "matrix multiplication"
If there is one vector which is element out of N-D space ($$$x\in X^{N}$$$)
(x is one point in N-D space),
you can map x onto $$$Y^{M}$$$,
then, you can obtain corresponding mapped y
======================================================================
Task can be processed like this:
(M,1) vector y = (some (M,N) matrix)*((N,1) vector x)
You call "some matrix" as linear transform matrix
If you want to map 8-D vector onto 5-D vector,
you can use (5,8) linear transform matrix.
======================================================================
In pattern recognition,
linear transform is much used for dimensionality reduction technique.
======================================================================
If linear transform matrix A is square matrix,
and if linear transform matrix A satisfies $$$AA^{T}=A^{T}A=I$$$,
you can say linear transform matrix A and $$$A^{T}$$$ are orthonormal.
If A and $$$A^{T}$$$ are orthonormal, $$$A^{T}=A^{-1}$$$
======================================================================
If you perform orthonormal transform, you can preserve size of vector.
y=Ax
y : n-D vector
A : (n,n) matrix
x : n-D vector
But y and x are perfectly different vectors
If A is orthonormal, length of y and x becomes identical.
Prove:
$$$|y|=\sqrt{y^{T}y}$$$
you know y=Ax
$$$|y|=\sqrt{(Ax)^{T}(Ax)}$$$
you know, $$$(ab)^{T}=b^{T}a^{T}$$$
$$$|y|=\sqrt{x^{T}A^{T}Ax}$$$
if A is orthonormal, $$$A^{T}A=I$$$
$$$|y|=\sqrt{x^{T}x}$$$
$$$|y|=|x|$$$
======================================================================
Row vectors ($$$a_{1}, a_{2}, ...,a_{N}$$$) of orthonormal transform matrix forms
set of basis vectors which are all orthonormal
2018-06-12 10-31-02.png
If $$$a_{1}, a_{2}, ...,a_{N}$$$ are orthonormal,
$$$a_{i}^{T}a_{j}=0$$$ if $$$i\neq j$$$
$$$a_{i}^{T}a_{j}=1$$$ if $$$i = j$$$
======================================================================
Suppose "one matrix A" which will be used as linear transform
y=Ax
Eigenvector of A represents "invariant direction" in vector space
======================================================================
When you perform y=Ax, you can think of eigenvector v of A
All points on direction of eigenvector v of A will be located
in that same direction when you perform y=Ax
Length of that will be multiplication by corresponding eigenvalue $$$\lambda$$$
2018-06-12 10-43-53.png
That is, direction and location don't change from x to y
Only length changes as much as $$$\lambda$$$
======================================================================
Suppose one matrix A
$$$A = \begin{bmatrix} \cos{\beta}&-\sin{\beta}&0 \\ \sin{\beta}&\cos{\beta}&0 \\ 0&0&1 \end{bmatrix}$$$
A is actually rotate transform matrix
1 column : x
2 column : y
3 column : z
Axis : z
Rotate angle : $$$\beta$$$
x and y change
z is fixed, z is direction of eigenvector
Eigenvalue is 1 because length is also fixed
======================================================================