In linear algebra what we do is we learn/visualise a concept in 2D or 3D and then apply it to ND . We cannot easily visualise 4D, 5D ....ND for this we have linear algebra which simplifies and generalises higher dimensions (# of features = # of dimensions) .
VECTORS
There are 2 types of vectors :
Row Vector.
A = [ a1,a2,a3....an] 1*n (1 row and n columns) .
2. Column Vector (default used in machine learning for simplicity ) .
B = [ b1
b2
...
bn] n*1 (n rows and 1 column ) .
Matrices are double arrays ie array of arrays.
A point is represented as a vector.
eg :
(i) 2D
P(point) = [2,3] -- here 2,3 are called as components ie x1 component of P is 2 and x2 component of P is 3 . P [2,3] - it is a point /2D vector ie it has 2 components.
(ii) 3D
It has 3 components ie x1,x2 and x3 .
(iii) ND
For n-dimensional point we have n components ie X (vector)
X = [1,2,3,4,5......n] {n components ie n dimension }.
DISTANCE OF A POINT FROM ORIGIN
(i) In 2D
Point is represented as a vector in 2D
p=[2,3] ## 2,3 are components of 2D vector p.
d = √ ( a pow(2) + b pow(2) {square root of a square + b square}
(ii) In 3D
q (vector of size 3) = [2,3,5] .## each component tells how far it is from origin .
(iii) ND
DISTANCE B/W 2-POINTS
Always in linear algebra we learn a concept in 2D/3D and then extend it to nD.
DOT PRODUCT & ANGLE B/W 2 VECTORS
Let's say we have 2 vectors a & b :
a=[a1,a2,a3,....an]
b=[b1,b2,b3,....bn]
Addition of vectors is :
a +b = [a1+b1 , a2+b2 , a3+b3 , .....an+bn ] {component wise addition is done }
Multiplication of vectors is of 2 types :
1 : Dot-Product :- Most widely used and most important.
2 : Cross - Product :- Not much used in ML .
DOT PRODUCT :
Trigonometrically :
Geometrically :
DETERMINE ANGLE BETWEEN 2 VECTORS
we use dot product formula to get angle b/w 2 vectors a & b.
If dot product of 2 vectors is 0 then it means 2 vectors are perpendicular ( since cos90 = 0 , prove using above formula itself ) .
Geometrically D dot product is angle between points . ie dot product is nothing but angle between 2 points . Also we get angle we get value of a.b by above formula.
For N-D :
PROJECTION & UNIT-VECTOR
Projection :
Using only component values of a,b we can easily get projection of a on b .
Unit Vector :
Unit vector is a vector which is in same direction as original vector .
Equation of Line(2D),Plane(3D) & Hyper-plane (nD).
Here as well we learn a concept in 2-D and 3-D and then apply in higher dimension ie n-D .
Equation of a Line (2D) :
ax + by + c = 0
General equation with x1 and x2 as point (so that in 100-D we don't have any issue ) .
ax1 + by1 +c = 0
let a = w1 and b = w2 then equation of a line in 2-d is :
w1x1 + w2x2 + c = 0
Equation of a Plane (3D) :
similar to above we get general equation as :
w1x1 + w2x2 + w3x3 + c = 0
Line in 2D is a plane in 3D and hyper-plane in nD .
so , nd Equation of hyper-plane is :
w0 + w1x1 + w2x2 + ...... wnxn = 0
ie :
What is W0 (constant value) ?
Equation of a plane from origin :
DISTANCE OF A POINT FROM A PLANE/HYPER-PLANE
shortest distance is perpendicular distance .
Plane in 3D space separates whole region in 2 and this is called as half-space.
If w is a unit vector then ||w|| = 1 , if w & p are on same side of plane then
w.p = ||w|| ||p|| cosΘ ,
if distance is negative then it is nothing , so we will only take +ve value . Also -ve value tells us that it is in which half-space.
Equation of Circle(2D),Sphere(3D) & Hyper-Sphere (nD).
(i) Circle :
(ii) Sphere & hyper-sphere :
Equation of Ellipse(2D),Ellipsoid(3D) & Hyper-Ellipsoid (nD).
(i) 2-D Ellipse (egg-shape)
(ii) : 3-D (ellipsoid ( soccer ball stretched equally)) & Hyper-ellipsoid
Equation of Rectangle/square(2D),Cuboid(3D) & Hyper-Cuboid (nD).
(i) 2D :
Show/learn in 2D and extend in high dimension .
Simply check via if-else if point is inside or outside .
(ii) 3D and nD
This was basic introduction to linear algebra for ML/AI .
Comments