Showing posts with label linear algebra. Show all posts
Showing posts with label linear algebra. Show all posts

Sunday, July 17, 2022

rotation matrix

 rotation matrix


https://en.wikipedia.org/wiki/Rotation_matrix 

Saturday, April 16, 2022

Saturday, June 26, 2021

quantum speed up on alignment or alignment sequence analysis

 

alignment is a score matrix, so matrix method might speed the best alignment? 

https://www.youtube.com/watch?v=p2hZL38tqAs


quantum solution on the maximum unique match might be a good starting point. 

https://en.wikipedia.org/wiki/Sequence_alignment#Maximal_unique_match


alignment - free sequence 

https://bioinformaticsreview.com/20170704/role-of-information-theory-chaos-theory-and-linear-algebra-and-statistics-in-the-development-of-alignment-free-sequence-analysis/

Square of Ajacency matrice give estimation of path























Quantum Walk and Graph search. 


Qin: sequence alignment can be conerted all-2-all adjacent matrix. So, sequence alignment then might become a quantum walk and graph search problem. 


Friday, April 10, 2020

Matrix Eigen value


spectral theorem

https://youtu.be/KCANLl8z6PI

visual explanation
https://youtu.be/PFDu9oVAE-g

Monday, August 26, 2019

algebraic multiplicity and geometric multiplicity

https://people.math.carleton.ca/~kcheung/math/notes/MATH1107/wk10/10_algebraic_and_geometric_multiplicities.html

The algebraic multiplicity of λ is the number of times λ is repeated as a root of the characteristic polynomial.

 Let A be an n × n matrix with eigenvalue λ. The geometric multiplicity of λ is the dimension of the eigenspace of λ.

In general, the algebraic multiplicity and geometric multiplicity of an eigenvalue can differ. However, the geometric multiplicity can never exceed the algebraic multiplicity.
It is a fact that summing up the algebraic multiplicities of all the eigenvalues of an n×n matrix A gives exactly nIf for every eigenvalue of A, the geometric multiplicity equals the algebraic multiplicity, then A is said to be diagonalizable. As we will see, it is relatively easy to compute powers of a diagonalizable matrix.



Friday, January 19, 2018

SVD

good and clear explanations
https://youtu.be/EfZsEFhHcNM


Monday, December 29, 2014

orthogonal projection, orthogonality

orthogonal projection of vector y to u:
y^hat = \frac{y \dot u}{u \dot u } u


U has orthonormal columns if and only if U^T U = I

inner product of vectors, dot products, orthogonality

$u \dot v = u^T v$

$u \dot v = v \dot u $

length (or norm) of vector $v$ is the square root of its inner product. This can be seen from the v [a,b], whose length(norm) is sqrt(a^2 + b^2)

u \dot v = ||u|| ||v|| cos \theta
||u-v|| = ||u||^2 + ||v||^2 - 2||u|| ||v|| cos\theta

Two vector $u$ and $v$ are orthogonal if and only if $u \dot v = 0$. 



U has orthonormal columns if and only if U^T U = I

orthogonal projection of vector y to u:
y^hat = \frac{y \dot u}{u \dot u } u

Orthogonal projection of a point y to W space with {u1, u2, ... up} basis can be found by orthogonal projections on each base vector, u1, u2, ..., u_p.