Linear algebra is the branch of mathematics concerning finite or countably infinite dimensional vector spaces, as well as linear mappings between such spaces. Such an investigation is initially motivated by a system of linear equations in several unknowns. Such equations are naturally represented using the formalism of matrices and vectors.
If T is a function from a vector space V to a vector space W, then T is called a linear transformation from V to W if the following two properties hold for all vectors u and v in V and for all scalars k:
(i) T(ku) = kT(u) [Homogeneity property]
(ii) T(u + v) = T(u) + T(v) [ Additivity property]
A basis for a vector space is a sequence of vectors that has two properties at once:
1 The vectors are linearly independent.
2 The vectors span the space.
LU decomposition of matrices (and, within the routine, systematic use of elementary matrices) was introduced in Alan Turing‘s paper  which was motivated, in Alan Turing’s own words, by “the advent of electronic computers“.
The setting for all of linear algebra is in some vector space. Intuitively this is just a collection of objects, which we call vectors, with some rules on how you can combine vectors to get other vectors.
The introduction and development of the notion of a matrix and the subject of linear algebra followed the development of determinants, which arose from the study of coefficients of systems of linear equations. Leibnitz, one of the two founders of calculus, used determinants in 1693 and Cramer presented his determinant-based formula for solving systems of linear equations (today known as Cramer's Rule) in 1750. In contrast, the first implicit use of matrices occurred in Lagrange's work on bilinear forms in the late 1700s.
About 4000 years ago the Babylonians knew how to solve a system of two linear equations in two unknowns (a 2 × 2 system). In their famous Nine Chapters of the Mathematical Art (c. 200 BC) the Chinese solved 3 × 3 systems by working solely with their (numerical) coefficients. These were prototypes of matrix methods
Linear algebra is the simplest way to look at functions of many variables, which usually arise in engineering by the discretization of a concept stated in terms of a continuum, e.g. the law governing the relation between stresses and strains in a structure. Linear programming is a mathematical technique used in economics. It finds the maximum or minimum of linear functions in many variables subject to constraints.
Linear algebra provides the foundational setting for the study of multivariable mathematics which in turn is the bedrock upon which most modern theories of mathematical physics rest including classical mechanics (rigid body mechanics), continuum mechanics (the mechanics of deformable material bodies), relativistic mechanics, quantum mechanics, etc. At the heart of linear algebra is the notion of a (linear) vector space which is an abstract mathematical structure introduced to make rigorous the classical, intuitive concept of vectors as physical quantities possessing the two attributes of length and direction.
Linear algebra is the study of linear sets of equations and their transformation properties. Linear algebra allows the analysis of rotations in space, least squares fitting, solution of coupled differential equations, determination of a circle passing through three given points, as well as many other problems in mathematics, physics, and engineering.
All eigenvalues and eigenvectors satisfy the equation for a given square matrix, A.
Car designers analyze eigenvalues in order to damp out the noise so that the occupants have a quiet ride. Eigenvalue analysis is also used in the design of car stereo systems so that the sounds are directed correctly for the listening pleasure of the passengers and driver.