Assume $a_{ij}\neq 0$. We say that the row $k$ is reduced by row $i$ w.r.t. the $(i,j)$ entry, if a scalar multiple of row $i$ is substracted from row $k$ such that the $(k,j)$ entry becomes $0$.

Gaussian elimination

Let $A$ be a matrix with $m$ rows and $n$ columns. Rows can be labeled as "completed" or "uncompleted".

Initially, all rows are labeled as "uncompleted".

For $i=0,\ldots,m-1$ do:

Let $a_{st}$ be the leftmost nonzero elements of the uncompleted rows.

($a_{st}$ is well defined if and only if not all uncompleted rows are zero.)

If $a_{st}$ is well defined, then label the $s$th row as completed and reduce all uncompleted rows by the $s$th row.

If $a_{st}$ is not well defined, then label the first uncompleted row as "completed".

Demonstration

Matrix dimensions = x.

Precision = digits. Magnitude of matrix entries = .

0 ≤ step ≤

Permuted rows [hide/show]

Observations

The algorithm proceeds row by row, from above to below.

Completed rows are above, uncompleted rows are below in each step.