(Does matrix algebra belong to what "algebra" means here?)
SVD can be applied to text classification.
One can store all document information in matrix $M=(m_{ij})$ (called term-frequency matrix): Each row stands for an article, each column stands for a word, and each entry $m_{ij}$ equals to some kind of word frequency measure (as used commonly, TF–IDF) of the $j$th word in the $i$th article. Perform SVD on matrix $M$ such that $M=U\Sigma V^*$, with the shape like: 
, where $U$, $\Sigma$ and $V$ are called term-concept vector matrix, singular values matrix and concept-document vector matrix, respectively. Now each row of $U$ can be viewed as one category of words with similar meanings, whose entries represent correlations (or importance) in the their own category, and each row of $V$ can be viewed as one category of articles with a certain topic, whose entries also act alike. Finally, $\Sigma$ denotes correlations between words and articles.