# Introduction¶

In many fields of engineering linear transforms play a key role during modeling and solving real world problems. Often these linear transforms have an inherent structure which reduces the degrees of freedom in their parametrization. Moreover this structure allows to describe the action of a linear mapping on a given vector more efficiently than the general one.

This structure can be exploited twofold. First, the storage of these transforms in form of matrices, on computers normally an array of numbers in $$\mathbb{C}$$ or $$\mathbb{R}$$, might be unnecessary. So for each structure there is a more concise way of representation, which leads to a benefit in memory consumption when using these linear transforms. Second, the structure allows more efficient calculations when applying the linear transform to a vector. This may result in a drop in algorithmic complexity which implies that computing time can be saved.

Still, these structural benefits have to be exploited and it is not often easy to accomplish this in a save and reuseable way. Moreover, in applications you often think of the linear transforms as a matrix and your way of working with it is streamlined to this way of thinking, which is only natural, but does not directly allow to exploit the structure.

So, there are different ways of thinking in what is natural and in what is efficient. This is the gap fastmat tries to bridge by allowing you to work with the provided objects as if they were common matrices represented as arrays of numbers, while the algorithms that make up the internals are highly adapted to the specific structure at hand. It provides you with a set of tools to work with linear transforms while hiding the algorithmic complexity and exposing the benefits in memory and calculation efficiency without too much overhead.

This way you can worry about really urgent matters to you, like research and development of algorithms and leave the internals to fastmat.

# Types of Matrices¶

If you want to find out, what types of structures we provide, have a look at the Classes. There is a whole zoo of them!

# Algorithms¶

We provide some algorithms, which make use of the speedups provided by fastmat to allow easy production use out of the box.

# Auxiliary Submodules¶

To dig into the docu of our internals, have a look at Submodules.

# Scipy Interface¶

We have a very neat and simple interface to the linear Operators offered by SciPy, which also allow the same lazy evaluation as the fast mat classes. See SciPy Interface for further information.

# Publications¶

Since we created a package for scientific computing, it makes sense to use it for science. Below we list all publications, which make use of our package with varying degree. If you used fastmat in you publication, we are happy to reference it here:

If you use fastmat in your own work we kindly ask you to cite the above mentioned white paper as an acknowledgement.

# Public Appearances¶

Sometimes we also get out in the wild and present the package. The talks we held can be found below.

# Contributions¶

There are many ways you as an individual can contribute. We are happy about feature requests, bug reports and of course contributions in form of additional features. To these ends, please step by at Github where we organize the work on the package.

# Affiliations and Credits¶

Currently the project is jointly maintained by Sebastian Semper and Christoph Wagner at the EMS group at TU Ilmenau.