Basically, take a matrix and change it so that its mean is equal to 0 and variance is 1. I'm using numpy's arrays so if it can already do it it's better, but I can implement it myself as long as I can find an algorithm.
edit: nvm nimrodm has a better implementation
Take each element and subtract with the mean and then divide by the standard deviation.
Shoot me, I don't know python. In general the above is
mu = Average()
sig = StandardDeviation()
for(i=0;i<rows;i++)
{
for(j=0;j<cols;j++)
{
A[i,j] = (A[i,j]-mu)/sig;
}
}
The following subtracts the mean of A from each element (the new mean is 0), then normalizes the result by the standard deviation.
from numpy import *
A = (A - mean(A)) / std(A)
The above is for standardizing the entire matrix as a whole, If A has many dimensions and you want to standardize each column individually, specify the axis:
from numpy import *
A = (A - mean(A, axis=0)) / std(A, axis=0)
Always verify by hand what these one-liners are doing before integrating them into your code. A simple change in orientation or dimension can drastically change (silently) what operations numpy performs on them.