This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

The Walsh Hadamard transform

The fast Walsh Hadamard transform provides a path to fast random projections that further lead to locality sensitive hashing, unbiased dimension reduction or increase, compressive sensing, associative memory, extreme learning machines and reservoir computing.  

I hope you have code for it in your HPC/AI libraries!

The simplest way to get a random projection is to apply a predetermined random pattern of sign flipping to an array of data followed by the WHT.  Simply by binarizing the output you have a locality sensitive hash.  If the bit outputs are +1,-1 then by weighting each one and summing you get a recalled value.  To train, recall and calculate the error.  Divide by the number of bits.  Then add or subtract that from each weight as appropriate to make the error zero.  That give an associative memory whose capacity is just short of the number of bits. 

Instead of random sign flipping you can use weighting to get more sophisticated projections:

https://arxiv.org/abs/1610.06209

https://github.com/FALCONN-LIB/FFHT/issues/26