Clifford Algebras, Neural Nets, and the Brain

In “Back Propagation in a Clifford Algebra“, Pearson and Bisset (1992) discuss the interesting problem of replacing real numbers in a neural net by elements from a Clifford Algebra.  They replace the sigmoid activation function with

$$f(x) = x / ( c + |x|/r)$$

where $c$ and $r$ are real positive constants and $|x|$ is the norm of the element in the Clifford algebra.  Deriving the back propagation algorithm is straight forward otherwise.

In a later article “Neural Networks in the Clifford Domain“, the same authors explain how complex numbers, quaternions, or Clifford algebras can convey electrical phase information between neurons which might be necessary for more accurate representation of how the brain actually works.  Also, it is possible that signal processing and image processing applications may benefit. They write,

“It is conjectured that complex valued feed-forward networks will be able to achieve better representations of problems that map into the complex domain naturally (such as phase and frequency information) than if the components of the signal were split up and presented to a real valued feed forward network.”