Self-proclaimed "quantum mechanic" and author of "Programming the Universe," Seth Lloyd, an engineering professor at MIT, has a new theory--quantum machine learning--that applies quantum states to solve big data. He claims to need only a 300 q-bit quantum computer to simulate the entire universe and has a design for a quantum RAM to prove it: R. Colin Johnson @NextGenLog
Seth Llody--the self-proclaimed "quantum mechanic"--recently revealed a patented new method of solving big data called quantum machine learning.
Further Reading