# A Probabilistic Quantum Computer? Surely, You’re Joking, Dr. Feynman!

Renowned physicist Dr. Richard Feynman continues to greatly impact how the quantum computing industry defines itself. The birth of the quantum computing industry can be traced to the talks he delivered and papers he wrote on the subject. Apparently, he believed that Nature was probabilistic. But then again, fueled by the Copenhagen Interpretation of Quantum Mechanics, that still remains the dominant belief amongst the scientific community.

Driven by his legendary curiosity, Feynman began to ponder what a computer that simulated physics would be like. In so doing, he arrived at proposals to simulate space, simulate time, and simulate Nature’s probability. Such simulations, or more exactly ‘imitations’ of Nature, could only be done by operating in the same way that Nature operates. Quantum operations such as superposition and entanglement would need to be leveraged so that such a computer would not become exponentially large as it sought to imitate larger volumes of Nature.

This idea of using quantum mechanisms to imitate physics within certain space-time boundaries became the basis of Feynman’s proposition to build a quantum computer. That in itself is a powerful idea. But what marginalizes it is the interpretation that Nature, and therefore physics at the quantum-levels, is probabilistic. This idea seems now to have been elevated to the level of *law — *evident in the thrust that quantum computers can only be *probabilistic* quantum computers — and has become the foundation of any effort to build quantum computers. But one has to pause here and ask how accurate is such a probabilistic view of Nature.

But one has to pause here and ask how accurate is such a probabilistic view of Nature

Matter and Life, after all, arise and maintain a coherent and similar architecture regardless of granularity — across quantum particles, across atoms, across molecular plans in cells (this has been part of the research work I have engaged in, summarized as a 10-book series on Cosmology of Light and the series of IEEE published papers) — and if this is the case, then that order must emanate from the quantum level itself.

To imagine, therefore, that a quantum-object exists in infinite simultaneous states until measured and that the act of measurement will cause it to exhibit a state that then is sufficient in emulating Nature and even equivalent with the entirety of what Nature is, is facile: an oversimplification perhaps of the real dynamics. In studying particle-manifestation, yes, there may be infinite ways in which the particle shows up varying by position or momentum, for example, but those states are surface-states and need to be thought of as controlled by some meta-state that will ensure the integrity of the function that it represents.

Further, when looking at the variation in particle-manifestation, scientists may apply probability and statistics to then arrive at probabilistic outcomes to do with particle-manifestation. But unless insight into the deeper arbitrating function surfaces, nothing really has been explained. Hence the title of this blog post: *“A Probabilistic Quantum-Computer? Surely you are joking, Dr. Feynman!”* (Note: a play off the title of the book “Surely You're Joking Mr. Feynman”).

To gain insight into the set of arbitrating functions that exist at the quantum-levels, a different type of quantum computer is needed that will more accurately “read” what is happening. My purpose, therefore, in writing the paper *“**Enhancing Feynman’s Quantum Computational Positioning to Inject New Possibility into the Foundations of the Quantum Computing Industry**”* was to not only succinctly summarize the origin of today’s quantum computing industry but, more importantly, to suggest crucial enhancements to get to the **right type of quantum computer**.

This paper was accepted and delivered at IEEE IEMCON 2022 just a few days ago.

In this paper, I briefly cover the following:

- Feynman’s foundational positions with respect to quantum computation
- A different interpretation of quanta and quantum computing based on a vaster theory of light
- Enhancement of Feynman’s initial position based on the light-based quantum-computational model
- Summary of non-probabilistic reinterpretations for Heisenberg’s Uncertainty Principle, Schrodinger’s Wave Equation, and Euler’s Identity — foundational in how the possibilities of quantum computation are perceived
- Suggestions on different developmental bases for the fledgling Quantum Computing industry

QIQuantum represents an alternative *Vision* for the quantum computing industry and a small push to open a very large door, yielding a pathway and journey into many unknowns. Many hands will be required to successfully open this door: physicists, chemists, biologists, nanotechnologists, mathematicians, engineers, and businesspeople, amongst others. In addition to the baseline Cosmology of Light book series and the series of IEEE papers, it is also necessary to offer guidance — based on the alternative vision — to different business stakeholders. That is why I am also beginning a series of articles on Forbes to do this. The first article was published just a few days ago:

Simply, I view the probabilistic quantum computer as a first step only and the vision represented by QIQuantum as something absolutely needed, pushing us more securely into any envisioned quantum-future. Perhaps this sentiment is best summarized by Feynman himself: “*We are at the very beginning of time for the human race. It is not unreasonable that we grapple with problems. But there are tens of thousands of years in the future. Our responsibility is to do what we can, learn what we can, improve the solutions, and pass them on.”*