Minimum Viable Quantum Computational Whole — I

Pravir Malik
4 min readAug 6, 2023

--

source: canva.com

In the double-slit experiment, we see how individual photons projected one at a time through two slits create wave-like interference patterns on a screen behind. It is assumed that as photons leave the source projector, they not only maintain their dual particle-wave nature — the latter being shared with past and future projected photons that belong to the experiment — but also assume a vast range of values along all paths between the projector and screen until it reaches the screen, at which point it chooses a position so as to ensure the best wave-like interference pattern.

This pattern is, of course, the outcome of the entire experiment, giving the sense that each photon is connected to the past and future of all photons in the experiment, knowing beforehand — if it were an earlier photon — the position to be filled to ensure a perfect final wave-like interference pattern. In the Copenhagen Interpretation of Quantum Mechanics, the photon, or for that matter even any other projected quantum-object, would be displaying a notion of superposition whereby the quantum-object could be in infinite possible states until it collapsed at the time of measurement (in this case when it hit the screen).

There are a couple of ways to look at this. If one sees the experiment from the bottom-up or a reductionist approach from the point of view of each photon, then the perception of photons connected to the past and future and assuming infinite values perhaps gets reinforced. Alternatively, if the same experiment is viewed as a totality, then the suggestion of a minimum viable whole that is wrapped in the space-time configuration encompassing the experiment surfaces. Then there is a functional aspect that comes to the forefront, where the individual photons contributing to the final interference pattern can be thought of as having a functional basis, as summarized by the following graphic:

If we were to focus on each individual photon on its path and then encompass all the paths and outcomes of all the photons in that experiment, through statistics and probability, we may arrive at a probability-based wave-like function that would allow the final pattern of numerous photons to assume that wave-like interference pattern perceived at the end. But is such decomposition and recomposition using statistics and probability the same thing as operating at a minimum viable quantum computational whole in which all functions of type f(x) belong to some meta-function F(X) that, in fact, presides over its space-time embodiment? We may take the route in which we equalize our mathematical function with an interpretation that Nature, in fact, operates that way. This is what the father of the quantum computing industry appears to have assumed (re: Feynman post). But is this correct?

Further, is it okay that we similarly seek to focus on quantum dynamics at the level of a manufactured qubit — ultimately resorting to a process of reductionism instead of maintaining the minimum viable quantum computational whole where the possible gains of quantum computation exist? Is it ok to piece together such reduced quantum-level operation via statistics and probability to arrive at a computational outcome? In this approach, aren’t we, in fact, losing touch with the whole, aka F(X), and reconstituting an inaccurate semblance of that through a falsely manufactured outcome, bolstered by our belief that this is how Nature operates?

A clue that we may be deviating from how Nature at quantum levels operates is suggested by the difficulties we face in standard quantum computational approaches: decoherence, lack of stability, lack of ability to propagate quantum states, lack of scalability, and so on.

Taking an example from the realm of space and time, there is a limit to how accurately we can perceive and measure space and time, which in turn necessitates a reconceptualization that these may be granular rather than continuous phenomena. As some have hypothesized, it would require infinite amounts of energy to penetrate into the smallest quantum of space or time, which in turn would cause a collapse into a local black hole, breaking down all laws of physics. In other words, it becomes important sometimes to know when direction or conception may have to be changed.

If this is the case, is it then necessary to step away from our current approach to quantum computation based on focusing on an imagined whole seemingly contained by a manufactured qubit to some other minimum required level of wholeness, to thereby perceive a different logic of quanta?

I will dive into these considerations in future posts. But for now, know that there is already a series of Forbes articles that begins to highlight this notion of minimum viable quantum computational wholes as made evident by natural, ubiquitous, successful, scalable, stable, atom-based, molecular plan-based, and cellular-based quantum computers.

Perhaps it is these that need to be perceived and leveraged differently, thereby also allowing the possibilities of quantum computation to incubate differently.

source: canva.com

Minimum Viable Quantum Computational Whole-II

Index to Cosmology of Light Links

--

--