Questioning Quantum Mechanics

Some mails I sent to the NPA dissidents Yahoo group regarding Quantum Mechanics and Quantum entanglement:


Hi there,

I have been looking into Aspect's experiment too, and I think there are quite some remarkable things in there. First of all, it says that about 5 * 10^7 photon pairs are emitted per second by the source. That's a lot. Then it says that single rates are over 10^4/sec and that dark rates are about 10^2/s and that the accidental rate is about 10/s. Finally, actual rates are in the order of 0-40/s.

Let's say for the sake of the argument that the "actual rate" is about 100/s. That means that 100 - (100/5e7 * 100) = 99,9998% of the supposedly generated pairs are somehow not labeled as "actual". I don't know, but it seems to me that if you throw away that much of your data points because they don't fit with the desired results that you are doing something wrong.

Either way, there is another solution to the problem that I haven't seen anywhere, but since I only spent like a day or two researching at this moment, I might have missed it.

Either way, the point is that the whole experiment is designed to differ between two hypothesis:

1. Quantum entanglement with "spooky action at distance";

2. a completely a priori deterministic process.

However, when you have photons, electromagnetic waves, interacting with a "polarizer", then one should at least have an idea about how this interaction takes place. And one should ask the question whether or not the polarization process is completely random, or that it might introduce some kind of conditionality that is not being accounted for.

Now you are talking about a physical "polarizer" consisting of atoms.

These things have a temperature. Inside the material, lots of electro-magnetic oscillations take place, which a/o results in the radiation of infrared light with a wavelength in the order of 1-10 um. Now the wavelengths of the photons used in this experiment are 551,3 nm and 422,7 nm, so only in the order of a factor 10-100 less than the wavelengths radiated by the polarizer material, assuming it is operating at a temperature of about 300 K. I haven't read anything in the paper which suggests there is any kind of temperature control being applied to the polarizers nor that these would be operating at much lower temperatures.

Now QM is based on the well known double-slit experiment, which according to Wikipedia "demonstrates the fundamentally probabilistic nature of quantum mechanical phenomena". But does it?

Let's think about this for a while. According to Quantum Mechanics, the light falling in/on to the double slit consists out of "photons" that are randomly emitted by *single* electrons - particles with a diameter of a fraction of the wavelength of the emitted light - once they change orbit around their nucleus at *random* moments.

Then, at the other side of the slits, we get a nice interference pattern. In other words: at the exit of the two slits, we have two signals that are nicely in phase thus creating an interference pattern, while at the entrance we supposedly have one big pile of random garbage.

Now if we really want to believe this, then it would be a waste of time to design phased array antennas for radar applications, for example. Supposedly all you need to do is place a (double-)slit plate in front of your antenna and your outgoing signal will be in phase automagically. I mean, if it works like that at light frequencies, why shouldn't it at the upper RF bands?

In other words, if you have a nice signal in phase going *out* of your dual slits, you must also have an *incoming* signal that is in phase. So, the incoming light in the dual slit experiment *must* be in phase in order for a interference pattern to appear and can therefore not be the result of a "fundamentally probabilistic" phenomenon and thus QM is fundamentally flawed. Light can be absorbed and/or (re)emitted by means of electrons changing orbits around their nuclei, but they do not do so at random. They do so in harmony or resonance with one another.

In other words: what the dual slit experiment shows is the fundamentally harmonic nature of, well, basically all of physical reality...

Now if a light source indeed emits photons by means of multiple atoms resonating with one another, then the same kind of process is likely to happen within a polarizer.

In other words: the way incoming photons interact with a polarizer is very complex, but since we know wavelengths in the order of 10-100 times the wavelength of the incoming photons are present at at least the surface of the polarizer material, the process that takes place is what is known as "heterodyning", the superposition/mixing of waves, which results in the differential frequencies and their harmonics coming to the forefront.

Now if both polarizers have about the same temperature, considering the waves that are present at it's boundary are time-dependent, the process by which the photons are either absorbed or passed trough the polarizer is also time dependent. There are specific "windows of opportunity" in time that determine whether or not a photon can pass trough. These occur when the ensemble of waves that is present at the boundary/surface of the polarizer meets certain conditions in terms of the phase of the (superposition of the) waves that are present.

This leads to the situation that the "coincidence rate" is determined by the overlapping of these "windows of opportunity", which also depend on the angles of the polarizers with respect to the trajectory of the photons.

Now of course this process is that complicated and that fast that we cannot control it experimentally and thus can only describe it using probabilities.

In other words: we now have a hypothesis with which we can probably explain the observed behavior without the need for "spooky action at a distance", which is a ridiculous idea indeed.

Regards,

-- Arend --


Hi all,

There is a fundamental problem that has not been accounted for with ALL the experiments that have been conducted on the subject of "proving" entanglement, which is that it is impossible to conduct any kind off measurement that does not in one way or another involve the interaction of electro-magnetic waves. Because of the particle-wave duality principle, we know for certain that at the nano-scale electro-magnetics are the dominant phenomena that have to be taken into account.

So, in order to understand and explain the experiments that have been conducted, we have to understand the interactions between the measured particles or photons and the measurement equipment, which should be considered to be some kind of antenna structure (array).

One important consideration is the behavior of EM waves in the vicinity of the antenna structure, which differs from what happens further away from the structure. These are the so called near and far fields, which are still not well understood my main stream science, which resorts to the introduction of "virtual particles" in order to straighten things out:

http://en.wikipedia.org/wiki/Near_and_far_field

"In the quantum view of electromagnetic interactions, far-field effects are manifestations of real photons, whereas near-field effects are due to a mixture of real and virtual photons. Virtual photons composing near-field fluctuations and signals, have effects that are of far shorter range than those of real photons."

On my blog, you can find an article about what is wrong with Einstein's relativity theory and why we should return to the good old aether theory, with a/o a reference to Dr. Cantrell's work who shows that actually no disproval of the aether theory exists. Only the assumption that the Earth is moving relative to the aether has been disproven. You will also find references to Paul Stowe's excellent work on a geniously simple and elegant aether theory of everything:

http://www.tuks.nl/wiki/index.php/Main/Ruins96YearsEinsteinRelativity http://www.tuks.nl/wiki/index.php/Main/StowePersonalEMail http://www.tuks.nl/wiki/index.php/Main/StoweFoundationUnificationPhysics

Now since both the relativity theory and QM are fundamentally flawed, the only reasonable theory we have to date that has not been falsified is the aether theory. And Paul Stowe's outstanding work lays it all right in front of your eyes. Wholeheartedly recommended!

So, when we accept the existence of an aether with fluid-like properties and follow Stowe's thesis that gravity = Grad E (the gradient of the electric field), and the magnetic field as being the rotation of the aether, we have a proper theoretic and conceptual foundation to work with.

And when we consider the existence of a fluid-like medium wherein all particles and EM waves exist and propagate, we can also beatifully visualize what particles look like. A very important ingredient in this picture are the vortices/rotating thorusses that sustain the magnetic (rotational) component of an EM particle or photon, and thus we come to particles looking something like this:

It is often said that a picture tells more than a thousand words and these two pictures really tell it all.

All right. So, we have an area around any antenna structure that is called "the near field" and we have the "far field", which consist out of photons or particles with that very strange wave-particle duality character.

When a real aether medium with fluid-like properties exists, this can be simply explained by postulating that in the near field, the vicinity of the boundary of two materials with a different propagation constant (or density), we have the good old transverse waves, while in the far field the propagation mode of the near-field transverse wave transforms into these thorus-like structures known as "particles" or "photons".

In order to consider how incoming photons would interact with a polarizer or other piece of filtering or measurement equipment, we can work with the analogy of considering the surface area of any measurement equipment as being very similar to the surface of a pond, across which all kinds of waves propagate. Of course, these waves are not confined to the surface area. At the surface area, we have transverse waves and underneath the surface we also have longitudinal waves. The same goes for the area above the surface, where the air (medium) has to move along with the surface waves and thus a wave is also present above the surface.

Then, the photons interacting with the measurement equipment act akin to throwing rocks onto a pond at an angle. What happens then depends not only on the angle and velocity of the incoming rock/photon, but also on the particular state of the surface area at the entry area.

And since the waves that are present on the surface area on the measurement equipment are a superposition of harmonic waves, we get an interaction that fundamentally depends on the particular state of the (electro-magnetically) harmonically vibrating surface area at the moment and area of impact.

It is clear that this results in a certain probability distribution that determines whether or not a photon or particle can pass trough the surface of a (polarization) filter, which is a/o *harmonically* dependent on time.

In other words: all the experiments that pretend to "prove" the so-called "entanglement" idea are in reality the umpteenth example of experiments that support the existence of the aether, as simple and elegant theory that can do without having to resort to totally out of the blue concepts like the "collapsing of a wave function", "spooky action at a distance" and "virtual particles" in order to define the violations of about all fundamental laws of nature away. And since it also explains what gravity is in a simple and elegant way, Occam's razor comes to mind:

http://en.wikipedia.org/wiki/Occam's_razor

"Occam's razor (also written as Ockham's razor from William of Ockham, and in Latin lex parsimoniae) is a principle of parsimony, economy, or succinctness used in logic and problem-solving. It states that among competing hypotheses, the hypothesis with the fewest assumptions should be selected."

The aether theory only needs one assumption:

A real, physical medium with fluid-like properties exists......

Regards,

-- Arend --


Hi Arend

I followed the link to Paul Stowe's paper on unification . I applaud his efforts however he has made a classic mistake of not having someone in physics proof read his writings before publishing . Within the first page his calculation for the speed of light is off by a factor of 6 . The calculation for proton and electron mass introduces new variables that are not defined in the nomenclature at the bottom of the paper. I realize it is extra work to clearly define every variable however if one wants to communicate physics with a new theory it is a expected requirement . He could be on to something with vortex fluid for mass but the toroidal ring model looks very much like the Dirac electron model . Added to this the paper is somewhat incomplete to follow his thoughts or pin down his math to see if he has it right. He needs to rewrite before running it up the flag pole . I for one am not saluting .

John H


Hi John,

Thank you for looking into it. I assume you refer to this paper: http://www.tuks.nl/wiki/index.php/Main/StoweFoundationUnificationPhysics

Which equation are you referring to when you say he is off by a factor 6 in his calculation for the speed of light? I appear to be unable to spot it...

The new variable you are talking about in the calculation for the proton and electron mass is defined in equation 8::

-:-

Next lets consider a volume that corresponds to Kelvin's vortex sponge. One must realize that changes in any one ring affect all others. As one ring dilates or contracts (resulting in volume/area changes of the ring) it must result in a responding sympathetic distortion of all other rings (quantum non-locality?) if the total volume of the system is to remain constant. The resulting coupling factor will consist of square root of 3 (1.73...[the relationship of particle speed to transverse wave speed]), the geometry of the ring (4 PI^2 [the area geometry constant of a toroidal ring]). This should lead to numeric term that would be constant for these types of interactions. This term, designated a, then becomes:

 a = Sqrt(3) (4 PI^2) {eq. 8}

which is 136.757 / 2 ... This is extremely close to inverse alpha (the quantum fine structure constant) at 137.036 / 2... with a difference of only 0.2%. We will later show that these and other differences are related to the magnetic moment anomaly {MMA = 1.001165923} already identified in physics. In the case above, 2a (MMA)^2 = 137.074 vs. 137.036...

-:-

Either way, I won't dispute that there are some rough edges here and there and that it needs more work, even a lot more work in order to come to a complete theory.

However, what is very appealing to me are the concepts underneath the math. Personally, I am not too good with math. Sure, I wrestled trough the math we got at University and passed the exams, but it is not my strongest subject, and my education was over 15 years ago. What I am good at, is analyzing the physics by visualizing what happens in my head. To me, the underlying concepts are the most important.

What I see in Stowe's work is the idea of defining an ideal super-fluid in terms of Newtonian physics and then working out how this connects a/o electro-magnetism and gravity. Now I am an electrical engineer and have studied a/o the work of Eric Dollard and various free energy systems, most notably the car on water by Stan Meyer. One of the most intriguing theoretic controversy that came forward is the question whether or not longitudinal electric waves can propagate trough the vacuum. This is claimed to be the case by Eric Dollard and I believe this to be true.

What I see wrong with the current EM model of main stream science is that it sees "charge carriers" as being the cause for EM fields to exist, while charge carriers are particles, which are also waves according to the wave-particle duality principle. Now supposedly EM waves can propagate trough the vacuum, because they are particles, but longitudinal waves cannot, because they supposedly need free charge carriers to be present in order to be able to propagate.

To me, it is clear that a deeper cause must be present for EM waves to exist, and that is why I am a proponent of an aether theory.

One problem that comes forth is that longitudinal waves are pure pressure waves and do not have a magnetic component. Now you need some kind of momentum in order to be able to propagate a wave, which in the normal EM case is the magnetic component. For the longitudinal case, we have no concept for the momentum part.

Now when you go to the aether concept of Stowe, you have a super fluid, wherein the magnetic field is defined as the rotation of the fluid. So, within current EM theory, only the rotational part of the aether momentum is modelled. The non-rotational part, which is also present, is totally ignored. With Stowe's version, we do not have that problem and thus have a foundation for extending the current understanding of the EM field, such that longitudinal waves can be described.

Then, his statement that "gravity = Grad E" is also intriguing, because it offers an understanding of the experiments conducted by T.T. Brown. By integrating gravity into the model like this, we get a complete aether theory, which also resolves the problem around whether or not we have an "entrained" aether or not. (see Cantrell: http://www.infinite-energy.com/iemagazine/issue59/adissidentview.html ).

So, to me, conceptually Stowe's theory solves a lot of problems we have with current EM theory, because the wave-particle duality principle all but demands gravity to be an integral part of EM theory. There may be many rough edges and many refinements may be needed, but to me the conceptual foundation appears to be rock-solid.