German physicist Werner Heisenberg made great contributions to a vital area of modern physics, quantum theory – most notably with his Uncertainty Principle.
German physicist Werner Heisenberg made great contributions to a vital area of modern physics, quantum theory – most notably with his Uncertainty Principle.
Quantum theory describes the world on the smallest scales, at the atomic and subatomic level. It is the most tested theory in physics and the foundation of some of the technology that has transformed the world in the last 50 years.
Quantum mechanics, as it is known, explains the behaviour of the component parts of atoms.
They do not conform to the normal laws of physics we experience every day – and, that being so, the theory is often hard for the layman to understand or even believe.
Two of the founding fathers of this branch of physics are Heisenberg (1901-1976) and Austrian physicist Erwin Schrödinger (1887–1961).
It all began with an idea from German physicist Max Planck (1858–1947).
In 1900, he said that only certain amounts of energy are possible: the ‘allowed’ energies of a particle, such as an electron, depend upon the situation.
Albert Einstein (1879–1955) used Planck’s idea to work out that radiation, such as light, is composed of particles: photons.
In 1913, Danish physicist Niels Bohr (1885–1962) applied Planck’s idea to electrons in atoms.
At the centre of an atom is a concentration of positive charge, the nucleus.
Electrons orbit the nucleus – and the further from it they are, the higher their energy.
If Planck’s suggestion was correct, only certain orbits would be possible – nothing in between.
Using his idea, Bohr correctly suggested that electrons could jump up a level to a higher orbit if they absorbed energy, and give off energy when they jump back down a level (moving closer to the nucleus).
The ‘spare’ energy would become one of Einstein’s photons: as electrons move between orbits, they absorb and emit light.
Bohr even worked out what the energy levels should be for the simplest atoms.
Physicists refined and extended Bohr’s idea to more complicated atoms, and all was well.
But in the 1920s things took a strange turn. Several experiments supported Einstein’s idea that electromagnetic radiation, normally thought of as waves, also behaved as particles.
Meanwhile the French physicist Louis de Broglie (1892–1987) suggested the opposite might also be true: particles such as electrons also behave as waves.
Several experiments proved his bizarre idea correct.
This ‘wave-particle duality’ has some strange consequences, since waves behave very differently from particles.
Waves, for example, spread out over large areas – how could a particle do that?
Electrons – only discovered in 1897 – had always behaved as particles, like solid balls.
But experiments showed that they have a ‘wavelength’ – just like light.
Higher-energy electrons have a shorter wavelength than lower-energy electrons.
In 1926, Schrödinger produced a brilliant mathematical equation.
It was based on the physics of wave motion to describe the behaviour of tiny particles like electrons.
The equation, reportedly arrived at during a Christmas holiday with his mistress in the Alps, works very well and supports the idea of matter behaving as waves.
Physicists interpret the wave nature of particles in terms of probability: the wave describes how likely a particle is to be found in a particular location, and with a particular energy.
But one of the strange consequences of quantum theory forms the basis of Heisenberg’s Uncertainty Principle – that you cannot know the exact position and velocity of a subatomic particle at the same time.
Quantum theory has done more than give physicists a mathematical window on the reality of tiny systems like atoms.
It is routinely applied in many technologies, including lasers and microelectronics, and in making extremely accurate atomic clocks.