Monday, April 19, 2010

Electroweak Unification: The Science Behind Voltes V?

Supposedly powered by ultraelectromagnetic technology, does the science behind Voltes V mirror that of our theoretical physicist’s search to achieve a unified theory within the Standard Model?


By: Ringo Bones


When the Voltes V series was first aired back in 1977 – 1978, many a fan considered whether the technology behind this gigantic (58-meter) anthropomorphic robot was based more on science fact than science fiction. And it was very safe to assume that a significant number of Voltes V fans – and probably its real-life creators / writers – do closely follow the latest trends in theoretical physics. Especially when you consider that most of the whiz-bang technology used in the series mirrors that of the theoretical work Sheldon L. Glashow, Abdus Salam, and Steven Weinberg that eventually won them the 1979 Nobel Physics Prize for their complementary research in the Weinberg-Salam Theory of Weak Interactions.

The ultraelectromagnetic technology used by Dr. Armstrong (in the English version of Voltes V) that made Voltes V possible might probably be 200 to 500 years more advance than the technology that existed back in the late 1970s. But some of us do already have a hint on how to make this technology possible via the existing cutting-edge ideas of theoretical physics at the time. If we someday manage to unify the two very disparate pillars of modern physics – namely Einstein’s general relativity and quantum mechanics – will we ever achieve a technology like that used in Voltes V?

Even though it wasn’t told in the Voltes V series how far Boazanian theoretical physicist have gone when it comes to achieving energies in their particle accelerators to reach the Strong-Electroweak Unification scale at 10,000,000,000,000,000-gigaelectron volts (GeV) or the Planck Scale energy levels at 1,000,000,000,000,000,000-gigaelectron volts. It does imply - given the abilities of Voltes V or of their Beast Fighters – which they probably got very close to the Planck Scale energy level. Even with the then late 1970s technology that we have – the Stanford Linear Accelerator is probably the most advanced particle accelerator at the time – and the confirmation of Abdus Salam and Steven Weinberg’s theoretical work, we’re still probably a few centuries away from achieving Voltes V-like technology. This requires the verification of the Standard Model – or the Standard Model with Supersymmetry factored in. And you can only do that by building ever more powerful particle accelerators.

Back in the real world, to complete our quest for the verification of the Standard Model, we still need to confirm the existence of scalar fields. You know those fields that don’t carry a sense of direction. Unlike the electric and magnetic fields and other fields in the Standard Model – especially in finding how many types there are. This necessitates the verification and confirmation of the existence of new elementary particles – often called Higgs particles like the famed Higgs Boson – that can be recognized as the quanta of these fields. Given the recent mainstream press fanfare of the Higgs Boson – often dubbed as the “God Particle” – theoretical physicists have every reason to expect that this task will be accomplished by the Large Hadron Collider (LHC) at CERN before 2020. Many in the scientific community say that the “new science” created by uncovering the properties of the Higgs Boson will eventually explain the inner workings behind phenomenon of dark matter and dark energy lurking in intergalactic space, but will it?

An overwhelming majority of theoretical physicists had fell in love with the Standard Model because it is a quantum field theory of a special kind – one that is “renormalizable”. The term renormalizeable goes back to the 1940s, when physicists were learning how to use the first quantum field theories in order to calculate small shifts in atomic energy levels. They suddenly found out that calculations using quantum field theory has a nasty habit of producing infinite quantities. In short, a situation that usually means that a theory is badly flawed or is being pushed beyond its limits of validity – like trying to use Einstein’s general relativity to describe conditions of a black hole’s singularity.

In time, the physicists found a way to deal with infinite quantities by absorbing them into a redefinition or renormalization of just a few physical constants – such as the charge and mass of the electron. It is also important to point out that the minimum version of the Standard Model – with just one scalar particle – has 18 of these constants. Theories in which this procedure worked were called renormalizable, and had a simpler structure than nonrenormalizable theories.

Given the capabilities of the particle accelerators that we already have – or had often used – theoretical physicist had coined a term called the “hierarchy problem” that seems to indicate our ignorance on how the universe works. Experiments have probed up to an energy of about 200-gigaelectron volts – probably a bit higher in the immediate future once the LHC at CERN becomes operational again – have revealed an assortment of particles up to the Top Quark. And up to the interaction energy scales – the Elecroweak Unification that made Glashow, Salam and Weinberg won the 1979 Nobel Physics Prize - that are remarkably well described by the Standard Model.

Yet, the Standard Model has nothing to theories about energy levels in the two further interaction energy scales – i.e. the Strong-Electroweak Unification scale near the 10,000,000,000,000,000-gigaelectron volt level. And the Planck Scale – characteristic of quantum gravity believed to occur in the singularity of black holes – at around 1,000,000,000,000,000,000-gigaelectron volts. Although during the Reagan Administration, the US president did provide funds for Strategic Defense Initiative (SDI) / Star Wars missile defense program to build a 1,000,000,000,000,000,000-gigaelectron volt capable particle beam deemed powerful enough to shoot down incoming nuclear ballistic missile warheads with 100% certainty.

In reality, there is virtually no chance that we will be able to do experiments – let alone built a “death ray” – involving processes at particle energies in the 10,000,000,000,000,000-gigaelectron volt region because with our present technology – i.e. 2010-era technology – the diameter of a particle accelerator is proportional to the energy given to accelerate the particles. To accelerate the particles to an energy level of 10,000,000,000,000,000-gigaelectron volts would require a particle accelerator a few light-years across. Even if some genius found some other way to concentrate macroscopic amounts of energy on a single particle, the rates of interesting processes at these energies would be too slow to yield useful information. There’s even an episode of Voltes V when they encounter a similar problem: A laser powerful enough to penetrate a Beast Fighter’s new top secret armor that requires almost all the energy generated by Camp Big Falcon.

1 comment:

  1. What does Dr. Armstrong think about Ballotechnic Superfluid power plants?

    ReplyDelete