QuarkBits [Sept. 1, 2020]

Can scientists become supervillains?
The need for interdisciplinary approaches to science
by Daniel Jenkins (EuroPLEx fellow, University of Regensburg)

In science fiction movies it is often portrayed that science can be done in isolation: one very intelligent person sits in a laboratory in their basement and discovers some secret that proves that all science beforehand was wrong, normally followed by some explosions, natural disasters, or said scientist becoming a supervillain.

A frame from the film Maniac (1934) with Horace B. Carpenter as “Dr. Meirschultz” [source: wikipedia commons].

However, this is never how science is done in real life, no one works in isolation; in reality, progress is made by many different fields working together, with large, often international, collaborations of people working on every small step. And this has become increasingly true in the modern age, as the problems get harder and harder the collaborations needed become bigger and bigger. Quantum mechanics was developed by a large group of scientists, most of whom will be forgotten to history; Special and General Relativity were not developed by Einstein alone, despite what some may think, he was in constant contact with physicists and mathematicians, especially for the development of General Relativity.

This need for interdisciplinary research is prevalent in almost all areas of science, but the one closest to my heart, as I am involved in it, is the relationship between High-Performance Computing (HPC), Quantum Chromodynamics (QCD), and astro-particle physics. The work I undertake is within the QCD part of the chain, with the goal of determining so-called “sigma terms”, these quantify the response of protons and neutrons in interaction with scalar particles, like the Higgs boson. These interactions allow us to extract information about the internal structure of protons and neutrons and what is much more interesting to establish connections with the nature of the dark matter. 

As many of the readers will know, dark matter is the most widely accepted explanation of numerous astrophysical and cosmological observations through the hypothesis of the existence of matter without electromagnetic interactions, and for that we can not see, thus the term “dark”, that is influencing planets and stars gravitationally. The most prevalent models for the possible constituents of the dark matter involve the concept of Weakly Interacting Massive Particles, or WIMPs for short. These are new particles with masses much larger than the known elementary particles that they could be in the range of energies that are accessible in LHC (where it would be possible to be created in proton collisions), or from experiments that aim to detect dark matter particles through collisions with atomic nuclei (as XENON, CDMS, and many others). Other models , very much in vogue, propose on the contrary the existence of new particles being at the other extreme of the energy hierarchy: i.e. particles with masses much smaller than the known particles. Recent results from the experiment XENON1T have, among others, an interpretation in terms of this hypothetical type of dark matter. 

Large scale distribution of the dark matter [NASA/ESA/Richard Massey (California Institute of Technology; available in wikipedia commons].

In the context of the relation between QCD and the dark matter models we wish to avoid a Russell’s teapot situation, i.e. reacting to an experiment that disproves a model by saying that the experiment is not sensitive enough, which could always be said without never accepting the experimental refutation. So we need to know in detail how a dark matter candidate particle would interact with regular matter. And this is where sigma terms come into play. The strength of the interaction between dark matters models and regular matter will be, partially, determined by these sigma terms, so knowing their values is required to know where we expect to see a signal.

The relation between QCD and HPC is a touch more direct. To perform the necessary calculations we need to carry out simulations on supercomputers. QCD describes the interactions between quarks and gluons that make up protons and neutrons, and the complexity of these interactions make them unaffordable for analytical methods using paper and pencil and at most employing modest computational resources. To understand the subnuclear structure we have to approximate spacetime as a lattice, similar to the structure of a crystal, which quarks and gluons live in. This allows the definition of the quantities under study in rigorous terms and their numerical calculation by means of a simulation of the dynamics. The intricate details would require a blog post or two, and so these are omitted for now. The most important information is that the formulation of efficient algorithms in order to perform the simulations together with the preparation of massively-parallel codes are consubstantial parts of the work.

The supercomputer Summit [source: wikimedia commons].

While this small network of connections between scientific disciplines may not have the same wow-factor as Einstein’s Relativity, or Hawking connecting thermodynamics with black holes, I hope that the necessity of interdisciplinary approaches has been highlighted, and how important it is for scientists, and people in general, to work together to solve the problems of the modern world. Further, I hope this assuages any fears about scientific supervillains appearing anytime soon, as this would require a lot of us to go evil at the same time, and that won’t happen, right?

Original story translated with permission from Investigación y Ciencia.