This is an "accessible to anyone" introduction to my research, starting from the general ("science") to the specific ("BSM phenomenology"). It originally appeared on An American Physics Student in England and has since also appeared on the US LHC blogs.
Physicists may be more interested in a technical description.
Science is a branch of human knowledge associated with the rational, objective, and empirical study of the natural world. The primary mode of generating such knowledge is the scientific method, by which hypotheses are checked against experiments. Science differs from the humanities in its subject and from the arts in its method.
Scientific fact is based on observation. Causal explanations for these observations are theories that must be rigorously checked against experiment. It is worth highlighting that a "theory," in the scientific sense, both explains observed phenomena and predicts further observable phenomena. In this way scientific theories are falsifiable and differ from the common use of the word "theory" which implies opinion of speculation. A theory may end up being incorrect when subjected to further experiments, but this is a feature rather than a shortcoming of the scientific method.
Physics is the branch of science concerned the fundamental laws of nature. Branches of physics study atoms (and all things subatomic), materials in different phases (condensed matter), dynamics of different systems (e.g. geophysics, general relativity), outer space (astrophysics and cosmology), and applications to other sciences (biophysics, physical chemistry). In some sense physics is the "purest" science in that it is an interface between fundamental models of nature and experiments.
Unlike the other sciences, physicists can roughly be divided into theorists and experimentalists. Theorists are primarily concerned with mathematical models of nature that can be used to explain experimental data. Experimentalists are primarily concerned with testing theories and acquiring new data that may point to science beyond current theories. This divide occurs because of the high degree of specialization required to study nature at the level of physics. Theorists must be fluent in advanced mathematical methods while experimentalists must be clever to build apparatuses and interpret data.
Particle physics is the branch of physics concerned the smallest building blocks of nature. In the past century, the "particles" that physicists considered "smallest" have gone from atoms, to nuclei, to protons, to quarks (not to mention electrons and their cousins). We have also learned how to think of the fundamental forces of nature in terms of force-mediating particles such as the photon.
Why do we study these particles? One reason is that we hope that by studying the basic building blocks of the universe we can understand composite objects better (reductionism). There is also a philosophical/aesthetic appeal associated in understanding what the ultimate basic building blocks of the universe should look like.
The current canon of particle physics is called "The Standard Model" and was mostly completed in the 1970s. It is a kind of quantum field theory called a non-abelian gauge theory (this means it is based on certain kinds of symmetries) and explains the strong and weak nuclear forces as well as electromagnetism. It has passed every direct experimental test (up to some recent modifications in the neutrino sector) with flying colors and is regarded as a stunning success.
We know, however, that the Standard Model is incomplete. This is not to say that it is wrong, but that it is an effective theory for the distance scales that we have probed. In the same sense, Maxwell's equations are an effective theory for electromagnetism above the atomic scale, where quantum effects become relevant (and another theory is effective: quantum electrodynamics).
The reason why effective theories are reasonable is that nature tends to only care about physics at the scale you are probing. For example, when a chef bakes a cake, there are several chemical reactions that occur as the batter bakes. At the heart of these chemical reactions are statistical and quantum effects which are ultimately explained by the Standard Model, which, in turn, may ultimately be explained my a more fundamental theory such as string theory. The chef, however, does not need to know particle physics, quantum mechanics, or even chemistry to bake the cake; the chef has an "effective theory" of how to bake cakes that is based on measuring cupfuls of ingredients.
In the same way the Standard Model is an effective theory for physics at the length scales we have probed. (Particle physicists measure scales in electron volts, which are inversely proportional to length; we have probed scales up to around the hundreds of giga-electron volt range.) There must be more to the story at smaller scales, but they don't have an appreciable effect on the scale that we've currently been able to study. One of the major "missing pieces" in the Standard Model is a quantum theory of gravity.
For a more heuristic explanation of effective theories that includes some pictures, see this post on the US LHC blogs.
Theoretical particle physics focuses on ways we can understand nature beyond the Standard Model. There are roughly two kinds of particle theory phenomenology and formal theory. Phenomenologists attempt to study the next level of effective theory by looking for signals of physics beyond the Standard Model in experiments and constructing new models. Formal theorists attempt to answer the bigger question of finding a fundamental "theory of everything" that is a complete theory that describes nature down to the smallest length scales (i.e. not just an effective theory). Most formal theory today focuses on string theory.
Since the characteristic scale of gravity is well beyond anything that is experimentally accessible in our lifetimes, formal theory often comes up against the barrier of experimental assessment. Much of the motivation for string theory comes from the hope that it can be a self-consistent theory of quantum gravity.
My interests, on the other hand, are on the phenomenological side, where theory and experiment engage in a back-and-forth dance.
Particle phenomenology often used as a blanket term used to describe theoretical particle physics that is not string theory. This generally refers to particle theory that is more closely related to experiment, with theory and experiment each suggesting new research directions to the other. It is an exciting time to be in this subfield since the Large Hadron Collider (LHC) will open up a new sectors of nature to scientific inquiry.
Some phenomenologists study finer details of the Standard Model, these include on-going studies of CP violation (such as Japan's BELLE experiment and SLAC's BaBar experiment) and neutrino physics (SuperK in Japan, various experiments in the US). There is also a subgroup of phenomenologists who work on the theory of strong interactions (i.e. quarks and gluons), called quantum chromodynamics (QCD) which is notorious for being nonperturbative. Most QCD research involves applying new mathematics (such as twistor methods) or computer simulations on discretized space (lattice QCD) to extract more accurate predictions from the theory.
While these are both very promising directions, my primary research interest is what happens when our current effective theory breaks down. The answer is almost certainly that it is replaced by another effective theory, perhaps motivated by string theory, that sheds further light on the structure of nature.
This is often called "beyond the Standard Model" phenomenology. It deals with ways to extend the Standard Model past its range of validity and, hopefully, include any new physics we discover at the Large Hadron Collider. There are several sources of data for particle physics, including astrophysics and cosmology, but colliders still represent our best controlled experiments.
There are good reasons to believe that there should be physics "beyond the Standard Model" within the reach of the LHC even though quantum gravity is well beyond that range. For one, from astrophysical observations we know that there is a class of massive particles called "dark matter" that is reponsible for the clustering of galaxies. Within reasonable assumptions, such a particle should be produced at the LHC. Another reason is the mass of the Higgs boson, which seems to suggest a "UV completion" at the TeV scale.
The two most prominent ideas in BSM phenomenology are supersymmetry and extra dimensions. Supersymmetry (SUSY) adds extra quantum dimensions to spacetime that lead to each particle having a "supersymmetric partner." This is analogous to each particle having an antiparticle. Extra dimensional scenarios extend our spacetime with classical dimensions, allowing our known particles to resonate in these extra directions to produce new "Kaluza-Klein" particles.
For the past ten or twenty years, BSM phenomenology has been centered around model building, i.e. developing new theories or reworking old theories that can solve the problems of the Standard Model. With the LHC turning on, however, the BSM community has shifted towards developing bottom-up data-driven approaches to new physics. The big question when the LHC turns on will be whether we can identify signals that are beyond the Standard Model. This is not a trivial thing since piecing together experimental signatures at a particle collider is very much a detective mystery in its own right; luckily this task is shared by experimental particle physicists.
The BSM phenomenology community has been waiting patiently for new data and trying to squeeze the most that it can out of old sources of data. We hope to see new and unexpected things at the LHC that we can then spend another couple of decades thinking about.
Interested in learning more about particle physics? Check out a series of blog posts I've written using Feynman diagrams to teach the Standard Model to a general audience: Part 1, Part 2, Part 3, Part4 , Part 5. Check my post listing for further installments.