The new gas pedal, the Large Hadron Collider (LHC), could re-establish some of the conditions, including the energy density that existed in the universe only a billionth of a second after the universe was created some 15 billion years ago.
The LHC is housed in a tunnel with a circumference of about 27 kilometers, located 75 meters below the Swiss-French border.
One of the most important tasks facing the LPC is to record the so-called Higgs boson, a heavy particle predicted by theory.
In implementing the LHC program,*** nearly 2,000 scientists from 150 research laboratories in 34 countries participated in the program, which cost about $8 billion, partly funded by the United States.
The completion of the LHC, an underground circle built in the countryside near Geneva. You can think of it as the largest and most powerful microscope in the history of science. It will give us the unprecedented ability to probe physical processes occurring at the shortest distances (as small as 1 nanometer, or 10 billionths of a meter) and highest-energy states to date. For more than a decade, particle physicists have been eagerly awaiting the opportunity to explore the physical world at the so-called "trillion energy scale," where physical processes involve energies as high as 1 trillion electron volts. Scientists expect that new and significant physical phenomena, such as the elusive Higgs particle, which scientists believe gives mass to other particles, and dark matter particles, which make up dark matter, the main component of matter in the universe, will be revealed in the trillion energy scale.
After nine years of construction, the massive machine is expected to produce a stream of particle beams this year. As planned, the LHC will undergo a series of debugging: from a single beam to two beams to colliding beams; from an increase in lower energy scales to trillions of energy scales; and from weaker test beams with lower strengths to stronger beams suited for rapid data collection but more difficult to control. Each step of the debugging process was difficult and required the collaboration of more than 5,000 scientists, engineers and graduate students to overcome the challenges. Physicists have done a lot of preparation to explore this high-energy frontier, and in the fall of 2007, I visited the LHC project team to get a first-hand look. Despite repeated delays in the project, everyone I spoke with was confident of success. The particle physics community is eagerly awaiting the first results from the LHC. Frank Wilzek of the Massachusetts Institute of Technology (MIT), speaking of the prospects for the LHC, repeats the physics community's *** knowledge that the LHC will usher in a "golden age of physics".
Each collision ejects a large number of particles, the vast majority of which are known, but with the occasional novelty.
Supermachine
As soon as it starts to run, it will produce a beam of protons with much higher energy than before. Its roughly 7,000 magnets, cooled below 2K by liquid helium and kept running in a superconducting state, guide and focus two proton beams. The proton beams would travel at 99.9999991% of the speed of light. Each proton would carry an energy of 7 trillion electron volts, which is equivalent to 7,000 times the energy contained in the rest mass of a proton (cf. Einstein's mass-energy equation E = mc2). The current record holder for the highest energy is the trillion-electron-volt positive-negative proton collider (Tevatron) at the Fermi National Accelerator Laboratory (Fermi National Accelerator Laboratory, located in Batavia, Illinois), and the LHC will produce protons with energies seven times greater than that record. And, according to design parameters, the LHC will produce a beam intensity (also known as brightness) that is 40 times that of the beam at the Trillion-Electron-Volt Positive and Negative Proton Collider. When the LHC is operating at its highest energy, the total energy carried by all the particles circling the giant ring is roughly equivalent to the total kinetic energy of 900 cars traveling at 100 kilometers per hour -- enough energy to make 2,000 liters of coffee if you used it to boil water.
The protons would be distributed in about 3,000 beam clusters along the 27-kilometer circumference of the collider's ring. Each cluster consists of up to 100 billion protons, but at the collision points the clusters are the size of silver needles: just a few centimeters long and 16 micrometers thick (roughly equivalent to the thinnest strand of hair). At the circle's four collision points, these silver needles pass through one after the other, with more than 600 million particle collisions occurring every second. Physicists call these collisions events, and they are not actually protons colliding with protons, but rather collisions between quarks and gluons, the smaller particles that make up protons. The most intense collisions will release about 2 trillion electron volts of energy, which is equivalent to 1/7th of the energy carried by the colliding protons. (For the same reason, although protons and antiprotons operating in the Trillion-Electron-Volt Positive and Negative Proton Collider can reach energies of up to 1 trillion electron volts, their energies will have to be raised by a factor of five more before they cross the threshold of the trillion-energy mark.)
Four giant detectors are built around the circle's four collision points, the largest of which would fill half of Notre Dame de Paris; the heaviest one uses more iron than the Eiffel Tower. These detectors will record and measure the thousands of particles produced by each collision. Despite their enormous size, the detectors require extreme precision in their installation, with some components having to be positioned to within 50 microns of each other.
Each of the two largest detectors holds nearly 100 million streams of data, generating enough data to fill 100,000 CD-ROMs every second -- which could be stacked from Earth to the moon in just six months. So instead of recording all the data, these experiments devise so-called trigger and data-acquisition systems. These systems act like spam filters, picking out only the 100 instances per second that seem most valuable and sending their data to CERN's LHC central computing system for archiving and later analysis.
The Euratom Research Center is home to the LHC. There, a computing cluster of thousands of computers transforms the filtered raw data into a more compact format for physicists to sift through and study. The physicists analyze the data through a so-called "grid network". The network consists of tens of thousands of PCs at research institutions around the world, connected to 12 large network hubs in Asia, Europe, and North America, and then to CERN via dedicated fiber-optic cables.
In early November 2007, the final connections were made to adjacent magnets on the ring; by mid-December, one of the eight sectors had been cooled to the low temperatures needed for operation, and the second sector had been cooled to the low temperatures needed for operation. Cooling of the second sector was also initiated. Previously, one sector had been cooled, energized, and then returned to room temperature. The sectors were tested individually and then commissioned as a whole. Once it passes inspection, a proton beam is injected into the ring, which runs along one of two beam tubes in a circle with a circumference of 27 kilometers, and, in September 2008, begins full operation.
A series of small gas pedals providing beams to the LHC's main ring have been accepted as capable of injecting protons with an energy of 0.45 trillion electron volts into the LHC.The first injection of the beam will be a critical step, with LHC scientists injecting low-intensity beams first to minimize the risk of damage to the LHC's hardware. Only after careful evaluation of how the test beams operate inside the LHC and precise corrections to the guiding magnetic field will scientists inject higher-intensity beams.The first time the LHC operates at its design energy of 7 trillion electron volts, there will be only one cluster of proton beams circling clockwise and counterclockwise in the ring. When it eventually operates at full capacity, there will be about 3,000 proton beam clusters circling in each direction.
The full commissioning of the LHC gas pedal, undertaken with such care, is sure to uncover many problems. How much time engineers and scientists will need to fix them is impossible to predict. But if a sector has to be returned to room temperature for repairs, the startup of the LHC will be delayed for several more months.
The four giant detectors on the LHC are each responsible for one experiment, known as ATLAS, ALICE, CMS, and LHCb. They also have long schedules and must be completed before beam commissioning can begin. Some very fragile parts are still being installed, such as the so-called "vertex locator detector", which was placed in the LHCb in mid-November 2007. During my visit, I, a graduate student who specialized in theoretical rather than experimental physics many years ago, was struck by the thousands of cables that were densely packed together - the very cables that carried the data collected by the detector, one by one. Each cable had its own unique label and needed to be meticulously and accurately placed into the appropriate socket and detected.
Processing massive amounts of data
When the LHC is operating at its design brightness, the meeting of two silver-needle-like proton beam clusters will produce about 20 instances of collision. With only 25 nanoseconds separating the two beam encounters, particles ejected outward during the first beam encounter will not have had time to leave the outer layers of the detector before the second beam encounter event has already occurred. Elements located in different layers of the detector are able to respond uniquely to a number of specific particles passing through the element. Each instance will produce about 1 megabyte (MB) of data, which in two seconds is 1 petabyte (PB, equivalent to 1 billion MB), and these data streams will be sent out over millions of communication channels.
A trigger system with multiple levels would reduce the flood of data to a manageable level. Primary triggers would collect and analyze data from some of the subsystems in the detector, picking out the most valuable instances of them based on a number of independent elements. For example, if the flight path of a high-energy muon has a large angle to the beam axis, this example will be selected. This so-called "primary trigger," controlled by hundreds of dedicated computer logic modules embedded in the hardware, sifts through 100,000 beam clusters per second for further analysis by the next stage of the advanced trigger system.
Unlike Primary Trigger, Advanced Trigger collects all the data coming out of the millions of channels on the detector. The system's software runs on a cluster of computers, and for each cluster screened through the primary trigger, the advanced trigger system has an average of 10 microseconds of processing time, enough to "reconstruct" each instance. By "reconstruction," we mean finding the ****same starting point for all the particles in the case, and fully characterizing the properties of each particle, including its energy, momentum and trajectory.
The Advanced Trigger System will sift through 100 instances per second, uploading them to the hubs of the LHC Computing Grid, the LHC's global network of computing resources. The Grid system is able to combine the processing power of all the computing centers on the network. Users simply log on to the Grid from their local institutes and can utilize the Grid's processing power for data analysis.
The LHC Computing Grid can be divided into several layers. Tier 0, which is housed at CERN, consists of thousands of commercially available computer processors, ranging from PC desktops to the latest pizza-box-sized black "blade" servers, stacked in racks (see illustration, right). CERN is ordering more computers to add to the grid. To make the most efficient use of their money, the CERN directors, like many home computer users, don't buy the newest, most powerful computers, but rather go for the best value for money.
The data acquisition system for the four detectors at the LHC will transmit data to Layer 0, where it will be stored on magnetic tape. In an age when DVD burners and flash memory have long been commonplace, it may seem outdated and backward to still save data on tape, but Francois Grey of CERN's Computing Center says it's the most cost-effective and secure way to do it.
Once up and running, the LHC will continue to produce massive amounts of data, and CREN's thousands of servers are linked together to provide the massive computing power needed to manage that data.
Layer Zero will distribute the data to 12 Tier 1 computing centers, one of which is located at CERN, and 11 others at other major research institutions around the world, including Fermilab and Brookhaven National Laboratory in the United States, as well as several research centers in Europe, Asia, and Canada. Thus, the raw, unprocessed data effectively has two backups, one held at CERN and the other scattered around the world. Each of the computing centers in the first tier keeps a complete copy of the data, in a compact format that makes it easy for physicists to perform multiple analyses.
The complete LHC computing grid also includes a second tier of computing centers, consisting primarily of smaller computing centers at universities and research institutes. The computers at these centers will provide distributed processing capabilities for data analysis across the grid.
Today at 3:30 p.m. BST this giant collider was fired up