Top Banner
CERN–2006–004 8 May 2006 ORGANISATION EUROPÉENNE POUR LA RECHERCHE NUCLÉAIRE CERN EUROPEAN ORGANIZATION FOR NUCLEAR RESEARCH Fifty years of research at CERN, from past to future Academic Training Lectures 13–16 September 2004, CERN Editor: G. Giudice Geneva 2006
164
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Cern History

CERN–2006–0048 May 2006

ORGANISATION EUROPÉENNE POUR LA RECHERCHE NUCLÉAIRE

CERN EUROPEAN ORGANIZATION FOR NUCLEAR RESEARCH

Fifty years of research at CERN, from past to future

Academic Training Lectures13–16 September 2004, CERN

Editor: G. Giudice

Geneva2006

Page 2: Cern History

CERN–200 copies printed–May 2006

Page 3: Cern History

Abstract

This report contains the text of the four presentations given as Academic Training Lectures on the occasionof the fiftieth anniversary of CERN. Fifty years of work in the fields of accelerators, theory, experiment, andcomputing are briefly recounted.

iii

Page 4: Cern History
Page 5: Cern History

Preface

The Academic Training Committee, which I am honoured to chair, has traditionally played a very active rolein offering, to all members of the CERN community, opportunities to deepen and broaden their scientificknowledge. Among its activities, it organizes about 20 lecture series a year, for a total of 80–100 hours oflessons on topics in high-energy physics and related fields of applied physics, engineering, and computing. In2004, on the occasion of CERN’s 50th anniversary, the Academic Training Committee wanted to organize somespecial lectures, to mark this important event for our laboratory. We had the idea of inviting four speakers whocould present the most important scientific developments that occurred at CERN during the last 50 years in thefour main research fields: accelerators, theory, experiment, and computing. We were not aiming at a preciseaccount of the history of CERN, but rather, personal recollections of the scientific atmosphere during thoseyears of great research progress and discoveries, as told by people who actively participated in those events.For this reason, we encouraged the four speakers to present their personal experience and to express theirparticular point of view on the past and the future of their respective field of research. As a result, between 13and 16 September 2004, in the main CERN Auditorium, Kurt Hubner, Gabriele Veneziano, Daniel Treille, andDavid Williams gave superb and fascinating lectures on research at CERN during these 50 years. The lectureswere very well received, with nostalgic feelings by the older generations, and as a source of inspiration for theyounger component of the large attendance. Many people responded by asking the Committee to publish thewrite-ups of the talks to preserve the presented material. We hope that this document will remain as a witnessof the exciting atmosphere that led to the great intellectual achievements of research in high-energy physics atCERN, and as a stimulus to pursue the exploration of the particle world in the future with the same enthusiasticspirit.

Gian Giudice

v

Page 6: Cern History
Page 7: Cern History

Contents

PrefaceG. Giudice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v

Fifty years of research at CERN, from past to future: The AcceleratorsK. Hubner . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

Fifty years of research at CERN, from past to future: TheoryG. Veneziano . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

Fifty years of research at CERN, from past to future: ExperimentD. Treille . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

Fifty years of research at CERN, from past to future: ComputingD. O. Williams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

vii

Page 8: Cern History
Page 9: Cern History

Fifty years of research at CERN, from past to future: The accelerators

K. HubnerCERN, Geneva, Switzerland

AbstractThe evolution of the CERN accelerator complex since its inception is summa-rized. Emphasis is put on the salient features and highlights of the differentfacilities and on what has been learnt at each stage in terms of acceleratorphysics and technology. Possible future accelerator options for CERN are dis-cussed at the end.

1 IntroductionThis report, based on lectures given in the Academic Training Programme on the occasion of CERN’sfiftieth anniversary, summarizes the evolution of the CERN accelerator complex with an emphasis on thehighlights in accelerator physics and technology. The description of the early part of the developmentis based on the available documents [1–3] and the reminiscences of senior colleagues who have beenmy teachers and friends. The later evolution is presented with a more personal touch, based on myown observations made after I joined the legendary CERN Accelerator Research Division in 1964, whenthe studies for the upgrading of the Proton Synchrotron (PS) and the construction of the IntersectingStorage Rings (ISR) and the Super Proton Synchrotron (SPS) were in full swing [2–4]. The achievementspresented here have been made mainly by CERN staff but the continuous support and the significantcontributions from our colleagues outside CERN must be gratefully acknowledged.

The first part treats the early accelerators, the Synchro-Cyclotron (SC) and the PS and their evo-lution. The second part deals with the accelerators built by CERN after its expansion into France, theISR and the SPS. The third part is dedicated to the period when CERN had to choose from a number ofoptions which finally led to a family of powerful particle colliders, starting with the conversion of theSPS into a proton–antiproton collider, followed by the Large Electron–Positron collider (LEP) and thenby the Large Hadron Collider (LHC) which will be housed in the tunnel drilled for LEP. The fourth partsketches the accelerator options for the future which have already been under study for a number of yearssince, as the accelerators become more and more complex, the lead times for a project become very long.Finally, a very personal account is presented of what we have learnt in terms of management in all theselarge projects.

I have given only a few references. A selection of references for further reading on the futureoptions for CERN is presented.

The lectures and this paper are dedicated to Mervyn Hine, distinguished accelerator physicist andman of vision, who made eminent and varied contributions to the build-up of the accelerator complex atCERN. He joined CERN in 1952 and became one of the leading figures for the construction of the PS.As Director for Applied Physics, responsible for planning under J. B. Adams and V. Weisskopf, he laidthe foundations for the expansion of CERN beyond the SC and PS. He died in April 2004.

2 Early times2.1 Starting conditionsIn May 1952 the provisional CERN Council decided on the creation of two Study Groups: the first Groupto study a proton Synchro-Cyclotron (SC) led by Cornelis Bakker of Amsterdam, and the second Groupto explore a weak-focusing proton synchrotron similar to the 3 GeV Cosmotron being commissioned atBrookhaven National Laboratory. The leader of the second Group was Odd Dahl of Bergen. The energies

1

Page 10: Cern History

of these accelerators were discussed in a meeting in June 1952 in Copenhagen and the recommendationreported by Heisenberg to aim for 600 MeV for the SC and 10 to 20 GeV for the synchrotron wasaccepted at the second Council session in June 1952.

A group led by Odd Dahl visited BNL in summer 1952 and learnt about the alternating-gradient(AG) focusing principle [5]. Although a design of a Cosmotron-like synchrotron had been worked out, itwas decided after this visit to forthwith design a synchrotron embodying the new principle, which heldthe promise that a higher energy could be reached for the same cost. In October 1952 the still provisionalCouncil fortunately accepted this bold proposal, which meant abandoning the scaled-up weak-focusingCosmotron-type 10 GeV synchrotron and studying instead a 30 GeV proton synchrotron based on theuntried AG principle.

The audacity of Council to make a large step in energy of the synchrotron using a scheme whichhad never been tested can be appreciated from Fig. 1(a) where the energy of various types of acceleratorsin the US is plotted as a function of the year of start-up. It also gives the energy of the two first AGsynchrotrons, the PS at CERN and the AGS at BNL, as a function of the time when the proposal wasmade. (The Stanford Mark III 1.2 GeV electron linear accelerator (linac) which was completed by 1950is not shown.) Figure 1(b) shows these two proposals with the accelerators in Europe.

1

10

100

1000

46 48 49 50 51 52 53 54

syn.cycl.

e- synchr.

p synchr.

PS/AGS prop.

1

10

100

1000

46 48 49 50 51 52 53 54

Fig. 1: (a) The energies of the proposed PS and AGS in units of 0.1 GeV are plotted as a function of the time of theproposal in comparison with the energy of the US accelerators as a function of their start-up. Synchro-cyclotrons(diamonds); electron synchrotrons (small squares); non-AG proton synchrotrons (triangles); PS and AGS proposal(large squares); (b) the same plot for European accelerators.

2

K. HUBNER

2

Page 11: Cern History

Obviously, this decision immediately put Europe in the forefront of accelerator physics. CERNstarted on an equal footing with the US and ahead of other European Countries. This helped the UKto decide to join CERN, even though the UK was the leader in accelerator technology in Europe at thattime. However, it was considered an “adventurous high-risk high-gain course of action” according toJ. B. Adams, or as R. Peierls put it more poetically “for awful gamble stands AG but if it works or notwe’ll see”.

In the long term, this choice turned out to be decisive for the development of the CERN acceleratorcomplex as only a synchrotron with AG focusing could provide the high-brilliance beams required forcolliders with a high luminosity.

Others did not trust the new AG principle and weak-focusing proton synchrotrons such as the10 GeV Synchro-Phasotron (start of construction in 1952) at JINR/Dubna, the 7 GeV Nimrod (1957) atRAL in the UK, and the12 GeV ZGS (1959) at ANL in the USA were still constructed.

2.2 The CERN Synchro-CyclotronIn October 1952 the provisional CERN Council also decided to go ahead immediately with the construc-tion of the 600 MeV SC, well before the construction of the PS, in order to secure an early start in mesonphysics and as a training ground for European experimental physicists and accelerator technology. Thedesign was almost complete in 1953, fifteen months after the setting-up of the Group which then movedto Geneva in 1954.

Construction started in 1955 and the first beam was obtained in 1957, immediately at maximumenergy. Figure 2 shows this accelerator.

Fig. 2: The CERN 600 MeV Synchro-Cyclotron (SC) with the red iron yoke, the coils, and, in the foreground, thecylindrical rotating condenser for frequency modulation

The physics programme started in 1958 and a number of excellent results were obtained such asthe discovery of the rare decay π− → e−ν and of the pion ‘beta decay’, π+ → π0e+ν, the g − 2measurement of the µ, and many other contributions to µ physics. Until the start-up of the PS the SCwas the centre of scientific life at CERN, providing the physicists with an opportunity to gain experience.However, this attraction of the SC had an adverse impact on the early scientific programme at the PS.The SC was later taken over by nuclear physics in 1964, in particular for experiments with the isotopeseparator on line, ISOLDE.

3

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

3

Page 12: Cern History

A number of improvements were made during the long faithful service of the SC. The most notablewas the work done in 1973/74 which included a new ion source, a rotary condenser for raising the dee-voltage and the repetition rate, and a new extraction channel. This led to an increase of the internal beamintensity from 1 µA to 4 µA and the extraction efficiency went up by a factor of 10% to 70% as expectedfrom the design. Further modifications were made later for acceleration of ions with a charge-to-massratio of 1/3 to 1/4 up to carbon.

The SC was stopped at the end of 1990 after the decision to move ISOLDE to the Booster Syn-chrotron of the PS (PSB) which had a higher proton-beam power than the SC.

2.3 The CERN Proton Synchrotron (PS)The PS Design Group was initially spread out over Europe but it quickly became obvious that such aproject needed a centralized team. The first members of the PS group moved to Geneva in autumn 1953,and started to seriously review the design after the approval by Council in October 1953 with a reductionin the energy to 25 GeV in order to contain the cost.

The advantage of the AG principle was that the size and, therefore, the cost of the vacuum chamber,the magnets and their foundations, and the tunnel cross-section could be reduced dramatically. However,it made the synchrotron much more sensitive to the errors in the magnetic field and alignment. To findthe right balance required a number of iterations, generating a sizeable fluctuation of the parametersduring the design until the start of construction in 1954. This is illustrated in Table 1 which gives thetotal weight of the magnets at the different design stages. The last column gives the final weight whichis still rather reasonable. In comparison, the weight of the magnets of the 10 GeV Synchro-Phasotron atJINR/Dubna is 36 000 t.

Table 1: The evolution of the weight of the PS magnets during design

Date January/1953 April/1953 March/1954 As constructedMagnet weight (t) 800 10 000 3300 3800

Figure 3 shows the PS combined-function magnets which provide a dipole field for bending withan alternating gradient for beam focusing.

Fig. 3: The combined-function magnets of the PS

4

K. HUBNER

4

Page 13: Cern History

The ground breaking took place in May 1954. The construction of the ring with a circumferenceof 628 m was completed in 1959, and the proton beam energy reached 28 GeV (kinetic) in December1959.

The PS group had learnt beam physics in a circular accelerator with AG focusing; how to producemagnets with tight tolerances; how to design stable supports for them; how to align them; how to properlycontrol the beam and the accelerating radio-frequency system, and, last but not least, how to manage alarge project, invaluable know-how for the future of CERN.

The experimental programme started in spring 1960 but was hampered by the lack of secondarybeam equipment and adequate particle detectors, and of experienced physicists. This was partly a con-sequence of the SC programme having cornered substantial resources which were then not available forthe work with the front-line accelerator. Hence CERN could not exploit the advantage of a six-monthhead start of the PS over the AGS at BNL.

2.4 Upgrading the PSIn order to be better armed for the fierce, but friendly competition, the PS has been continuously improvedand adapted to serve as injector for a number of accelerators.

In a first vigorous step, the efficiency of the secondary beam production was improved in compar-ison with that of an internal target by extracting the primary proton beam onto an external target:

– A fast extracted beam was produced for the first time in 1963, by a technique allowing the extrac-tion of the totality or part of the beam bunches from the ring within one turn. This was accom-plished by a novel kicker magnet with a very short rise-time moved over the beam at the end ofthe acceleration cycle when the beam was small. This led to a spectacular increase in the neutrinoflux, which, in addition, was enhanced by the magnetic horn, another device pioneered at CERNand installed in 1961. The kicker magnet was replaced in 1969 by a full-aperture kicker in orderto remove the constraining kicker aperture. Figure 4 shows the extracted proton beam passingthrough a row of scintillators with a PS ring magnet at the right.

– A slow extraction system was brought into operation in 1963 providing a long spill (about 1 s) tothe experiments and increasing the duty cycle; this was especially appreciated by the experimentswith electronic detectors.

In order to raise the mean proton current, a new power supply and rf system were installed toincrease the repetition rate by a factor 2 in the mid 1960s.

Since these two new extraction methods worked very well, it became conceivable to increasethe PS intensity per pulse, which would have been impossible with the internal targets (the latter werecontaminating the accelerator, heated too much, and were eradicated completely only in 1980). Thenumber of particles per pulse N was limited by space-charge effects at injection. Since Nmax ∝ βγ2 fora given invariant emittance of the beam, where β,γ are the usual relativistic parameters, a synchrotronconsisting of four superposed rings, the PS Booster (PSB), was inserted between the 50 MeV LinearAccelerator (Linac1) and the PS in order to increase the injection energy of the latter to 800 MeV. Atheoretical gain of a factor 8 in intensity was expected from this new accelerator, which was constructedin 1968–1972. The PSB came on-line just in time for the experiments with the Gargamelle bubblechamber, which led to the discovery of neutral currents, one of prestigious accomplishments of CERN.

The third step was the construction of a new 50 MeV linac (Linac2) equipped with a new protonsource in 1973–1978.

The steady evolution of the record number of protons accelerated in the PS and PSB is shown inFig. 5. As early as 1960, 2.2 × 1013 protons per pulse were reached. A particular effort was demandedfrom these two accelerators during the last neutrino experiments in the CERN West Hall (CHORUS and

5

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

5

Page 14: Cern History

Fig. 4: The first extracted proton beam at the PS in 1963

NOMAD) in 1999 where the PS provided on average 2 × 1013 protons per pulse. Since the PS wasoriginally designed for 5–8 × 109 protons per pulse, a good factor of 2000 has been gained over theyears through continuous upgrading.

Fig. 5: Evolution of the PS and PSB peak intensity

The PSB impressed not only with a spectacular growth in intensity over the years but also with thestep-wise increase in proton energy by an amazing 75% from 0.8 to 1.4 GeV (kinetic). It was possible toraise the field of the magnets, which, however, no longer operate in the long-term cost minimum.

Linac1 served the PS as its first proton injector and as the injector for a very successful ion pro-gramme. The first phase started in 1976 with deuterons for the ISR, which were also served with alphaparticles in 1980. The second phase was the production of S16+ for the SPS from 1990 to 1992. In1990, in a collaboration with other laboratories, construction began of a new Pb linac, Linac3. It became

6

K. HUBNER

6

Page 15: Cern History

operational in 1994, and the PS started providing the SPS with 208Pb ions which are fully stripped afterextraction from the PS. Linac1 continued to provide p and H−to the storage ring LEAR (see Section 4.3)for tests until the end of 1996 and was dismantled in 1997.

Furthermore, the PS was modified to permit the acceleration of positrons and electrons to 3.5 GeVfor LEP (see Section 4.4) by adding a pre-injector, introducing a damping wiggler and an additional rfsystem in the late 1980s. A new control system allowed fast switching between cycles within a supercy-cle. An example of such a supercycle in Fig. 6 shows the interleaved acceleration of Pb ions, protons,positrons and electrons and, in between, the deceleration of antiprotons for LEAR.

Fig. 6: Example of a PS supercycle [6]

The highlights of the work of the accelerator groups during the upgrading of the PS were the designand running-in of the novel fast and slow extractions, the mastering of the high-intensity proton beamsand of the ion beams, the merging of the proton bunches for antiproton production by using more thanone rf system, and the refining of the computer control system allowing the creation and manipulation ofsupercycles without time-consuming commissioning.

3 Expansion into France with the ISR and SPS3.1 The Intersecting Storage Rings (ISR)While the SC and the PS were being built, the study for the second generation of CERN acceleratorsstarted at the end of 1956 and gradually swung towards a proton–proton collider. In addition, from 1961onwards, a study of a 300 GeV proton synchrotron was conducted. In 1965, at the end of a lengthy andintense debate, it was decided to first construct the ISR, though a number of physicists were not in favouras the ISR would not provide secondary beams and was considered to be too much of a shot into thedark.

The ISR were constructed from 1966 to 1970 on land in France, adjacent to the original CERNsite in Switzerland. It operated from 1971 to 1983 for physics. The combined-function magnet lattice,providing AG focusing, formed two independent, interleaved rings, intersecting in eight points. Fiveof these points were used for experiments. The ISR were housed in a big tunnel constructed by thecut-and-fill method. A view of the ISR at intersection point 5 is shown in Fig. 7.

The circumference of the orbits was 943 m, exactly 1.5 times the circumference of the PS. Themaximum beam momentum was 31.4 GeV/c. With dc proton currents up to 40 A (single beam up to57 A!) it reached a luminosity of 1.4×1032 cm−2 s−1 in the superconducting low-beta insertion, a factor35 above the design luminosity. It was a pioneer of accelerator technology:

– ultra-high vacuum and ion clearing (residual pressure 10−13 Torr in the experimental areas),– low-impedance vacuum envelope,– high-stability power supplies (only 0.1 ppm ripple in the dipole current),– superconducting low-beta insertion to squeeze the beam in the intersection point

(increasing the luminosity by a factor of 6.5).

7

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

7

Page 16: Cern History

Fig. 7: View of the ISR at intersection point 5

Figure 8 shows as an example the evolution of the pressure in the vacuum chamber averaged overthe circumference. The design figure was 10−9 Torr. The excellent vacuum allowed physics runs withthe beam coasting for 60 h with beam lifetimes in excess of many months, which rendered excellentbackground conditions.

Fig. 8: The evolution of the average vacuum pressure in the ISR [7]

Since it was the first hadron collider, the ISR provided a unique opportunity to study effects whichwere predicted by theory, such as beam–beam effects, space-charge detuning, beam-equipment inter-action, and intra-beam scattering, and to discover unexpected phenomena such as pressure rise due tomultipacting. Two prominent examples are given.

The invention of non-destructive beam diagnostics for coasting beams with Schottky noise was ofenormous impact. Figure 9 shows the beam noise picked up by an electrode.

8

K. HUBNER

8

Page 17: Cern History

Fig. 9: Longitudinal Schottky scans of coasting proton beams of 10, 15, and 19 A [8]

The signal is proportional to the square of the proton density as a function of beam momentum.The scan with 19 A beam current (bottom trace) shows a dent in the distribution due to beam loss at anon-linear resonance.

This Schottky noise signal became an indispensable tool for monitoring the average momentum,the momentum spread, and the density evolution of the coasting beam without disturbing the beam.Online correction of the space-charge tune-shift became possible because the betatron tunes of the stack-edges or of a resonance could be determined with high precision from the fast and slow transverse-waveSchottky signals.

This discovery of the transverse Schottky signals led to another unique accomplishment, the res-urrection of the idea and the first experimental test of stochastic cooling invented by S. van der Meerin 1968. Figure 10 shows the result of the first test. The beam height is blowing up on account ofmultiple-scattering on the rest gas when the cooling is off.

Fig. 10: Measurement of relative beam height as a function of time with stochastic cooling on and off [9]

Later, physics runs with colliding proton–antiproton beams took place with a more refined coolingsystem so that an antiproton beam (see Section 4.2) could circulate up to 345 h without any significantdeterioration.

9

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

9

Page 18: Cern History

The ISR were decommissioned in 1984 in order to free resources for LEP. Very interesting physicsresults were obtained with this machine but the main discoveries in this period, the J /ψ and the Υ, weremade elsewhere though they could have been made at the ISR, very likely with some of the experimentsthat had been proposed but unfortunately were degraded or turned down.

3.2 The Super Proton Synchrotron (SPS)Work on the concept of a 300 GeV proton synchrotron started in the early 1960s at CERN. After thedecision in favour of the ISR, the discussions resumed but were soon dead-locked over the choice of thesite, elsewhere in Europe, for which 22 offers had been received. The issue was settled in favour of a sitein France near CERN with the argument of using the PS as injector and other savings being made fromsynergies if the SPS were built close to CERN.

The project was approved in 1970. After some fluctuations between an initial energy of 150 GeVwith missing magnets and 600 GeV with superconducting magnets, the decision was made in 1973 to aimfor 400 GeV with conventional magnets. A magnet lattice with a circumference of 6912 m, eleven timesthe PS circumference, provides AG focusing. In contrast to the ISR, it is of the separated-function type,i.e., the bending is provided by pure dipole magnets and the focusing by separate quadrupole magnets(Fig. 11). This allows the attainment of a magnetic field in the dipoles about 50% higher than withcombined-function magnets.

Fig. 11: View of the SPS tunnel with dipole magnets (red) and quadrupoles (blue)

The first beam circulated at top energy in mid 1976, and the design intensity of 1013 protons perpulse was reached in the same year. Later the top energy was increased to 450 GeV and the intensityreached 4.5 × 1013 protons per pulse. In the period 1982 to 1991, the SPS played a key role in CERN’santiproton programme discussed in Section 4.2. The SPS has accelerated ions from 1990 onwards (S in1900 to 1992 and Pb from 1994), underlining the flexibility of this synchrotron which served also as LEPinjector, accelerating electrons and positrons from 3.5 GeV to 22 GeV. This required the installation ofan additional powerful rf system, careful shielding against synchrotron radiation, and new injection andextraction systems. At present, the SPS is being modified and improved to become the LHC injector.

Two large experimental halls (West, North) provide the required floor space for the experimen-tation with secondary beams of highest energy. The West Area, including the neutrino beams, wasoperational from January 1977, a few months after the start-up of the SPS, avoiding the mistake made

10

K. HUBNER

10

Page 19: Cern History

with the PS. The neutrino programme in the West Area was terminated in 1998, but a new neutrino beam(CNGS) is under construction in order to send neutrinos to the Gran Sasso Laboratory of INFN 730 kmaway. CNGS will start operation in 2006, continuing the search for neutrino oscillations in Europe whichcommenced with the West Hall beams.

New expertise was acquired and new techniques have been learnt during SPS construction andcommissioning: deep tunnelling with a precision of 2 cm over 1.2 km, direct powering from the gridwith reactive power compensation (peak active power 150 MW), and computer control from the start(both had been done already on the PSB albeit on a smaller scale), and very refined, high-efficiencybeam extraction techniques (fast, slow, fast–slow) to serve users requiring very different spill times andintensities, often within one and the same acceleration cycle.

4 The quest for world leadership4.1 Search for the next stepsKnowing the long lead-times of a project, the search for the next steps started after completion of theISR in 1974, while the SPS was still under construction. A variety of options was examined up to 1978:

– CHEEP: 27 GeV electrons in an additional ring, colliding with 270 GeV protons in the SPS,– LSR/SISR: pp collider with 400 GeV per beam,– MISR: 60 GeV protons in a storage ring built from ISR magnets to collide with SPS protons,– SCISR: two new superconducting rings for pp collisions with 120 GeV per beam in the ISR,– Proton–antiproton collisions in the SPS requiring the construction of an antiproton source,– Large Electron–Positron (LEP) ring in a new tunnel.

In 1978, the decision was made to go for the proton–antiproton collisions in the SPS as themedium-term project, and to concentrate on LEP as the future long-term flagship, with the hope thatit would be powerful enough to discover the Higgs.

It is also useful to recall the international context of these decisions: the study of ISABELLE,a pp collider with 200 GeV per beam at BNL, was in full swing. Hence CERN had to take a quickand economic approach to be first to discover the predicted intermediary W and Z bosons. Incidentally,ISABELLE was stopped at the construction stage in 1983 in favour of the superconducting pp collider(SSC) with 40 TeV in the centre of mass which in turn was abandoned in 1993. The study of the proton–antiproton option at FNAL had been stopped in 1978, to a large extent because the vacuum system of theFNAL ring could not be upgraded with reasonable effort to reach the required beam lifetime.

4.2 The antiproton programme in the SPSIn order to achieve a reasonable luminosity in the proton–antiproton collision in the SPS, the phase-spacedensity of the secondary antiproton beam, produced by protons hitting a target, had to be increased bymany orders of magnitude by stochastic cooling. However, only transverse stochastic cooling had beendemonstrated at the ISR. Hence it was decided in February 1977 to experimentally prove momentumcooling and simultaneous stochastic cooling in all three dimensions in a new test ring, the Initial Cool-ing Experiment (ICE). The latter was constructed and commissioned in record time and successfullycompleted its mission in May 1978. In June, the decision was made to perform the proton–antiprotonexperiment in the SPS. Figure 12 shows an example of momentum cooling in ICE.

ICE also established a new lower limit for the antiproton lifetime (32 h at rest) increasing it bynine orders of magnitude, which was fortunately about three times the minimum required beam lifetimein the upgraded SPS.

Thereafter, the Antiproton Accumulator (AA) was built very quickly from 1979 to 1980. It was astorage ring, downstream of the target hit by 26 GeV/c protons from the PS, with 157 m circumference

11

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

11

Page 20: Cern History

Fig. 12: Distribution of protons as a function of momentum before (wide rectangle) and after cooling (narrowpeak) in ICE

operating at 3.5 GeV/c and equipped with powerful stochastic cooling devices. In the SPS, the vacuumsystem had to be upgraded from the 200 nTorr (design) to better than 2 nTorr, low-beta insertions wereinstalled in straight sections 4 and 5 for the UA2 and UA1 experiments, electrostatic deflectors wereproduced to separate the beams, each consisting of 6 bunches, in 9 of the 12 crossing points, and therf system was upgraded. New transfer lines to and from the AA and from the PS to the SPS (TT70) werebuilt. In addition, transfer line TT6 was added to channel antiprotons to the ISR. Figure 13 shows thelayout of the accelerators.

Fig. 13: Accelerators and beam lines for the antiproton–proton experiment in the SPS

Later, in order to shorten the cooling times and to obtain a higher antiproton density, the Antipro-ton Collector (AC) ring with a circumference of 182 m was added around the AA in 1987. Operatedalso at 3.5 GeV/c, its purpose was to capture more antiprotons, thanks to its much larger acceptance.After reduction of momentum spread by bunch rotation of the injected beam, the stack was subjectedto fast three-dimensional pre-cooling and then transferred to the AA. The combined action AC and AAeventually raised the daily antiproton production rate by a factor 6 and the six-dimensional antiprotonphase space density by up to a factor 4 × 109. Although 1012 antiprotons per day were accumulatedfor about two fills of the SPS, where the beam lifetime was 10 h, this left little margin and the wholeoperation was a continuous cliff-hanger because there was always the risk of losing the stack in the AAor losing the beam during acceleration in either the PS or the SPS with the resulting loss of a whole day.

12

K. HUBNER

12

Page 21: Cern History

The energy of the colliding beams was 273 GeV between 1982 and 1985. It was raised to 315 GeVfrom 1987 to 1991 when the programme was terminated. Figure 14 shows the performance of the SPSas collider in terms of peak luminosity and integrated luminosity per year.

Fig. 14: Performance of the SPS antiproton–proton collider [10]

The luminosity was rather low before the advent of the AC but still sufficient to firmly establishthe existence of the W and Z in the runs of 1982 and 1983 which won Carlo Rubbia and Simon vander Meer the Nobel Prize in 1984. The large 4π detectors, which had been denied to the ISR, had beendecisive.

The rewards for CERN were extremely important not only in particle physics but also in acceler-ator physics and technology. For the first time the beam–beam effect with bunched hadron beams couldbe studied, including long-range beam–beam forces and the effect of very unequal beam sizes in theinteraction points, invaluable experience for the LHC. The bunches collided in three crossing points: inthe two experiments and in the point mid-way in between. In order to avoid beam collisions at the othernine out of the twelve possible crossing points when operating with six bunches per beam, the orbitsof the protons and antiprotons were separated in these nine points by distorting the orbits to a ‘Pretzel’shape by means of electrostatic separators, providing important experience for LEP. New civil engineer-ing know-how was acquired with the digging of the large caverns in the molasse bedrock for the twoexperiments UA1 and UA2, which turned out to be extremely useful for LEP and the LHC.

4.3 The low-energy antiproton programmeThe existence of a powerful antiproton source motivated the construction of the Low-Energy Antipro-ton Ring (LEAR), a buffer and decelerator ring, producing antiprotons with a kinetic energy of 5 to1270 MeV in long spills for the physics community interested in low-energy physics with antiprotons. Itwas built between 1980 and 1982 and operated until the end of 1996.

In order to counteract the blow-up of the beam during deceleration below the kinetic injectionenergy of 180 MeV, stochastic cooling and electron cooling had to be used. A highlight was the ultra-slow ejection with a noise signal feeding the particles into a non-linear resonance, so that smooth spillsof up to 10 h became possible. The ring also had an internal target which served to produce the firstanti-hydrogen atoms ever seen. Unfortunately, only a handful of these atoms were produced and not atrest as required for precise tests of the charge-parity-time (CPT) conservation theorem.

Hence in order to pursue seriously this line of research and, it was hoped, once with sufficientquantities of anti-hydrogen at rest, the AC ring was modified between 1998 and 2000 to become theAntiproton Decelerator (AD), providing a very stable antiproton beam at 5 MeV after deceleration from2.7 GeV. Again, the beam blow-up inherent to deceleration is compensated by stochastic and electroncooling. A highlight is a special linear Radio-Frequency Quadrupole Decelerator (RFQD) lowering

13

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

13

Page 22: Cern History

further the kinetic energy of the extracted beam to a value which can be chosen at will between 120 and10 keV; this increases the intensity by a factor 50 compared to beams having their energy lowered bydegraders.

4.4 The Large Electron–Positron Ring (LEP)The design of the future CERN flagship started in 1975. It went through a number of iterations, eachthe result of a compromise between the desiderata of the physics community and feasibility, as shown inTable 2. The last column gives the final design parameters.

Table 2: LEP design iterations

1977 1978 1979 1984Beam energy (GeV) 100 70 86 55Circumference (km) 52 22 31 27Number of experiments 8 8 8 4RF power (MW) 109 74 96 16

The discussions about a possible site were cut short by proposing the PS/SPS as injector chain,complemented with a new Lepton Pre-Injector (LPI) feeding the PS with either electrons or positrons at600 MeV. The LPI was constructed with the help of and in close collaboration with LAL/Orsay.

After approval at the end of 1981, construction started immediately and the first beams collided inLEP in 1989 at 46 GeV per beam, at the Z0 resonance (LEP1). In the framework of the LEP2 programme,the beam energy was increased in steps from 1995 onwards depending on the installation schedule of thesuperconducting cavities. In 1997 the threshold for W-pair production was reached and the final energyof 104.5 GeV per beam was attained by LEP2 in 2000 when all available spares were mobilized in adramatic final spurt to find the elusive Higgs particle. The accelerator was dismantled in 2001 to makespace for the LHC.

A number of technical challenges had to be met by innovative designs. Some examples follow. Thecheap and rigid steel–concrete dipole magnets consisted of spaced iron laminations with cement mortarin between (27% steel filling), kept together by four pre-stressing rods running over the whole lengthof 6 m. The concentration of the magnetic field in the laminations avoided reproducibility problems atinjection energy where the average dipole field was only 240 G while the field in the laminations stayedbelow 5 kG at maximum energy, avoiding any saturation effect. In order to avoid disturbance of thesolenoid field in the detectors, warm-bore, iron-free, superconducting, low-beta quadrupoles had to bedeveloped.

Since the bending field at injection was below the threshold of distributed sputter-ion pumps, anon-evaporable getter (NEG) pumping system was used for the first time in the LEP dipoles and dealteffectively with the large gas load induced by synchrotron radiation at high energy. A cross-section ofthe vacuum chamber in the dipoles is shown in Fig. 15.

In order to minimize power consumption, the 352 MHz Cu rf cavities operating at ambient tem-perature were coupled to single-cell spherical storage cavities operating with a mode having vanishingfields on the walls and, therefore, very low losses. The coupling was chosen so that the electromagneticenergy was stored in the storage cavities in the intervals between the bunches.

Since the required rf voltage scales with γ4, it had to be increased massively from the 0.4 GVrequired at the Z0 resonance to reach the W threshold and beyond. Thus superconducting cavities wererequired to keep the power bill within reasonable bounds. The management had the vision and courageto start R&D for these novel and complex components as early as 1979, i.e., long before the approvalof LEP1, and by 1989, 20 Nb bulk cavities operating at 352 MHz and at 4.5 K could be ordered fromindustry. However, for the next batch of 160 cavities, CERN switched to the Nb-film technology where

14

K. HUBNER

14

Page 23: Cern History

Fig. 15: Vacuum chamber in the dipole magnet made of extruded Al profile (1) with the elliptic beam channel,three water cooling ducts (2), and surrounded by a lead shield (3). The NEG pump (4) is connected by longitudinalslots (5) [11].

the film is obtained by sputtering Nb on Cu sheets. This technology has inherent advantages such asbetter thermal stability against quenching, savings on Nb material, insensitivity to stray magnetic fields,and a higher quality factor. The cavity production was an example of successful transfer to industry oftechnology developed at CERN.

LEP2 required a number of modifications and improvements apart from the upgrading of therf system: two new klystron galleries had to be dug and equipped; a cryogenic system with long transferlines and four refrigerators, each providing initially 6 kW and later 12 kW, was required for cooling allthe superconducting cavities and the increased number of superconducting quadrupoles.

The final rf configuration comprised 272 Nb-film cavities operating on average at 7.5 MV/m (nom-inal 6 MV/m), 16 Nb bulk cavities (4.5 MV/m), 56 Cu cavities (1 MV/m) fed by 43 klystrons of the 1 MWclass. It was the largest superconducting rf system ever with 490 m total active length of superconductingcavities, which together with the Cu system eventually provided an rf circumferential voltage of 3.6 GV.Figure 16 shows the evolution of the available rf voltage and the beam energy of LEP2 over the years.

0

500

1000

1500

2000

2500

3000

3500

4000

juil.95 févr.96 août.96 mars.97

sept.97 avr.98 nov.98 mai.99 déc.99 juin.00 janv.01

Date

RF

vol

tage

[MV

]

65

75

85

95

105

115

125

Cryogenicsupgrade

Beamenergy

Nominal RFvoltage

Available RF voltage

Beamenergy[GeV]

Fig. 16: Evolution of beam energy, available and nominal rf voltage of LEP2 [12]

LEP1 reached an integrated luminosity of 206 pb−1 at Z0 energy, and 784 pb−1 at and above theW threshold were collected with LEP2. LEP underwent a practically continuous improvement whichpushed the integrated luminosity per year from 9 pb−1 in 1990 to the record of 254 pb−1 in 1999.

New challenges had to be mastered during LEP construction and operation. The civil engineeringknow-how acquired with the SPS had to be applied on a large scale for the construction of the 27 km

15

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

15

Page 24: Cern History

circumference tunnel and the big experimental halls. One of the largest rf and cryogenic systems foraccelerators had to be designed, installed, and operated. Operation with strong synchrotron and radiationdamping of beams, and very strong beam–beam effects provided new insights for accelerator physics.Refined methods for beam energy calibration had to be developed reducing the contribution to the uncer-tainty in the W mass to only 10 MeV. The operations group learnt to deal with perturbations from earthcurrents caused by near-by dc-operated trains which produced a relative variation of the magnetic dipolefield of 2×10−4, changes in the circumference of LEP caused by tides, changes in the underground watertable and rain leading to variations of 1 mm in circumference resulting in deviations of 10 MeV in beamenergy, and, last but not least, with beer bottles found in the vacuum chamber after a shutdown.

Although a large number of unique physics results were obtained from LEP, solidly confirmingthe Standard Model, the LEP machine energy was not enough to produce the elusive Higgs particleor the postulated supersymmetric particles. Whether it was a wrong decision in 1996 to discontinuethe industrial production of the superconducting cavities and, therefore, not to exploit LEP fully with384 Nb-film superconducting cavities providing 4.8 GV to reach 220 GeV in the centre-of-mass, willonly be known from the results of the Tevatron and LHC in a few years’ time.

4.5 The Large Hadron Collider (LHC)The option to complement or to replace LEP later with a large hadron collider had been in the mindsof European particle physicists since the mid-1970s. The design of the LHC started in 1983 and theproject was approved in two steps: a two-stage approach was first proposed in 1994 with a missing-magnet scheme providing 5 TeV per proton beam which could be upgraded to 7 TeV by adding themissing complement of magnets; in 1996, when substantial contributions from non-member states hadbeen secured, a single-stage construction with 7 TeV top energy was accepted. The start of operation isscheduled for 2007.

The nominal luminosity is 1×1034 cm−2 s−1 with protons of 7 TeV per beam and 1×1027 cm−2 s−1

when operating with Pb ions of 2.8 TeV/u. The counter-rotating beams are horizontally separated in thearcs and go from the inner arc to the outer arc and vice versa in the four crossing points where the beamsinteract in the detectors at a small crossing angle.

The project team has to master a number of challenges in technology and accelerator physics. Themost important components are the 1232 dipole magnets with a mass of 28 t and length of 14.5 m (ironyoke). The magnetic field is 8.3 T, bending the orbits of the counter-rotating particles which are separatedby only 194 mm. A cross-section of the very compact ‘two-in-one’ magnet is shown in Fig. 17.

The superconducting coils are wound with cables made from 6–7 µm Nb-Ti alloy filaments em-bedded in a Cu matrix which has to absorb the heat dissipation in case the magnet quenches. The cablecarries 12 kA. An elaborate quench protection is required as the stored electromagnetic energy is 7 MJper magnet and the heat capacity of the cable is very much reduced at this low temperature. The non-magnetic steel collars keep the coil under compression over the whole operating range in order to avoidany movement of the cables or the coils despite the very strong electromagnetic forces (2 MN/m actingtransversely per coil quadrant). The magnets are cooled by superfluid liquid He at 1.7 K generated bythe modified and upgraded LEP cryogenic plants.

The vacuum required for a beam lifetime of about 100 h is provided by the pumping action ofthe vacuum tube at 1.9 K. However, the built-up gas layer on this cold tube has to be protected againstdirect synchrotron radiation (0.2 W/m) emitted by the protons, making the LHC the first proton storagering where synchrotron radiation has an impact on the hardware design. In addition, the proton bunchesproduce primary electrons by ionization of the rest gas, which are transversely accelerated by the elec-tric field of the bunches and hit the walls, where the number of electrons gets amplified as a result ofsecondary emission. This beam-induced multi-pacting, an effect discovered at the ISR, will produce asubstantial heat load.

16

K. HUBNER

16

Page 25: Cern History

Fig. 17: Cross-section of the LHC dipole magnet [13]

In order to avoid desorption of the adsorbed gas by direct synchrotron radiation emitted by theproton beam or by photoelectrons and to absorb the heat load, an elaborate beam screen (Fig. 18) cooledto 5–20 K protects the cold vacuum chamber and its gas layer. A slight saw tooth on the surface of thescreen reduces photon reflectivity, and coating the warm vacuum chambers with a special getter lowersthe secondary emission yield. Scrubbing the walls with synchrotron radiation and electrons for a whileduring running-in will further reduce the secondary emission yield. This will be performed at lowerbeam energy and/or with increased bunch spacing to limit the heat load.

Fig. 18: Model of the LHC beam screen [13]

Important beam dynamics issues have to be tackled: i) the long-range beam–beam forces in the120 parasitic crossings near the interaction points, because the beams contain 2808 bunches each and,therefore, the two beams ‘see’ each other in a common vacuum pipe before the actual interaction points;ii) multibunch instabilities, especially those induced by the electron clouds which cannot escape to thewalls because the spacing of the bunches is only 25 ns.

17

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

17

Page 26: Cern History

Figure 19 shows the CERN accelerator complex with the LHC in the LEP tunnel, the SPS andthe PS complex with all the transfer lines and the experimental areas of the PSB (ISOLDE), the PS(East Area, neutron Time-of Flight facility (n-ToF)) and of the SPS (West and North Area). The CERNNeutrino beam to Gran Sasso (CNGS) is under construction.

Fig. 19: The CERN accelerator complex

The new LEIR ring is also shown as an important element in the ion injection scheme of the LHC.It is nothing but LEAR modified to become a buffer accumulation ring for ions, inserted between thefast-cycling Pb linac, Linac 3, and the slow-cycling PS. It will convert four to five long linac pulses oflow density into two bunches of high density at 4.2 MeV/u by means of electron cooling, more efficientthan stochastic cooling at this low energy, before accelerating these bunches to 72 MeV/u and transfer tothe PS.

LEIR is a good example of the tradition of CERN fully exploiting past investment as done, forexample, with the PS and SPS, both of which in parallel to the fixed-target programme served as LEPinjectors and will be so used for the LHC.

5 Future accelerator-based options for CERN5.1 OverviewSince there is increasing awareness of the long lead-times of large projects, it is timely that the discussionof this issue has already started in the physics community and in the CERN Council, probably continuingfor a number of years even when the first results of the LHC become available. This situation will be verysimilar to the one we experienced earlier during the passionate, lengthy debates and scrutiny of optionsin the early 1960s after the construction of the PS, and in the middle of the 1970s after the completionof the ISR, although this time the discussion will be more global and will include our colleagues fromother regions of the world.

There are three basic directions to take: hadron colliders, lepton colliders, and advanced neutrinobeams. Each can lead either to exploratory experiments or to precision physics.

The most straightforward option for CERN seems to be the upgrade of the luminosity of the LHCby an order of magnitude to 1035 cm−2 s−1. The upgrade of the LHC energy by a factor two yielding

18

K. HUBNER

18

Page 27: Cern History

a centre-of-mass energy of 28 TeV has also been studied, although the step in energy is small and thelength of the required shutdown to install new magnets, and probably also to upgrade the injectors, issomewhat deterrent. A Very Large Hadron Collider (VLHC) was considered but turns out to be excludedby the geology around CERN. According to the study by our colleagues in the USA, led by FNAL, itwould require a circumference of more than 200 km to reach 40 TeV and, eventually, 200 TeV in thecentre of mass.

The VLHC is apparently less attractive, for the next large new facility, because there seems to bea vast consensus in the international physics community that a linear electron–positron collider in thecentre-of-mass range between 0.5 TeV and 1 TeV is the best choice. Although the energy reach of thelatter is rather limited, it is considered to be complementary to the LHC and could provide hints beyondLHC energies, provided its luminosity is high enough to perform precision measurements. One couldimagine CERN joining an international consortium to construct this International Linear Collider (ILC).In parallel with these studies for such a TeV-class collider based on the extension of known technology,conducted in particular by DESY, KEK, and SLAC, for a number of years a collaboration led by CERNhas been studying a technology which would allow the exploration of the high-energy frontier beyondLHC with a Compact LInear Collider (CLIC), also operating with positrons and electrons. This frontiercould also be reached with µ–µ colliders in the TeV class but the required technologies for the productionof intense and dense µ beams appear so challenging that some disenchantment has replaced the initialenthusiasm.

Advanced neutrino beams also appear to be an interesting option, in particular since neutrinophysics is rather decoupled from the LHC results and a number of synergies with nuclear physics andcondensed-matter physics promoting a European spallation source can be imagined. In a first instance, amore powerful but still conventional neutrino beam (‘Superbeam’) based on a powerful superconductinglinac (4 MW beam power) using LEP components feeding a new accumulation and compressor ring inthe ISR has been considered. The next stage, using the same linac as proton driver, could be a ‘Neutrino-Factory’ based on a µ-decay ring providing pure neutrino beams. A competing idea is to add a decayring to the SPS where an intense beam consisting of β emitters would coast producing a pure electron–neutrino beam. In both of these decay rings, charge-conjugate beams can also be produced: in the firstcase by using negative muons instead of positive ones, in the second case by choosing a positron emitterinstead of an electron emitter.

5.2 LHC upgradeA staged approach has been suggested [14], briefly sketched here. First, the luminosity would be raisedin steps. Examination of the formula for the luminosity shows the parameters which can be influenced

L = nbfrevN2bF/(4πσ

∗) (1)

where nb is the number of bunches per beam, Nb the number of protons per bunch, σ∗ the r.m.s. beamradius at the interaction points (round beam), and F = 1/(1 + (θσz/2σ∗)2)1/2. F is a form factor oforder 1 depending on σ∗, the beam crossing angle θ and bunch length σz . For the LHC, its value is 0.9for θ = 0.3 mrad.

In Phase 0, the maximum bunch intensity limited by the beam–beam effects is raised by collidingthe beams in interaction points 1 (ATLAS) and 5 (CMS) only, and the crossing angle is decreased bymodifying the layout of magnetic elements at these crossing points. This results in an increase of a factor2 to 3 in luminosity.

Phase 1 requires new insertion quadrupoles to increase the beam focusing (β∗ = 0.5 → 0.25 m)resulting in a σ∗ decreased by

√2. Very likely these quadrupoles would have superconducting coils made

from Nb3Sn, an alloy with a higher critical field but being very brittle (VLHC technology). Increasingfurther nb and Nb results in a luminosity between 5 and 7 × 1034.

19

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

19

Page 28: Cern History

Phase 2 is an energy upgrade, a major upheaval which cannot be envisaged before the next decade.In order to reach 14 TeV per beam compared to the present nominal 7 TeV, amongst many other compo-nents new magnets are required and, in particular, new dipoles with a magnetic field of 15 T comparedto the present 8 T; the injection energy has to be raised to 1 TeV by equipping the SPS with pulsed super-conducting magnets. Both these modifications demand a vigorous R&D programme in superconductingmaterials and magnet design. It is not likely that this energy upgrade could be accomplished before 2020,which diminishes the attractiveness of the proposition.

5.3 Electron–positron linear colliderSchemes for a linear collider with a centre-of-mass energy between 0.5 and 1 TeV and a luminosityaround 3 to 6×1034 have been the subject of an intense R&D programme led by DESY, KEK and SLAC[15]. Recently, the International Technology Review Panel (ITRP) has recommended concentrating onthe linac technology based on superconducting bulk-Nb accelerating structures operating at 1.3 GHzwith an accelerating gradient of 25–35 MV/m as pioneered by the TESLA collaboration based at DESY.In the TESLA proposal, the facility had a length of 33 km, providing 0.8 TeV in the centre of mass.Figure 20 shows a schematic layout of TESLA with its X-ray free-electron laser (FEL) option which willnot be adopted for the ILC.

Fig. 20: Schematic layout of TESLA [16]

After the decision on technology, the next step towards the ILC is the setting-up of a central teamand regional teams. They will review the TESLA design and its challenges, e.g., the very long dog-bonedamping rings (2 × 17 km); the non-conventional positron production through laser light generated bythe electron beam at top energy, which tightly couples the positron operation to the electron one; thefocusing of the beams to σ∗x/y = 400 nm/3 nm; and the safe tackling of 18 MW beam power per beam.Test facilities are under construction to test the basic components and to demonstrate the engineeringmargins required for such a big project.

In order to reach a very high energy in a reasonable length, CLIC [17] would use a very highaccelerating gradient (150 MV/m) generated in normal-conducting accelerating structures operating at30 GHz, though the optimum frequency may be somewhat lower. The choice of gradient and frequencymakes the linac very short such that 3 TeV in the centre of mass are reached within an overall length of

20

K. HUBNER

20

Page 29: Cern History

about 33 km (10 km for 0.5 TeV). The linac is also very compact in transverse dimension as the outerdiameter of the accelerating structures is of the order of 10 cm. Figure 21 shows a cross-section of theCLIC tunnel of the same diameter as the LEP/LHC tunnel.

? ?

Fig. 21: Cross-section of the CLIC tunnel. The position of the drive linac and main linac are indicated by arrows.The linacs are linked by a horizontal waveguide [17].

The novel power generation scheme of CLIC does not use a vast array of klystrons but a numberof drive beams which run along a considerable length (≈640 m) parallel to the main beam. The intenseelectron drive beam pulses of 150 A in 130 ns are decelerated from 2 to 0.2 GeV generating in thedecelerating structures a 30 GHz rf power pulse which powers the structures accelerating the main beampulse of low-intensity (1 A in 102 ns) up to 1.5 TeV.

Although CLIC holds the promise of a high potential, the linac technology of CLIC is certainly lessmature than that of the ILC whose feasibility is fairly well established thanks to the TESLA Test Facilityat DESY. The most important challenges for CLIC are the high accelerating gradient (150 MV/m), andthe generation and control of the high-power drive beam. A disadvantage of CLIC is that its basic rf unitis relatively large, requiring a substantial investment for a full-scale test.

5.4 Advanced neutrino beamsThere are basically two production mechanisms of neutrino beams: i) based on the decay of π and Kmesons; ii) based on the decay of beta emitters.

5.4.1 Neutrinos from mesons or muonsThe first process yields neutrinos either directly through the π and K decay or the µ decay

p → Target → π+ (K+) → µ

+ + νµ

(2a)

→ e+νe νµ (2b)

21

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

21

Page 30: Cern History

Process (2a) has been the traditional way of producing neutrino beams, still used at present(KEK/K2K) and in the medium term future (FNAL/NuMI, CERN/CNGS). The neutrino ‘superbeams’under study are based on it, using very powerful proton accelerators providing a beam power in the MWrange at the target. Selecting by focusing negative π and K yields the charge conjugate particles. Notethat this neutrino beam is always contaminated with neutrinos from the π and K of the opposite polaritywhich cannot be eliminated completely by the focusing of the secondary particles. Further sources ofcontamination are the neutrinos from µ decay and the neutrinos from the decay of Kµ3, Ke3 and neutralkaons.

Process (2b) uses muons decaying in a suitable storage ring, resulting in a very pure neutrino beamof only two species. Since one hopes to obtain very intense beams from such a facility, it was baptized‘Neutrino Factory’. Figure 22 shows the schematic layout of such a facility.

Fig. 22: Schematic layout of the European Neutrino Factory Complex [18]

The driver is a superconducting linac based on LEP components feeding negative H ions to anaccumulation ring where they are stripped during the multiturn injection process. After compression inthe twin ring, the 4 MW beam hits the target. The resulting π and K decay to muons which subsequentlyhave their phase space density increased in a cooling device. Acceleration to 50 GeV has to be very rapidon account of the limited lifetime of the muons. It takes place in a linac followed by two recirculatinglinacs before the muons are injected into the decay ring. This is not a small facility. For example, thetotal rf voltage required is 15 GV, about four times the voltage installed in LEP2.

The critical design issues related to the high beam power are the proton driver, the target, andthe downstream µ collection system. The target has to withstand the shock of the proton beam, andthe wide-band π collection system the intense radiation. The hardware must be designed to minimizeinduced radioactivity and to enable correct maintenance. The hitherto untried technology of ionizationcooling of the µ beam has to be tested thoroughly. A Muon Ionization Cooling Experiment (MICE) hasbeen worked out in detail and it is hoped that it will be conducted fairly soon using a µ beam either atPSI or ISIS/RAL. A further challenge is the rapid acceleration of the muons which have a very limitedlifetime (cτµ = 658 m at rest) requiring high accelerating gradients at low energy to increase theirlifetime as quickly as possible, the latter being proportional to γ.

22

K. HUBNER

22

Page 31: Cern History

5.4.2 Neutrinos from beta emittersThe idea is to generate suitable beta emitters by the ISOL technique, developed at CERN and currentlyused at ISOLDE, to accelerate them using the existing PS and SPS, and let them decay in a new storagering. There they would produce an intense, pure neutrino beam of only one species with a narrowopening angle due to the high Lorentz factor which is γ ≈ 150 at SPS top energy. Examples of suitablebeta decays are

62He→ 6

3Li e− νe

1810Ne→ 18

9F e+νe .

(3)

The He isotope has 0.8 s lifetime at rest; the Ne isotope has 1.6 s, which would give comfortablelifetimes of 120 s and 96 s, respectively at SPS top energy. The He isotope is obtained from the reactionBe (n, α) induced by spallation neutrons produced by protons hitting a heavy metal target; the Ne isotopeis produced by protons hitting a MgO target.

A possible schematic layout of such a neutrino source (called ‘beta-beam’) using the CERN in-frastructure is shown in Fig. 23. A new injection system for the PS is required with a SuperconductingProton Linac (SPL) as front-end providing protons at 2.2 GeV. The ions diffusing out of the target are col-lected, fully ionized, and bunched in an Electron Cyclotron Resonance (ECR) device and, subsequently,accelerated in a linac to about 100 MeV/c before being injected into a bunching ring which precedes aRapid Cycling Synchrotron (RCS) [19].

Neutrino Source

Decay Ring

Ion production ISOL target &

Ion source

Proton Driver SPL

Decay ring

Bρ = 1500 Tm B = 5 T C = 7000 m Lss = 2500 m 6He: γ = 150 18Ne: γ = 60

SPS

Acceleration to medium energy Bunching ring

and RCS

PS

Experiment

Ion acceleration Linac

Beam preparation ECR pulsed

Existing + additions New

Acceleration to final energy

Fig. 23: Schematic layout of a neutrino source based on beta emitters [19]

The accelerator issues are the generation of an ion beam of sufficient intensity, the detailed under-standing of the transmission efficiency of the many accelerators in this chain, the control of the inevitablelosses during acceleration of the decaying ions in order to avoid contamination of the accelerators, andthe accumulation of the ions in a very few bunches in the storage ring. Common issues with all theother neutrino sources are the handling of the hot target and the activated components downstream ofthe target, and the politically delicate and not easily controllable procedures to obtain and to retain theauthorizations for the operation of a MW-class proton driver and target.

Since the front-end linac is virtually the same as the one needed for the superbeam for which ithas to provide 4 MW beam power and since only 200 kW on target are required for the neutrino sourcebased on beta emitters, the idea has been aired of having both facilities built at CERN with the twodifferent neutrino beams pointing to the same detector. The advantage would be that data on νµ → νe

23

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

23

Page 32: Cern History

and νe → νµ oscillations and, in a later run, on the oscillations of the charge conjugates could besimultaneously collected, significantly speeding up the data taking.

As the nuclear physics community is also very interested in spallation neutrons to generate radio-active ion beams, possible synergies are under investigation in a study sponsored by the European Union.

6 Conclusions, or what we have learnt in the past 50 yearsIt was the continuous stream of good and attractive projects that created a growing user communitygiving unfailing support to CERN. The strategy has included the full exploitation of existing facilities,either by upgrading or re-use after modification, and the timely stopping of facilities when they were nolonger on the leading edge. This has led to an evolution of the accelerator park given in Fig. 24.

Fig. 24: Evolution of CERN’s accelerator park

Examination of the graph indicates that colliders have a short lifetime compared to that of accel-erators. The latter have turned out to be the real work-horses of CERN, serving as particle sources forthe fixed-target programme, providing the indispensable test beams, and faithfully playing the role ofinjectors for leading-edge colliders. The SPS was even turned into a very successful collider for a while.

There are a number of lessons from this success story. First, CERN has always gone for projectsand has carefully avoided losing time and reputation with uncommitted R&D. It realized very soonthat large projects have long lead-times and, therefore, one has to think ahead, which nowadays meansthinking in terms of decades. Obviously, key technologies must be mastered, preferably by CERN itself,before inviting tenders from industry. Industrial production has to be monitored meticulously and CERNmust have the resources to immediately intervene with advice and active help in case a problem arises.

Work on operation and upgrading of existing accelerators and, simultaneously, either conductingR&D or constructing new facilities has turned out to be extremely beneficial for CERN. Operation im-mediately gains from the new insights obtained and techniques learned in R&D, and the design of thenew facility is based on the experience and know-how acquired in operation, which keeps the projectteam on the floor of reality.

24

K. HUBNER

24

Page 33: Cern History

When drawing up a project it is imperative to work as closely as possible with the users but theparameters should be chosen based on existing know-how and, whenever possible, on full-scale tests ofall critical components and not on the desiderata of the users, which do however provide the indispensableguide lines. Sufficient engineering margins guarantee long-term reliability and flexibility and should notbe sacrificed to exaggerated competition leading to rush decisions which are bitterly regretted afterwards.All this is best underlined by citing from J. B. Adams’s farewell talk to Council in December 1980.

The question of how much flexibility to build into a machine is obviously a matter of judgement,and sometimes the machine designers are better judges than the physicists who are anxious to start theirresearch as soon as possible. But whatever compromise is reached about flexibility, one should certainlyavoid taking risks with the reliability of the machine because then all its users suffer for as long as it isin service and the worst thing of all is to launch accelerator projects irrespective of whether or not oneknows how to overcome the technical problems. That is the surest way of ending up with an expensivemachine of doubtful reliability, later than was promised, and a physicist community which is thoroughlydissatisfied.

CERN will have to adapt to the large size and complexity of future facilities at the high-energyfrontier; this will increase the lead-times and require a global approach involving not only the Europeanparticle physics laboratories but also those of other regions, an approach already timidly started withLEP but then adopted for the LHC.

Although the infrastructure of CERN is one of its well-known assets, the most important one isthe carefully selected young staff who have been hired in recent years. It is extremely important that theyare taught not only in academic lectures like this one, but that they work with and are encouraged byexperienced colleagues, as I was myself when I came as a young Fellow to CERN and the late MervynHine, though then Director of Planning, took a personal interest in my work, for which to this day I amstill extremely grateful.

AcknowledgementsIt is a pleasure to thank G. Fernqvist, W. Herr, R. Hohbach, K. Schindl, and C. J. Zilverschoon fordiscussions and advice. H. Koziol and B. de Raad carefully read the manuscript and made numeroussuggestions for which I am very grateful.

References[1] A. Hermann, L. Belloni, J. Krige, U. Mersits and D. Pestre, History of CERN, Launching the

European Organization for Nuclear Research, Vol. I (North-Holland, Amsterdam, 1987).[2] A. Hermann, J. Krige, U. Mersits, D. Pestre and L. Weiss, History of CERN, Building and Running

the Laboratory 1954–1965, Vol. II (North-Holland, Amsterdam, 1990).[3] CERN Annual Reports 1955–2003, see

http://library.cern.ch/cern{\_}publications/annual{\_}report.html

[4] J. Krige (Ed.), History of CERN, Vol. III (Elsevier, Amsterdam, 1996).[5] N.C. Christofilos, unpublished manuscript (1950).

E. Courant, M.S. Livingston and H. Snyder, Phys. Rev. 88 (1952) 1190.[6] D.J. Simon, Proc. 5th EPAC, Sitges, Eds. S. Myers et al. (Bristol, IOP, 1996), p. 295.[7] K. Johnsen, Report CERN 84–13 (1984).[8] J. Borer et al., Proc. 9th Int. Conf. High-Energy Accelerator, Stanford (AEC, Washington, D.C.,

1974), p. 53.[9] P. Bramham et al., Nucl. Instrum. Methods 125 (1975) 201.

[10] G. Brianti, Eur. Phys. J. C 34 (2004) 15.[11] O. Grobner, Vacuum 43 (1992) 27.

25

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THE ACCELERATORS

25

Page 34: Cern History

[12] R.W. Assmann, Proc. LHC Workshop, Chamonix, 2001, Eds. J. Poole et al., Report CERN-SL-2001-003 DI (2001), p. 323.

[13] O.S. Bruning et al. (Eds.), Report CERN-2004-003 (2004), Vol. I.[14] O.S. Bruning et al., CERN LHC Project Report 626 (2002).[15] International Linear Collider Technical Review Committee, Second Report, Report SLAC-R-606

(2003).[16] J. Andruszkow et al., (R. Brinkmann et al., Eds.), Report DESY 2001-011 (2001).[17] The CLIC Study Team, (G. Guignard Ed.), Report CERN 2000-008 (2000).[18] M. Aleksa et al., (P. Gruber Ed.), Report CERN/PS/202-080 (PP).[19] B. Autin et al., J. Phys. G Part. Phys. 29 (2003) 1785.

26

K. HUBNER

26

Page 35: Cern History

Fifty years of research at CERN, from past to future: Theory

G. VenezianoCERN, Geneva, Switzerland and College de France, 75005 Paris, France

AbstractA personal account is given of how work at CERN-TH has evolved during thepast 50 years, together with some personal thoughts on how it may develop inthe near future. No serious attempt is made towards achieving completeness,objectivity, or historical precision. For a more systematic — and somewhatcomplementary — study of CERN-TH (at age 40), the account by J. Iliopoulosis highly recommended.

1 The pastThe past 50 years were ones of remarkable achievements in our understanding of nature at its deepestlevel and, in particular, a truly revolutionary epoch in theoretical physics. It was, in many respects, atwo-fold revolution:

– in our understanding of the physical world, and– in our research tools and work style.

I shall discuss these two aspects in turn.

1.1 Revolutions in our understandingIn terms of novelty, this half-century was second only to its predecessor, the one that shook the worldof physics at the beginning of the last century through two profound conceptual revolutions: QuantumMechanics and Special Relativity.

These revolutions came, almost simultaneously (as exemplified by two of Einstein’s famous annusmirabilis papers), as two conceptually unrelated developments. For the high-energy theorist at CERN —and elsewhere — much of what has happened since had to do with finding a consistent framework wherethese two new paradigms could happily coexist . . . and that can possibly explain the data. A combinationof ingenious experiments (see Daniel Treille’s talk), technical developments (accelerators, computing,see talks given by Kurt Hubner and David Williams), and theoretically sound ideas, made all this anincredibly successful story.

I shall try to retrace the developments of our understanding of elementary particles and fundamen-tal forces as seen through the eyes of a theorist at CERN1. The world of high-energy physics, and partic-ularly of theory, has no geographical or political barriers. This makes it impossible to isolate CERN-THactivities from what happened world-wide. For this reason I shall occasionally mention developmentsand episodes that took place elsewhere.

I was lucky enough to be active — and at the beginning of my career — by the mid-1960s, the startof a ‘golden decade’ during which our theoretical ideas truly underwent a ‘phase transition’.

I shall insert this crucial decade inside a broader picture and also mention other remarkable the-oretical (if not yet experimental) revolutions that took place in the more recent past. I shall spend com-paratively little time on that recent history, since on the one hand it is better known to most of you and,

1Actually, I joined CERN only in 1976. But, even during the previous decade, I visited TH at regular intervals in or aroundthe summer (basically every year from 1967 to 1974). This allowed me to take the ‘temperature’ of CERN-TH at regularintervals and to feel its evolution from year to year.

27

Page 36: Cern History

on the other, it has not survived (yet!) the test of time. I shall divide this part of my account into fourperiods:

– 1900–1954: B.C. (meaning ‘Before CERN’);– 1954–1974: Genesis of the Standard Model;– 1974–1984: Consolidation of the Standard Model and first steps beyond it;– 1984–2004: Leaps forward.

I shall then come to other aspects of the way CERN-TH has evolved (tools, style) and concludewith some thoughts — and worries — about where we may be heading.

In Fig.1 I have tried to represent how our understanding of the four known basic interactions(electromagnetic, weak, strong, gravitational) has evolved with time. A characteristic feature of thissketch is that the horizontal dividing lines have been progressively removed. As early as the 19th century,Maxwell erased the separation between electricity and magnetism. In spite of appearances, electric andmagnetic phenomena are different manifestations of one and the same physical principle as seen indifferent reference frames. This ‘theory merging’ process continued in the 20th century, partly becauseof the theorists’ quest for simplicity, but even more so under the push of experimental discoveries. Forceswhose macroscopic manifestation is completely different (strong and weak, for instance) turned out tobe described, at the microscopic level, by very similar theoretical concepts, those at the basis of gaugefield theories.

At the beginning of the 20th century only the two long-range forces, electromagnetism and grav-ity, were known. Later on, the weak and strong forces, both characterized by their short range, werediscovered. In spite of the above range-wise distinction, theoretical progress has proceeded indepen-dently, during many decades, for gravity and for the other three forces. For this reason, I shall start myjourney by ‘getting rid’ of the bottom part of Fig. 1, only to come back to it later.

1.1.1 Gravitation and cosmology (1915–1974)Ten years after introducing special relativity, Einstein, starting from the Galilean universality of freefall, proposed his theory of General Relativity, in which gravity is seen as a manifestation of space–timecurvature. At the beginning his proposal caused much scepticism. But just a few years later some strikingtests (deflection of light, precession of Mercury’s perihelion) convinced the physics community.

Not much later, the first cosmological applications of General Relativity were made by AlexandreFriedmann (1922); these had their confirmation with the discovery, in 1929, of the red shift and Hubble’slaw and with their interpretation in terms of an expanding Universe. This eventually led (1935) to thestandard (Friedmann–Lemaıtre–Robertson–Walker) hot big bang model, itself confirmed by the discov-ery of the cosmic microwave background and applied successfully to the production of light elements(big bang nucleosynthesis).

During all those years, in spite of its mathematical beauty and experimental successes, GeneralRelativity attracted very little interest, if any, inside the high-energy physics community and at CERN-TH: after all, everybody knew that gravity is irrelevant for elementary-particle physics! Furthermore, notmuch effort was made to go beyond the classical theory: for the macroscopic world it was aiming at,classical General Relativity looked entirely sufficient and satisfactory.

1.1.2 B.C.: Before CERN! (1900–1954)These were very important years for the advancement of theoretical and experimental physics. On the onehand, quantum mechanics was formulated in precise mathematical terms by Heisenberg, Schroedinger,Dirac and others (even if its interpretation remained puzzling2), and on the other, the weak and strong

2The most significant breakthrough in this area was the work done by John Bell at CERN (1964–1966), the celebrated Bellinequalities, which, after the experiments were carried out, would have probably given CERN-TH its first Nobel prize.

2

G. VENEZIANO

28

Page 37: Cern History

interactions were discovered and their theory started to be developed.

It is interesting that the theories of the latter two interactions, in spite of their vastly differentphenomenology, have followed parallel histories from the beginning till they found their final placewithin the Standard Model. Indeed, Fermi’s theory of the weak interactions dates back to 1934, whileYukawa’s model of the strong interactions, with his postulate of the existence of a meson, later called thepion, came just one year later.

Right after World War II (Shelter Island, June 1947), Quantum Field Theory (QFT) had its firstrealization with QED, the first example of a quantum-relativistic theory with precise predictions. Itsstriking successes in explaining the Lamb shift and (g − 2)e swiftly followed and consecrated QED asthe quantum theory of electromagnetism.

1.1.3 1954–1974: the golden decadesThese were the years during which the Standard Model of particle physics was conceived and developed.But, before theorists gave their full confidence to QFT once again, they had a moment of crisis, startingwithin QED itself.

Two problems indeed mitigated the enthusiasm generated by QED’s smashing successes:

– The apparent divergence of perturbation theory. But, even more,– the renormalization group evolution of the fine-structure constant and the associated so-called

zero-charge problem. L. Landau has been quoted as saying, in 1955: “The Lagrangian is dead. Itshould be buried, with all due honours of course.”

These problems, together with those encountered in providing a full quantum version of either theFermi or Yukawa theory, caused a certain uneasiness with QFT within the HEP community: S-matrixtheory gained in popularity, particularly for strong interactions under the flag of G. Chew’s bootstrapprogram. Within that S-matrix climate, new ideas on how to handle the strong interaction sprang out.

However, before discussing those, let us blow up the central part of Fig. 1 (as shown in Fig. 2)in order to summarize the highly non-trivial ‘road map’ that, through many false starts and partiallysuccessful attempts, eventually led to the Standard Model as we know it today.

Again, developments in the weak and strong interactions went hand in hand. In the weak interac-tions the discoveries of strange particles (1953–1955) and of parity non-conservation (1956) paved theway to the first improvement of Fermi’s theory, based on the current–current (JJ) interaction. The twocurrents were soon identified (1957) as being just a precise combination of vector and axial-vector cur-rents, leading to maximal parity violation (only one helicity carrying the weak interaction). The vectorand axial-vector currents were understood to be conserved (CVC, 1958) and partially conserved (PCAC,1961), respectively, and to be ‘rotated’, in what we now call flavour space, by the Cabibbo angle (aCERN-TH paper, 1963).

Interestingly, the hadronic pieces of these same currents were also playing an important role instrong interactions. They were at the basis of the isospin and (flavour) SU(3) symmetries that successfullydescribed the hadronic spectra (see Gell-Mann–Ne’eman’s ‘eightfold way’, 1961). The hypothesis ofspontaneous breaking of chiral symmetry led, through PCAC, to a successful theory of the masses andinteractions of the pions (and of other light pseudoscalars) as pseudo-Nambu–Goldstone bosons. Thesehadronic currents satisfied the so-called current algebra, and theorists got busy trying to ‘saturate’ currentalgebra relations through families of hadronic states (see the program of S. Fubini and collaborators).The growing number of (almost daily!) discovered resonances got neatly classified through T. Regge’stheory (1959), which, thanks to the work of G. Chew and S. Mandelstam, became a theory of high-energyscattering. The description of multiparticle production at high energy through the multiperipheral model(Amati–Fubini–Stanghellini, early 1960s) was another important achievement in that same direction.

3

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THEORY

29

Page 38: Cern History

In the next decade (1964–1974) both areas blossomed. In the weak interactions we may recall thediscovery of CP violation (1964) and the seminal papers of Glashow, Salam and Weinberg which did notreceive a lot of attention at the time. The important Bell–Jackiw anomaly paper came out of CERN-THin 1969, while the GIM mechanism (1970) was successfully confirmed by the J/Ψ discovery in 1974.Meanwhile, in 1971, renormalizabilty of the GSW theory had been proved by Gerard ’t Hooft, and thisquickly led to the establishment of the present electroweak theory.

Let me briefly recall how I lived that crucial moment, as a young post-doc at MIT, through a visitof ’t Hooft and a lunch seminar by Steven Weinberg. Weinberg had just joined the MIT faculty in thenew Center for Theoretical Physics led by V. Weisskopf. Before ’t Hooft’s paper on the renormaliz-ability of spontaneously broken gauge theories, Weinberg would rarely talk about his 1967 model. Hewas working mainly on chiral Lagrangians and even got interested in the dual-resonance model (seea paper written with M. Ademollo and myself). When ’t Hooft’s article appeared, Steve immediatelygrasped its importance, invited Gerard to MIT for a seminar (and tough private discussions), and then hehimself gave one of the traditional lunch (journal club) seminars. After filling the blackboard with theelectroweak Lagrangian, he added something like: “This is not a particularly pretty Lagrangian, neitheris it clear whether it has anything to do with the real world, but it seems to define a consistent quantumfield theory of the electroweak interactions from which definite predictions can be drawn”. It marked thebeginning of a new era in particle physics, that of electroweak unification.

Parallel striking developments were going on in the strong-interaction domain. Current algebraand flavour symmetries were nicely described, mathematically, in terms of quark fields. Quarks couldalso explain some puzzle with Fermi statistics, at the price of attributing to them, besides the usual‘flavour’ label, one called ‘colour’. Gradually, these mathematical entities acquired respectability astruly important degrees of freedom in hadronic physics, and colour was transformed from a book-keepingdevice into a dynamical concept, the source of a field that is responsible for the unusual behaviour ofquarks. Indeed, processes like e+ e−→ hadrons (at Cambridge’s electron accelerator), of deep-inelasticlepton–hadron scattering (at SLAC), and of hard hadron–hadron scattering (at CERN’s ISR), were allshowing that quarks behaved as almost free particles, and yet nobody, in spite of many searches, hadever seen a free quark!

Before I go more into that story let me mention that important developments also occurred in theS-matrix approach to strong interactions, such as those that gave, out of the CA programme and Reggetheory, superconvergence (1966), finite-energy sum rules, duality (1967), and, eventually, dual-resonancemodels and string theory.

I can add a couple of personal stories on that (strong-interaction) side. In the summer of 1967, Iwas attending the Erice summer school: there were plenty of interesting talks but two sentences by M.Gell-Mann particularly stuck in my mind:

– “Nature only reads books in free field theory”;– Thanks to the newly discovered Dolen–Horn–Schmit duality, the possibility of a ‘cheap bootstrap’

program (as opposed to Chew’s expensive one!) had emerged.

From the autumn of 1967 to the summer of 1968 a Harvard (M. Ademollo)–Weizmann Institute(H. Rubinstein, M. Virasoro and myself) collaboration applied the ‘cheap bootstrap’ idea to the reactionππ → πω, with very encouraging results. In the summer of 1968, while at CERN on the way to my firstpost-doc at MIT, I completed a paper I had begun just before leaving Israel. After discussing with severalpeople at CERN-TH3 I submitted it to Nuovo Cimento. By the time of the Vienna Conference (August–September) it had already received much attention and was soon named the dual-resonance model: noone knew it at the time, but string theory was born!

3I wish to stress here how important CERN-TH was, and still is, as a reference point, particularly for European theorists: aplace where they can find experts in all branches of high-energy theory, check their ideas, learn new things, and get inspiration.This, of course, besides its role in supporting and guiding CERN’s experimental programme.

4

G. VENEZIANO

30

Page 39: Cern History

However, this development had a hard time being accepted by the establishment: it was neitherquantum field theory nor a pure S-matrix bootstrap a la Chew. A sentence by Fubini (around 1970)expressed well the feelings of the day: “A piece of 21st century physics that fell accidentally in thiscentury”. A couple of years later, the string reinterpretation of the dual-resonance model did not makeits magic properties less mysterious, but at least made the whole game more ‘respectable’. Even then,the community’s attitude towards it was very polarized: love it or hate it!

A third, crucial episode came for me at a seminar at CERN-TH by Sid Coleman in 1973. I re-member its title very well:

The price of asymptotic freedom

and its abstract:

Non-Abelian gauge theories.

One finally did have an answer to Gell-Mann’s puzzle: Nature is reading books in asymptoticallyfree quantum field theory! At the same time, non-Abelian gauge theories could explain the absence offree quarks by the highly non-trivial behaviour of the strong force at large distances and could solve Lan-dau’s zero-charge problem. Another pillar had been added to our understanding of elementary particlesand their non-gravitational interactions, and the Standard Model was now complete!

What about string theory? Well, it had to concede defeat. Yet, for many of us, it was difficult togive up such an elegant construction. I still felt that there was something deep and unique in the waystring theory was reorganizing perturbation theory diagrams, something very unlike what any normalquantum field theory does. I kept working on a ‘planar bootstrap’ and a topological expansion. Soonafter, however, QCD naturally provided its own way of achieving all that. This came from ’t Hooft’s1974 CERN paper, where he showed that, in a suitable large-N limit, QCD diagrams get reorganizedprecisely as to mimic a string theory! That was enough for me (and, I guess, for most string theoristsof the time). Those interested in strong interactions (sadly?) turned a page and switched to working onQCD. In just a few years, quantum field theory had made a spectacular comeback and it was the turn ofS-matrix theory to fall in to disgrace.

Meanwhile, in a 1974 paper that went practically unnoticed, J. Sherk and J. Schwarz proposedreinterpreting string theory as a theory of quantum gravity . . .

1.1.4 1974–1984: Standard Model confirmations and attempts to go beyondThe following decade can be characterized as a period of consolidation and experimental verification ofthe theoretical ideas that had just blossomed.

On the experimental side (see Daniel Treille’s talk for details) let me recall, besides the alreadymentioned J/Ψ discovery, two of CERN’s most glorious achievements, the discovery of neutral currentsand that of the W and Z bosons.

On the theoretical side, Cabibbo’s theory was extended to the CKM matrix, naturally incorporatinga CP-violating parameter in the Standard Model. In 1974 Ken Wilson proposed lattice gauge theory, andnot much later the first numerical proofs of confinement in QCD came about. Perturbative QCD wasgreatly developed justifying a QCD-corrected version of Feynman’s parton model; quark and gluonjets, as well as their shapes, were predicted and found. Instantons were discovered, offering an elegantsolution to the U(1) problem, while resurrecting the one of a strong CP violation (for which a solutionwithin the Standard Model is still lacking). Various expansions of the 1/N kind were proposed, whichbridged different approaches to hard and soft strong-interaction physics.

This decade also witnessed the first bold attempts to go beyond the Standard Model with the firstGrand Unified Theories (GUTs), the prediction of a long but finite proton lifetime, and the unsuccessful

5

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THEORY

31

Page 40: Cern History

search for its decay in non-accelerator experiments. Supersymmetry (SUSY) was invented and devel-oped4. It was then generalized into a new theory of gravity, called supergravity, with the hope, soondashed, that some suitable version of it (e.g., D = 11 supergravity) could be consistently quantized.Theory was quickly moving to energy scales (GUT scale, gravity’s Planck scale) never considered be-fore to be of relevance to particle physics.

1.1.5 1984–2004: the leap forwardAnother personal story. In the autumn of 1984, Tom Banks arrived from the US for a short visit toCERN. As soon as we met he asked: “Have you heard about SO(32)?” I probably answered: “Whatdo you mean?” A few days later he gave a talk about a recent paper by M. Green and J. Schwarzwhich provided the first example of an anomaly-free superstring theory with chiral fermions, i.e., of atheory that could in principle contain both the Standard Model of particle physics and quantum gravity.It marked the beginning of an exciting decade in theoretical physics that we may characterize as ‘The(S)-matrix reloaded’.

The paper by Green and Schwarz became known as the first string revolution; string theory cameback into the spotlight as nothing less than a ‘Theory of Everything’ (TOE). Some beautiful early math-ematical developments led to over-optimistic claims that not only would one soon arrive at an experi-mentally viable TOE, but also that the choice would be uniquely determined by theoretical consistency.As time went by, those hopes started to fade away but, in any case, the conceptual barrier betweengravitational and gauge interactions was gone for ever.

Also the attitude towards QFT changed. In the sixties and seventies QFT had been regarded asa complete framework. In the eighties theorists started to view it as an effective low-energy theory thatcannot be extrapolated above a certain energy scale without an ‘ultraviolet completion’, string theorybeing just one example of such a completion. In this optic, Landau’s zero-charge problem disappears,provided the ultraviolet cut-off is kept finite. I can recall at this point a brief encounter I had at CERNwith A. Sakharov in or around 1987. He had become very interested in string theory (see his Memoirs)and came to ask me about several issues that interested (or bothered?) him. He had a particularly burningquestion: Is the induced-gravity idea (Sakharov, 1968) borne out in string theory? I gave him a straightanswer: No! In string theory, gravity is already present at tree level, loops just renormalize Newton’sconstant by a finite amount (in Sakharov’s induced gravity, instead, the Newton constant is entirelydetermined by radiative corrections). Years later, after his death, I came back to the issue and concludedthat the induced-gravity idea is also possible in string theory and, actually, could be a very interestingoption . . .

I shall go quickly to the last decade (1994–2004) during which the so-called second string revolu-tion took place. It brought many new theoretical tools and ideas through the concept of (mem)branes andof new dualities interconnecting different string theories in various regimes. The number of consistentstring theories went down to a single one, called M-theory, but, unfortunately, the number of solutionswent up by many orders of magnitude, making the dream of uniqueness move further and further away.

These developments, however, reconciled much of the theory community — until then sharply di-vided into enthusiasts and violent critics — with string theory. The new ‘stringy’ techniques offered pow-erful tools for studying gauge theories in various regimes, as wonderfully exemplified by the Seiberg–Witten 1994 solution of N = 2 super-Yang–Mills theory. Not only: these developments paved the wayto the possibility that the extra dimensions of space envisaged by string theory may be ‘large’ or that ourown Universe is just a membrane immersed in this higher-dimensional space (brane-world scenarios). Inboth cases, new physical phenomena may be expected to show up at accessible energies (such as thoseto be reached at the Large Hadron Collider) and/or at short distances (e.g., in submillimetric deviations

4First discovered (in the West) in dual resonance models, it became a bona fide quantum field theory after the work of J.Wess and B. Zumino (CERN, 1974).

6

G. VENEZIANO

32

Page 41: Cern History

from Newton’s law).

There have also been quite spectacular achievements in explaining, through string theory, themicroscopic origin of Black Hole entropy and in establishing a holographic bridge between gravity andgauge theories. Hope is growing that, one day, we shall finally ‘close the circle’ by understandingwhat kind of strings describes the long-distance behaviour of QCD, and how theorists in the sixties andseventies were led to string theory by following an amazing ‘red herring’.

Finally, this decade brought the fields of particle physics and of astro-cosmo-particle physics closerthan ever before. Fruitful interactions between particle theory and cosmology had actually started muchearlier, in particular with theories of baryogenesis based on realizing, within the Standard Model or itsextensions, Sakharov’s famous three criteria, as well as with the theory of inflation, developed by particletheorists in the eighties. At the same time, experiments in astrophysics and cosmology made a quantumjump in precision and led to striking discoveries, some of which were totally unexpected (dark matter,dark energy, ultra-high-energy cosmic rays).

Non-accelerator experiments gave another striking piece of news: neutrino masses and oscilla-tions discovered in both solar- and atmospheric-neutrino experiments. Although a slight extension ofthe Standard Model can accommodate these phenomena, and theorists had largely anticipated this pos-sibility, they were caught by surprise as far as the neutrino-mixing pattern appears to be realized. Thus,astrophysical and cosmological data appear to offer the much-sought-after crises that theorists need tomake progress (quoting S. Weinberg’s “physics thrives in crises”).

1.2 Revolutions in our work’s tools and styleThe tools theorists use have greatly evolved during the last 50 years. In particular:

– Mathematical toolsThe level of mathematical sophistication needed in order to take part in (or just to follow) sometheoretical developments has reached unprecedented levels. This is particularly true of string the-ory, an area of research that not only exploits the most advanced mathematics but that has alsocontributed to its development (see E. Witten’s Fields Medal). This is reflected in the mathe-matical background of the average young theorist today (no comment on his/her average physicsbackground!).

– Computing toolsI still remember the computing facilities I had at my disposal when I did my Master’s thesis inFlorence in 1965: punching cards to compute a one-dimensional integral on a big IBM machinetook a substantial chunk of my time. Nowadays I can do that integral in a fraction of a second onmy laptop! Lattice calculations, for instance, could not have been conceived of at that time. Theystill rely today on the exponential growth of computing power.

As with our tools, our style of research has changed in many respects:

– SpecializationIn the early days of my career I could basically follow all that was happening in high-energytheory: there were fewer areas and they were closer to each other, the number of new papers permonth was manageable. Today, every field is very specialized and trying to follow what is going on(or even looking at what is new on the archives) requires a big time-consuming effort. Languagebarriers across different areas of research are often another obstacle and people do not seem to tryto make the effort to be understood beyond their inner circle. Perhaps as a result, I find discussionsat TH seminars less frequent and interesting than they used to be 30 years ago.

– Communication, search toolsAgain a huge revolution has taken place. I remember when, during the Weizmann–Harvard col-laboration, we were sending long letters full of calculations over the Atlantic to get an answer, at

7

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THEORY

33

Page 42: Cern History

best, a couple of weeks later (very exceptionally, we would dare to make a short phone call). Sincethen, we have learned to take advantage of:

– fax,– e-mail,– electronic archives,– the Web, and its search engines.

Unfortunately all these goodies come with some side-effects:– Mathematics

Warning to theorists: “Excessive use of mathematics can be dangerous for your health”, i.e.,there can be a danger of becoming so addicted to mathematics that one forgets about theoriginal physics goals.

– ComputersThere can be a tendency to lean too much on numerical computations, without trying eachtime to understand what exactly one has put in and what exactly the output means. There isa too widespread attitude to use the computer as an all-encompassing black box.

– Electronic archivesThese are very useful indeed. However, they appear to encourage quick writing and evenquicker reading of papers. The flood of papers is such that most theorists use an ‘on-linetrigger’ (like our experimental friends) to select what to read (or at least to glance at).

2 The future2.1 New theoretical ideasThe past 30 years have not been stingy in terms of new theoretical ideas. Quite the contrary: theoristshave proved their great imagination again and again. The problem, if any, was elsewhere: the lack ofexperimental checks on whether any of those ideas had much to do with real life! I would say that, ingeneral, the relative status of Theory and Experiments determines to a large extent the way we makeprogress in physics. We can indeed consider, roughly, three cases:

– Experiment (much) ahead of TheoryExamples are the whole of particle physics of the 1950s and 1960s, with the exception of QED.More recently, the nature and origin of the CKM matrix, astro- and cosmo-particle physics.I would characterize this situation as being tough (for theorists), but healthy and challenging.

– Experiment and Theory at a similar levelExamples are QED, HEP in the 1970s and 1980s, the Standard Model facing LEP, HERA, andTevratron data.I would call this situation a perfect fit, although it may become too perfect, and a bit boring, whenexperiments keep finding no disagreement with theory. Finally:

– Theory much ahead of ExperimentAlso here there is no lack of examples: the beginning of General Relativity, much of high-energytheory in the last two decades, with supersymmetry, superstrings and the like.I would characterize this case as being easy but dangerous, always for the theorists, of course.

In other words: very precise experimental data mean little without a theory matching that preci-sion, while theoretical research, without the guide of experiments, tends to make random walks. Possibly,while waiting for great news from the LHC, theory should appeal more to astro- and cosmo-particle datawhere the situation looks much like that of particle physics in the 1950s and 1960s: lots of good data andvery little understanding (see dark matter, dark energy, cosmic microwave background, ultra-high-energycosmic rays, gamma-ray bursts . . . ).

8

G. VENEZIANO

34

Page 43: Cern History

2.2 New tools, new stylesComputing power is going to grow, allowing us to tackle problems we have not been able to deal within the past (e.g., in lattice gauge theories and numerical relativity). Means of communication will alsoimprove offering easier and easier access to information.

But we have to watch the side-effects . . . particularly on the youngest generation!

What could be at stake?

– Free choice of researchThe job market constrains more and more the choices young people have on the kind of researchthey want to carry out.

– Embarking on risky projectsThe system penalizes projects that go out of the mainstream. Even if one achieves some niceresults there, chances are that the community will realize it too late . . .

– Enlarging and deepening one’s knowledgeThe pressure to produce is so high that little time is left for broadening one’s culture beyond a verylimited range of topics.

– Interactions between different areas of theory and with experimentsAs people concentrate more and more on a narrow area, interactions between the various groupsof theorists, and with experimentalists, become harder and harder to maintain.

Dangers exist but, if we are aware of them, they can be overcome. Another golden era, I am prettysure, is just around the corner . . .

BibliographyJ. Iliopoulos, Physics in the CERN TH Division, report in the CERN History Series, CHS-39 (December1993).

9

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: THEORY

35

Page 44: Cern History

Maxwell

Einstein (1916)Newton

Yukawa(1935)

Fermi(1934)

Shelter Isl.(‘47)EM

Weak

Strong

Gravity Spec

ial R

elat

ivit

y, Q

uant

um M

echa

nics

’ 54 ‘ 741900

SU(2

) LxU

(1)

S-S-mtrxmtrxdualityStrings

QFTQFTColourQCD,AF

SU(2)SU(3)CAPCACSSB

V-ACVC

GWSGIMSSB

SU(3

) C

’ 84

SUPE

RSTR

INGS

?

GUTs

?

SU

SY?

SUGRA?

(g-2),triviality

Cosmology..

QED(‘49)

Strings

’ 04CERNB.C.

Fig. 1

Yukawa(1935)

Fermi(1934)Weak

StrongS-MatrixS-Matrix

Sum rules, Nuclear democracy,Superconvergence, FESR, dualitybootstrap, DRM, String Theory

QFTQFTNaïve quark model, Colour, FreeQFT? Quarks are for real, AF&Conf., QCD, Lattice(‘74),1/N(‘74)

SU(2)I --> SU(4)SU(3)F --> SU(6)Current AlgebraSSB (global)PS-mesons asQNGB’s (‘61)Regge(‘59), MP(‘62)

Strangeness(‘53-’55) P (‘56), JJ int.V-A (‘57), CVC (‘58),PCAC(‘61)Cabibbo (‘63)

CP (‘64)GSW(‘67), ABJ(‘69)GIM(‘70) --> J/ψ (‘74)SSB (local)Renormalizability of GSW(‘71)

1954-1974

Fig. 2

10

G. VENEZIANO

36

Page 45: Cern History

Fifty years of research at CERN, from past to future: Experiment

D. Treille CERN, Geneva, Switzerland

Abstract An overview of fifty years of physics at CERN is given.

1 Introduction I was asked to give a personal, selective, and informal review of half a century of physics at CERN. Actually I shall try to be as impersonal and objective as I can, and to cover as many of the important topics as possible. Nevertheless, there is no way to do justice to every actor and I apologize in advance for omissions and a minimal selectivity in my choices of topics. I shall be informal: by this I mean that I shall evoke the shadows as well as the lights.

In the preparation of this talk, I borrowed a lot from several colleagues, in particular the authors of the 25 and 50 years Physics Reports [1, 2], and from various historical accounts already published [3].

My own vision of CERN was that of an outsider before 1968, of an insider afterwards. I started my career in LAL Orsay, working at an electron linac on nucleon form factors which had been discovered in 1956 in Stanford (Nobel Prize awarded to R. Hofstadter in 1961) and on pion photoproduction, heavy lepton search, etc. Orsay exploited the first e+e– ring, AdA, coming from Frascati. Then the ACO e+e– ring allowed a detailed study of the vector bosons ρ, ω and φ. At the time ACO had to run flat out to reach the φ. A further ring of higher energy, COPPELIA, was considered, but the project did not go through, and the DCI ring was built instead. By that time, a group from LAL, led by P. Lehmann, decided to start an experimental programme at CERN and I joined it.

From these years of apprenticeship, I had learned that electron scattering and e+e– physics could be of high cleanliness and quality, that important discoveries may be made elsewhere than in your own laboratory, and that an e+e– ring has never enough centre-of-mass energy.

It was also clear that a cultural gap seemed to exist between electron and hadron communities. I am pleased to see that this seems to be over. The usual statement that a hadron collider is a discovery machine and an electron collider a measurement machine is a loose one, not supported by history. Electron-proton or e+e– collisions have revealed the structure of the proton, deep inelastic scattering (DIS), the J/Ψ, the τ, charm, jets, gluon. It has now been realized that an e+e– collider is a threshold machine. If the available energy √s is below the interesting one you miss the possibility of a direct observation. The series of ‘just missed’ is long: ADONE trauma (J/Ψ, τ), DORIS (Υ), TRISTAN (Z0), maybe LEP200 (h0). Obtaining relevant indirect information is not guaranteed either. For instance, SUSY at LEP: in spite of all the accurate electroweak measurements performed, we still do not know whether SUSY, even light, is a reality or not. On the contrary, a hadron machine is not a threshold one but offers broadband beams of partons with tails at high energy, as Fig. 1 illustrates. One can, within some limits, trade energy against luminosity. A hadron collider offers a high potential for discovery, provided it is equipped with the right detectors. An e+e– machine is ideal for accurate measurements and detection of difficult final states, for example, invisible decays.

37

Page 46: Cern History

Fig 1: The scenery of hadronic and leptonic collisions (courtesy U. Amaldi)

2 Synoptic view of fifty years of particle physics Table 1 below presents in chronological order some of the most important steps and innovations in four different sectors: theoretical physics, experimental physics, machines, and detectors. The ‘ideal’ physicist is the one who, at a given time, thinks in the proper terms, for instance after 1964 in terms of quarks and after 1973 in terms of the Standard Model (SM), goes to the most promising machine to perform experiments with the most efficient detectors.

A comment: it was always a great pleasure and help for me to be able to visualize the physical event: scanning of bubble chamber pictures, Ω optical, UA1 Megatek, DELPHI or ALEPH event display. Do the trends for the future keep that possibility?

Let me add some personal remarks:

In CERN scientific life, there were several very intense and euphoric periods: the era of bubble chamber physics, culminating with the discovery of Neutral Currents, the harvest of results from the SPS, the W and Z discovery, LEP1 and LEP200, etc. No doubt the LHC will be another one.

But we also went through some periods which, although they may have been intellectually stimulating, were quite depressing as seen from CERN: for instance, during the ‘1974 Revolution’ which followed the discovery of the J/Ψ in the United States, one had the feeling that ‘life was elsewhere’. Personally I also consider LEP200 as an unfinished symphony, as I shall illustrate later.

For long, CERN had a diversified physics programme. Most physicists had several ‘irons in the fire’, analysing an experiment, running another one, preparing a third. Currently, because of much longer time scales and in spite of the immensity of the task, there is a risk of monoculture and boredom. Could one conceive that young physicists share (unequally) their activity between the main line and a ‘small’ stimulating one?

D. TREILLE

38

Page 47: Cern History

Table 1: Some of the most important steps and innovations in fifty years of particle physics

Theory Physics Machines Detectors 1945 46 Gamow: Big Bang

48–49 Feynman, Schwinger, etc., QED

47 π− 47 Lamb 47–53 Strange particles

45 Synchr. McMillan 47–50

Scint. count.

1950 53 Gell-Mann (V particles) 54 CPT theorem

051 Λ, , −Κ Ξ 53 νe Reines

53 BC (Glaser)

1955 58 Prentki–d’Espagnat 59 Regge

55 Antiproton 56 P violation

55 GeV e Stanford 57 SC 58 first g – 2 59 PS

59 30 cm BC

1960 61 Goldstone, Salam, Glashow Gell-Mann (8-fold way)

63 Cabibbo theory 64 Quarks: Gell-Mann, Zweig 64 Higgs mechanism 64 Bell inequalities

63 -Σ Λ parity 64 Ω 64 CP violation

61 AGS 63 First muon ring 64 ISOLDE

61 81 cm BC 64 2 m BC

1965 66 Greenberg: colour 67 Sakharov conditions 67 Weinberg: ew Lagrangian 68 Salam: ew Lagrangian 68 Veneziano’s model 69 Bjorken scaling, partons

69 DIS

67 Budker e-cool. 69 Second muon ring

68 MWPC

1970 70 GIM mechanism 71–72 ’t Hooft renorm. 73 Asympt. freedom

Kobayashi–Maskawa 73 QCD+SM 73 Quark counting 73–74 Wess–Zumino

72 ISR: hard coll. 73 Neutral Currents 73 ISR: σpp increases 74 J/Ψ

71 ISR 74 Stoch. cool.

proven in ISR

70 Gargamelle 72 OMEGA 73 Split Field 73 BEBC

1975 77 Altarelli–Parisi 77 Peccei–Quinn 79 LEP Summer St.

75–77 τ 75 jets in e+e–

76 charm 77 Y 78 PV at SLAC

BEBC, .. :scal. viol. 79 Gluon at Petra

76 Start of SPS 76 pp idea 77 ICE

78 † Gargamelle 78 ν beam-dump

1980 Phenomenology of SUSY

Supergravity

81 Beauty at CLEO 82 Jets in hard. coll. 83 W, Z discovery 83 EMC effect

81 pp collider 82 LHC feas. st. 83 LEAR

81 Si microstrips RCBC

84 search for ν osc.

1985 87 La Thuile workshop

86–87 Bs mixing,

UA1, Argus 88 Proton spin crisis 89 LEP: 3 ν

86 AC 89 LEP

85 End BC 89 LEP detectors

1990 90 Large ED LHC Aachen Workshop

90 low-β UA2

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

39

Page 48: Cern History

3 The mutations of the detector methods One should remember that the ancestors of modern detectors are not so far away. While the Wilson chamber was born in 1911 and the Geiger–Muller counter in 1928, one had to wait until the years 1947–50 to see the first scintillators, the Bubble Chamber (BC) was invented in 1953, the MultiWire Proportional Chamber (MWPC) in 1968.

The first two decades of CERN life were the reign of bubble chambers. While the first was partly apprenticeship, the second decade saw the successes of the liquid hydrogen (LH) and heavy liquid (HL) large chambers. By the end of the life of this technique, rapid cycling BCs were successfully used for charm physics. The methods of analysis of their pictures were evolving in parallel and would deserve a paper by themselves [4].

Triggered detectors then entered the game. From 1949 to 1959 spark chambers (SC) were developed as tracking devices. The first massive use of SCs was at BNL in 1962, when evidence for the νμ was found. The various readout methods changed from film to filmless ones: acoustic, vidicon, wires. The OMEGA optical spectrometer was operated in 1972. In 1964 appeared the first online computers.

1968 saw the ‘revolution’ of G. Charpak, inventor of the MultiWire Proportional Chamber (MWPC), which then led to the drift chamber, the multistep chambers, a series culminating with the time projection chamber. The first large-scale use of MWPC occurred in 1971: the CERN–Heidelberg experiment, by J. Steinberger et al., at the PS on CP violation and the Split Field Magnet, suggested by J. Steinberger and used by G. Charpak, A. Minten, P.G. Innocenti et al. (a cost of 23 MSF, unfortunately the detector was blind around 90º).

Cherenkov detectors, from threshold devices and focusing ones (DISC) to the Ring Imaging technique (RICH), have played a key role in particle identification. The RICH of T. Ypsilantis and J. Seguinot, after being used during the LEP era (DELPHI, SLD, heavy ions), seems to have a promising future ahead, an example being LHCb.

Concerning calorimetry, important breakthroughs were the understanding and exploitation of the compensation mechanism and the move towards pointing geometries, from the ‘Spaghetti’ to the ATLAS ‘Accordeon’. Liquid argon and scintillating crystals are still quite fashionable.

Hybrid systems, from BEBC to the European Hybrid Spectrometer, have played an important role, in particular for charm physics.

Currently the ongoing revolution is the rise of silicon detectors. While the goal was flavour tag in the LEP era, one is now planning to use them as main tracker. This represents a big step, from the square metre typical of a LEP microvertex to hundreds of square metre, as for the CMS tracker at the LHC. Clearly this progress was made possible by various breakthroughs occurring in microelectronics, in particular the advent of submicron processes, intrinsically radiation hard, a very welcome feature for LHC experiments.

In all these mutations the crucial role of detector R&D (in particular the strong DRD programme undertaken to learn how to exploit the full luminosity of LHC) is obvious. It is clear as well that, if we want to progress further, some R&D has to continue. Whatever the physics goal may be (SuperLHC, ILC, CLIC, future fixed targets), some of the requirements are similar. For an optimal tracking one needs real 3D information obtained from pixels, micro or macro, rather than from stereo microstrips, maybe vectorial information (position and direction), thinner, faster, still harder and cheaper detectors. As for calorimetry, one is still far from a completely satisfactory design regarding its granularity and the integration of the electromagnetic and hadronic parts.

D. TREILLE

40

Page 49: Cern History

Figures 2 and 3 below give an idea of the evolution with time of the various classes of detectors. What is plotted is simply the number of entries in the search engine Spires whose title refers to a given technique. The fall of the curves does not imply that the technique is no longer used, but means that it no longer deserves a publication by itself.

Fig. 2: Evolution with time of the various classes of detectors

Fig. 3: More details on silicon detectors

4 The bubble chamber era After the Second World War, the situation in the US and in Europe was quite different, politically and technically. Europe started with a period of ‘every man for himself’. The various countries and CERN had to learn the techniques required and how to collaborate.

In 1953 the Bubble Chamber (BC) was invented by Glaser. L. Alvarez, in Berkeley, soon started to play a leading role in its development. CERN’s interest in this technique started in 1955.

In 1956 many realizations and projects already existed around the world and the LHBC (liquid hydrogen) programme was adopted at CERN.

In 1955–56 J. Steinberger was developing propane chambers at Columbia University. In 1958 the HBC of 10 cm was in use at the synchrocyclotron, and lasted until 1960.

1959 saw the commissioning of the 72 inch chamber in Berkeley, recognized as the “most courageous of scientific decisions”.

From 1957 to 1960 CERN built the 30 cm LHBC, which was then in activity at the PS from 1960 to 1962. In 1960 the BP3 HLBC (heavy liquid) from Ecole Polytechnique in France was operated at CERN.

After 1960, having reached its maturity, the European bubble chamber community was able to react rapidly to new challenges and to take the right decisions which led to its notorious successes. In 1961 the Track Chamber Division was created under C. Peyrou, its mission being to run the 30 cm BC, to build the 2 m one and to get hydrogen liquefiers.

In 1961, the 81 cm LHBC from CEA started operating at CERN. In 1961 the 1 m HLBC from Ecole Polytechnique in Paris came to CERN. The magnetic horn, invented by S. van der Meer and completed in 1962, led to a very welcome increase in neutrino fluxes.

By the end of 1963, a 2 m BC was being operated in Berkeley and Brookhaven. In January 1964 appeared R.P. Shutt’s proposal, telling that times were ripe to build very large chambers. The reaction

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

41

Page 50: Cern History

in Europe was fast. In February–April 1964 Gargamelle was defined (A. Lagarrigue). By mid-1964 the 152 cm British National HBC was put in service at CERN. In December 1964 the realization of a large LHBC was recommended, the Tripartite Technical Group was created. 1965 saw the start of operation of HBC 2 m (cost: 3 MSF, operated until 1977, 40 million pictures taken). At the end of 1965 came the decision to build Gargamelle, commissioned in December 1970. The construction of BEBC lasted from 1967 to 1972; in 1975 it was fully operational.

Given the initial situation described above, it is not surprising that the first important discoveries of the BC era were made elsewhere than at CERN.

Let us quote:

– At the LBL cyclotron in 1950: π 0→ 2γ (Panofsky, Steinberger)

– At other machines: 1953: BNL Cosmotron, 3.3 GeV proton synchrotron, exploited until 1966

1953: V0 events 1956: KL 1957: Σ0 1961: ρ 1955: LBL Bevatron, 6 GeV protons

1955: discovery of antiproton 1958: anti-Λ 1959 Ξ0 1960 anti-Σ0 1961: K*(892), Λ(1400), ω, η 1964: η' 1968: Nobel prize awarded to L. Alvarez

July 1960: AGS, 33 GeV 1964: discovery of Ω –, discovery of CP violation

Nevertheless one must recognize many important contributions to hadron spectroscopy from CERN in the first years. For instance in 1962, using K– at rest, the 81cm HBC allowed one to measure the relative Σ – Λ parity, found to be > 0, or the observation of the anti-Ξ (Fig. 4).

Later came the great successes of the big chambers, crowned by the discovery of Neutral Currents. Finally, an important role in charm physics was played by the last generation of small rapid-cycling chambers. Moreover, bubble chamber activities have gradually built the strong and diverse CERN technical expertise. It is from them that Europe and CERN found the way to collaborate.

Fig 4: A Σ–Λ event (left); the first Ξ–anti Ξ event (right)

D. TREILLE

42

Page 51: Cern History

Fig 5: C. Peyrou and BEBC

4.1 Early non-BC physics at the Synchrocyclotron [5]

Considering the π→eν channel, in 1955 J. Steinberger at Nevis Lab had set an upper limit on its branching ratio Re (< 0.6 10–4 ) and this started to be puzzling. However, in 1958, G. Fidecaro et al. got evidence for its existence and found Re = (1.22 ± 0.30)·10–4 . This was in agreement with the expected e–μ universality of the axial coupling and a cornerstone of our understanding of V–A interactions.

For the process μ→e γ, it was the reverse. Some evidence of its existence had been reported. However, in 1962, Conversi, di Lella, Rubbia et al. showed that the process was forbidden, both for real and virtual γ. The idea of lepton-flavour conservation was introduced.

Let us quote also the first observation of the pion β-decay π+ →π0e+ν.

Experiments on the g – 2 of the muon started at the SC in 1958. One should also mention the determination of the helicity of the muon from π → μ ν at the PS.

4.2 Neutral Currents (NC)

I borrowed much from several excellent chronicles about Neutral Currents [6–8].

From the 1.2 m HL chamber, there was an upper limit on the ratio of elastic NC to Charged Current (CC) events (< 0.03), published in 1964. This was simply a mistake uncovered by M. Paty in 1965.

After the Siena 1963 conference, A. Lagarrigue, A. Rousset, P. Musset worked out a proposal of a neutrino detector aiming at increasing the event rate by an order of magnitude. L. Leprince Ringuet called it Gargamelle. The team was composed of physicists from X-Orsay and of former members of the NPA 1 m BC. It included seven European laboratories and one guest laboratory.

In 1971–72 a viable theory of weak interactions existed and the question was clearly posed: are there NCs, yes or no? The competitors in the field were Gargamelle and the HPWF counter experiment at NAL.

Even in November 1968, at a Gargamelle meeting in Milan, the word NC was not pronounced. This problem had low priority and Deep Inelastic Scattering (DIS) was more exciting then.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

43

Page 52: Cern History

But in 1971 everything was ready. A careful classification of event types was devised. NC events looked like neutron star events without a muon. The goal was to separate neutrino-induced from neutron-induced stars.

In December 1972 one electron event was found at Aachen. The search for hadronic NC candidates went on. Candidates were observed, with a flat spatial distribution, and caused much excitement. However, counter arguments were put forward and one realized that the only way to conclude was to prove that the number of neutron-induced events was small. Monte Carlo simulation then showed its strength: a neutron background program with no free parameter was used, complemented by other approaches. It allowed the conclusion that the NC sample was not dominated by neutron stars (J. Fry and D. Haidt, CERN Report 75-01). At the Bonn 1973 conference, C.N. Yang announced the discovery of Neutral Currents.

However, by that time the HPWF signal had disappeared. This threw dismay and distrust on the whole matter, and one more year of work was needed to conclude. Finally, at the London 1974 conference, evidence for NC was confirmed by Gargamelle and HPWF.

This major discovery established the similarity in structure of the weak and electromagnetic interactions. The term ‘electroweak’ came into being. This was the experimental beginning of the SM. The MW was then predicted to be around 70 GeV.

Fig. 6: André Lagarrigue and visitors Fig. 7: André Lagarrigue

Fig. 8: Gargamelle Fig. 9: A neutral current event on electron

D. TREILLE

44

Page 53: Cern History

Fig. 10: S. van der Meer Fig.11: A. Salam with P. Musset (right)

Fig. 12: A. Salam receiving the Nobel Prize

4.3 Other important results from the bubble chambers

Besides Neutral Currents the large bubble chamber brought other most interesting results.

By comparing the structure functions of the nucleon derived from neutrino and muon scattering, Gargamelle proved that quarks have indeed a fractional electric charge (Fig. 13).

Gargamelle demonstrated that the neutrino–nucleon cross-section rises linearly with energy (Fig. 14).

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

45

Page 54: Cern History

Finally, by combining their results, BEBC and Gargamelle got the first evidence for scaling violation (Fig. 15), namely the proof that the nucleon structure functions actually evolve with the resolution power of the observation.

An amusing fact is illustrated by Fig. 16. At that time, the value found for the Weinberg angle was in agreement with non-supersymmetric Grand Unification. This predicted a rather low GU scale and a proton lifetime of 1030 years or so. This prompted programmes aimed at detecting proton decay, like Kamiokande. They did not find proton decay because the value of the Weinberg angle was actually incorrect. But they later discovered the non-zero neutrino mass via ν oscillations!

Fig. 13: The charges of quarks are fractional Fig. 14: The rise of the neutrino cross-section

Fig. 15: First evidence for scaling violation

D. TREILLE

46

Page 55: Cern History

Fig. 16: Ad augusta per angusta: the status of the mixing angle in 1981

5 The ISR The Intersecting Storage Rings (ISR) was the first hadron collider ever built. Following M. Jacob we distinguish:

– “A brilliant start”: 1971–74

An important early discovery at the ISR was the rising total proton–proton cross-section in 1971. The measurements led us to conclude that, while the proton size increases with energy, its profile stays the same.

Another major finding was the discovery of the large pT phenomena at the ISR in 1973 (Fig. 17): it demonstrated that partons were behaving as pointlike objects for the strong interaction as well as for the electromagnetic one.

The study of Diffractive Dissociation told how energy hadronizes, independently of the nature and energy of the colliding particles.

– “A somewhat difficult period”: 1975–77

The reason, obviously, is that several important discoveries (J/ψ, charm, beauty) within reach of the ISR were actually made in the US (see the synoptic table), in part because the ISR detectors were not adequate.

– “A very active and interesting programme”: 1978–83

Among other topics one can mention the studies of lepton pair production: J/ψ, Υ, dilepton masses up to 20 GeV, properties of the Drell–Yan process. However, charm and beauty semileptonic decays were ignored in these data analyses. Concerning the production of heavy quark bound states, like pp → χC , the results obtained at that time are still competitive.

The first clear mass peaks of charmed hadron production in hadronic interactions were observed in 1979, following several earlier indications from the measurements of single electron and lepton pairs. However, the charm cross-section determination turned out to be troublesome. The production of a beauty baryon at the ISR was announced but was for a long time a controversial matter.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

47

Page 56: Cern History

The production of direct photons was studied, from their discovery in 1979 to the end of the ISR (Fig. 18). These measurements played an important role in testing QCD. Gluon Compton scattering was found to be dominant, the sub-leading process being quark–antiquark annihilation. This fact allowed the determination of the gluon distribution inside the proton. The study of the two-photon final states was also performed. A comparison of the yields with antiproton–proton collisions would have been interesting, but unfortunately the luminosity was insufficient.

Concerning the evidence for jets in hadronic collisions, it started with a rather confusing period, both in fixed-target experiments and at the ISR. By 1982 the AFS and R110 experiments at the ISR had an appropriate acceptance to see jets. But the results of the proton–antiproton collider came at about the same time, and these ‘stole the limelight’, although different x-regions were probed in the two sets of measurements.

Whatever be the area of the physics programme, the machine worked magnificently: the ISR were a most efficient R&D laboratory for accelerator physics. In particular, it allowed the testing of the idea of stochastic cooling, key of later successes.

By the end of the ISR in 1983, M. Jacob, quoting Mark Antony, in Shakespeare’s Julius Caesar, said: “I come to bury Caesar not to praise him”, while V. Weisskopf argued that it does not matter where discoveries are made.

Nevertheless, the morale at CERN was low. In one of the pictures, V.Weisskopf seems to exhort us to turn to superior instances. That is what we did in a quite oecumenical way.

Fig. 17: The high pt signal at the ISR Fig. 18: Direct photons and dominant diagram

D. TREILLE

48

Page 57: Cern History

Fig. 19: L. Lederman discovering beauty Fig. 20: An exhortation of V. Weisskopf

Fig. 21:Visit of the Pope Fig. 22:Visit of the Dalai Lama

6 The proton–antiproton collider Actually the real cure came from the success of the proton–antiproton collider.

A look at the synoptic tables shows how fast were the decision-making and the realization of this programme. The collider was a great machine, largely due to the use of stochastic cooling. It fed two outstanding and quite innovative detectors. For the first time, concepts like hermeticity of a detector, redundancy of the procedures, etc., were fully taken into account. The figures below show the UA1 detector and a sketch of its powerful tracker.

As we all know, “la physique était au rendez-vous”. Besides the most celebrated discovery of the W and Z bosons, close to the predicted masses, the collider brought a clear picture of hard hadronic scattering [9].

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

49

Page 58: Cern History

Fig. 23: The UA1 detector Fig. 24: A sketch of the UA1 tracker

Fig. 25: First Z event in UA1 Fig. 26: Lego plots of a W and Z event

D. TREILLE

50

Page 59: Cern History

Fig. 27: From the discovery to the final UA2 results

Fig. 28: Evolution of the Z mass, from its prediction to LEP final results

Let me tell you more about hard hadronic scattering.

In 1981 the fixed-target NA5 experiment, with full azimuthal calorimetric coverage, did not see any jet structure. The situation in hadronic scattering was thus very different from the one of e+e– scattering in which a clear two-jet structure had been observed since 1975. Even after the first run of the collider the results were not decisive. However, in 1982, first UA2, then UA1 obtained clear, uncontroversial evidence for jets. UA2, for instance, had a full azimuthal and a 40° to 140º polar angle coverage.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

51

Page 60: Cern History

Many results then followed, among them:

– the determination of the angular distribution of parton–parton scattering,

– the determination of the proton structure functions, showing the role of gluons at small x,

– the production of direct photons, found to agree with NLO QCD,

– the measurement of the total pt of two-jet systems, agreeing also with QCD,

– the studies of multijet final states, of the W transverse momentum, of heavy flavour production (beauty), all behaving as expected.

The collider thus proved the physical reality of partons inside the proton and opened the door to quantitative tests of QCD.

Much progress has been obtained since then, in particular by the Tevatron proton–antiproton and HERA electron–proton colliders. What remains to be done, concerning QCD, in order to guarantee a proper search programme at the LHC, is impressive as well.

7 The SPS Programme The programme was carefully prepared in 1970–74 by a prospective activity on the SPS beam lines and on its physics programme, in particular at the ECFA Tirrenia Meetings.

7.1 Neutrinos

This programme has been one of the flagships of CERN activities. From its start in 1976 the SPS narrowband beam, up to 200 GeV, and wideband beam, 102 times more intense but peaked at low energy, fed, in addition to the Bubble Chambers BEBC and Gargamelle (ending in 1978), the giant spectrometers, CDHS and CHARM, and later CHORUS and NOMAD.

These experiments performed more and more refined measurements of scaling violation, of αS and of the nucleon’s gluon content. In conjunction with electron and muon DIS, they confirmed, as we saw, the fractional electric charge of u and d quarks (F2

eN = 5/18 F2νN). Since parity violation allows

neutrinos to distinguish quarks from antiquarks, their scattering provided the structure functions of the ‘sea’ quarks. Opposite-sign dimuons were exploited to get the structure function of the sea of strange quarks in the nucleon.

Later the neutrino programme included a beam-dump experiment, and finally moved to the search for oscillations, unfortunately in a domain of parameters in which, as we know now, they could not manifest themselves.

7.2 Muons

This programme has been active from 1978 to now. The most efficient and clean SPS muon beam (R. Clifft, N. Doble) first fed the NA2 (EMC) and NA4 (BCDMS) experiments, which took data until 1985. Afterwards, EMC became New (or Nuclear) MC then SpinMC. The programme is currently continuing with COMPASS.

Scaling violations, first seen in BC neutrino data, were studied in great detail using the Altarelli–Parisi formalism (1977). Various tests of perturbative QCD and a precise measurement of the strong coupling constant were performed.

In 1983 the EMC effect (Fig. 29) came as a surprise: the structure functions of the nucleon depend on the nucleus in which it is immersed. In 1988 another surprise was the proton spin crisis (Fig. 30), deduced from the study of spin-dependent structure functions. The total quark spin, contrary to expectations, is only a small fraction of the proton spin. These two effects are far from being explained

D. TREILLE

52

Page 61: Cern History

and other programmes, COMPASS, HERA polarized, HERMES, RHIC p–p polarized, will continue to shed light.

Fig. 29: The EMC effect

Fig. 30: The spin crisis

7.3 Hadron and photon beams

This very rich programme includes about 70 experiments extending over 25 years. The main themes of physics can be classified as follows:

– Hard scattering

– Heavy flavours

– Hadron spectroscopy

– Soft processes.

Hadron spectroscopy covers in particular the spectroscopy of light quarks, glueball searches, baryonium searches. Soft processes include studies of particle production, elastic scattering, low-mass μ+μ− pairs, soft photons, etc.

Forced to be selective, I have chosen to illustrate the first two items.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

53

Page 62: Cern History

7.4 Hard scattering

The main activities were the study of high-mass lepton pairs or the Drell–Yan mechanism. A highlight was the discovery of the K factor in NA3 (Fig. 31) and WA11 in 1982, namely direct evidence for higher-order QCD corrections. Being essentially a multiplicative factor of the cross-section, constant with Bjorken-x, this K factor did not preclude the study of structure functions, in particular those of unstable hadrons like pions and kaons.

Fig. 31: Evidence for the K factor from NA3

Among other experiments, one can quote the Omega Beam Dump in 1978, measuring J/ψ and Drell–Yan dimuon production by a variety of projectiles, and which provided the very first results obtained from the SPS (Fig. 32).

(a)

(b)

Fig: 32: The Omega beam dump, set-up (a) and results (b)

D. TREILLE

54

Page 63: Cern History

Another active area was the study of prompt single photons and di-photons produced in hadronic reactions, bringing some first-hand information on the hard processes.

Photoproduction, besides charm studies (see later), provided the first high-energy measurement of the Compton effects on quarks in 1985–86: QED Compton effect, i.e., elastic scattering of the gamma on a quark (Fig. 33), and QCD Compton effect, a process in which the incident photon is changed into a gluon.

Fig. 33: Evidence for QED Compton from NA14

The comparison of experimental data with the existing theoretical QCD estimates, computed at the relevant order of perturbation, contributed strongly to validating the theory.

7.5 Heavy flavours

I shall insist a bit more on heavy flavour physics, a very active sector as well (about 20 CERN experiments), not only because of its impact on QCD testing, but also because it was a most innovative sector in terms of high-spatial-resolution silicon detectors, like microstrips, CCDs and active targets. It used also ‘old’ techniques, like emulsions coupled with large spectrometers, and rapid cycling bubble chambers. All types of beams were exploited: hadrons, photons (which offer a fraction of charm ten times higher than hadrons), and hyperons. It contributed mostly to charm physics, but provided also some results on beauty, obviously a more difficult physics topic requiring higher energies.

Figure 34 illustrates a few remarkable results: the associated production of charm in emulsions (WA58), of beauty in the microstrips of WA92, the very pure signal of Λ C → pKπ from NA32, thanks to its CCD detectors, and the signal of Ω0

A of WA89, the shortest lifetime ever measured that way (5 × 10-14 s).

Figure 35 recalls two beautiful experiments performed at CERN.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

55

Page 64: Cern History

Fig. 34: A few results from heavy-flavour physics (see text)

Fig. 35: Two remarkable experiments: the π0 lifetime (CERN SPS, 1985) and the pixel set-up of NA57

8 Accurate tests of gauge theories

8.1 g-ology

The g – 2 programmes have extended over 40 years. As John Adams once said: “g – 2 is not an experiment: it is a way of life”. Indeed some of the initiators are still currently involved. From my point of view it represents one of the most beautiful achievements in particle physics, experimentally and theoretically.

The g factor is the constant that determines how much magnetic moment μ results from a given amount of charge, mass and spin s, μ = g e/2m s. For the electron or muon the Dirac equation gives g = 2. Actually a slight difference arises because of radiative phenomena (Fig. 36) and the g – 2 quantity turns out to be an amazing testing ground of theory, QED and EW, since all interactions contribute to it. The present measurements of g are

– electron: 2.0023193043718 ± 0.0000000000075, i.e., a few parts per trillion.

– muon: 2.0023318416 ± 0.0000000012, ‘only’ a few parts per billion, but with the potentially interesting effects boosted by the mass squared factor (40 000).

D. TREILLE

56

Page 65: Cern History

The confrontation between experiment and theory has been a “tennis game with well-matched players on either side of the net ”, actually a 40-year long experimental programme matching a 40-year long programme of more and more refined calculations. At first order: α /2π ~ 0.00116, giving g = 2.00232. There are 7 two-loop diagrams, 72 three-loop diagrams computed in 1995 by T. Kinoshita at Cornell (at better than one part per million), 891 four-loop diagrams, for which numerical computations have bean under way since early 1980, and 12672 five-loop ones. The role of hadronic light-by-light scattering is important and actually leads to the present residual ambiguity in the interpretation.

Fig. 36: (a) The one-loop contribution to g – 2; (b) a key higher-order contribution

The value of g − 2 is obtained by measuring the beat between the rotation of muons around a ring and the rotation of their spin. The movement of the spin is described by the Bargmann–Michel–Telegdi equation.

In 1958 a first experiment was done at the SC. The actors are shown below.

In 1963 came the idea of a new experiment at the PS with 1.2 GeV muons involving E. Picasso, S. van der Meer, F. Farley, F. Krienen et al. The signal was observed for ~130 microseconds. A disagreement of 1.7 σ between experiment and theory led to a correction of the latter.

A better experiment started in 1969, using the so-called magic γ-factor at 3.1 GeV, the energy at which the electric field does not affect the g−2 precession, so that one can use electric quadrupoles with a uniform magnetic field. The signal was observed for 534 microsecond, the agreement with theory was excellent.

In his tribute to E. Picasso, F. Farley [10] underlined the excellent atmosphere in the group: “no wars, no clashes”.

Figures 37 to 43 illustrate various aspects of the g – 2 programme.

uncertainty on aμ

in 10–9

Fig. 37: The evolution of the uncertainty in the CERN experiments

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

57

Page 66: Cern History

Fig. 38: The signal in the Brookhaven experiment

(a)

(b)

Fig. 39: The present status of the confrontation between experiment and theory

Fig: 40: The team of the first g – 2 experiment Fig. 41: The first ring

D. TREILLE

58

Page 67: Cern History

Fig. 42: The second ring Fig. 43: The Brookhaven ring

9 LEP and ADLO

Fig. 44: The LEP scenery

ADLO is the name of the four experiments, ALEPH, DELPHI, L3 and OPAL, which combined their results.

LEP is a well celebrated success story. The machine, planned very accurately 15 years in advance, worked magnificently, surpassing all expectations (see Table 2). The superconducting radiofrequency cavities, on account of a careful work of conditioning and balancing, reached an accelerating field substantially higher than expected. The vacuum, thanks to the large-scale use of getter pumping, was outstanding, and this contributed to very favourable experimental conditions.

LEP was exploited by four high-quality multipurpose detectors. The programme led to clean and subtle physics results: in particular it demonstrated the validity of the SM at the loop level. Through their effect as virtual objects, it gave results on particles too heavy to be ‘really’ produced, the top quark and the Higgs boson. It showed that the coupling ‘constants’ of the three forces, electromagnetic,

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

59

Page 68: Cern History

weak and strong, are actually running and converge exactly at a very high scale (~1016 GeV) in the supersymmetric version of the SM. We briefly quote below the main LEP messages.

We recall that, besides the main programme, two additional variants of LEP were studied. Even if they were not realized, these studies had interesting consequences. The first was an attempt to exploit the longitudinal polarization of e± at LEP. It turned out to be complicated and, since it would have delayed the LEP energy upgrade, it was not implemented. It led to the invention of a clever scheme, called the Blondel scheme, which may play an important role in the future at a Z factory, and to exploit transverse polarization, key to the precise measurement of the Z mass. The second variant aimed at a much increased luminosity by devising multibunch schemes (Pretzel trains later replacing equidistant bunches in the Pretzel scheme). Actually up to 16 bunches were used, and LEP can be considered as a first approximation of a Z and therefore of a heavy-flavour factory.

Table 2: LEP performance: foreseen and achieved

Foreseen (55/95 GeV)

Achieved (46/98 GeV)

Current per bunch 0.75 mA 1.00 mA Total current 6 mA 8.4 mA/6.2 mA Beam–beam vertical parameter 0.03 0.045/0.083 Ratio of emittances 4.0 0.4% Maximal luminosity (10+30) 16/27 34/100

*xβ 1.75 m 1.25 m

*yβ 7 cm 4 cm

Fig. 45: The RF voltage around LEP Fig. 46: The vacuum chamber and its getter

pumping

In all respects LEP experiments did better, sometimes much better, than expected as shown by Table 3 below.

Among the instrumental breakthroughs which contributed to these successes, let us quote:

– the use of high-performance luminometers and the most accurate theoretical knowledge of the relevant Bhabha cross-section (Figs. 50, 51), a key indredient of neutrino counting (Fig. 52);

– the exploitation for beauty tagging of high-quality silicon microvertex detectors, allowing excellent purity for a still reasonable efficiency (Fig. 53).

D. TREILLE

60

Page 69: Cern History

Fig. 47: The LEP inauguration Fig. 48: The ALEPH experiment

Fig. 49: Some important actors of LEP: H. Schopper and E. Picasso (left); A. Hofmann and S. Myers with M. Bourquin, Rector of the University of Geneva and President of CERN Council (June 2001) (centre and right)

Table 3: Expected and final accuracies of various measurements at LEP

Expected Final

Z (MeV)M ±10–15 (stat) ±17 (syst) 2.1±

Z (MeV)Γ 50± 2.3±

Normalization 3% < 1‰

W (Me )VM 50–60 (stat), > 100 (syst) ADLO: 42 μμFBA Twice better

τ polarization 2.5 times better

Rb 3 to 6 times better bFBA 3 times better

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

61

Page 70: Cern History

Table 4: The absolute normalization at LEP (%)

ALEPH DELPHI L3 OPAL 1992 0.15 0.38 0.50 0.41 1995 0.080 0.09 0.068 0.034

Fig. 50: Evolution of luminosity

theoretical error at LEP1

Fig. 51: The ALEPH luminometer Fig. 52: Final result on neutrino counting

Fig. 53: Some aspects of LEP microvertex detectors, a comparison with SLD CCD and performances

D. TREILLE

62

Page 71: Cern History

Figures 54 to 56 illustrate well-known results from LEP. Figure 57 shows that, within the strict SM frame, the Higgs boson is felt to be light, say < 250 GeV.

An important but trivial quantum effect is the running of the fine structure constant from low energy to the Z mass, Δα. In addition, do we feel other effects due in particular to the top and Higgs? The answer is a clear yes [11]. The effects are summarized in the Δρ and Δr factors, whose numerical values give indirect information on these particles, as shown below.

– From the Z branching ratio into beauty, Rb: the predicted value, neglecting radiative corrections, is Rb

0 = 0.2183, the measured one is Rexp = 0.21644 ± 0.00065. From the 2.8σ discrepancy one derives: Mt = 155 ± 20 GeV.

– From the Z leptonic width: one finds gAl = – 0.50123 ± 0.00026, 4.7 σ away from the canonical

value of –1/2. One deduces Δρ = 0.005 ± 0.001.

– From the W mass and its relation with Gμ: one finds Δr = 0.033 ± 0.002 or ΔrW = Δr – Δα = – 0.026 ± 0.002, 13σ away from 0!

Fig. 54: The Z lineshape and the progress brought by LEP for Nν, MZ and sin2θW

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

63

Page 72: Cern History

Fig. 55: The W mass measurement, direct and indirect

Fig. 56: The top mass measurements, indirect and direct

Fig. 57: The indications on the Higgs mass, valid in the frame of the Standard Model

D. TREILLE

64

Page 73: Cern History

Figure 58 shows the running of the coupling constants of the three forces, due also to the role, as virtual objects, of the particles existing in the model considered. As I have said already, it leads to a perfect convergence at about 1016 GeV in the supersymmetric version of the SM with light superpartners.

Fig. 58: Evidence for the convergence of couplings, exact in the SUSY frame

Figures 59 to 62 illustrate a less bright topic: the search for the SM-like Higgs boson at LEP200. A lower limit was set at 115 GeV. The future, at the LHC or the Tevatron, will tell whether a weak signal at that mass was real or not. Eighty more superconducting RF cavities at LEP would have allowed us to answer, yes or no, about the existence of the light boson (< 130 GeV) predicted by minimal SUSY.

Fig. 59: Higgs search at LEP200: diagram, cross-sections, the final mass spectrum of ADLO

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

65

Page 74: Cern History

Fig. 60: The potential impact that a modest energy rise of LEP200 with 80 more superconducting RF cavities would have had

Fig. 61: The potential of the Tevatron concerning the Higgs search, left: the foreseen integrated luminosity; right: Higgs sensitivity study

Fig. 62: Peter Higgs: ‘l’art d’être grand-père’

D. TREILLE

66

Page 75: Cern History

10 CERN’s contribution to the validation of QCD Let us recall the main steps that led to the establishment of the validity of QCD.

On the experimental side the important dates (all three having led to Nobel prizes) are

– 1956: the discovery of proton structure at Stanford

– 1969: the first evidence of high-energy, deep-inelastic e–p scattering (DIS) in Stanford

– 1974: the ‘November revolution’, namely the discovery of the J/ψ in SLAC and Brookhaven.

On the theoretical side let us quote:

– 1966: Greenberg invents the colour quantum number

– 1969: interpretation of DIS: Bjorken scaling, Feynman’s parton model, Bjorken proposes to study deep inelastic Compton scattering

– 1972: Gribov–Lipatov predict scaling violation

– 1973: discovery of asymptotic freedom (Fig. 63 shows the actors). Formulation of QCD. Quark counting rules for hard hadron scattering.

Fig. 63: D. Gross, F. Wilczek, D. Politzer (Nobel Laureates 2004)

As we saw already, all CERN programmes, from bubble chamber to LEP, have brought important contributions to the build-up and accurate checking of QCD. We can summarize the situation by Figs. 64 and 65. Figure 64 shows the very coherent set of determinations of the strong coupling constant (given at the MZ scale): the impact of CERN programmes in this respect is obvious. Figure 65 demonstrates that this coupling constant is indeed running.

QCD is a very remarkable theory. One can consider for instance the ‘QCD lite’ version, obtained by setting mu and md to zero and considering only the first generation of constituents. It is not a big change to our vision of things. Indeed the nucleon mass, and thus the mass of the baryonic universe, is due not to the mass of the constituents (mgluon = 0, mup , mdown = few MeV) but to their dynamics. See “Mass without mass”, by F. Wilczek [12]. In that case a single parameter is left: αS(MZ) or ΛQCD . For the consequences, see F. Wilczek again [13].

A mystery left in QCD concerns the zero or very small value of a potential CP-violating term, θQCD. The axion hypothesis was formulated as a possible explanation. Here too, CERN is involved with the question through the CAST experiment, which looks for such a particle emitted from the Sun. About the axion problem, I recommend a delicious paper by P. Sikivie, “The pooltable analogy of the axion” [14].

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

67

Page 76: Cern History

Fig. 64: The set of αs (MZ) measurements Fig. 65: The running of αs

Another domain concerning QCD is heavy-ion (HI) collisions, a very active sector of CERN physics at the SPS. The idea is that these collisions may provide conditions such that nuclear matter undergoes a phase transition, in which, crudely speaking, individual nucleons ‘melt’ into a quark–gluon plasma. This is the reverse of what happened a few microseconds after the Big Bang (Fig. 66) when quarks got confined into nucleons. It is inconceivable to get quarks free again. On the other hand, the CERN experiments have provided striking signals that could result from such a phase transition, like the ‘melting’ of the J/ψ in HI collisions as shown by Fig. 67. The HI collider RHIC, in Brookhaven, and the future experiment ALICE at the LHC colliding lead ions will continue these studies.

Fig. 66: Location of LEP and LHC physics in the diagram describing the evolution of the Universe

Fig. 67: The ‘melting’ of J/Ψ as seen by NA50

D. TREILLE

68

Page 77: Cern History

11 CP violation and the origin of matter

(a) (b)

(c) Fig. 68: Parity violation (above left); CP conservation (above right); CP violation in the K0 system (below)

In our present Universe the number of baryons divided by the number of photons is 6 × 10-10 , a very small but definitely (and fortunately for our existence) non-zero number. A symmetry principle operates for electric charge, guaranteeing an exact neutrality of the Universe, but here an asymmetry seems to exist.

In 1954, the CPT theorem (Lüders) was demonstrated. In the theories we consider currently, the triple mirror CPT must be a good one.

In 1956, the discovery of parity (P) violation [Fig. 68(a)] in the weak interaction was a major revolution. One then trusted the validity of the CP symmetry [Fig. 68 (b)], product of P and charge conjugation C. However, in 1964 CP violation was discovered through the observation of the ‘forbidden’ mode K0

L → 2π (Fitch, Cronin, Christenson and Turlay). We show in Fig. 68(c) an evidence of CP violation.

In 1967, Sakharov gave his famous three conditions needed to guarantee the present matter–antimatter asymmetry. CP violation is indeed one of them. But in the SM the level of CP violation is insufficient to fulfil the goal. SUSY could be an answer.

It is impossible to do justice to all CERN experiments which dealt with CP.

P

C P

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

69

Page 78: Cern History

Briefly: in 1966 a η decay experiment at CERN showed that there is no evidence for C violation in η→3π electromagnetic decay as had been suggested.

The year 1970 saw the measurements of the relative phase of the K0L →π0π0 and K0

S →π0π0 amplitudes. It was shown that CP violation was accompanied by a violation of T to an amount compatible with CPT invariance, whereas T invariance with CPT violation was ruled out. In 1974 a new determination of the K0 →π+π– decay parameters was performed.

By that time, there was full agreement with the Superweak Model (Wolfenstein, 1964). Its mixing parameter ε is equal to the ratio of CP-violating to CP-conserving decay amplitudes of KL and KS . CP violation was seen as a small effect showing in K0 decay because of a small admixture of K0

1 (CP = 1 eigenstate, allowed to decay into 2π) in the K0

L: K0L = K0

2 + εK01.

But Kobayashi and Maskawa realized that, with three generations of elementary particles, CP violation can be incorporated in a natural way in the SM. Then it becomes possible that the CP-odd eigenstate K2 can decay directly into 2π . The corresponding parameter ε′ is generally not zero.

Let us resume the history at that stage.

12 Direct CP violation: NA31, NA48

12.1 NA31

The proposal was submitted at the end of 1981, the data were taken in 1986–89. The first results were available in the summer of 1987. This was an elegant detector, with the KS beam mounted on a train. 4 × 105 KL → 2π0 were recorded. It measured the double ratio R of KS and KL to 2π0 and π+π– . The result was R = 0.980 ± 0.004 ± 0.005, i.e., a non-zero Re ε /ε′ ratio at 3σ. However, this result was not confirmed by the Fermilab experiment E731. The Superweak Model was only ‘wounded’.

12.2 NA48

Approved in 1991, it took data from 1997 to 2001. Remarkable features of NA48 were the krypton calorimeter and the magnet spectrometer. The experiment exploited cancellations of systematics occurring in the double ratio. The statistics was increased by an order of magnitude. New results also came from KTeV. Direct CP violation was finally established beyond doubt at the 9σ level (Fig. 69), with a CL for consistency of 10% between all results. This implies an asymmetry in the decay rates of neutral kaons and their antiparticles into π+π– (2π0 ) of 5.5×10-6 (11×10-6), respectively.

Fig. 69: The evolution of the ε /ε′ measurement with time

D. TREILLE

70

Page 79: Cern History

12.3 CPLear

In this experiment particles and antiparticles are concurrently produced from antiproton–proton annihilations. Thanks to the very good performance of the detector and the excellent quality of the beam, a wide range of studies were made which provided high-precision tests of T violation, CP violation, CPT invariance, and possible extensions of Quantum Mechanics (QM).

One labels the K0 by its strangeness, obtained from the accompanying charged K, and observes its subsequent evolution in time under the weak interaction which does not conserve strangeness. So one follows the oscillations of strangeness plus the decay process.

CPLear has provided the first and only direct observation of non-invariance under time-reversal: this requires one to know the strangeness of the neutral K at two moments of its life, production and decay. The semileptonic mode eπν was used for the latter. The violation was established at the 5σ level.

A direct measurement of the CPT parameter was performed. It showed, 50 times more accurately than previously, that CPT was conserved. The K0–anti-K0 mass and decay width differences are found to be compatible with 0, as they should be according to CPT invariance, at a sensitivity of a few 10–18 GeV.

Considering the tests of QM, CPLear has measured special asymmetries whose values are compatible with the non-separability of the wave function of the K0–anti-K0 system, as predicted by QM, and rule out the separability hypothesis.

12.4 CP violation in B physics

It is usually admitted that this was discovered at the B factories. However, it is clear that LEP, and other programmes, also had ‘leur mot à dire’ on this topic. Showing that CP is violated amounts to showing that a triangle, the Unitarity Triangle, built from quantities measured in beauty and heavy-flavour physics, is not flat. As one can see from Fig. 70(a), the LEP era brought clear evidence that it is so, by providing an accurate determination of the tip of the triangle. Then B factories measured its angle β [Fig. 70(b)] directly. The fair agreement between the two methods is another great success of the SM. However, the confrontation is continuing at an increased level of sensitivity.

(a)

(b)

Fig. 70: (a) The indirect determination of the tip of the Unitary Triangle during the LEP era; (b) the measurement of the angle β of the Unitary Triangle at B factories

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

71

Page 80: Cern History

One can thus draw, a bit like in the case of the top mass, Fig. 71 showing both the indirect, i.e., through the knowledge of the tip of the UT, and the direct measurements of the angle β.

Fig. 71: Indirect (points) and direct (stars) determinations of the angle β of the Unitarity Triangle

12.5 CPT and antihydrogen

As we said before, the CPT theorem states in particular that particle and antiparticle have exactly the same mass and total width, etc. However, for instance in the frame of superstrings, one may still question the validity of the assumptions which led to this theorem.

The validity of CPT symmetry has been tested at the 10–-12 level for leptons, 10–-10 for baryons. Since the difference of energies ΔE/E between the 1S–2S levels in hydrogen is measured at ~10–14, there is potentially room for improvement through spectroscopic methods. Another reason to be interested in antiatoms is that there is no direct experimental proof of the equality of gravitational effects on matter and antimatter.

In 1993 came the idea that antiprotons turning in the LEAR ring could create e+e– in the Coulomb field of a gas-jet target and eventually capture the e+. Indeed in 1995 the PS210 experiment at LEAR observed nine antihydrogen atoms moving at a speed of 0.9c, this created a great media stir.

In 1996, TRAP at LEAR was able to trap both e+ and antiprotons, but this occurred just before the machine was stopped. One had therefore to wait for the Antiproton Decelerator (AD) to pursue the programme. In 2002, the experiments ATHENA and ATRAP at AD observed signals from ~50 000 and 17 000 antihydrogen atoms, respectively. However, they are still rather fast movers, while they should be below 0.06 meV to be trapped, in order to envisage spectroscopic studies. The R&D is ongoing.

12.6 The study of rare decays

Among the roads that deserve to be followed in the future, I think that the study of rare decays, not only of beauty, but also of charm, strange particles, and the muon is a most promising one.

One may recall that a Unitarity Triangle can be built from kaon physics only provided one has access to very rare modes (Fig. 72). The NA48 experiment at CERN is paving the way, for instance by measuring, as a preliminary step, the KS decay mode, shown by the right-hand side of Fig. 72.

Among the future plans for fixed-target experiments at CERN, the study of rare modes of kaons is certainly a highlight.

D. TREILLE

72

Page 81: Cern History

Fig. 72: The Unitary Triangle from kaon physics and a beautiful measurement by NA48

During the last few years one might have had a doubt about the unitarity of the first row of the CKM matrix, as shown by the Fig. 73. However, new measurements, at CERN (NA48) and elsewhere, of K and hyperon decays, led to a slight revision of the numerical value of the second element Vus, which is simply the famous Cabibbo angle. The situation is now much better and this potentially embarrassing (and exciting) anomaly is on the way to disappearing.

Fig. 73: A problem with the unitarity of the CKM matrix? (Courtesy Rainer Wanke)

13 The future It is clear that worldwide planning and distribution of large programmes is mandatory. Currently Europe has its share, the LHC, the flagship of collider physics for the next decades.

The first goal of particle physics is to go up in energy and observe (directly!) the particles and effects we think we ‘feel’ (a light Higgs ?) or suspect (SUSY ?). The LHC with its huge mass reach is the right machine to do so. The LHC programme has to be a success. We all know that it will be very demanding. More than 80% of CERN resources are being devoted to it. Moreover, the physics results from the LHC are mandatory to guide the next steps.

What else should one foresee at CERN?

One should guarantee a minimal, well-targeted programme of non-LHC physics: fixed-target (I alluded above to rare decays), low-energy antiprotons, etc.

One should ensure a boost of the CLIC R&D programme to get answers about its feasibility at the time when LHC results will indicate the road to be followed. This is the goal of the CTF3 facility.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

73

Page 82: Cern History

One should guarantee, by proceeding in carefully controlled steps, the possibility to have more protons available at CERN, which can benefit several areas, from ISOLDE to the LHC and an eventual neutrino programme.

One should have some limited level of involvement in astroparticle physics and astrophysics. CERN cannot (and does not) ignore the news ‘from the sky’; neither can CERN invest massively in such programmes. The scheme of ‘recognized experiments’ is a first step. One clearly needs a European and even a worldwide co-ordination of astroparticle physics: CERN, through its Council, may have a key role to play in this respect.

13.1 A few personal remarks concerning the LHC

The initial LHC energy is not a big concern. If that helps the machine commissioning one can compromise on it and run slightly below the nominal one. In particular, to minimize the risks taken for the machine components and the experiments, it is likely that the rise in luminosity will be rather slow.

Concerning its physics, one should diversify the phenomenologies under study, and in particular escape the tyranny of the MSSM. Right or wrong, large extra dimensions or Little Higgs models are most welcome because they suggest new topologies to be studied!

Besides ‘narrow-bump hunting’ one should consider cases where a more elaborate measurement is needed and focus on problems such as normalization, control of systematics in the determination of distributions, etc.

Since, with few exceptions (Higgs, missing energy), one does not really know what could be the signals to look for, we should favour a topological approach of the search programme: namely explore ALL final-state topologies that are accessible, i.e., for which the SM background is well mastered and the experimental acceptance is good and well known. The former implies a huge effort in the field of QCD, the latter much work of calibration by exploiting simple well-known physics channels.

The LHC will have only two (big) experiments. An army of physicists will construct a large number of distributions extending over a vast energy domain. This may lead to many seemingly anomalous features and statistical fluctuations. Which policy should one adopt to confront their findings in due course?

13.2 Miscellaneous and personal comments

There are quite obvious spin-offs from our discipline: detectors (crystals, pixels, etc.), magnets, accelerators, the Web, the GRID, methods of data treatment and analysis, etc., from which other fields, like medicine, also profit. It is important to develop and illustrate them.

However, we must first ‘sell’ our science for its scientific content. I think it is possible to convey its messages, even to the general public. This needs work and practice. From my personal experience, listeners whom we have a chance to interest prefer to be “tirés vers le haut”, rather than treated as half-wits. The impact of a well-explained visit of CERN can be decisive. A first condition is to show our own real enthusiasm about our research: “Enjoy, and share your enjoyment.”

There exists a clear synergy between particle physics and astroparticle physics/cosmology. We should naturally develop and exploit it. In particular, since we are confronted with cosmological problems, we should not fear ‘philosophical’ questions and be prepared to react intelligently and honestly to them.

Some ethical rules have to be respected and promoted: we should avoid precipitation, in particular rushing from one project to the next, and we must exploit fully our machines and programmes. We have to maintain and improve our peer-review procedures, check the quality of our publications (in particular on the Web), find ways to ensure the visibility of talented young individuals inside huge collaborations.

D. TREILLE

74

Page 83: Cern History

Given the size and complexity of future programmes, we cannot try to do everything everywhere. We must accept and contribute to a worldwide planning and sharing of activities. “Par la force des choses” it will happen for the large accelerators and colliders, but it has still to be established in astroparticle physics.

As we did in the past, we must ensure the continuation of a strong and co-ordinated R&D in accelerators and detectors, which is the key to our future.

We should try to get closer to other physics fields with which we can have fruitful reciprocal relations (ideas, methods) like atomic physics, condensed-matter physics, ultracold neutrons.

14 Conclusion In spite of many omissions, I think I have given a balanced account of half a century of physics at CERN. What is most remarkable is that CERN, even being over fifty years old and a victim of abusive slimming treatments, is more enterprising than ever. It is obviously a big chance for CERN to have the LHC starting in a couple of years. Let us hope that it will soon shed light on which physics is hiding behind the SM.

Aknowledgements I would like to thank warmly R. Barate, L. di Lella, K. Hübner, L. Pape, and F. Pauss for their careful reading of the manuscript and their comments. I would also like to thank the STP service for their excellent work.

References [1] Highlights of 25 years of physics at CERN, Phys. Rep. 62 (1980). [2] CERN–The second 25 years, Phys. Rep. 403–404 (2004). [3] Reports CERN CHS-21 and CHS-23, U. Mersits (1987); CERN CHS-26, L. Weiss (1988);

CERN CHS-36, D. Pestre (1992). [4] Report CERN CHS-20, J. Krige (1987). [5] L. Di Lella, Phys. Rep. 225 (1993) 45. [6] D. Haidt, CERN Courier 44 No. 8 (2004) 21–24; Eur. Phys. J. C34 (2004) 25–31. [7] A. Rousset, Nucl. Phys. Proc. Suppl. 36 (1994) 339–361. [8] D. Perkins, CERN Courier 43 No. 5 (2003) Suppl. 15–19;

http://cerncourier.com/main/article/43/5/22 . [9] D. Denegri, DAPNIA-04-247 (2004); CERN–The second 25 years, Phys. Rep. 403–404 (2004).

[10] F. Farley, G minus 2 plus Emilio, CERN-OPEN-2002-006 (1992). [11] F. Teubert, Int. J. Mod. Phys. A20 (2005) 5174–83; LEP EW Group,

http://lepewwg.web.cern.ch/LEPEWWG/ . [12] F. Wilczek, Mass without mass, Phys. Today 53 No.1 (2000) 13–14. [13] F. Wilczek, QCD and natural philosophy, MIT.CTP.3328 (2002), Archive: physics/0212025. [14] P. Sikivie, Phys. Today 49 No.12 (1996) 22–27.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: EXPERIMENT

75

Page 84: Cern History
Page 85: Cern History

Fifty years of research at CERN, from past to future: Computing

D. O. WilliamsCERN, Geneva, Switzerland

AbstractComputing in the broadest sense has a long history, and Babbage (1791–1871),Hollerith (1860–1929), Zuse (1910–1995), many other early pioneers, andthe wartime code-breakers, all made important break-throughs. CERN wasfounded as the first valve-based digital computers were coming onto the mar-ket. Computing at CERN will be considered from various viewpoints, includ-ing the various ‘wares’ (hardware, software, netware, epeopleware and morerecently middleware) that it incorporates, and the impact which it has had atCERN, on particle physics in general, and on other sciences.

For medical reasons David Williams has been unable to complete the section onfifty years of CERN computing in time for publication with the other material. Hehopes to complete this work for publication towards the end of 2006. In the meanwhile his PowerPoint slides are included here.

77

Page 86: Cern History

78

Page 87: Cern History

CERNCERN--50 50 50 years of Research at CERN50 years of Research at CERN

From past to future (Computing)From past to future (Computing)

Personal Reflections and PhotographsPersonal Reflections and Photographs

David WilliamsDavid WilliamsCERN Academic Training ProgrammeCERN Academic Training Programme

16 September 200416 September 2004

See See cern.ch/David.Williams/public/CERN50computingPub.pptcern.ch/David.Williams/public/CERN50computingPub.ppt

CaveatsCaveatsI knew that this job would not be easy I knew that this job would not be easy –– it was even harder than I thoughtit was even harder than I thoughtI stopped as Division Leader at the end of 1996, and I have had I stopped as Division Leader at the end of 1996, and I have had very little to very little to do with physics computing since ~end 1998, which is two Internetdo with physics computing since ~end 1998, which is two Internetgenerations.generations.If you want to prepare a proper historical review (book) you shoIf you want to prepare a proper historical review (book) you should identify uld identify the themes, have them treated and analysed by specialists, and ithe themes, have them treated and analysed by specialists, and it would t would take a couple of years, and I might even try to do that one daytake a couple of years, and I might even try to do that one dayToday I merely try to paint a picture, making some technical comToday I merely try to paint a picture, making some technical comments. ments. Surely a personal picture, with all the biases that implies. AnSurely a personal picture, with all the biases that implies. And in no way am d in no way am I trying to assign credit to individuals. I know that I will faI trying to assign credit to individuals. I know that I will fail to mention il to mention some important aspects of the puzzle, and many people who have msome important aspects of the puzzle, and many people who have made ade important contributionsimportant contributions

I concentrate on the early years I concentrate on the early years –– you are living the later ones and donyou are living the later ones and don’’t t need to be told about those! But need to be told about those! But I have not talked to two people that I I have not talked to two people that I should have talked to about the very early days (George should have talked to about the very early days (George ErskineErskine and and HenkHenkSlettenhaarSlettenhaar))

The lighting of many of the photos is not wonderful, and many ofThe lighting of many of the photos is not wonderful, and many of the early the early ones are very ones are very ““posedposed””. . We have tried an not always succeeded to assemble We have tried an not always succeeded to assemble ““collagescollages”” of many of of many of our colleaguesour colleagues

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

79

Page 88: Cern History

The nature of the problem The nature of the problem --People and their interactionsPeople and their interactions

I suspect that the LHC experiments are running at the limit I suspect that the LHC experiments are running at the limit of what is feasibleof what is feasible……Not the amount of funds that can be assembledNot the amount of funds that can be assembledOr the complexity of the detectorsOr the complexity of the detectorsBut the possibility of keeping such a large number of very But the possibility of keeping such a large number of very smart people working enthusiastically and actively towards a smart people working enthusiastically and actively towards a common scientific goalcommon scientific goal

Until the midUntil the mid--1980s 1980s HEPHEP’’ss ““computing problemcomputing problem”” was often was often thought to be about thought to be about obtaining enough processor powerobtaining enough processor powerThen we worried about Then we worried about storage capacitystorage capacityThe real problem has always been, in my opinion, The real problem has always been, in my opinion, getting getting people to collaborate on a solutionpeople to collaborate on a solution

AcknowledgementsAcknowledgements

Miguel Marquina Miguel Marquina –– who helped me to prepare this, and who who helped me to prepare this, and who has scanned a lot of photos for me (more than I am using)has scanned a lot of photos for me (more than I am using)CERN CERN PhotolabPhotolabMany others who have provided information and photos, and Many others who have provided information and photos, and answered my questionsanswered my questionsAll of the people All of the people –– CERN people, CERN users, outside labs CERN people, CERN users, outside labs and computer suppliers staff and computer suppliers staff –– who did the real work behind who did the real work behind fifty years of fifty years of ““Computing at CERNComputing at CERN””The mistakes The mistakes –– and there will surely be several and there will surely be several –– are mine are mine alonealone

D. O. WILLIAMS

80

Page 89: Cern History

OutlineOutline

Setting the scene Setting the scene –– the world in 1954 and 2004the world in 1954 and 2004The early days The early days –– roughly to 1967roughly to 1967Some early technology Some early technology -- punched cardspunched cardsLater Computer Centre machinesLater Computer Centre machinesSoftwareSoftwareMeasuring MachinesMeasuring MachinesOnline computing and DAQOnline computing and DAQOnsite NetworkingOnsite NetworkingOffsite NetworkingOffsite NetworkingControlling Accelerators Controlling Accelerators Various other things (Databases and other special applications, Various other things (Databases and other special applications, EU EU projects, Emulators, CERN School of Computing, HEPprojects, Emulators, CERN School of Computing, HEP--CCC)CCC)Data HandlingData HandlingInformation HandlingInformation HandlingProspects for the FutureProspects for the FutureWim KleinWim Klein

First thoughtsFirst thoughtsComputing at CERN has little to do with Research into ComputingComputing at CERN has little to do with Research into ComputingIt is the huge challenge of using leadingIt is the huge challenge of using leading--edge technologies to provide top edge technologies to provide top quality services for the CERN community, so that they can preparquality services for the CERN community, so that they can prepare, run and e, run and process their experiments as well (and competitively) as possiblprocess their experiments as well (and competitively) as possiblee

Over time the physics challenges mainly stay conceptually the saOver time the physics challenges mainly stay conceptually the same, but the me, but the available technology available technology –– and its cost and its cost –– keeps evolvingkeeps evolvingOnline feedbackOnline feedback AND AND Worldwide access to data and informationWorldwide access to data and informationWorldwide participation in all phases of an experimentWorldwide participation in all phases of an experiment are the basic are the basic (enduring) challenges(enduring) challengesWhat did not work or was too complex What did not work or was too complex –– or was impossibly expensive or was impossibly expensive –– last last year may well work year may well work –– or be affordable or be affordable –– next yearnext year

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

81

Page 90: Cern History

SETTING THE SCENESETTING THE SCENETHE WORLD IN 1954 AND 2004THE WORLD IN 1954 AND 2004

WHERE DID WE COME FROM?WHERE DID WE COME FROM?

19541954

Europe was still recovering from World War IIEurope was still recovering from World War II

Historically computing had been driven by the need for Historically computing had been driven by the need for accurate compilation of tables accurate compilation of tables –– especially for navigation especially for navigation ––(Babbage (Babbage ““mathematical enginemathematical engine””) and for census ) and for census purposes (purposes (HollerithHollerith))

More recently (~previous 20 years) it had been largely More recently (~previous 20 years) it had been largely driven by military needs driven by military needs –– code breaking and bomb code breaking and bomb simulationsimulation

D. O. WILLIAMS

82

Page 91: Cern History

1954 world timeline 1954 world timeline –– English LiteratureEnglish Literature

Lord of the Flies (Lord of the Flies (GoldingGolding))Lord of the Rings Lord of the Rings VolVol 1 1 -- The Fellowship of the Ring The Fellowship of the Ring -- and and VolVol 2 2 -- The Two Towers (The Two Towers (TolkienTolkien))Lucky Jim (Kingsley Lucky Jim (Kingsley AmisAmis))Under Milk Wood (Dylan Thomas)Under Milk Wood (Dylan Thomas)Under the Net (Murdoch)Under the Net (Murdoch)

1954 world timeline 1954 world timeline –– General eventsGeneral events

First polio vaccinationFirst polio vaccinationFirst kidney transplantFirst kidney transplantFirst four minute mile (Bannister)First four minute mile (Bannister)Battle of Battle of DienDien BienBien PhuPhuAlgerian War of Independence startsAlgerian War of Independence startsGeneral Nasser becomes prime minister of EgyptGeneral Nasser becomes prime minister of EgyptBikini Atoll hydrogen bomb testBikini Atoll hydrogen bomb testFirst Russian hydrogen bomb testFirst Russian hydrogen bomb testUSS Nautilus launchedUSS Nautilus launchedSen. McCarthy active (investigating Army etc.) but year ends Sen. McCarthy active (investigating Army etc.) but year ends with his condemnation by Senate votewith his condemnation by Senate voteUS Supreme Court decision in US Supreme Court decision in Oliver Brown v Board of Oliver Brown v Board of Education of Topeka KA (and others)Education of Topeka KA (and others)

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

83

Page 92: Cern History

1954 world timeline 1954 world timeline -- MoreMore

Nobel PrizesNobel Prizes–– PhysicsPhysics Born (quantum mechanics) and Born (quantum mechanics) and BotheBothe (coincidence (coincidence

circuit)circuit)–– ChemistryChemistry PaulingPauling (chemical bond (chemical bond ……))–– MedicineMedicine Enders, Welling and Robbins (for the cultivation of Enders, Welling and Robbins (for the cultivation of

polio virus polio virus –– leading to vaccines)leading to vaccines)–– LiteratureLiterature HemingwayHemingway–– PeacePeace UNHCRUNHCR

BornBorn–– Cherie BlairCherie Blair–– Condoleezza RiceCondoleezza Rice

DiedDied–– Alan TuringAlan Turing–– Enrico FermiEnrico Fermi

NATIONAL LEADERSNATIONAL LEADERS

““Who wants to be a millionaireWho wants to be a millionaire””

1954 political leaders of Russia, USA, UK, France, 1954 political leaders of Russia, USA, UK, France, Germany??Germany??

D. O. WILLIAMS

84

Page 93: Cern History

Firs

t Se

cret

ary

of t

he C

P of

the

Sov

iet

Uni

on

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

85

Page 94: Cern History

Photo: Karsh

Rene Coty (President from 16 Jan)

Joseph Laniel(PM until June 18)

Pierre Mendes-France (PM from June 18)

D. O. WILLIAMS

86

Page 95: Cern History

Photo: Haus der Geschichte der BRD, Bonn

MARRIEDMARRIEDAND DIVORCED IN 1954AND DIVORCED IN 1954

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

87

Page 96: Cern History

Married 14 Jan Divorced 5 Oct

1954 computing timeline1954 computing timeline

Computers were valveComputers were valve--basedbased

I was 10 years old and had never seen a computer. I first I was 10 years old and had never seen a computer. I first saw one, a saw one, a StantecStantec Zebra, in ~1956, and then started taking Zebra, in ~1956, and then started taking out the ~3 books in the local library about computing.out the ~3 books in the local library about computing.

I saw EDSAC 1 in Cambridge in ~1958 as it was being I saw EDSAC 1 in Cambridge in ~1958 as it was being dismantled.dismantled.

D. O. WILLIAMS

88

Page 97: Cern History

Stantec Zebra

Built in Newport by STC from the original concept of van der Poel (Delft)8k 33-bit words. Valve-based. Cost 23 kGBP (then ~280 kCHF)

and was one of the cheapest general-purpose machines of the time

Not

e: F

anta

stic

Pap

er-t

ape

equi

pmen

tan

d te

legr

aph-

styl

e ty

pew

riter

EDSAC 1. Courtesy, Cambridge Computer Laboratory Archive

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

89

Page 98: Cern History

More 50s and 60s computing timelineMore 50s and 60s computing timeline

1947 (Dec)1947 (Dec) Point contact transistor inventedPoint contact transistor invented1951 (Sep)1951 (Sep) Major Bell Labs symposium on working junction Major Bell Labs symposium on working junction

transistorstransistors

19551955 Wilkes invents microprogramming Wilkes invents microprogramming –– programming the instructionprogramming the instructionsetset

19561956 First magnetic disk system sold (IBM RAMAC)First magnetic disk system sold (IBM RAMAC)~1956~1956 FORTRAN under developmentFORTRAN under development19591959 IBM 1401 first shipped. IBM 1401 first shipped. TransistorisedTransistorised. Punched card input. 12,000 . Punched card input. 12,000

sold over 12 yearssold over 12 years19601960 PDPPDP--1 launched (18 bit words)1 launched (18 bit words)19641964 PDPPDP--8 launched (12 bit words))8 launched (12 bit words))19641964 System/360 launched (4*8 bit byte words)System/360 launched (4*8 bit byte words)

IBM RAMAC disk – 19567-bit storage. Capacity ~4.4 MbytesHeld on 50 24” platters with data

recorded on both sides on 100 tracksA computer in its own right

RAMAC == Random AccessMethod of Accounting and Control

Leased for 40 k$ per year

D. O. WILLIAMS

90

Page 99: Cern History

Personal timelinePersonal timeline

August 1966August 1966 Williams to CERNWilliams to CERN

I that time I had programmed the Titan (prototype Ferranti I that time I had programmed the Titan (prototype Ferranti AtlasAtlas--2) in 2) in AutocodeAutocode and a DEC (Digital Equipment) PDPand a DEC (Digital Equipment) PDP--7 7 ––a followa follow--on machine to the PDPon machine to the PDP--1 1 –– equipped with an equipped with an interactive display, in assembler.interactive display, in assembler.A multiA multi--user service and a standuser service and a stand--alone alone ““minimini--computercomputer””which you could work on alone in the eveningwhich you could work on alone in the evening

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

91

Page 100: Cern History

Cour

tesy

: M

artin

Poo

lN

ote: Ch

air! Teletype (KSR

-33

), PTR

an

d DEC

tape(~

25

6 kB

/reel)D

EC 3

40

Dis

play

(at

tach

ed t

o P

DP

7)

wit

h li

ght-

pen

D. O. WILLIAMS

92

Page 101: Cern History

20042004

Longest period of peace in (much of) Longest period of peace in (much of) Europe since everEurope since ever

Not intended as an exhaustive listNot intended as an exhaustive listNapoleonic Wars 1803Napoleonic Wars 1803--18151815Belgian War of Independence 1830Belgian War of Independence 1830--18321832Italian Wars of Independence 1848Italian Wars of Independence 1848--18661866Crimean War 1853Crimean War 1853--18561856DanishDanish--Prussian War 1864Prussian War 1864AustroAustro--Prussian War 1866Prussian War 1866FrancoFranco--Prussian War 1870Prussian War 1870--18711871World War I 1914World War I 1914--19181918World War II 1939World War II 1939--19451945

van Hovevan Hove

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

93

Page 102: Cern History

END OF SCENE SETTINGEND OF SCENE SETTING

DOWN TO WORK!DOWN TO WORK!

THE EARLY DAYSTHE EARLY DAYS

ROUGHLY TO 1967ROUGHLY TO 1967

From CERN annual reportsFrom CERN annual reportsMainly verbatim quotesMainly verbatim quotes

(but I did (but I did somesome paraphrasing)paraphrasing)

D. O. WILLIAMS

94

Page 103: Cern History

1956/571956/57

19561956–– Following in the steps of Following in the steps of Harwell and SaclayHarwell and Saclay …… CERN ordered an CERN ordered an

electronic computer of a new type electronic computer of a new type (Ferranti Mercury) in May 1956(Ferranti Mercury) in May 1956. . Because of unforeseen difficulties experienced by the producers Because of unforeseen difficulties experienced by the producers of of this machine, however, there will undoubtedly be some considerabthis machine, however, there will undoubtedly be some considerable le delay in its delivery delay in its delivery ……

–– An experienced mathematicianAn experienced mathematician--physicist has been recruited to run physicist has been recruited to run the future Computer Section. the future Computer Section. Further staff will eventually be Further staff will eventually be recruited, but the section will not be very large since recruited, but the section will not be very large since ……

19571957–– The Mercury computer will probably not be installed until the suThe Mercury computer will probably not be installed until the summer mmer

of 1958of 1958–– …… prepare programmes for use on the English Electric Deuce and thprepare programmes for use on the English Electric Deuce and the e

IBM 704 in ParisIBM 704 in Paris–– By the end of 1957 there were By the end of 1957 there were 2 staff plus 1 fellow in the Computer 2 staff plus 1 fellow in the Computer

SectionSection

19581958--595919581958–– The Ferranti Mercury computer The Ferranti Mercury computer arrived in the summer and its acceptance arrived in the summer and its acceptance

tests were completed by midtests were completed by mid--October.October.–– …… AutocodeAutocode is fairly easily learnt by scientists who have had no experiencis fairly easily learnt by scientists who have had no experience of e of

computerscomputers–– (referring to the IBM 704 in Paris)(referring to the IBM 704 in Paris) in each case the programming was done in each case the programming was done

in Fortranin Fortran–– The speeds at which the available paperThe speeds at which the available paper--tape devices can transmit data to tape devices can transmit data to

and from the computer are in no way comparable with the computinand from the computer are in no way comparable with the computing speedg speed–– Leading personnel Leading personnel …… took part in the took part in the ““Symposium on the Symposium on the MechanisationMechanisation of of

ThoughtThought”” at NPL at NPL TeddingtonTeddington in Novemberin November–– The staff of the Computing The staff of the Computing GroupGroup, , numbering about 10 at the end of 1958, numbering about 10 at the end of 1958,

will have to be doubled in the course of 1959will have to be doubled in the course of 195919591959–– …… to supplement and eventually replace the present Mercury computto supplement and eventually replace the present Mercury computer by er by

hiring an IBM 704 from the latter part of 1960. The installatiohiring an IBM 704 from the latter part of 1960. The installation will be n will be equipped with an 8equipped with an 8’’000 word core store (32 000 word core store (32 kBkB) 8 tape units and various ) 8 tape units and various other ancillary equipment. It is due to start operation in the other ancillary equipment. It is due to start operation in the autumn of 1960autumn of 1960

–– In October In October regular tworegular two--shift workingshift working was introduced on the Mercury was introduced on the Mercury ……Nevertheless the is already a backlog of computing work.Nevertheless the is already a backlog of computing work.

–– The computer service has been used by about The computer service has been used by about 40 customers40 customers, who wrote , who wrote more than a hundred more than a hundred AutocodeAutocode programmesprogrammes

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

95

Page 104: Cern History

Mercu

ry: note paper-tape

equipm

ent an

d real onlin

e typewriter

Ferranti MercuryFerranti Mercury

Au

ffretan

d Slettenh

aar

D. O. WILLIAMS

96

Page 105: Cern History

19601960--6161

19601960–– In In early Novemberearly November a much larger and faster computer, a much larger and faster computer, the IBM 709the IBM 709, ,

was delivered to the site and in 1961 seven was delivered to the site and in 1961 seven analysinganalysing machines machines should be in operation using the IBM 709 for data processingshould be in operation using the IBM 709 for data processing

–– To house it a prefabricated building has been put up with all To house it a prefabricated building has been put up with all necessary airnecessary air--conditioning and stabilized power suppliesconditioning and stabilized power supplies

–– Full time operation of the Mercury was introduced in April 1960.Full time operation of the Mercury was introduced in April 1960. At At weekweek--ends (on Saturdays and sometimes on Sundays) it has been ends (on Saturdays and sometimes on Sundays) it has been possible to work on one or more shifts, and do some maintenance possible to work on one or more shifts, and do some maintenance and upgradesand upgrades

–– First mention of Flying Spot First mention of Flying Spot DigitisersDigitisers (Hough and Powell)(Hough and Powell)

19611961–– It became clear during the year that even more attention will haIt became clear during the year that even more attention will have to ve to

be given to data handling and analysis, that as be given to data handling and analysis, that as …… ……, will require , will require more and more and more machine time from computers with faster operation more machine time from computers with faster operation and larger memoriesand larger memories

IBM 709IBM 709

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

97

Page 106: Cern History

PD

G of IB

M Fran

ceB

aron de W

aldner

atth

e official openin

g?

1962 1962 -- generalgeneral

19621962–– The importance of data handling as a central problem for the The importance of data handling as a central problem for the

laboratory has become even more obvious with the growing flood olaboratory has become even more obvious with the growing flood of f bubble chamber pictures and the beginning of what is likely to bbubble chamber pictures and the beginning of what is likely to be a e a comparable flow of spark chamber datacomparable flow of spark chamber data

–– The 709 worked an average of 1.5 shifts during the yearThe 709 worked an average of 1.5 shifts during the year–– The Mercury ran 24 hours*5 days throughout the year, with The Mercury ran 24 hours*5 days throughout the year, with

occasional running at weekoccasional running at week--endsends–– A system A system …… is being written to run (Mercury) is being written to run (Mercury) AutocodeAutocode programmes programmes

on the IBM 709!on the IBM 709!–– The The IBM 1401IBM 1401 computer was delivered in November (for I/O computer was delivered in November (for I/O

handling). handling). CERNCERN’’s third computer? (and first s third computer? (and first transistorisedtransistorised machine)machine)

D. O. WILLIAMS

98

Page 107: Cern History

1962 1962 -- the the ““offline chainoffline chain””

19621962–– A new generation of programmes for the analysis of measurements A new generation of programmes for the analysis of measurements of of

bubble chamber photographs was brought into use. The new bubble chamber photographs was brought into use. The new programmes make use of the greater speed of the IBM 709 computerprogrammes make use of the greater speed of the IBM 709 computer, , ……, , they are written in Fortranthey are written in Fortran ……

–– Paper tape measurements read by Paper tape measurements read by REAPREAP (Mercury) (Mercury) 709 via 709 via magtapemagtape–– Geometry reconstruction by Geometry reconstruction by THRESHTHRESH–– KinematicKinematic analysis by analysis by GRINDGRIND (which later needed an (which later needed an ““ignore infinityignore infinity””

card on the CDC 6600!)card on the CDC 6600!)

From the Electronics Experiments CommitteeFrom the Electronics Experiments Committee19 Sept 196219 Sept 1962

PreiswerkPreiswerk (chair), (chair), CassellsCassells, , RubbiaRubbia, , WetherellWetherell, Dick, DickThe situation The situation …… with respect to the analysis of spark with respect to the analysis of spark chamber photographs was briefly discussed. With the influx chamber photographs was briefly discussed. With the influx of 10of 1055 pictures from the PS pictures from the PS dibosondiboson experiment S1 and experiment S1 and …… it it is clear that is clear that the available facilities arethe available facilities are rapidly becoming rapidly becoming overloadedoverloaded. A general discussion of the whole picture . A general discussion of the whole picture analysis problem would be forthcoming from analysis problem would be forthcoming from HineHine’’ss study study groupgroup..A possible and untapped source of effort might be available A possible and untapped source of effort might be available from universities. Participation by university personnel in from universities. Participation by university personnel in the data taking stage would be essential.the data taking stage would be essential.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

99

Page 108: Cern History

19631963By September the IBM 709 computer was replaced by a By September the IBM 709 computer was replaced by a transistorisedtransistorised IBM 7090, increasing by about a factor 4 the IBM 7090, increasing by about a factor 4 the total computing capacity available at CERN.total computing capacity available at CERN.On the basis of the report of the On the basis of the report of the ““European Committee on the European Committee on the Future Computing Needs of CERNFuture Computing Needs of CERN””, the DG proposed to the , the DG proposed to the Council the purchase of Council the purchase of a large timea large time--sharing computer sharing computer –– the the CDC 6600 CDC 6600 –– to replace CERNto replace CERN’’s present computer by early s present computer by early 1965.1965.The computing capacity The computing capacity …… estimated to be at least 15x estimated to be at least 15x CERNCERN’’s present capacity) will allow considerable development s present capacity) will allow considerable development of data handling techniques used in HEP. of data handling techniques used in HEP. It is planned to It is planned to exploit fully the timeexploit fully the time--sharing properties of the computer in sharing properties of the computer in order to allow simultaneous operation of various onorder to allow simultaneous operation of various on--line line applications (counter experiments, film measuring devices applications (counter experiments, film measuring devices etc.) together with the more conventional computing work of etc.) together with the more conventional computing work of the Laboratory. the Laboratory. A great, but unfortunately not very realistic, A great, but unfortunately not very realistic, vision.vision.

IBM 7090 IBM 7090 –– October 1963October 1963

Han

s Klein

and Sw

oboda(IB

M)

D. O. WILLIAMS

100

Page 109: Cern History

More 1963More 1963

REAP moved from Mercury to 709REAP moved from Mercury to 709BAKEBAKE--SLICESLICE--SUMXSUMX introducedintroducedTHRESH for heavy liquids being developedTHRESH for heavy liquids being developedManuals written!Manuals written!A fourA four--week lecture course in Octweek lecture course in Oct--Nov on the programme Nov on the programme chain was attended by 120 physicists from all over the worldchain was attended by 120 physicists from all over the worldPreparations to use the Mercury Preparations to use the Mercury onlineonline with a sonic spark with a sonic spark chamber experiment chamber experiment

19641964The CDC 6600 will be delivered in Jan 1965 and The CDC 6600 will be delivered in Jan 1965 and the changethe change--over from the IBM 7090 is planned to take 3 monthsover from the IBM 7090 is planned to take 3 monthsBy the end of the year the By the end of the year the 7090 was operating on 24*7 7090 was operating on 24*7 basis, processing an average of 350 jobs/day.basis, processing an average of 350 jobs/day.Second IBM 1401 installedSecond IBM 1401 installedGeneral Mercury service stopped in April and machine General Mercury service stopped in April and machine moved to online work. moved to online work. 1 km data link to the South Hall 1 km data link to the South Hall (missing mass) and similar link to SC (missing mass) and similar link to SC vidiconvidicon experiment.experiment.Sonic spark chamber analysis programmes were developed Sonic spark chamber analysis programmes were developed for an experiment using the for an experiment using the SDS 920 computer onlineSDS 920 computer online..Programme Library startedProgramme Library started –– with 80 general programmeswith 80 general programmesDefinition of Definition of CERN FortranCERN Fortran (to provide compatibility across (to provide compatibility across machines)machines)Detailed proposal being made to connect several Detailed proposal being made to connect several IEPsIEPs to the to the CDC 6600CDC 6600New standard interface defined with CDC to enable HPDs to New standard interface defined with CDC to enable HPDs to be attached directly to 6600be attached directly to 6600

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

101

Page 110: Cern History

From the Electronics Experiments CommitteeFrom the Electronics Experiments Committee13 January 196413 January 1964

PuppiPuppi (chair), (chair), HeintzeHeintze, Lock, , Lock, MundayMunday, , PreiswerkPreiswerk, , SensSens, , WinterWinterData Handling FacilitiesData Handling FacilitiesSeveral experiments experience severe difficulties in view of Several experiments experience severe difficulties in view of the limitations on the available data handling facilitiesthe limitations on the available data handling facilities. The . The EEC recommends as a general policy that experiments which EEC recommends as a general policy that experiments which have priority on the accelerators also have priority on the have priority on the accelerators also have priority on the data handling facilities. It is considered of the greatest data handling facilities. It is considered of the greatest importance to importance to organize the data organize the data analysisanalysis in such a way that in such a way that a a feedfeed--back to the data back to the data takingtaking phase of the experiment is phase of the experiment is possible. possible.

(i.e. you want online feedback!)(i.e. you want online feedback!)

From March 1964 Informal Meeting on FilmFrom March 1964 Informal Meeting on Film--Less Spark Less Spark Chamber Techniques and Associated Computer UseChamber Techniques and Associated Computer Use

General Discussion on Online Computer Use (led by General Discussion on Online Computer Use (led by Macleod)Macleod)As far as online computers are concerned, there are those As far as online computers are concerned, there are those who are in who are in favourfavour and those who are against, and, by and and those who are against, and, by and large these appear to correspond to those who have online large these appear to correspond to those who have online computers and those who do not! computers and those who do not! There is nobody who has There is nobody who has been near an onbeen near an on--line computer who has said that he did not line computer who has said that he did not like it.like it.If I can start with If I can start with LindenbaumLindenbaum, he is essentially in , he is essentially in favourfavour of of everything. He is not reticent about it, and he said that he everything. He is not reticent about it, and he said that he thinks that thinks that his kind of experiment could use a 6600 fullhis kind of experiment could use a 6600 full--time time plus any other computers one can find on the East Coast!plus any other computers one can find on the East Coast!

D. O. WILLIAMS

102

Page 111: Cern History

19651965

The IBM 7090 computer was completely overloaded from The IBM 7090 computer was completely overloaded from the beginning of the year and it was necessary to process the beginning of the year and it was necessary to process work away from CERN until Junework away from CERN until JuneThe CDC 6600 was delivered in January, but delivery delays The CDC 6600 was delivered in January, but delivery delays with the SIPROS timewith the SIPROS time--sharing operating system, and sharing operating system, and technical problems with the hardware itself, technical problems with the hardware itself, prevented prevented operation at anything like the planned capacityoperation at anything like the planned capacityThe IBM lease was terminated at the end of JulyThe IBM lease was terminated at the end of JulyWork on SIPROS was concentrated at CERN. The work is Work on SIPROS was concentrated at CERN. The work is making good progress and it is planned to introduce SIPROS making good progress and it is planned to introduce SIPROS in January 1966in January 1966Hardware reliability problems led to Hardware reliability problems led to 44--week overhaul in midweek overhaul in mid--October October –– when I think that it was the only computer onsite when I think that it was the only computer onsite (apart from the Mercury)(apart from the Mercury)

CDC 6600 (console, card readers, line printers)CDC 6600 (console, card readers, line printers)

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

103

Page 112: Cern History

WhatWhat’’s the problem?s the problem? Lipps, Trenel, D

elleran

d two C

DC

analysts

19661966Most of the work of DD Division was centred around the Most of the work of DD Division was centred around the difficult task of bringing the main computer, the CDC 6600, difficult task of bringing the main computer, the CDC 6600, into full and reliable serviceinto full and reliable service, which was largely accomplish, which was largely accomplish--ed, with some temporary seted, with some temporary set--backs, by the end of the yearbacks, by the end of the yearControl Data installed a 3400 to take some of the load, Control Data installed a 3400 to take some of the load, upgraded to 3800 in August. Planned to upgrade 3800 to upgraded to 3800 in August. Planned to upgrade 3800 to 6500 in spring 19676500 in spring 1967It was recognised that computing had become such an It was recognised that computing had become such an integral part of the scientific work of CERN that an integral part of the scientific work of CERN that an interruption of even a day or so due to computer breakdown interruption of even a day or so due to computer breakdown caused serious disruption to the work of the Laboratorycaused serious disruption to the work of the LaboratoryTender for the Tender for the first ADP computerfirst ADP computerHPD1 and HPD1 and LucioleLuciole online to 6600, but running at only ~60% online to 6600, but running at only ~60% of speed available on a dedicated 7090of speed available on a dedicated 7090Two Two IEPsIEPs online to 6600, but poor performance (memory) online to 6600, but poor performance (memory) led to dropping that solution, and led to dropping that solution, and CDC 3100 was installed to CDC 3100 was installed to control control IEPsIEPs in Augustin AugustData linkData link SDS 920 SDS 920 –– CDC 6600 used up to 5 hours/day, CDC 6600 used up to 5 hours/day, planned to lead to FOCUS. Another data link to an IBM 1800planned to lead to FOCUS. Another data link to an IBM 1800

D. O. WILLIAMS

104

Page 113: Cern History

CDC 3800CDC 3800B

ernard P

etiotat th

e console

19671967

Experience confirmed that the runningExperience confirmed that the running--in problems of 1965 in problems of 1965 and 1966 had been overcomeand 1966 had been overcomeCDC 6400CDC 6400--6500 installed and by end of 1967 both machines 6500 installed and by end of 1967 both machines were running CERN SCOPE (were running CERN SCOPE (i.e. not SIPROS, which was i.e. not SIPROS, which was deaddead) and drum and large disk had been added.) and drum and large disk had been added.Jobs rose to 5Jobs rose to 5’’800/week, from ~400 users800/week, from ~400 users1010’’000 tapes in the tape library000 tapes in the tape libraryFOCUS system started on CDC 3100FOCUS system started on CDC 3100. Intended primarily for . Intended primarily for a limited number of experiments at the PS using small online a limited number of experiments at the PS using small online computers which will be connected by computers which will be connected by datalinksdatalinksGraphicsGraphics on CDC 3100 (display with lighton CDC 3100 (display with light--pen)pen)

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

105

Page 114: Cern History

CDC 6500CDC 6500Tape reel D

isplay

AND A SUMMARY OF ALL OF THAT?AND A SUMMARY OF ALL OF THAT?

D. O. WILLIAMS

106

Page 115: Cern History

Summary (1/2)Summary (1/2)

CERN had moved from CERN had moved from one valveone valve--based Mercury computer in based Mercury computer in October 1958October 1958 via the IBM 709 (Nov 1960via the IBM 709 (Nov 1960--Sept 1963), then Sept 1963), then the the transistorisedtransistorised 7090 (Sept 19637090 (Sept 1963--July 1965) and CDC 6600 July 1965) and CDC 6600 (from Jan 1965) (from Jan 1965) to maybe twenty to maybe twenty transistorisedtransistorised computers computers in 1967in 1967CDC 3100s hovering around the 6600/6400CDC 3100s hovering around the 6600/6400Several computers controlling filmSeveral computers controlling film--measurement machinesmeasurement machinesAnd others controlling experiments onlineAnd others controlling experiments onlineAnd starting on real specialAnd starting on real special--purpose activity like ADP and purpose activity like ADP and accelerator controlaccelerator control

Summary (2/2)Summary (2/2)We had learned the hard way that computing is an integral We had learned the hard way that computing is an integral part of the life of a scientific laboratorypart of the life of a scientific laboratoryThat software and hardware reliability are vitalThat software and hardware reliability are vitalAnd that the technology was not yet ready to handle And that the technology was not yet ready to handle efficiently the mix of a generalefficiently the mix of a general--purpose computing load with purpose computing load with devices such as devices such as IEPsIEPs and HPDs on the same machineand HPDs on the same machine

The long leadThe long lead--times between order and delivery (and for times between order and delivery (and for acceptance testing) come from another worldacceptance testing) come from another world

We were also learning to balance expenditure between the We were also learning to balance expenditure between the big central computer and the computers at the experiments big central computer and the computers at the experiments (not really suitable for full(not really suitable for full--blown analysis codes)blown analysis codes)

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

107

Page 116: Cern History

SOME EARLY TECHNOLOGYSOME EARLY TECHNOLOGY

PUNCHED CARDSPUNCHED CARDS

Card copying (and listing)Card copying (and listing)

Plu

g boards on righ

tA

lso card supply

D. O. WILLIAMS

108

Page 117: Cern History

Mag

nif

ied

enou

gh y

ou c

an s

pot

card

dec

ksfr

om A

guila

r, D

ufe

y, G

avill

et, K

lein

knec

ht,

Lin

delo

fan

d Sa

lmer

on

Card readers in real useCard readers in real use

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

109

Page 118: Cern History

Looking for the bug

LATER COMPUTER CENTRE MACHINESLATER COMPUTER CENTRE MACHINES

Ferranti Mercury Ferranti Mercury IBM 709/7090 IBM 709/7090 CDC 6600 (et al) CDC 6600 (et al) CDC 7600 (et al)CDC 7600 (et al)

together with IBM 195/3081/3090 and Siemenstogether with IBM 195/3081/3090 and SiemensHOPE/CSF/SHIFT HOPE/CSF/SHIFT ServersServers

together with VAX 780/750/8600 together with VAX 780/750/8600

D. O. WILLIAMS

110

Page 119: Cern History

The location of the The location of the ““Computer CentreComputer Centre””

The Mercury was installed on the ground floor of Building 1 The Mercury was installed on the ground floor of Building 1 –– on the on the JuraJura side side –– in what are now ATLAS offices. You can in what are now ATLAS offices. You can find the location by looking for the bell which is still therefind the location by looking for the bell which is still thereII’’m not sure exactly where the IBM 709/7090 was installed m not sure exactly where the IBM 709/7090 was installed ––I suspect close to but not exactly where the CDC was I suspect close to but not exactly where the CDC was subsequently installedsubsequently installedThe CDC 6600 plus 6500 was installed where the Print Shop The CDC 6600 plus 6500 was installed where the Print Shop is nowis nowB513 was constructed during 1971 and the machines were B513 was constructed during 1971 and the machines were moved there during 1972 (and maybe early 1973)moved there during 1972 (and maybe early 1973)

B513 in construction in 1971B513 in construction in 1971

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

111

Page 120: Cern History

The Move to B513The Move to B513

I have always believed, with the wisdom of hindsight, that it I have always believed, with the wisdom of hindsight, that it was a huge mistake that the new Computer Centre was was a huge mistake that the new Computer Centre was moved so far way in 1972moved so far way in 1972It took almost all DD/CN/IT staff away from the natural It took almost all DD/CN/IT staff away from the natural centre of the lab, and made easy integration with the centre of the lab, and made easy integration with the experiments much harderexperiments much harderOf course, I must recognise that there was an immense Of course, I must recognise that there was an immense amount of good vision behind the move too, and we amount of good vision behind the move too, and we certainly are benefiting from the space as we move towards certainly are benefiting from the space as we move towards LHC computingLHC computingIMO just in the wrong place IMO just in the wrong place –– we built B32 and the hostels we built B32 and the hostels much closer to CERNmuch closer to CERN’’s natural centre afterwardss natural centre afterwards

B513 B513 –– rere--installation of 6600 in 1972installation of 6600 in 1972

D. O. WILLIAMS

112

Page 121: Cern History

CDC 7600 wiringCDC 7600 wiring

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

113

Page 122: Cern History

IBM 370/168 IBM 370/168 -- the CERN Unitthe CERN Unit

Siemens 7880 in 1985Siemens 7880 in 1985

D. O. WILLIAMS

114

Page 123: Cern History

CSF in 1992CSF in 1992

VAX Area in 1993VAX Area in 1993

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

115

Page 124: Cern History

Cray X/MPCray X/MP--48 (198848 (1988--1993)1993)

Servers (NICE etc.)Servers (NICE etc.)

D. O. WILLIAMS

116

Page 125: Cern History

SHIFTSHIFT

CERN ACTS AS THE NATURAL CENTRE CERN ACTS AS THE NATURAL CENTRE FOR A LARGE COMMUNITY FOR A LARGE COMMUNITY

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

117

Page 126: Cern History

Weekly interactive users 1987Weekly interactive users 1987--20002000

Num

ber o

f Use

rs e

ach

Wee

k

Week

Windows 95 Windows NTWGS and PLUS

CERNVMVXCERN

0

2000

4000

6000

8000

10000

12000

8701

8721

8741

8809

8829

8849

8917

8937

9005

9025

9045 91

139133

9201

9221

9241

9309

9329

9349

9417

9437

9505

9525

9545

9613

9633

9701

9721

9741

9809

9829

9849

9916

9936

0004

0024

CERNVM

WINDOWS

VXCERN

PLUS

SOFTWARESOFTWARE

Does MooreDoes Moore’’s Law only apply to hardware?s Law only apply to hardware?

D. O. WILLIAMS

118

Page 127: Cern History

Did we make any progress with software?Did we make any progress with software?

Answer has to be YESAnswer has to be YESIn 1960 an experiment involved (I guess) several hundreds, In 1960 an experiment involved (I guess) several hundreds, up to a few thousands, of lines of code, running on one up to a few thousands, of lines of code, running on one single computer, and which had been written by no more single computer, and which had been written by no more than ~3 programmersthan ~3 programmersIn 2000In 2000--2005 an LHC experiment involves several tens of 2005 an LHC experiment involves several tens of millions of millions of l.o.cl.o.c., running in 1000., running in 1000--1010’’000 processors, and 000 processors, and written by at least ~300 programmerswritten by at least ~300 programmers

So, over 40 years, 10So, over 40 years, 1044--101055--101066 more more l.o.cl.o.c, involving O(100)x , involving O(100)x more authorsmore authors

Delivering software to the endDelivering software to the end--useruser

We (fortunately!) no longer need to duplicate card decks or We (fortunately!) no longer need to duplicate card decks or magnetic tape to deliver software to colleaguesmagnetic tape to deliver software to colleaguesWe deliver it over the network We deliver it over the network -- to the workstationto the workstationIf we are sharing it on a professional basis it can be If we are sharing it on a professional basis it can be configured to some agreed specificationconfigured to some agreed specificationThe world of The world of CERNlibCERNlib, ASIS, AFS, CVS, SRT, , ASIS, AFS, CVS, SRT, ……..

The steady improvements here helped to deliver several The steady improvements here helped to deliver several really major packages which changed the face of physics really major packages which changed the face of physics computing ...computing ...

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

119

Page 128: Cern History

The The ““offline programmersoffline programmers””

(Macleod, YGC(Macleod, YGC††), Bock, ), Bock, ZollZoll††, , KellnerKellner, , PagiolaPagiola, , BurmeisterBurmeister, , BruyantBruyant, Norton, Norton

BrunBrun, , PalazziPalazzi, , GroteGrote, Metcalf, , Metcalf, OnionsOnions

GianiGiani, , ApostolakisApostolakis

The Program LibrariansThe Program LibrariansJames, James, RenshallRenshall, Carminati, Marquina, , Carminati, Marquina, McLarenMcLaren

D. O. WILLIAMS

120

Page 129: Cern History

Languages and Operating SystemsLanguages and Operating Systems

Many software engineers thought that physicists were Many software engineers thought that physicists were renegades and would only ever write in FORTRAN? But renegades and would only ever write in FORTRAN? But things moved.things moved.FF--77, and also C (portable assembler) and C++ (object 77, and also C (portable assembler) and C++ (object orientation)orientation)Basic Basic Java for interactivityJava for interactivityNot to forget scripting languages Not to forget scripting languages –– primarily to interact with primarily to interact with OSsOSs

SCOPE, SCOPE, MVS+WylburMVS+Wylbur, VM/CMS, VMS, Windows, Unix, Linux, VM/CMS, VMS, Windows, Unix, LinuxDigital and Digital and NorskNorsk Data died for not Data died for not recognisingrecognising Unix quickly Unix quickly enoughenoughAnd in the future And in the future ……??What role for Open Source?What role for Open Source?

Some important CERN software suitesSome important CERN software suites

REAP, THRESH, GRIND, SLICEREAP, THRESH, GRIND, SLICESUMX and then HBOOKSUMX and then HBOOKPATCHY and ZEBRAPATCHY and ZEBRAGEANT3 and then GEANT4GEANT3 and then GEANT4

Leading onto PAW, ROOT, POOL and all the other software Leading onto PAW, ROOT, POOL and all the other software that will form the basis of LHC analysis softwarethat will form the basis of LHC analysis software

Not to forget things like the AIS software, EDMS, MAD and Not to forget things like the AIS software, EDMS, MAD and other accelerator design packages, etc. etc.other accelerator design packages, etc. etc.

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

121

Page 130: Cern History

PAWPAW

Discussed in MEDDLE at the September 1986 meeting (first Discussed in MEDDLE at the September 1986 meeting (first meeting with John Thresher present as responsible Director)meeting with John Thresher present as responsible Director)Towards a Physics Analysis Workstation (December 1985)Towards a Physics Analysis Workstation (December 1985)–– Bock, Bock, BrunBrun, , PapePape and and RevolRevol

As the world moved to graphics and interaction, trying to As the world moved to graphics and interaction, trying to bring bring ““everythingeverything”” together to create a convenient together to create a convenient interactive interface for the physicist doing analysisinteractive interface for the physicist doing analysisA real A real de factode facto standard and huge successstandard and huge success

Ren

éB

run

presentation

of PA

W at 1

April 1

98

9C

HEP

in O

xford

D. O. WILLIAMS

122

Page 131: Cern History

PAW demonstrationat Oxford CHEP,

April 1989

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

123

Page 132: Cern History

GEANTGEANT

The basis of much (not all) of CERNThe basis of much (not all) of CERN’’s physics simulation s physics simulation workworkA series of releases aimed at increasing functionality and A series of releases aimed at increasing functionality and reliability reliability ……Of which 3.14 (early days), 3.15 (end Of which 3.14 (early days), 3.15 (end ‘‘91) and 3.21 (Spring 91) and 3.21 (Spring ‘‘94) were some of the most important for LEP94) were some of the most important for LEPEffort then evolved/switched to GEANT4 (OO)Effort then evolved/switched to GEANT4 (OO)

OPALGEANT 3.21

D. O. WILLIAMS

124

Page 133: Cern History

GEANT4 (direct) graphic of ATLAS beamline test GEANT4 (direct) graphic of ATLAS beamline test A

LICE R

OO

T display

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

125

Page 134: Cern History

MEASURING MACHINESMEASURING MACHINES

As we have seen these formed a critical part As we have seen these formed a critical part of the early computing loadof the early computing load

Multiple versionsMultiple versions

IEPsIEPs were needed for many different films and chambers were needed for many different films and chambers (both bubble(both bubble-- and sparkand spark--))Initially punching paperInitially punching paper--tape tape –– later controlled by computerlater controlled by computerThen machines with more/less automation and different Then machines with more/less automation and different scanning technologies. scanning technologies. FSDsFSDs (mechanically generated spot (mechanically generated spot scans), scans), Spiral ReadersSpiral Readers (spiralling slit scan starting from (spiralling slit scan starting from vertex) and CRT scanners (vertex) and CRT scanners (LucioleLuciole, PEPR, , PEPR, ERASMEERASME))Until the arrival of machines of the PDPUntil the arrival of machines of the PDP--6/10 class there was 6/10 class there was a continuing problem about how much programmed feeda continuing problem about how much programmed feed--back could be made available to the operator back could be made available to the operator –– not enough not enough memory space & floating point power to run real memory space & floating point power to run real reconstruction.reconstruction.But lots of good mechanical and electronic engineeringBut lots of good mechanical and electronic engineering

D. O. WILLIAMS

126

Page 135: Cern History

Spiral Reader Spiral Reader -- 19691969

ERASME ERASME -- 19711971

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

127

Page 136: Cern History

Hough and PowellHough and Powell

Various HPD (bubble chamber) programmers and users:Various HPD (bubble chamber) programmers and users:--Moorhead, Moorhead, SamblesSambles, , JoostenJoosten, , EvershedEvershed, , FerranFerran (2x), (2x), QuercighQuercigh,,

CelnikierCelnikier, Morrison. Missing: Gerard, Lord, , Morrison. Missing: Gerard, Lord, AltaberAltaber,,AntoniozAntonioz--Blanc, Blanc, HowieHowie, French, Atherton, French, Atherton

D. O. WILLIAMS

128

Page 137: Cern History

Rin

us

Verkerk, A

dolfo Fuccian

d one oth

er (kneelin

g)

ONLINE COMPUTING AND DAQONLINE COMPUTING AND DAQ

The online connection of CERN experiments The online connection of CERN experiments to the Computer Centre had a long history to the Computer Centre had a long history

(as we have already seen)(as we have already seen)

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

129

Page 138: Cern History

IBM 360/44IBM 360/44P

er Scharff-H

ansen

at the con

sole

OMEGA CII 10070OMEGA CII 10070

D. O. WILLIAMS

130

Page 139: Cern History

ONSITE NETWORKINGONSITE NETWORKING

FocusFocusRIOSRIOS

TerminalsTerminalsCERNETCERNET

The world of TCP/IPThe world of TCP/IP

FOCUS in August 1969FOCUS in August 1969

Yu

le at console; G

erard-Weigh

ts standin

g on righ

t

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

131

Page 140: Cern History

1972 1972 –– a protoa proto--RIOS in B112 (West Area)RIOS in B112 (West Area)M

ichael Metcalf, Anton Frölich,

and an unknown person at the printer

UA1 analysis?UA1 analysis?Note the Gandalf terminal switchesNote the Gandalf terminal switches

D. O. WILLIAMS

132

Page 141: Cern History

CERNET CERNET ModcompModcomp teamteam

GigaswitchGigaswitch interoperability testsinteroperability tests

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

133

Page 142: Cern History

OFFSITE NETWORKINGOFFSITE NETWORKING

Broadly speaking we have moved from external Broadly speaking we have moved from external connectivity at kbps in the early 1980s to Mbps in the early connectivity at kbps in the early 1980s to Mbps in the early 1990s to Gbps in the early 2000s. By LHC start1990s to Gbps in the early 2000s. By LHC start--up or soon up or soon

after we are likely to reach a total of 100 Gbps. We will after we are likely to reach a total of 100 Gbps. We will need that to handle the grid data flows.need that to handle the grid data flows.

Francois Flű

ckigercon

nectin

g to Transpac?

D. O. WILLIAMS

134

Page 143: Cern History

Internet2 Land Speed RecordIPv4 Single and Multiple Streams

Raw Metrics• 27 February 2003• 1.1 terabytes• 1 hour, 1 minute, 40 seconds• 10,037 kilometers

Results• 2.38 gigabits per second• Sunnyvale to Geneva....by way Chicago• 23,888.06 terabit-meters/second

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

135

Page 144: Cern History

Record breaking teamRecord breaking team

CONTROLLING ACCELERATORSCONTROLLING ACCELERATORS

Control systems play a crucial role, especially during the startControl systems play a crucial role, especially during the start--up of new accelerators and their subsequent operationup of new accelerators and their subsequent operation

The SPS control system was responsible for several important The SPS control system was responsible for several important computing advances (though the PS always had stronger realcomputing advances (though the PS always had stronger real--time constraints). And the time constraints). And the NordNord story is an important part of story is an important part of CERNCERN’’s European TT history.s European TT history.

I could find more SPS than PS picturesI could find more SPS than PS pictures

D. O. WILLIAMS

136

Page 145: Cern History

Michael CrowleyMichael Crowley--MillingMilling

NordNord--10s for the SPS controls10s for the SPS controls

Raymond Rausch

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

137

Page 146: Cern History

SPS consoleSPS consoleB

ent Stu

mpe

at console

Controls Display in 1976Controls Display in 1976

D. O. WILLIAMS

138

Page 147: Cern History

Controls Display (pControls Display (p--bar) in 1991bar) in 1991

VARIOUS OTHER THINGSVARIOUS OTHER THINGS

ADP (and then AIS)ADP (and then AIS)EUEU--funded computing activityfunded computing activity

GridderyGridderyControl systems at experimentsControl systems at experiments

And still many other thingsAnd still many other things

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

139

Page 148: Cern History

ADP 360/30 in March 1968ADP 360/30 in March 1968

ADP 360/50 in April 1971ADP 360/50 in April 1971

Note th

e very early model IB

M 3

33

0 disks

Lionel G

rosset(in

visible) at the con

sole. Jean

Gros

standin

g

D. O. WILLIAMS

140

Page 149: Cern History

IT database specialists

Shiers, Santiago, Segura, Dűllmann (Moorhead inset)

EUEU--funded computing activitiesfunded computing activities

Bob Dobinson was a pioneer and was involved in several Bob Dobinson was a pioneer and was involved in several different projects (especially different projects (especially transputerstransputers and advanced and advanced networking)networking)MeikoMeiko CSCS--2 was a European example of the classic US 2 was a European example of the classic US ““place place a supercomputer at a national lab and they will do a supercomputer at a national lab and they will do something useful with itsomething useful with it”” approach. In our case we used the approach. In our case we used the MeikoMeiko to run NAto run NA--48 data taking etc. etc.48 data taking etc. etc.And since 2001 DataGrid and EGEE (Fabrizio Gagliardi)And since 2001 DataGrid and EGEE (Fabrizio Gagliardi)

Not to forget the partial support for direct CERN connectivity Not to forget the partial support for direct CERN connectivity to GEANT (we share the Swiss connection with the Swiss to GEANT (we share the Swiss connection with the Swiss national network SWITCH)national network SWITCH)And DataTAG and othersAnd DataTAG and others

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

141

Page 150: Cern History

GridsGrids

See nextSee next

25/05 2004 - 128

Summary

LCG-2 is running as a production service

Anticipate further improvements in infrastructure

Broadening of participation and increase in available resources

In 2004 we must show that we can handle the data - meeting the Data Challenges is the key goal of 2004

2004

2005

2006

2007first data

Initial service in operation

Decisions on final core middleware

Demonstrate core data handling and batch analysis

Installation and commissioning

Courtesy: Oliver KeebleLCG Deployment Status

25 May 2004

D. O. WILLIAMS

142

Page 151: Cern History

Some different sorts of computing for HEPSome different sorts of computing for HEP

Calculations Calculations –– for experiments for experiments –– accelerators accelerators –– theory theory –– engineeringengineeringData acquisitionData acquisitionData storage and processingData storage and processingSimulationSimulationInteractivityInteractivityGraphics Graphics VRVRCommunity computingCommunity computing–– Email, program and data Email, program and data ““managementmanagement””, (Web access), (Web access)–– Remote conferencing and collaborative toolsRemote conferencing and collaborative tools

Information access (Web)Information access (Web)DatabasesDatabasesNetworkingNetworkingSymbolic computingSymbolic computingControl systemsControl systemsLattice QCD calculationsLattice QCD calculations

DATA HANDLINGDATA HANDLING

The major challenge by far for CERN The major challenge by far for CERN computing lies in the datacomputing lies in the data

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

143

Page 152: Cern History

Tapes being sent up from B513 basementTapes being sent up from B513 basement

What they found when they got upstairsWhat they found when they got upstairs

D. O. WILLIAMS

144

Page 153: Cern History

3480 cartridge (200 MB)3480 cartridge (200 MB)

STK silos in 2000STK silos in 2000

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

145

Page 154: Cern History

Data handling softwareData handling software

Moving between files structures (understood by machines) Moving between files structures (understood by machines) and structures of events grouped into various stages of and structures of events grouped into various stages of analysisanalysisFATMEN at LEPFATMEN at LEPA crucial determinant for Grid software A crucial determinant for Grid software -- can we get wide can we get wide consensus on how to address our data?consensus on how to address our data?

INFORMATION HANDLINGINFORMATION HANDLING

More of what I just mentionedMore of what I just mentionedThe world is hoping that computing can bring more The world is hoping that computing can bring more

order order -- and perhaps some intelligence (we have a bad and perhaps some intelligence (we have a bad track record!) to all sorts of information. Semantic track record!) to all sorts of information. Semantic

Web Web -- Semantic Grid etc.Semantic Grid etc.

D. O. WILLIAMS

146

Page 155: Cern History

Text ProcessingText Processing

Something that we/I tend to forgetSomething that we/I tend to forgetAll over the world scientists All over the world scientists ““did it themselvesdid it themselves”” and took part and took part in standardisation effortsin standardisation effortsA good review in the 35A good review in the 35thth Anniversary Computer Newsletter Anniversary Computer Newsletter Edition, by Michel GoossensEdition, by Michel Goossens

The WebThe Web

Something that we (especially Tim BL) did rightSomething that we (especially Tim BL) did rightWe had the problem We had the problem -- info sharing for LEP experimentsinfo sharing for LEP experimentsWe had the infrastructure We had the infrastructure -- reasonable networks had just reasonable networks had just startedstartedWe had just enough smart peopleWe had just enough smart peopleAnd we managed to put the critical software in the public And we managed to put the critical software in the public domaindomain

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

147

Page 156: Cern History

Robert Cailliau (+Tim BL and Ted Nelson)Robert Cailliau (+Tim BL and Ted Nelson)

Photo: Hakon Lie

D. O. WILLIAMS

148

Page 157: Cern History

PROSPECTS FOR THE FUTUREPROSPECTS FOR THE FUTURE

(COMPUTING AT CERN)(COMPUTING AT CERN)

ComplexityComplexity

Computing will never be easy because of complexityComputing will never be easy because of complexityWe probably donWe probably don’’t explain the source of the complexity enought explain the source of the complexity enoughTake a serious application from 1970 Take a serious application from 1970 -- say 10say 10’’000 lines of code (000 lines of code (l.o.cl.o.c.).)Assume that every 10 Assume that every 10 l.o.cl.o.c. we encounter a 2. we encounter a 2--way logical decision, where the way logical decision, where the route taken depends on the input data, or on the prior state of route taken depends on the input data, or on the prior state of calculations. calculations. That is a very conservative estimate.That is a very conservative estimate.There are, therefore, 2There are, therefore, 21000 1000 possible routes through our codepossible routes through our code2210001000 = (2= (21010))100100 = 10= 10300 300 routes! A googol cubed.routes! A googol cubed.LHC experiments will use a total of some 10LHC experiments will use a total of some 107 7 lines of code!!lines of code!!

[Bock and [Bock and ZollZoll in the 1972 CERN Courier in the 1972 CERN Courier –– Central Computers in Bubble Chamber Central Computers in Bubble Chamber analysis] analysis] ““The computer spends most of its time processing unThe computer spends most of its time processing un--problematic events. The problematic events. The programmer, on the other hand, spends most of their time foreseeprogrammer, on the other hand, spends most of their time foreseeing ing –– or or discovering discovering –– possible difficulties and programming the computer to deal withpossible difficulties and programming the computer to deal withthem. The computer programs for bubble chamber experiments starthem. The computer programs for bubble chamber experiments start with t with elegant and simple ideas, and end up complex and sophisticated.elegant and simple ideas, and end up complex and sophisticated.””

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

149

Page 158: Cern History

The fight against complexityThe fight against complexity

Decompose problems Decompose problems –– several small and (fairly) simple several small and (fairly) simple systems tend to beat complex monolithssystems tend to beat complex monolithsTry to think clearlyTry to think clearlyBe prepared to handle errors (easier said than done)Be prepared to handle errors (easier said than done)

The natural tension between particle physics The natural tension between particle physics and computingand computing

CERN experiments have to be missionCERN experiments have to be mission--drivendrivenCERN computer specialists cannot be CERN computer specialists cannot be entirelyentirely missionmission--drivendriven–– They must pay attention to how computing technology (all differeThey must pay attention to how computing technology (all different nt

wares) is evolvingwares) is evolving–– And how the economics of the industry evolvesAnd how the economics of the industry evolves–– In order to find optimal solutions In order to find optimal solutions ““for tomorrowfor tomorrow””

Some elements of CERN computing need to be handled Some elements of CERN computing need to be handled centrally, and others must be handled directly by the centrally, and others must be handled directly by the experimentsexperimentsFinding the correct balance in that respect is an ongoing job Finding the correct balance in that respect is an ongoing job for the IT and PH department managementsfor the IT and PH department managements

D. O. WILLIAMS

150

Page 159: Cern History

The impact of HEP computing on the wider The impact of HEP computing on the wider world?world?

Impact on other sciencesImpact on other sciencesImpact on society in generalImpact on society in generalImpact on developing countriesImpact on developing countries

The +The +veve side of computingside of computing–– Info accessInfo access–– Great help for Great help for organisingorganising meetings and travelmeetings and travel–– EnrichingEnriching

The The ––veve sideside–– SpamSpam–– CrooksCrooks–– MindMind--numbingnumbing

ProgrammersProgrammers

To be a good programmer some part of your mind must To be a good programmer some part of your mind must sympathisesympathise with a machine with a machine –– which has to do what it is told, which has to do what it is told, so must be told so must be told –– in all gory detail in all gory detail –– exactly what it must do.exactly what it must do.If you can communicate well with a machine can you also If you can communicate well with a machine can you also communicate well with people?communicate well with people?And can you communicate without your machine?And can you communicate without your machine?

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

151

Page 160: Cern History

The companiesThe companies

In 50 years of CERN computing several companies have In 50 years of CERN computing several companies have been major players been major players –– and we should probably say major and we should probably say major contributors to the workcontributors to the workCDC, Cray, DEC/Digital, HP, IBM, CDC, Cray, DEC/Digital, HP, IBM, NorskNorsk DataData–– At various times DEC and IBM had Joint Project teams of a seriouAt various times DEC and IBM had Joint Project teams of a serious s

sizesize

Now Open Lab provides a more general possibility for such Now Open Lab provides a more general possibility for such interactionsinteractions

Is that all only on the side of the hardware? What about Is that all only on the side of the hardware? What about software suppliers software suppliers –– Oracle, Microsoft?, Numerical Oracle, Microsoft?, Numerical Algorithms Group (NAG)?Algorithms Group (NAG)?

Our successes and maybe some failuresOur successes and maybe some failures

Web has to be the biggest successWeb has to be the biggest successBut also CERNLIB as a sharing mechanismBut also CERNLIB as a sharing mechanismAnd a lot of widelyAnd a lot of widely--used packagesused packages–– REAP, THRESH, GRIND, SLICEREAP, THRESH, GRIND, SLICE–– HBOOK HBOOK PAWPAW–– GEANT GEANT GEANT4GEANT4–– Data acquisition software (PDP, NordData acquisition software (PDP, Nord--10(0) and 50(0), HP, VAX)10(0) and 50(0), HP, VAX)–– Packages from outside Packages from outside ““pure HEPpure HEP””

And our external networking (especially CIXP)And our external networking (especially CIXP)

Why did we have less impact (overall) on the development of earlWhy did we have less impact (overall) on the development of early y Internet standards?Internet standards?–– CERNET (1973 (concepts) to 1977 (deployment) to 1988 (turnCERNET (1973 (concepts) to 1977 (deployment) to 1988 (turn--off)) served off)) served

our onsite needs superbly but was CERNour onsite needs superbly but was CERN--specificspecific–– HEP deployed HEP deployed DECnetDECnet very successfully in the wide area (plus wide use of very successfully in the wide area (plus wide use of

mainframe solutions) and was slow to look at mainframe solutions) and was slow to look at ““peerpeer--toto--peerpeer””–– But we were probably slow to catch on to INTERBut we were probably slow to catch on to INTER--networkingnetworking

D. O. WILLIAMS

152

Page 161: Cern History

…… some failuressome failures

We have been quite strong in the We have been quite strong in the ““industrialindustrial”” deployment of software deployment of software development, and software development distributed over the widedevelopment, and software development distributed over the wide--area, area, but have never been but have never been recognisedrecognised as such by the broad community. Why?as such by the broad community. Why?–– PATCHY was extremely advancedPATCHY was extremely advanced–– As were many of the ideas for machine independence behind As were many of the ideas for machine independence behind ““CERN FortranCERN Fortran””–– And distributed software developmentAnd distributed software development

I think that our problem here has been that I think that our problem here has been that –– (a) the people involved were very busy (a) the people involved were very busy –– driven by experiment deadlines and driven by experiment deadlines and –– (b) we did not see it as our job to make this information availa(b) we did not see it as our job to make this information available to others ble to others ––

either potential users in other sciences or to computer scientiseither potential users in other sciences or to computer scientists and ts and (commercial) software engineers who should have been kept inform(commercial) software engineers who should have been kept informeded

Computing tends to be too introvertedComputing tends to be too introvertedAs I said that the start As I said that the start –– The real problem has always been, in my opinion, The real problem has always been, in my opinion, getting people to getting people to

collaborate on a solutioncollaborate on a solution

But the success predominateBut the success predominate

By a long wayBy a long wayAnd it was:And it was:--–– ChallengingChallenging–– FascinatingFascinating–– Lots of hard workLots of hard work–– And great FUNAnd great FUN

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

153

Page 162: Cern History

WIM KLEINWIM KLEIN

Intro for the younger members of the audience:Intro for the younger members of the audience:--In the early days computing was quite difficult. Wim Klein was In the early days computing was quite difficult. Wim Klein was a a ““human computerhuman computer”” who provided numerical assistance, especially who provided numerical assistance, especially

to generations of CERN theoretical physiciststo generations of CERN theoretical physicists

He retired from CERN in 1976He retired from CERN in 1976

D. O. WILLIAMS

154

Page 163: Cern History

FINAL REMARKSFINAL REMARKS

The The PhotoLabPhotoLab Archives Archives –– time is running outtime is running out

There is a lot of fascinating photographic material in the There is a lot of fascinating photographic material in the PhotolabPhotolab Archives, but only a tiny fraction has been scanned, Archives, but only a tiny fraction has been scanned, and the rest is almost entirely without any descriptions. and the rest is almost entirely without any descriptions. From the early years of CERN (1950s and 60s) many of the From the early years of CERN (1950s and 60s) many of the people shown on the photos are now >80 (or dead) and people shown on the photos are now >80 (or dead) and even the people who have some idea what the pictures even the people who have some idea what the pictures might be about will soon retire.might be about will soon retire.We urgently need to do a mass scan of the photo archive We urgently need to do a mass scan of the photo archive and make it Weband make it Web--available so that past and present CERN available so that past and present CERN staff are able to provide descriptions.staff are able to provide descriptions.We have to find the money to do this We have to find the money to do this –– itit’’s our contribution s our contribution to preserving a minimum to preserving a minimum ““historical heritagehistorical heritage””

FIFTY YEARS OF RESEARCH AT CERN, FROM PAST TO FUTURE: COMPUTING

155

Page 164: Cern History

TakeTake--home messageshome messages

Computing for particle physics is challenging Computing for particle physics is challenging –– and it is likely and it is likely to stay that wayto stay that wayOne big change of the next decade will be the move to fully One big change of the next decade will be the move to fully mobile computingmobile computingWe can also hope that there will be moves to improve the We can also hope that there will be moves to improve the way in which most information is organised and made way in which most information is organised and made accessibleaccessibleIt will be important for CERN (and particle physics) to invest It will be important for CERN (and particle physics) to invest enough in computing infrastructure (including people, enough in computing infrastructure (including people, servers, network bandwidth and performance monitoring) servers, network bandwidth and performance monitoring) since that is vital for each persons professional working since that is vital for each persons professional working efficiencyefficiency

And make sure that computing stays FUN for both users and And make sure that computing stays FUN for both users and providersproviders

THE ENDTHE END

Or rather, the end of what I dare detain you withOr rather, the end of what I dare detain you with

D. O. WILLIAMS

156