Notice: We are in the process of migrating Oral History Interview metadata to this new version of our website.
During this migration, the following fields associated with interviews may be incomplete: Institutions, Additional Persons, and Subjects. Our Browse Subjects feature is also affected by this migration.
We encourage researchers to utilize the full-text search on this page to navigate our oral histories or to use our catalog to locate oral history interviews by keyword.
Please contact [email protected] with any feedback.
This transcript may not be quoted, reproduced or redistributed in whole or in part by any means except with the written permission of the American Institute of Physics.
This transcript is based on a tape-recorded interview deposited at the Center for History of Physics of the American Institute of Physics. The AIP's interviews have generally been transcribed from tape, edited by the interviewer for clarity, and then further edited by the interviewee. If this interview is important to you, you should consult earlier versions of the transcript or listen to the original tape. For many interviews, the AIP retains substantial files with further information about the interviewee and the interview itself. Please contact us for information about accessing these materials.
Please bear in mind that: 1) This material is a transcript of the spoken word rather than a literary product; 2) An interview must be read with the awareness that different people's memories about an event will often differ, and that memories can change with time for many reasons including subsequent experiences, interactions with others, and one's feelings about an event. Disclaimer: This transcript was scanned from a typescript, introducing occasional spelling errors. The original typescript is available.
In footnotes or endnotes please cite AIP interviews like this:
Interview of Donald F. Geesaman by Catherine Westfall on 2012 June 18,
Niels Bohr Library & Archives, American Institute of Physics,
College Park, MD USA,
For multiple citations, "AIP" is the preferred abbreviation for the location.
Dr. Geesaman, a distinguished Argonne National Laboratory physicist and former head of the Argonne Physics Division, explained the beginning of his career and what he remembers of the beginnings of the experimental program at the Thomas Jefferson National Accelerator Facility (JLab). He also discussed the major achievements of the program.
This is Catherine Westfall. I’m with Don Geesaman. We’re at Argonne, and it is the 18th of June, 2012. Could you just very quickly say, you got your bachelor’s degree from some engineering school in Colorado. It’s short, because then you came here.
I got my undergraduate degree in Mineral Engineering, which was Engineering Physics at Colorado School of Mines in 1971. Then I went to the State University of Stony Brook to become a nuclear theorist, which quickly converted into becoming an experimentalist. Got my Master’s and PhD at Stony Brook in ’76. Came to Argonne as a Postdoc. I was promoted to staff in 1978.
And then you were division director from?
I was division director from 2000 to 2007.
OK, thanks. Now we can skip to the beginnings of the JLab experimental program.
So the electron accelerator had been proposed in 1979 by the Nuclear Science Advisory Committee (NSAC) report “A Long Range Plan for Nuclear Science.” And the community then got together to outline what the scientific case would be, and that was done with the Blue Book that came out of the MIT workshop. Most of the physics that was in that was of course the physics we were already doing. The point was to do it much, much better. So we considered topics to explore the structure of the nucleon, particularly nucleon form factors, but most people didn’t think that would lead to discovery; it was just doing more precise measurements and pushing measurements to higher momentum transfer. There was also a lot of effort aimed at exploring the physics of nuclei, single particle levels, structure of nuclei and nuclear shapes. Then we also wanted to use the nucleus as a laboratory, so we talked about parity violation and weak interactions, issues being studied right then at Los Alamos at LAMPF and hadrons in the nuclei medium. So there was a nice assembly of physics topics. If you look at that, you can see very clearly how that led to the Jefferson Lab program. The Barnes committee then formed to decide what the parameters should be for the accelerator. That was in 1982. I think it was heavily influenced by Stan Brodsky and his ideas about QCD structure — he believed the QCD structure, the core structure of hadrons would be visible at very low momentum transfer. This was a time when most of the rest of the nuclear community felt that the nucleus was made of protons, neutrons, and mesons. At some point you would clearly make contact with quarks, but it wasn’t clear you’d be able to see them directly at the energy scales of this new accelerator. So the accelerator was originally thought to be a 2 GeV accelerator, and the Barnes Subcommittee members were the first ones to decide that that wasn’t a big enough step. They (and particularly Brodsky) wanted to go to higher energies and argued that a 4 GeV accelerator was needed. So all of those of us who were working on building proposals — the Virginia group, Jim McCarthy, the Argonne group — went back and said, “All right, we can come up with a proposal for a 4 GeV accelerator.” The question was always, “What is the right number?” You know, two, four, six? If you intended to build a 2 GeV accelerator, you would have probably focused much more on the properties of nuclei in the research program. By going to a 4 GeV accelerator, the focus was naturally more on the high-energy range and the properties of baryons and mesons. And of course what then happened was that MIT, which had been viewed as the leading candidate for the facility, decided that their accelerator was not so easily upgradable to 4 GeV. So they devised a 2 GeV proposal, and because the Barnes Committee had asked for 4 GeV, their proposal was not as competitive. I think that’s why the competition ended up being basically between Virginia and Argonne.
So that was the contest between Argonne and Jim McCarthy at UVA.
So the next step was really the revolution in the perception in the community, which happened when the European Muon Scattering Collaboration showed that the quark structure of the nucleus was different than the quark structure of the nuclei. And that offered a completely different perspective. I think of it as a paradigm shift. Some people think that’s too strong. But at that time, we had major problems in nuclear physics. There were fundamental aspects of the nucleus that we were calculating and we were getting the wrong answer. And if you say that the quark structure of a proton changes inside a nucleus, then our conventional way of doing nuclear calculations might, in fact, not be right. And so these QCD effects then could explain why we have trouble getting the binding energy of helium-4, the binding energy in nuclear matter, or problems with other nuclear structure problems. The shapes of nuclei did not follow mean field calculations at the very central densities. They were quite successful on the surface, but not in the center of the nucleus. And that was where you might expect the densities to be highest and things to change the most. So that then really invigorated the idea that we needed to understand the QCD structure nuclei, and if we had to do that, it meant we also had to understand the QCD structure of protons and neutrons and other mesons better. So the community went off and tried to do what experiments it could with existing facilities. So I went to Fermilab and did deep inelastic muon scattering. We had the Nuclear Physics at SLAC (NPAS) program at SLAC with Roy Holt. This very successful experiment showed that, in fact, when you do photon-induced deuteron breakup, you saw quark counting behavior, meaning that energy dependence of the cross-section seemed to count the number of quarks in the deuteron. And that was viewed as the clearest evidence that you could directly probe the QCD effects in nuclei. It actually still to this day is a sort of unique piece of physics that seems to show quark counting behavior even though we know many other regions where people expected quark counting behaviors and don’t see it. So it was perhaps a bit of serendipity. But in any event, that’s one of the early experiments that was run, to extend those measurements in deuteron photodisintegration. I was doing experiments at Bates to understand proton propagation in nuclei. That was a quite successful experiment that showed that protons have a very long mean free path in nuclei. And then there was the prediction from QCD that you could see color transparency — you should see the proton mean free path get longer for a proton that’s made with a hard, short distance, electronuclear reaction. Bob McKeown and Richard Milner, who were then at Caltech, proposed the experiment at SLAC to search for this color transparency. A bunch of groups around the country, including all our Argonne staff members, joined in that. Richard Milner then moved to MIT. That experiment was very successful, but showed no evidence of this QCD-expected behavior up to a certain energy. My point of view was that we needed to study phenomena both at the quark level, very high energy probes directly probing the quarks, and we needed to study phenomena at the proton and meson level. We could only believe we had the right answer if we got a consistent picture, that for example, the size of the proton changed or some other behavior happened that was consistent. So I was doing these hard scale experiments, hard meaning short distance experiments. At Argonne we were also involved in the HERMES experiment. So what we proposed was a series of experiments at Jefferson Lab that would look at phenomena at the hadronic level to look for proton knockout reactions to study color transparency as well as nuclear structure. That would also mean studying the pion field inside a nucleus. That was, the fact, done in an experiment led by Hal Jackson. To study kaon production and lambda-nucleon interactions, there was also an experiment led by Ben Zeidman. The idea was also to extend Roy Holt’s measurements on deuteron photo-disintegration to even higher energy. Because my experiment, (e,e’p) proton knockout, and Roy’s experiments were perhaps the simplest experiments you could do, they were the ones that were chosen for the commissioning experiments in Hall C, the first hall that would get beam. (This was even though my experiment had very low priority, a B-minus priority, I think, although Roy’s experiment did have an A priority, as I recall.)
Yes, you had a B-minus and he had an A priority. I am interested in understanding the stage up to the time of the first Program Advisory Committee, PAC 1.
So what I just said explains much of the transition from PAC 1, which was probably in the late-‘80s. The original PACs didn’t consider proposals. Instead, they considered the framework to use for the selection of experimental equipment. And then in ’89 to ’91, they started considering proposals. So for example, the D(γ,p) proposal was, I believe, approved at the first PAC that considered proposals. I think it was actually PAC 4 or something like that. My (e,e’p) experiment, because it was a lower priority, actually wasn’t approved until two years later when we had the results from the Bates experiment and some of the results from SLAC, and so people could see the value of doing this experiment. So the Argonne group had five different experimental proposals, all aimed at exploring phenomena at the hadronic and the quark level. We were not involved in Hall B because I was not convinced that the baryon spectroscopy program would yield much that was new to teach us about the structure of the hadron. There were single experiments that I could see might be very important, but it just seemed to me it was better to study form factors and deep inelastic scattering. So the experimental program started. It took off. There was, of course, a huge backlog because the PAC kept approving experiments, and that was good and bad. It brought in a large community, but it meant that for a lot of the experiments there was not much running time. So Larry Cardman, when he took over as Associate Director for physics, instituted the Jeopardy program to try to deal with this backlog to make sure that we were running the best experiments. And it seemed the users wanted to have a large number of experiments approved, and if you were going to take that tack, I thought that was a perfectly reasonable approach. In high-energy physics, or at Brookhaven, the approach is to approve one apparatus, like the CLAS detector in Hall B. You know at RHIC, you have two big detectors, and then you might have a beam use proposal saying this is what you’re going to do as opposed to individual measurements. But the perception was that the community at Jefferson Lab didn’t want to go that way. So that was fine. I think things developed. In my opinion, as a result of the leadership from Bob McKeown and then Doug Beck later on in parity violating electron scattering, it was very clear that the parity violation measurements, though difficult, would really provide some unique insight on the proton structure. And that’s borne great fruit. It turned out that Jefferson Lab was a wonderful place to do that. They produced a pure polarized beam. In fact, in large part thanks to Charlie Sinclair, who led the source group, and then his successor, Matt Poelker, who was a postdoc with us at Argonne. Actually, when I was chair of the users group -– which was around 2001 or a little earlier — we decided we needed to start thinking about what the next set of experimental equipment should be. Experimenters were having great results with the initial experiments that had been envisioned, but we decided we should be thinking about how to broaden the program. This was when Hermann Grunder and Nathan Isgur were contemplating how the lab as a whole should go to the next project. So they incorporated our workshops, actually, in developing the case for a 12 GeV upgrade. Higher energies had been talked about, even before CEBAF turned on. It was clear we had to stick with 4 GeV or step into the QCD regime. They were able to push the accelerator to close to 6 GeV. That lets you do some other experiments. 12 GeV opened another set of experiments. When I was chair of the Program Advisory Committee, Nathan Isgur instituted a series of workshops that we’d have before each of the Program Advisory Committee meetings, or occasionally after each one, where you’d take a sector of the program, and the PAC would have presentations, just informally discuss it and try to list where are the new ways it could go, and what are the ideas? It was actually something that I enjoyed as PAC chair, because I was also put in the position of having to summarize these discussions, and I really found that they helped crystallize the PAC’s ideas about what was important to do in the future. So Nathan brought the 12 GeV plan to us, and the ideas they had. We helped contribute to that and try to flesh ideas out, try to say what needed more work, and try to give the lab as much feedback as we could about how to develop the arguments for going to 12 GeV. The trick was, of course, always to find something that was unique. You could clearly do the same kind of experiments you did at higher momentum transfer, and there was a lot of value in that, but to sell a new major project, you have to be able to argue that you will do work that is unique. What was near and dear to Nathan’s heart was baryon spectroscopy, and as I said, I was not a huge fan of that. But that was the area where he could something that was truly unique, by doing meson spectroscopy with polarized photon beams, in particular looking for exotic mesons and hybrids. And there you could lay out a scientifically credible case that to do that program, you really needed energies of about 10 to 12 GeV. If you had less than that, you couldn’t do that program well enough to get results. So that provided a threshold to determine what the capabilities of the accelerator should be, and thus the design so that plans could be solidified. Whereas if you didn’t have that sort of step, then you could always argue, “Well you could just go 8 or 10 or 12 or 20,” and it was always difficult to make the argument about what the top energy should be. This process gave us a real step function so that we could say, “You have to go here.” And that, in turn, put a scale to the cost of the projects, and then other planning was coordinated around that start.
Okay. So that’s how you got to the 12 GeV.
It was a combination of scientific and technical reasons. 12 GeV was, in some sense, the maximum credible energy you could manage with the current tunnels and the performance of superconducting cavities, coupled to a physics argument; that you needed at least that energy to be able to do a certain piece of physics. That’s a very simple argument — if you want to look for hybrid mesons, then polarized photons are the best probe. And no one had done that before, so it was virgin territory. So there’s no question it could be done much, much better, and if we see hybrid mesons clearly, then it would be a very nice scientific coup for Jefferson Lab.
So it’s interesting. So there was this sense that you should be looking for what we might call exotic phenomena. I remember talking with Dirk Walecka. He always thought it was most important to simply make these finer measurements rather than search for new phenomena.
Well I actually tend to agree with Dirk’s point of view rather than Nathan’s.
How interesting. There is this tension, then, inherent in this branch of physics.
Right. It’s certainly true. And again, the problem is always what sets the parameters that determine how to design the accelerator? How do you justify picking this target for your accelerator? And it was Nathan and the meson spectroscopy people who ended up making the underlying decisions. Alex Dzierba from Indiana, he had been doing experiments at Brookhaven looking for exotic mesons, and there had been some controversy about the experiments. There was a collaboration that fragmented, and only part of the collaboration published some of the work and the others repudiated it. Because it’s a very subtle and sophisticated analysis, particularly if you’re using pion beams, as they were, because the probability of forming these states with a pion beam is relatively small. And that’s the real advantage of photon beams. With a pion beam you have a quark and antiquark coupled to zero angular momentum. In all the low-lying of these hybrid states, the quark and antiquark have to be aligned. So it means with a pion beam you have to flip the spin on one of the quarks, and that’s not very easy to do. With the polarized photon beam, you immediately produce two quarks that are aligned, so all you have to do is excite the glue. So that was the scientific argument, and it’s a very simple scientific argument. It’s possible that nature will make all these states. The issue is, are these states going to be narrow? If these states are narrow, then they will be seen very clearly and everyone will agree with what they are. If the states are very broad, then there is considerable argument about what you’ve learned from because questions are raised about whether you have properly extracted signal from background and that sort of thing.
What do you think are the highlights of the entire 16 or 17 years?
To my mind, the whole program on nucleon form factors has been a highlight because the result was very unexpected. And again, the original experiment that used polarization transfer to study nucleon form factors was only given a B-plus priority. People said, “Yeah, this is nice, we’ll measure it more accurately, but we’re not going to find anything new.” And of course, they found something completely different. So I would point to that. The strange quark form factors have totally changed our picture of the QCD structure of the proton. And that’s still very challenging for many of the modelers. That’s not to say there weren’t modelers before it happened that predicted you would see such behaviors. Those modelers have gone out of fashion, because in the old days they’d look quite differently. So I think the parity violation measurements of the strange quark form factors, the electromagnetic form factors, has clearly been a major success. I think the high momentum transfer data on deuterium, helium-3, helium-4, is sort of in the Walecka picture, which gives us insight into how QCD manifests itself in nuclei. The work on short-range correlations has been, I think, very important, because we knew from theory that short-range correlations had to be there. We knew it had a renormalizing effect in changing spectroscopic factors. But to directly see the short-range correlations in (e,e’), two nucleon, and three nucleon knockout reactions I think has been very important. One of the issues that is relevant for this is, of course, at the same time effective field theories have become very influential in nuclear physics. Effective field theories are based on very long-distance phenomena, and they integrate over short-distance behavior. And so I think the whole discussion of how short distance phenomena are important in nuclei versus how they show up in these effective field theory descriptions has taught us something about both effective field theories and about what we’re actually measuring. So I think that’s important. The other things that I think are quite important are the few quantitative tests of QCD, and a recent example is the measurement of the axial anomaly, or it’s really a measurement of the lifetime of the π0, which there’s a very clear prediction from QCD and there was a more accurate measurement of this that resolved some of the tension. There were two previous measurements that had large disagreements with each other and with the theory. The JLab experiment showed that QCD was giving a quantitative result. I think Kawtar Hafidi’s (ANL) measurement of color transparency in electroproduction of rho mesons is, again, a prediction of QCD. This is an effect that we actually published from Fermilab in 1995, but statistically our results were limited. It was only two to three sigma, depending on how you ask the question. This experiment has shown very nicely that in mesons we see color transparency. It’s still no evidence of color transparency with baryons, and I think that that is certainly not clearly understood. I give a talk about how QCD has misled us in a variety of things, and some of those trade-offs. The thing I’m sort of disappointed at, because I’m in this business to study nuclei, is that what is still unresolved — or undelivered — is what JLab promised. That is, that we would really learn something convincing about the nucleus. I think part of that was the fact that the high resolution spectrometers in Hall A, which were supposed to do this physics, one of them was not able to go up to the momentum it was designed for and the resolution was not quite what people had hoped. And partly because the community moved away from having a taste for doing these nuclear electron scattering experiments. I think the link between what’s done at Jefferson Lab and what’s done at NSCL at Michigan State or at ATLAS at ANL is very strong, and I would have certainly liked to see more beautiful data of that nature. Certainly people appreciate the fact that the parity violating measurements on lead that measure the neutron radius of lead have a big impact for low-energy nuclear physics. And that experiment has barely started. It really needs another factor of two precision, which is exactly what the PAC said when we approved the experiment. It was only worth doing at a certain level of precision, and they haven’t reached that yet. So there are other things I wished could have come out of Jefferson Lab, but I think it’s had a long and a very successful program. I think for 12 GeV, there’s a clear path forward, a lot of excellent experiments. I think Argonne people are spokesmen of, about ten experiments now. So we’re clearly very excited about the program for the future.
 This is the “Report on the Workshop on Future Directions in Electromagnetic Nuclear Physics,” commonly called the Blue Book because it’s cover is blue.