Notice: We are in the process of migrating Oral History Interview metadata to this new version of our website.
During this migration, the following fields associated with interviews may be incomplete: Institutions, Additional Persons, and Subjects. Our Browse Subjects feature is also affected by this migration.
We encourage researchers to utilize the full-text search on this page to navigate our oral histories or to use our catalog to locate oral history interviews by keyword.
Please contact [email protected] with any feedback.
This transcript may not be quoted, reproduced or redistributed in whole or in part by any means except with the written permission of the American Institute of Physics.
This transcript is based on a tape-recorded interview deposited at the Center for History of Physics of the American Institute of Physics. The AIP's interviews have generally been transcribed from tape, edited by the interviewer for clarity, and then further edited by the interviewee. If this interview is important to you, you should consult earlier versions of the transcript or listen to the original tape. For many interviews, the AIP retains substantial files with further information about the interviewee and the interview itself. Please contact us for information about accessing these materials.
Please bear in mind that: 1) This material is a transcript of the spoken word rather than a literary product; 2) An interview must be read with the awareness that different people's memories about an event will often differ, and that memories can change with time for many reasons including subsequent experiences, interactions with others, and one's feelings about an event. Disclaimer: This transcript was scanned from a typescript, introducing occasional spelling errors. The original typescript is available.
In footnotes or endnotes please cite AIP interviews like this:
Interview of Peter Nugent by Ursula Pavlish on 2007 July 30, Niels Bohr Library & Archives, American Institute of Physics, College Park, MD USA, www.aip.org/history-programs/niels-bohr-library/oral-histories/32975
For multiple citations, "AIP" is the preferred abbreviation for the location.
How Dr. Nugent joined The SCP in 1996; on his transition from supernova theorist to theorist-observer; on K-corrections; his work on cosmological fits; on writing the Nature paper of 1998; on computer visualizations of data; work with Adam Riess of rival team; observing runs in Chile; turning point in January 1998 to believing the accelerating universe result; computer simulations; blinking of data.
It is July 30, 2007, and I am here at Lawrence Berkeley National Laboratory, in the Computational Sciences division, to interview Dr. Peter Nugent. He was in the Supernova Cosmology Project collaboration, and still is. We will be talking today about the history, up through the 1998 discovery. First, Dr. Nugent, may I ask how you came to join the SCP?
My graduate school research was on spectrum synthesis models of Type IA supernovae. I had applied for quite a few jobs in 1996. At that time, Saul Perlmutter had just recently come out to The University of Oklahoma, probably the year before, to give a lecture. He heard that I was looking for a job, and he said, “Well, why don’t you come on an observing run with us in Chile.” They were just finishing up their first seven supernovae, and a starting in on a few more in the spring of 1996. He invited me to an observing run. That was my first trip down there. I liked it. I got to meet Isobel Hook and Alex Kim, two other members of the project. Later that May, the National Energy Research Supercomputing Center got transferred to Lawrence Berkeley Lab and Saul was able to obtain a grant through them, to have a job hire for me. I was brought in as their token theorist for Type IA supernovae. At the time, nobody in the group had a background in Astronomy with respect to supernovae at all. I was the first one brought in. I was quickly Shanghaid into going on observing runs. The theorist got transferred to becoming an observer. That is how I started out.
Was that an efficient use of your theoretical background?
[laughs] Back in those days, it was absolutely and completely chaotic to do these observing runs for supernovae. It was the best use of manpower at the time. It kept me busy, that is for sure. I still continued working on my other assignments, in 14 hour days. It kept things going. I learned a lot about observing. It gave me the perspective to think about how to put together an observing run to address theoretical questions. Vice versa, it gave me the ability to talk to others in the group about the latest and greatest theories on Type IA supernovae. I could say, “If you want to search for something, this is how I would design a project.” This, bringing me into the group, made it a lot easier than to communicate between theorists and observers.
When you say that you were also working on other things at that time, was that a follow-up on your dissertation, on spectrum synthesis models?
Yes. I was also in a unique position of being one of the few people who collaborated with a lot of other people on the High-z Team. During that time, when I first joined the lab, Adam Riess had also just come out to UC Berkeley. We became friends and started working on a bunch of projects together. We worked with Alexei Filippenko and Robert Kirshner and many other people in the supernova community. I kept up all of that work at the same time, while working on the high-z supernovae for the Supernova Cosmology Project.
Is your work in theoretical models of supernovae work that impacts the team?
Yes. One of the things that I did as part of the project was to look at the spectra of the supernovae as a function of time, especially in the ultraviolet. One of the things we realized, as we went to higher and higher redshift, is that we would get more and more of the ultraviolet light of the supernovae, compared to what they would get for the nearby objects. They were typically looking in the blue or in the red. We were looking in the ultraviolet. One of our tests was to figure out what was going on in the ultraviolet, what were the variations, and how they would affect the cosmological parameters from the measurement. This all falls into a subject called K-corrections, correcting for the light that you observe, compared to the light that you are trying to compare to, since we were never going to get a perfect match.
What does the K symbolize?
I think Sandage wrote the original paper on K-corrections. K was just the letter that they pulled out of their hat to symbolize the correction for having filters, which mismatch between low redshift and high redshift.
Were you doing that with old data as well as with new data?
We were using all the data that we could get a hold of. There was a big project going on that was imaging nearby supernovae, that Mario Hamuy had going. Also there was a whole bunch of data that the SInS group, the Supernova Intensive Studies, led by Bob Kirshner out of Harvard, used to obtain spectra in the ultraviolet of nearby type IA supernovae and other supernovae. They studied supernova 1987A as well. Then, the International Ultraviolet Explorer, IUE, obtained a lot of data from the 1980s through the early 1990s. What I did, I put together a compendium of all this data, and started to look for the traits between these supernovae and their spectral properties, and also between the supernovae and their colors to see what was going to affect us in determining these K-corrections. This was something that I then shared with both teams. I am always the type of person who shares their science. In each of the papers, both the paper by Adam Riess and the one by Saul Perlmutter, they make reference to the paper that I was putting together. A lot of work that went on in the ultraviolet was based on what I had done with my theoretical models. It was a nice combination of the work I did before joining the SCP and the work I learned to do here.
Was your model of the ultraviolet spectrum of the supernovae, a model of how the supernovae explode?
Yes. Basically, what it is, is that somebody will produce a model for a supernova explosion. Then, I will take a snapshot at 10 days after explosion, and 20 days after the explosion, and then I will do a spectrum synthesis run. That is a code that takes in all the abundances of all the materials, the densities, how fast they are moving, and then computes the theoretical spectrum from that. We can vary things like the brightness of the supernovae, and we can see how the colors will change throughout the supernovae. It was this that led me to look for certain trends in the supernovae spectra, to figure out what we would see or not see at high redshift.
Did you mostly work alone, or did you collaborate with people on this project?
I worked a lot with Alex Kim on that project. It was a big effort between the two of us. He had started some of the work before in a paper he had done, and I took it to the next level. A lot of the work was solitary; working by myself with models. Then, I would interact by talking with Alex and Saul.
You say that was one of your goals in the project. There were others?
That was one. The other one was doing all the cosmological fits. We had a nice allocation of supercomputing time. Saul is definitely one for examining every possibility: let us try this, let us try that, and let us see what happens if we include these supernovae, throw out those supernovae. If you look in our paper, there are a dozen different cases where we tried different things, to split up the data in as many ways as possible. I was one of the people running all of the fits to that data.
What was it like as all the fits were coming in? Was there a lot of anticipation?
It was so much drudgery. [laughs] It is one of those things that I think early on we knew things were changing from the way people had thought about Astronomy. Back in 1997, we discovered supernova 1997ap, which we published in a Nature letter. At that time, when we were putting that together, we started to realize that well, this guy is really off the Hubble Diagram compared to where we thought it should be. When we saw that, and we had better quality data for supernovae, where we had spectra for all of them, so we knew they were all type IA-s, we knew they were normal-looking type IA-s, when we started to refine the data sets it became clear that this is where things were headed. There was a paper that Adam Riess and I put out just before all this, “Snapshot Distances to Type Ia Supernovae,” and even there, in that paper, you could get a feeling, even though it was not the full data set, that good quality data even in a small amount were hinting at something different than what was the standard. It was a slow learning process. It came on and off. Once you get more and more supernovae, one thing that has always amazed me as a theorist, is just how similar these type IA supernovae are. Once you get them, and you get good quality control on your data, it was really just driving us in this direction. I do not think there was a “eureka” moment, per se. I think there was just a constant buildup of more and more data leading in that direction. For me, it also made a lot of sense. I had started working on the Hubble constant and using Type IA supernovae to measure the Hubble constant. There was a huge battle between the people who believed in a Hubble constant of 50 and the people who believed in a Hubble constant of 100. The Type IA supernovae suggested a lower value around 70. A value somewhere between 60 and 70 is what most everybody who has published a supernova Type IA paper comes up with. With that there was a huge problem for the age of the universe. That if Omega_M was one, then the age of the universe was inconsistent with that Hubble constant by a lot. Because of that, people in my field who believed that the Hubble constant was that value either had to bite the bullet and say that there was something really fundamentally wrong with our understanding of the age of stars, or there was something wrong with our understanding of the cosmological parameters Omega_M and Omega_Lambda since they control what the age of the universe is as well. To me, personally, this made sense. It tied up a whole bunch of things. That said I was rather naïve when it came to Cosmology and understanding what a radical change this was from the perceived view at the time. To me, it seemed that a lot of things fell into place.
Do you have particularly vivid memories or recollections of the process of scientific discovery?
Yes. All through the spring of 1998 as we were putting the paper together, I remember that Saul was definitely one for saying, “Okay we tried the analysis this way, now let us do it this other way.” I did not have a vacation for my first four years at the lab here, without Saul calling me up while I was on vacation. I vividly remember receiving calls in the spring of 1998 much to my wife’s chagrin, while I was on vacation in Maine, with Saul saying, “We need to run these fits, we need to do this, and we need to do that.” I remember that well. It was a very good process. Saul left no stone unturned as he was trying to make sure that these results were the right ones.
Did you have to then continue that analysis on vacation?
Oh yes, oh yes. It was: log in over the old dial-up modems, run it, point Saul to a file where I had the runs stored. So, yes, there was a whole bunch of that. I have saved all my email ever since email started, and I was looking back at those times. There were a lot of emails back and forth between Saul and I, saying: we should look at this, we should look at that. I realized as I was looking back, that I was supposed to be on vacation at the time.
Did you do this also while writing the paper? Did you participate in the actual writing of the paper?
Yes. I wrote a large chunk of the Nature paper that came out in 1998, on supernova 1997ap. In the ‘42 paper’ that came out, I wrote a couple of small sections on it, but I was mostly involved in pushing through the various cosmological fits to the subsets of data that Saul wanted analyzed. That was my biggest chunk of work there, was doing that.
Did you use visualization at all when you were doing that?
Oh yes, lots of it, lots of it. It was all: how do you take 3D, 4D grids, and slap them down to look at them in different ways. It was writing a bunch of code first to do the fits, and then writing a bunch of code to visualize what they looked like.
When you write code like that, do you know what you want it to look like before you write the code?
No, God no. [laughs] You say: I want to see Omega_M and Omega_Lambda in this plot and I want to reduce all these variables. But you leave things wide open and let things fall where they will fall.
I am familiar with this basic Omega_Lambda over Omega_M plot, and I was talking with Saul Perlmutter earlier today, and he said that he decided that Omega_M should be on the x-axis. That is in 2D.
Yes. He wrote a paper with Ariel Goobar on that and that was the way that they decided to show it. However, there are a few other parameters in there that have been, how should I say it, compressed down, so that all their uncertainty is reflected in that graph. So, there is the relationship between the brightness of the supernova and its light curve, which is one of the parameters in the fit there. There is also the uncertainty on the Hubble constant; that is put in there as well. Now, you do not have to know the absolute value of the Hubble constant. You can pick any one and you will still get the same plot there. But, the uncertainty in that value propagates through to those plots. Then, when we tried other fits as well, fitting what the extinction law would look like, all of those parameters are called nuisance parameters. They are not things that are interesting to us cosmological parameter-wise, but they do influence the widths of those curves, the shapes of those curves, and so we would constantly work out ways to compress those down into those axes.
Are your visualizations forms of this one?
Yes. And then, of course all of the other parameters too. We would look at: What does the uncertainty in the stretch-peak brightness relationship look like? We also worked on the other parameters. Back then, it was called Quintessence. It was the search for the answer to the question: is there any temporal evolution in the cosmological constant? Is it truly a constant? That is another dimension that you would look for. You would say: Okay, I am going to cut out that part of the Omega_Matter, Omega_Lambda axis where their sum total is one. That is the preferred view, for some people. Then, you would say: Okay, does it look like Omega_Lambda is varying with time? So, we would do those plots as well.
You did not publish many of those, right?
Oh, God. There were hundreds and hundreds of plots that we did not publish. The one for q, or Omega_w, or whatever they wound up calling it, there is a plot in the very end of the ‘42’ paper.
The deceleration parameter?
It is not the deceleration parameter. It is what people call ‘w’ now. We might have called it ‘w’ then.
You made that plot?
Yes. That was one of the first things that I did on my own. I read up on a paper by Paul Steinhardt, found an error in it where he had dropped a negative sign, I remember emailing him on it. It was a simple, trivial little mistake. I worked on doing the fits to that parameter.
I have your Nature paper.
This is in the second one. You have the big one there?
You were not thinking of Lambda as Dark Energy, were you? Was it one possibility that you were thinking of? Quintessence would be replacing Dark Energy?
Yes, let us see here. Let us see if they showed this at the end. This was the Hubble constant one. There we go, this one: Omega_M for the Equation of State, w. This was inspired by Steinhardt’s work. I saw him write this down: what if the equation of state was not what we thought it was, could you do this? I wound up doing this plot, showing it to Saul, and said: Hey, this would be good to stick in another paper. Let us do another paper on this. He said: No, let us just throw it into the end of this one. That is where it went, in the end.
It packed this paper even more.
Yes, even more so. We added this in. This was the last thing I worked on right before the paper went out. So, it is consistent with there not being any difference. This is the big hunt now, narrowing these contours down.
The equation of state w, in terms of Omega_M. This was one of the last things done before we popped it in the paper. You can see that we were biased in our views of what the equation of state, w, should be. Here we just had it lie between zero and negative one. So, we assumed that anywhere outside of this graph, there was no probability. Today, all the rage is considering: is it possible that we could be even smaller than negative one, could it be negative one point one or negative one point five, which to me made no sense at the time. One of the things is, once you have opened up this can of worms, the theorists have a bunch of fun and go off in crazy directions. That was our little bias then.
What other parts of the paper were your parts?
Certainly, all these different cases, case A, B, C, D, E, F, etc., all these were different runs that I did, different fits that I put through.
You produced these?
Yes. I produced those. Two other people produced them as well. Saul always wanted different people to check each other. I had runs that I did on the supercomputer to be confirmed by other people. Saul certainly did not want an error popping in. One that I did by myself, since nobody else could do it is the following: There are two approaches here. There is a frequentist approach, and a Baysean approach. Baysean assumes that ‘what you see is what you get’ on the plots here, that Omega_M cannot be less than zero. So, all the power of the statistics goes into that area. But we did a different fit here, using the frequentist approach. I did that on the supercomputers. That was a very, very time-consuming approach to do it that way.
It did not end up giving a different result?
It slightly changed things but nothing fundamental. There was just a very nice confirmation of the fact that both methods were sound. One of the things there that people were worried about and rightfully so, is: Okay, so we are saying that there is no chance for Omega_Lambda to be zero. How well can we state that? Can we state that with 90% certainty? Or 97%? Or what? That level hinged very much on how you treated the supernovae; which supernovae you brought into the fit, which ones you discarded. The great thing is, you can look through all the fits there and you can see every way we tried it. Even just picking the statistical method to analyze that result, which influences how well you can say that. Saul wanted to try it in a more traditional, conservative sense.
When you joined the team, were they already saying Lambda could be zero?
Oh, no. When I joined the team, their first 7 paper had come out which said it looked like Lambda certainly was consistent with zero, but they had very little data. They had very few Type IAs that were confirmed, spectroscopically. No, when I showed up, Lambda equals zero was what most people were still thinking. After 1997, people started dropping that thought.
After the year 1997?
Yes, after 1997, after we started analyzing the data from that one and then of course all the other supernovae that came in between the time we had that one and the 42 paper. It was just starting to become more and more apparent.
Is that pretty standard? People call it the ‘42’ paper?
That is what I call it.
I heard that you would play football with Adam Riess around that time?
Oh, yes. I played football with Adam. Let me see. I was the host for his bachelor party right before he got engaged. So, yes, we hung out a lot back then, together. We worked on a paper on 1997 which I think today even is still the highest redshift Type Ia supernova observed, at a redshift of 1.7. So yes, we worked a lot together.
There was not much cross-talk between the teams in 1997 and 1998?
Just between Adam and I, a lot between Adam and I. We talked every day about the work and what was going on, how to make things better. I knew where they were. He knew where we were. We were fairly open in that regard.
But before he started getting the results for that team? Or actually while?
Oh, while; during the entire time. We showed up within a month of each other out here at Berkeley. We were constantly talking. I do not know that he did as much observing. He went to KECK to observe for their team. I think on that paper, he mostly spent his time reducing the data and getting the fits. Whereas, I would see Brian Schmidt who I knew well, down in Chile on observing runs.
Did you find it much of an adjustment coming from theory to do observing runs?
Oh yes. But even more so an adjustment of coming to a place that was all high-energy physicists, that is a very different thing. It opened my eyes up to statistics at a completely different level. It opened my eyes up to how big projects are managed and handled. It is very different, very different. I think I went down to Chile about ten times for observing runs. It was a fun experience. When you are a theorist, and you do not sit in front of telescopes, it is one thing to see a beautiful image that somebody shows you in class. It is quite another when you are taking that image yourself, and you are saying: my God, there are thousands upon thousands of galaxies in this image. There are more galaxies than stars. Hopefully in one of them there is a supernova. You come back, you subtract the data and you see a supernova.
But you would do the subtraction back here?
Yes. The data transfers from Chile were very, very slow in those days. I know the other team did their subtractions in place. They had a lot of members who were on the staff of CTIO at the time. They would use the computers down there. We would transfer the data back up to LBL. It would take 24 hours at least to get all the data up. On many observing runs, I remember going down, we would have a two night run, get the data, I would get a tape of all the data and fly back to LBL with all the data, hand in the data.
Would you be the one designated person on those observing runs?
No. We did it in huge teams. We would typically do a few month campaign. We would try to do two campaigns back to back. There would be the reference images, and then the new images. About a week later or so we would get another set of new set of images. I would go down on either of those runs or both of them sometimes. Alex Kim went on a lot of those observing runs.
Would you and Alex Kim have gone together at the same time?
Yes, always two people. There were always two of us that went down. Usually one would fly back with the data between the run. The other one would stay down there while another person flew down later on.
Stay down for the next run?
Yes, stay down for that week-long gap while we get some more follow-up data.
By this time, things were going pretty smoothly at the observatory?
Yes. We had it pretty well in terms of finding things. The biggest challenge was how tightly things were timed together. If you talk to other observers, they go out, they get their data, it is a much more casual observing experience. For us, everything is so time-critical. You want to find the supernovae as they are getting brighter. You want to take spectra of them at a place like KECK when they are the brightest. Many times, we were trying to send back data, find supernovae, and then the very next night observe them at KECK. There were huge time constraints, making sure everything went through just right. That was challenging. That was my first observing experience and so I always thought that was the normal way people observed. Then I went later on with other people. I was like, “You are not thinking about what to do if a cloud comes. What am I going to do next?” It was one of the things that made me a very efficient observer.
What would one cloud do?
If the seeing got worse or clouds came over. You would typically have two fields or maybe three fields during the night that you were pointing at. So, if you had clouds come in one area, the question was could you get to the other fields in time to start observing those, since the clouds are not over there. Or if the seeing got worse and degraded by 20 percent or so would you expose for longer, to try to make up for the difference? Or, would you just hit other fields? What were you going to sacrifice because of that? If the weather looked like it was going to be bad the next night, would you get half the data tonight, or try to get half on all the fields, or would you get half the number of exposures that you wanted? These were all decisions that Saul would try to impress upon you to make. Saul was almost always on the phone with you while you were observing as well. Or chatting, it was not the chats you have today, by y-talk was a UNIX program that we had back then. We would chat with the people at LBL while they were subtracting the data. It was quite quite hectic.
Did you talk about data analysis with Adam?
I helped him with the K-corrections. I said, “I found this really interesting correlation between the K-corrections and the color of the supernovae.” I came down to campus one day, and gave a lecture on it, and showed how it improved things dramatically. It does. Without it, the data would probably be about twenty-percent worse, at least. You would wind up with the same result, just you would not have as much confidence with it. Adam and I talked about how to get rid of nuisance parameters in the fit. What are you with the Hubble constant? Or, what is your light curve shape parameter? How is that influencing things? We talked about things like that. We did not come out and say, are you guys finding Omega_Lambda is nonzero, or anything like that because, we had worked on a couple of papers where that was already obvious. The snapshot distance paper that we put together, when we were working on that, it was obvious at that time. That was after the Nature paper came out. That one supernova at a redshift of 0.83 made a huge difference and was even included in their analysis. Because the spectrum was so good, we were absolutely convinced that it was a type IA. The redshift was good, there was HST data on it, it was observed very well. It was one of those, “Okay, this is what it is saying.” Then, when you get ten or twelve other supernovae that get the same thing.
They would have seen the Nature paper in January?
In January, 1998. At that time it was becoming quite obvious that with good quality data, that is where things were headed. Then it was just a matter of trying to go through and do the analysis as carefully as possible, to get it out as quickly as possible.
What is the color calibration of the K-corrections like?
It was something that one of the guys Mario Hamuy had seen for nearby supernovae that the K-correction, or the correction for lost or added light is very dependent on the color of the supernova. I put together these templates and tested it every which way and showed that if you in fact took the colors of the observed supernovae, or took the colors of the supernovae light curve shapes, that the K-corrections were tied very very closely to those colors. Basically, if you just yanked a K-correction out of a table for a supernova at day ten, if it was a broad supernova light curve for a brighter guy or a narrower fainter guy you could screw up your answer by a considerable amount. What this says, just use the colors that are inherent, and you can improve your estimate of the K-correction greatly. So, that was one of the things that everybody started doing.
Are there any date stamps that you can put on this? This is all very helpful. But, if one were to write a historical chronology, it would be difficult to say when exactly things occurred.
The wonders of saved email!
What do you think are the most important events?
Certainly when the Nature paper came out, to me, that was the single most important event for either team in the sense that we knew we could start finding them at those redshifts. That is a big thing. The game gets a lot easier at higher redshifts. There is a lot of uncertainty when you are down at a redshift of 0.3. We need the statistics; we need huge statistics in order to see it. With just one or two supernovae at a redshift of 0.8 it is pretty easy to see the accelerating universe. I think that when that paper came out, that was the moment when everybody realized that the game was to go through and get just as good quality data as we had for that one supernova on the rest of them and try to prove that is the way things were going. To me, that sticks out as one of the most important dates.
That would have been around the AAS meeting?
Yes. It probably would have been the fall of 1997 when that paper came together. We certainly knew where the results were back then. The Nature paper came out in, I think the first issue of Nature, in January of 1998. That is when, I think, we knew, what was going on.
Would you pair that with the 1999 paper as an essential coupling? I am a graduate student in History of Science. My advisor has a book, How Experiments End. Not end as when experimental programs end but say, when the muon is discovered. When is it decided that it is a new particle that is needed and not a new theory. In this case, that the universe is accelerating, that there is Dark Energy.
Do you want my view as if I was trying to look at myself from the outside, or when do I think I am convinced of everything?
Maybe both if you can. When you personally believed it and…
When I personally believed it, was right about the time in January 1998 because at that point we had enough supernovae where I was convinced. I had, of course, a lot more criteria of what is a good quality supernova to be included in a fit? I needed to see a spectrum that told me it was a type Ia, no doubt. I needed to see a good light curve that went over peak brightness and got a good measurement of the shape. At that time, I know Alex Kim, and I know myself, and I know that several other people, had started just playing with those supernovae. From those alone, you could already see evidence that Omega_M was definitely not equal to one and likely that there was a cosmological constant. That is the time in my head. I would say that between the Nature paper and Adam’s paper coming out toward the end of 1998 and Saul’s paper coming out in 1999, We presented some of the results for the 42 paper at the AAS paper earlier in 1998. I know that gave great confidence to Adam and the others that their data was not messing with them in one way or another, that it was real, that they were seeing the same thing. By the time our paper came out, then I was convinced because I could not see any other way around it. Like I said, for me it probably came a lot easier than for other people. Particle physicists probably had a very hard time swallowing it since they were not already thinking this way. For me, it made sense because it tied up so many loose ends in Cosmology at that time, in Astrophysics. I came to this from an Astronomer’s point of view. If you believed in a low Hubble constant then in order to make all these other things fit, you needed something like the cosmological constant.
So for you it would be January, 1998 and the paper of 1999?
Yes. Or, by the time the AAS meeting came around. When we had the poster up there that showed our fits, and clearly showed the results were. We called them preliminary, because we had not studied them in every which way to Sunday. We had certainly studied them enough as far as I was concerned. [laughs] I was like, “Okay, I have run enough fits; I know that there is no way we could be fooling ourselves.” Certainly by the middle of 1998 I was absolutely convinced that this is where it was going.
Did you participate in, or were you influenced by the Fermilab workshop in the spring of 1998? There was a straw poll taken to see how many people in the audience believed in the two teams results.
Oh yes, I remember that.
I am thinking of other times when the community began to believe the results.
It was funny. I think that when the results first came out, there was a lot of immediate backlash from the community saying: This does not make sense. But I swear, by one year later, by 2000, everybody had bought into it, lock stock and barrel. There were very very few dissenters that I encountered. Like I said, from the astronomy side of things, it put a lot of the pieces of the puzzle together and made sense. I think the people that had a hard time with it were those that just thought this was so crazy that it could not be. Omega_Lambda is one of the variables that you have in the solution to a set of differential equations. Einstein put it in there because he thought the universe was static so he needed something to balance gravity out. Once you say there is a Big Bang and the universe is expanding, well you do not need it. That is not necessarily true either. You do not have to have it but there is nothing that says it cannot be there. In my personal opinion, I am not sure that any of it is right in the sense that I absolutely believe that the universe is accelerating. But, what that means, in terms of is General Relativity, is this value here, is that fundamental? General Relativity and Einstein’s view may not be the be all and end all for explaining cosmology. Something else may come along. If you look at it from the historical perspective, we are very much in a similar time, I would say, compared to the turn of the previous century, when you had these things coming up in Quantum Mechanics, which had not even been defined yet, but there were some real disturbing things that the theory that was out there could not explain. Then boom, all of a sudden Quantum Mechanics comes along, and General Relativity, and Special Relativity. A whole bunch of things are explained, very succinctly. Pieces of the puzzle fit. I think we are sitting at that same point right now. We have found something really incredible. Yes, our equations can account for it. But what does that mean? The equations do not tell us. There is Dark Energy. Okay. That is a very apt description. We do not know what it is but it acts like energy and it is causing space to expand.
Every equation comes with an interpretation.
Exactly. This is not a good state to be in, as far as I am concerned. I think we now need to have a person come up with, well, what is Dark Energy. Likewise, there is Dark Matter as well. A completely different thing, but we are just as completely in the dark about it. I think somebody will come along, the next Einstein. Whoever that person is, will all of a sudden show us.
Is it likely that person will be from one of the two teams?
Absolutely not. I do not think that there is a theoretical physicist among us at that level. Honestly, I think we are too old. These are things that are best done by people who are very young, and their minds are not polluted by what the conventional wisdom is. If you look at Einstein, it is clearly what made him superior to most of the scientists at that time. He was able to think outside the box. We need somebody to do that. That is my opinion of where it is going to go. I think we definitely need to push the measurements farther. We need to nail down what the cosmological parameters to a much much better level. That can hopefully start driving the theorists to coming up with a solution. To me, it is a number that goes into a differential equation. It explains things but that does not do it for me.
Does it change your view of the universe?
Well, it certainly did. If you sit back and think about it, all of a sudden there is another force that was never accounted for before. This antigravity that is out there. That is pretty radical. In that sense it changes the view of the universe. It also changes the view of the universe on how it will end. If it continues on and it is just a cosmological constant that is not varying with time, things are going to get farther and farther away from each other. It will be a very cold end to the universe.
As The New York Times is fond of lamenting.
Yes, yes. But hey, we can go in there and discover in the next few years with better observations that it is not a constant, it is varying. That could change that whole view there.
So the experiment has not ended.
No, not even close.
You would call it an experiment, yourself being a theorist working on observations?
Yes, absolutely. There was: hey, this is a proposal. We are going out there to try to measure the cosmological parameters. This is still the only way to measure the acceleration directly. Every other method in cosmology is an indirect measurement of it. I have done some stuff with type II supernovae, to measure the cosmological parameters. That is just starting off. It is almost like the Type Ia business 10 years ago. That is the only other method that is out there on the horizon that can measure this effect directly. Everything else is, baryonic acoustic oscillations coupled with CMB measurements, which can show you that this is happening. In my mind, it will continue on. It has to be refined. Right now the Supernova Legacy Survey, working at CFHT; they have done a great job of taking this experiment to the next level. They have hundreds and hundreds of supernovae at high redshift.
They already have found?
Yes, yes. They already have in hand probably 300 supernovae, right now.
Do you know how many have been found total? I hear that there are some plans to find thousands of supernovae.
Right, clearly that is what they want to do with the next generation. If you talk to Saul about SNAP, the idea there was to find 1,000 or 2,000 or something like that. I would say that in hand right now, all the supernovae combined, we are not talking all published, but all the ones with good light curves, there has got to be 1,000, at least 600 or 700.
Getting back to the other level of theory, of how supernovae work, will the new data, do you think, help?
Yes. The scariest thing about this whole thing is that we still do not know what the progenitors of Type Ia supernovae are. We see them as beautiful distance indicators. They are very similar. We have figured out ways to calibrate the ones that are different so we know how bright they are. But we still do not know what explodes to make them, and exactly how it explodes. There are some very good ideas. There is a big push right now in the Department of Energy to understand this. Stan Wooseley at UC Santa Cruz has a project which I am involved with; it is the SIDAC program, Scientific Discovery through Advanced Computing which hopes to address this. The idea is to work through various explosion models, and take the light curves and spectra of the supernovae from these models to say: okay, well, if this is what the progenitor of a type IA supernova is, then this is how it may evolve with redshift. Right now we just have one parameter, basically.
Using simulations?
Yes, exactly.
Did you do simulations in your 1998 work?
Yes, I did simulations at that time of the supernovae we found. Of course there is much better data nearby, but the next generation thing, is to try to make the data at high redshift just as good as the nearby ones so that you can compare and contrast them at the same level.
I am working as a historian so I keep wanting to get back to 1998.
That is fine with me.
Related to what you are talking about makes me think back. Did you write the simulations up in the paper? Did I miss it?
No, no, no. They were included. Let’s see.
This SIDAC project…
This is brand new and recent.
It is new, but if you are involved in it, it makes me think that they asked you because you have done similar work before.
Absolutely, absolutely. We looked at some explosion models in here and their evolutionary effects; this is section 4.4 of the paper. We had done studies, at the time, based on some of this work. Somebody said: one of the things that can evolve with redshift is metallicity. As the universe gets older, more supernovae explode; polluting the hydrogen and helium with more and more metals and therefore this could change what a supernova looks like.
Is there a time limit to how far back it is possible to find a supernova?
This is a very interesting question. It gets at the heart of what is the progenitor of a supernova. If we knew what the progenitor was, we could say: definitely. There are a couple of ways about this. If you could have two White Dwarfs merge, you would have to wait for the stars to become White Dwarfs. That could take 50 million years. But if you need two 0.7 solar mass White Dwarfs and only two 0.7’s, that could be a billion years. If it has to accrete mass from another star, then you have to wait for one star to evolve to a White Dwarf and the other star to get big enough to accrete material over. That could be 100 million years, it could be a billion years. We do not know what the answer is. What these people had done, Peter Hoeflich is one of them, was study: well, here is a proposed model for them. Let us see how they would evolve as a function of redshift. One of the theoretical things that I worked on at that time was, well let us compute the spectra of these models and see what they look like. What we found was that in the blue and visual bands, things did not change much at all, no matter what the metallicity was, but things in the ultraviolet changed a lot. What it told us was that the ultraviolet could be a very dangerous place to base your cosmological measurements on. But it gave us great confidence in using B-band photometry to make our cosmological measurements. That is one of the things I worked on the side here that was incorporated into the paper.
I do not exactly understand how a simulation could tell you that.
It is not going to help you read off a cosmological parameter. But, say you blow up a supernova 100 different ways, and every single time you do it, you see that the B-band light correlates very nicely with the light curve shape. That gives you confidence that even though I do not know exactly what a type Ia supernova is or where it came from, I have made up a whole bunch of models that clearly go way beyond what it could be. Every single time it says that the B-band looks good. Conversely when we saw huge differences in the ultraviolet, it does not tell us that the ultraviolet is necessarily a bad place to use, it just makes us worried that if we have results that are based upon the ultraviolet, we better not hold them to too much unless we verify them 20 different ways. At the end of the day, what this paper got us too, once this paper came out, was that we were no longer limited statistically with the supernovae. It was all systematics. That is the big push now. The systematics include all these little nitpicky things like evolution, like does the metallicity affect the brightness of the supernovae, the galaxies, the environments where the supernovae are born, do they somehow affect things. That is the big push.
That is where this paper got your team. Did the High-z team’s comparable analysis in their paper also get to that point?
They did not have as many, and they had a couple of our supernovae in there. So, I do not think that their paper quite pushed it to that level.
They had your supernovae in there?
Yes, I believe two of our supernovae were included in their analysis. One of them was that 1997ap supernova.
From your Nature paper?
Exactly. That was a very helpful one, because I think it was the highest redshift confirmed one. I believe they had one at a slightly higher redshift, but they had no spectrum of it so it was just a guess as to whether or not it was a Ia. There was one other one too. I do not think their paper quite had things pushed down to: we are now stuck at the systematics level.
That is something that people still say today, right?
Exactly. Yes. In addition to which, I think people would also say that it is not the high redshift supernovae anymore. It is the low redshift supernovae. We need many more low redshift supernovae to do this.
Those are more difficult to find?
They are much more difficult to find. You need to find them sufficiently far away so that they are in the Hubble Flow so that their peculiar velocities of the galaxies are small compared to the expansion of the universe. So you get very good relative distances to them. That makes it even more difficult. The combination of all those things… That is why the CFA, Alex Filippenko’s group down at UC Berkeley, our group here with the nearby supernova factory, and the other big group would be the SLOAN people, are all working very hard on the nearby supernovae question because that will be the next jump. When that data comes out and is available, then we will be able to push the high-redshift data even farther. That is what people are waiting for.
I have a funny question now. Did the supernova in the Nature paper have a name? Not that you think of right away?
Nothing that comes to mind.
It is not very significant.
We experimented with all those crazy little cute names back then. I remember right after we got the spectrum, Alex Kim and I put it up on our door, outside the office where we worked on it because we thought it was so neat. We always called it the redshift of 0.83 or 97ap. Everybody knew what you were talking about.
So you had an office up on the fifth floor?
Yes, on the fifth floor. We were in this crazy office. I remember the first time I showed up I saw a folding lawn chair there. I turned to Alex and I said, “What is this for?” He said, “Oh, you will find out.” Sure enough, when the observing runs started and the data started coming through, you were in the office 24/7. That was the place you could crash and take a nap when you needed it. It was a bit hectic.
You put the spectrum of that up?
We put it up on our door. I remember it sitting there for a long long time.
Before you even did the paper?
Oh yes.
People walking by could see it?
Oh, yes, yes. Right after we took the spectrum at KECK, I remember seeing the reduced spectrum from Rob Knop in his notebook. They were the ones who took it. I started playing with it and comparing it to other supernovae. That was clearly the supernova that changed things. It inspired us. We knew that we were on the right track. It was the first one that we got follow-up time for at the Hubble Space Telescope. It was a director’s discretionary run to take data. It was just a very nice supernova all the way around.
That is funny story about the lawn chair. The tape is probably running down. Are there any other personal or scientific stories, which you think the records should have?
No. To me, it was fun. It was exhausting. It was tiring. What I liked most about it is that there were two teams working on it. It was nice to go see the guys I had worked with in the supernova field when I went down in Chile: to see Brian, to see Mark Philipps, to see Nick Suntzeff, toiling away doing the same thing we were, in the next room over. Remember, we all wanted the same time at the telescope. We wanted to observe a month before new moon, and then a week before new moon, and then have the spectra taken at new moon. So, we would always wind up the three or four days around each other see them. We would always see them whether or not they were going out to Hawaii to take spectra at KECK or down in Chile, wherever. “How is it going?” It was all good.
The observatories have separate telescopes?
No, no. We would have three days on the telescope, then we would be off, then they would be on the same telescope the next three nights.
Really?
Yes.
Three and three like that?
Two and two, three and three. It all depended on how much time we got for the year. The big work-horses this time were the Victor Blanco 4-meter telescope in Chile, and the KECK telescope in Hawaii for taking the spectra. We all knew the best places, and that the best time of the year to observe was between October and March. If you wanted to do two runs during the year, you would have a run in October and a run in January. We both wanted the same stuff, so we would always see each other down there. I shared many a meal with those guys as they were working on their data, saw them go through their search pipeline when I was down there. They saw the good times and bad times, just like we had. When the software goes crazy and you all of a sudden you are almost back to blinking the data to find the supernovae. I remember seeing Mark Phillips do that once.
What is blinking the data?
Just flashing two images back and forth. It is next to impossible to find a supernova that way. That is the way of course Fritz Zwicky did it many years ago. Mark said, “There is some problem with the software. Brian is working on it. He should have it back in an hour. In the meantime I am having fun and I am looking at data.” I looked at him. I had done it too myself. More times than not, you just find asteroids. I remember seeing stuff like that.
That is the way of trying to see it in real time?
Yes, and you know what, it is so hard to see some of these supernovae. If you would look at the data and blinked it, there are 10,000 sources on an image and you are looking for one of them to get slightly brighter or slightly off center. You have no chance.
The limits of human perception.
I found one at KECK.
Dr. Peter Nugent was just explaining how in December 1998 he found a supernova by blinking.
Blinking the data. Blinking a month old image with new images that Greg Aldering and I were taking at the KECK telescope. We were searching for very high redshift supernovae with KECK. We were trying to find supernovae above a redshift of one. And we did. We found one called Albinoni, that we nicknamed Albinoni. While we were there I was getting a little bored so I decided to start blinking the data. That just means throwing up one image on top of another really quick to see if anything has changed. Sure enough I found a supernova that was very nearby. It was very unlikely that we would find it. Rob Knop and I calculated that the likelihood of finding it was something like one in a thousand. But it was easy to see by blinking. That was quite fun. But you definitely do not want to search for high-redshift supernovae that way. It would strain your eyes and you would not get much.
That is the way Zwicky found them?
Yes. Zwicky would take photographic images and find them by blinking old photographic images with new ones. There are quite a few amateur astronomers. One of them, Reverend Evans from Australia who just uses a small telescope to go out and observe. He stares at very nearby galaxies. He has a photographic memory so he knows when something new is up there. He has found quite a few supernovae. My guess would be in the neighborhood of 20 supernovae but maybe more, just purely by staring at nearby galaxies.
Does he send them to professionals?
He sends them off to the International Astronomical Union circulars which keep track of all the supernovae discovered. His record is in there somewhere, but it has got to be getting close to what Zwicky did. Zwicky was back in the 1930s and 1940s and 1950s.
Thank you very much for your time.
No problem.
You just said [off the tape] that this was a huge team effort.
Yes, and it was non-trivial because we had people spread out over the entire world doing these observing runs from Chile to Berkeley to Hawaii. One of the things that would happen many times was, we relied on the fact that the Earth’s rotation was rather slow so that night would end in Chile before night would end in Hawaii. So, that we would get a few more supernovae to send there. Boy, if the rotation of the Earth was any faster, then we would not have been able to do that and we would have missed our observing night. That is the way it was. You would get the schedule from the telescopes the way they are, you would try to push in a different direction to make it convenient for you but it would not happen. I know the other team had similar types of issues. It was a huge team effort. We had tens of scientists working on either taking the data, subtracting the data with the supernovae, or taking the spectra later on. Then, not to mention all the people that worked on the reduction of the photometric data and the spectroscopic data. But people also worked on the analysis, of what are the contours of the cosmological fits. It was a huge team effort by both groups.
You would say it was the two groups rather than outside theorists coming in.
No, it was contained within each of the groups. Each of the groups had their own techniques for doing things. The great thing was, other than a couple of supernovae that we published previously which they used, they were independent sets of supernovae. The analysis, both the data reduction of the photometry was done differently, and the analysis of how we fit for peak brightness on the supernovae and the reduced brightness was different. I think it was because of that that the results were accepted so quickly in the community. It was because two groups doing it in two very different ways, came up with the same answer. If it had just been one group or the other it probably would have been another five years down the road until somebody else came along and tried repeating the same experiment that the results would have received the same power and notoriety that they did.
Thank you very much.
No problem.