History Home | Book Catalog | International Catalog of Sources | Visual Archives | Contact Us

Oral History Transcript — Dr. Reynald Pain

This transcript may not be quoted, reproduced or redistributed in whole or in part by any means except with the written permission of the American Institute of Physics.

This transcript is based on a tape-recorded interview deposited at the Center for History of Physics of the American Institute of Physics. The AIP's interviews have generally been transcribed from tape, edited by the interviewer for clarity, and then further edited by the interviewee. If this interview is important to you, you should consult earlier versions of the transcript or listen to the original tape. For many interviews, the AIP retains substantial files with further information about the interviewee and the interview itself. Please contact us for information about accessing these materials.

Please bear in mind that: 1) This material is a transcript of the spoken word rather than a literary product; 2) An interview must be read with the awareness that different people's memories about an event will often differ, and that memories can change with time for many reasons including subsequent experiences, interactions with others, and one's feelings about an event. Disclaimer: This transcript was scanned from a typescript, introducing occasional spelling errors. The original typescript is available.

Access form   |   Project support   |   How to cite   |   Print this page


See the catalog record for this interview and search for other interviews in our collection


Interview with Dr. Reynald Pain
By Ursula Pavlish
At Lawrence Berkeley National Laboratory
July 31, 2007

open tab View abstract

Reynald Pain; July 31, 2007

ABSTRACT: Reynald Pain (member of the Supernova Cosmology Project): Particle Physicist, joined the SCP in 1993. Worked on software to detect supernovae. Simulated supernovae, varying their brightness to aid in real object detection; 1993-1996. Participatetd in first batch find of approximately 10 supernovae in 1995 at CTIO 4-meter telescope. Supernova work links to his simulation work as a particle physicist at the e+e experiment at LEP, at CERN. Original interest in joining the project, was to try to detect Dark Matter. On the conceptual attractiveness of Dark Energy as vacuum energy, to him as a particle physicist. The importance of historical timing, in how experiements end. On change in sociology of the SCP as more astronomers joined the group. In France, Pain started a group after SCP's success, as a subset of the SCP.

Transcript

Pavlish:

I am here at Lawrence Berkeley National Laboratory to speak with Professor Reynald Pain. I would first like to ask you, Professor Pain, what are the milestones or turning points in the Supernova Cosmology Project, from your point of view, and how did you come to join the group?

Pain:

I joined the group in September or October of 1993. At that time, I was coming here to the Center for Particle Astrophysics. I was on sabbatical from my work. I am a particle physicist. I was working at CERN for some years. Then I thought that I wanted to do something else. They said to me, here, “You may choose between various options.” I visited various groups, including Saul Perlmutter’s. He explained to me what he was doing. I said, “Well, I should try this. It sounds very interesting. Why not?” [laughs] This was pretty different from what I was doing in particle physics because there, I was working on experiments at accelerators. Also, it was very unlikely something I would do when I went back to France.

Pavlish:

It was a one year-long sabbatical?

Pain:

Yes. Often people stay for two years. I do not teach. I just do research. Provided that we get funding locally, we can stay for two years.

Pavlish:

What sort of analyses you did during that time, and how did your work relate or did not to your earlier work in particle physics?

Pain:

Yes. Here, I worked on the software that was used to detect the supernovae, to find the supernovae. What was quite new, I think, in the software was there was a lot of image to process, much more than astronomers usually do here. We had to use a camera to look at a large area of sky. For that you have to process many images. We had to put a procedure to retrieve the images from the telescope, to calibrate them in some way, and then to do the image subtraction to try to find the supernova candidates. That is one thing I worked on. Another thing I did is to validate somehow the software suite used to find the supernovae. There is a way, which is commonly used in particle physics, which is to do simulations. We put fake supernovae in the images, and then try to retrieve them, to find those fake supernovae. It is a good way to test how efficient you are in finding supernovae. Whether you are missing a fraction of them, or whatever.

Pavlish:

You would put the fake supernovae in real acquired images?

Pain:

Yes. That is the main thing I worked on, so that we could validate the software that we were using to find the supernovae. At that time, it was quite important because we had to process many, many images before we could find one, which had a supernova in it. We were only finding, typically only two or three supernovae per observing period, sometimes a bit more. When you do not find something, you are not sure if it is because it is not there, or because you were not able to find it. You need to test the tools you are using so that you are confident that if you do not find anything it is because there is nothing to find, not because you are just missing it.

Pavlish:

Did the software get quite good at finding the simulated supernovae?

Pain:

Yes, yes. The good thing with simulated supernovae, is that you can play around, and test many situations. You can calculate the efficiency and draw an efficiency curve as to how well you do under certain conditions. We knew that at a certain magnitude, at a certain depth, or at a given redshift if you want, we could find supernovae. Once you chart this thing, it is very good, because you know that if you survey a large enough area of the sky, then you are sure to find a certain number of supernovae, plus or minus an uncertainty. The first thing, we really were happy about, was in 1995 when we found the first batch, maybe 10 or so, supernovae, in one run, in one observing period. At that time, we had two or three nights in which we were observing the sky. These were for reference images. Then we went back a month later to observe the same area of the sky. We did subtraction of the images from the same area of the sky. In two or three nights, I think we got ten or so supernovae. It was the first time that people could see so many supernovae, found in one batch. We issued an Astronomical Telegram (IAU). It was quite impressive.

Pavlish:

Did the team celebrate back here at Berkeley?

Pain:

Yes. I forget what we did. We were very pleased with that one. It was the first big batch of several supernovae found. Before that, we found a few each time while observing, say two or three, but not much more than that. It was very difficult. Here we were very lucky. Everything went well. We were at CTIO. The weather was good. The software was working perfectly. Everything went fine.

Pavlish:

Did you follow the software to the telescopes when the observing runs were being done or did the software stay here?

Pain:

The software was done here at LBL. At that time, the machines were considered quite powerful. PCs, which now are out of date in terms of speed, space, and everything. But at that time they were considered to be very good. We had a few of these machines. We could process the images quite rapidly. We needed to do that within 24 hours so that the supernova stayed visible, and then we could do the spectroscopy at the KECK telescope. It had to be done fast. So, we were working overnight. This particular run was done with the CTIO 4-meter telescope. The data was transferred here. That is something we had to do, to monitor the transfer. At the time, networks were much slower. The images were also much smaller. But still, we had many images to transfer. So we would transfer these images and run the software here at LBL.

Pavlish:

The image comes in as numbers from the Internet, and it takes the numbers as input?

Pain:

Now, detectors have much bigger areas. But at that time, when we started, the detector had maybe only one CCD. The images were 1,000 by 1,000 pixels, something like that. Each pixel is a word in computing, so two or four bytes.

Pavlish:

In your particle physics work before that at CERN, had you done similar simulations of particle events?

Pain:

Yes. That is something, which is done regularly in particle physics experiments. There is a team working on simulation, making fake events, trying to reproduce reality. We even invent models, to see if we can find new particles. There are people working on simulations. This simulation data is processed with the same software as the real data, so that you know how good you are at finding these events.

Pavlish:

What kinds of particles were you simulating there?

Pain:

At CERN, I was working on an e+e- experiments at LEP. We were doing many things, like searching for new particles, called supersymmetric particles. We did simulations, and then looked to see if the real data had these kinds of events that we had simulated. In the e+e- interaction, electrons collide with positrons, to produce all sorts of particles, depending on the energy. People also simulate events that exist, like particles, which decay into other particles, so that they can well understand the detector. That is something, which is commonly done.

Pavlish:

Speaking of parallels between particle astrophysics and particle physics, are you doing those simulations after the detector is already online? Or, are you doing the simulations in advance? The detector equals the telescope?

Pain:

Yes. It is also done, even before we do the experiment. We want to evaluate how well we will do, and also to make decisions about what types of detectors we will want. There are various types we can use. Even before we reach the experiment, we do these kinds of simulations. We do them while we are doing the experiment, because we want to understand the details. It is really something, which is done in parallel. In fact, what is commonly done also is to try to produce ten times more simulated events, compared to what we expect the detector from the real experiment so that we understand what we are doing. Those detectors are quite complex. Here, it is less complex, in terms of detectors. There is only one sort of detector, the CCD. The telescope itself gives you some constraint. But we can simulate things. We do not simulate all the details. We mix real data and the simulation, which is done also in particle physics, but now less often. Now, people are trying to simulate everything from first principles, in particle physics. In astronomy, we cannot really do that. It is too complicated. Although, in this case the detector is simpler because it is just one kind of detector. Still, there are things, which are hard to simulate, like the atmosphere.

Pavlish:

Do you have to simulate the atmosphere?

Pain:

No.

Pavlish:

Did you ever for fun, make anything else, other than supernovae?

Pain:

Yes, you take an object like a star, which is a point-like object, and you put it any place on the image. If it is not in a galaxy, it can be something different from a supernova. The simulation is, that there is a new star appearing anyplace in the field, and we want to find it whatever it is.

Pavlish:

You limit the z of where you are simulating?

Pain:

We limit the brightness of the object. That is one of the parameters that we use. We vary the brightness, so if it is very faint, it is like simulating a very distant one.

Pavlish:

Did you use any of the same code you had used in particle physics?

Pain:

No.

Pavlish:

You wrote the code anew?

Pain:

Yes.

Pavlish:

In particle physics, you have a team of simulators. Here, were you the only one doing simulation?

Pain:

No. Once one has done the program, everybody works a bit on it. The group is not very big. We had Alex Kim, and a couple of other people. Once we had done the software, it was a collective effort, and we were also producing these fake images, and processing them in the same way. It is not the work of one person only, but it is a small team effort, much smaller than in particle physics. There are different tests. A simulation can use some software which was used not for the simulation, but used to find objects. To find objects in images, we had one computer scientist, Ivan Small, wrote most of the software that needed to be fast and efficient. He wrote a lot of this. He was a computer scientist. We were just scientists. We could recycle some of the routines that he used to find the supernovae, to do the fake supernovae simulation. It is the contribution of many people to various parts. That is one example. There is another piece of software that does alignment of images. This also was written by this Ivan Small. We need that also, for the simulation. We want to see if we can find fake supernovae, so we have to align the images as well. Another piece that has to be done: at the end we want to be sure that we are not missing anything, so we look at every event we found, including the fake ones, and there is a scanning program which helps to visualize the candidates. The same program was used for the real event, and for the fake supernovae. We want the simulated events to be as similar to the real event as possible, and so we want to train ourselves to see how they look. There are things that you can do with very complicated software that you work on for years and years. But sometimes, the eye and the brain is much more efficient than trying to describe all the possible cases, and do a complete software to find all the possible situations.

Pavlish:

You mean, it is easier for the eye and the brain to see those visualizations that the computer produced, than to have the computer do some determined output from it?

Pain:

Yes. The computer calculates: This one looks like a star. This one is close to the galaxy. So there are many steps to decide on whether you think this is a real candidate or not. We have to train ourselves with real data. At that time, the human eye could do a lot better than any computer program. We have learned a lot. We find supernovae by the hundreds. We now have this neural network kind of software, which does this kind of thing. It is trained on real data. It learns how to recognize a supernova, and then decides. We have so many supernova candidates, now, that we cannot look at them all. It is good; we can train the program with the many supernovae. It just happens that it works out just right.

Pavlish:

When you were doing this work, it must have been a different experience to look at simulated supernovae, than to look at real ones. Or is it pretty much the same?

Pain:

The simulated ones are much nicer. Reality is always more complicated than simulations, more complex, let’s say. Well, sometimes we could not say if it was a real one or a fake one. But if it was a bright object, it would be easy to see if it was a fake one. The real difficulty was to find the faint objects. A very simple software can find the bright objects. But we were really working at the limit of the instrument, of what we could do. There are two competing things. One is that we want to go as deep as possible, because we wanted to find faint objects. But you want to survey as large as possible an area, because we want to find as many objects as possible. This is a competition, because if we go too deep, then we will not have time to survey a large area of the sky. So, we were at the limit. We tried to go as deep as possible to capture the z=0.5 supernovae, but in the meantime we wanted to catch a lot, so we needed a large area of the sky. We were always at the limit of detection, basically.

Pavlish:

While you were doing this work, were there group meetings every two days?

Pain:

We had meetings in one of the rooms down the hall here, I think in 5215, the room number. Everybody was in front of his computer. We were not that many. We were standing in the middle of the room and discussing the project.

Pavlish:

You were working on the data at the same time as when you held group meetings?

Pain:

Yes. Then, later on, after 1995 or so, we started to have regular group meetings, when we formally created the SCP. Before that, the supernova group was more informal. In 1995 or so, we started to have regular group meetings.

Pavlish:

What about the cosmology aspect of it? Had you considered a nonzero Lambda before this?

Pain:

No. In fact, when I joined the group, the idea was to try to measure Omega_ Matter, to find the matter content. It was an indirect way of getting to Dark Matter. This is a connection with particle physics. There was this idea of non-baryonic matter, maybe some new particles, super symmetric particles or something like that. In a sense, we have a direct way to look for Dark Matter or possible unknown massive particles, in accelerators. The indirect way was to measure the geometry of the universe, to weigh the universe. This was the reason I went in this direction, to weigh Dark Matter. The reason I joined the SCP initially, was to find the Dark Matter.

Pavlish:

So you were not thinking of Dark Energy?

Pain:

Not at all. Nobody was, really. Nobody was.

Pavlish:

Nor Lambda.

Pain:

No. This came about in 1996, I think. In the beginning, we were trying to measure Omega_M. That was it. We were in a Lambda equals zero, a cosmological constant equals zero, universe. We were not even speaking of it. For us, there was no cosmological constant.

Pavlish:

That is interesting, because in fact Dark Matter has not come to be one of the results of this collaboration.

Pain:

No. There is still a relation. The main finding was that the cosmological constant is probably nonzero. That is the main finding; that the universe is accelerating. But if you combine this measure with some other measure, you find that you need to have Dark Matter in the universe. So it is not just the cosmological constant. It is also the fact that there is a need for Dark Matter. Many people knew already, or thought already, that there was a need for Dark Matter. So it is not a discovery in the sense that it was completely new. But it was a confirmation that there is a need for some Dark Matter. In fact, in the beginning when we were looking for Dark Matter, we thought that there would be much more Dark Matter than we think that there is today. Everyone was convinced that the universe was flat, and that you had to have a lot of Dark Matter to make it flat. If there is no cosmological constant, you need a lot of Dark Matter. So after this discovery took place, there was a need for less Dark Matter. But still, there was a need for some. At least, we know that is so when you combine this result with other measurements of cosmology.

Pavlish:

Did your particle physics colleagues, back in France for example, understand what you were doing and sympathize with this work? Or, were they instead asking, “Are you becoming an astronomer?”

Pain:

Yes, exactly. Most of them, not all of them, but most of them thought that I was doing something else for some time and that I would come back to do the real work. My Institute is a Particle Physics Institute, so working with telescopes was not in the main line of the work. Some people thought that it was not a good thing to do.

Pavlish:

Did you see the SCP change over its lifetime, or over your tenure with it? Was there a change in dynamics of the group?

Pavlish:

Yes, there was a change. This period when I was here, from 1993-1996, then I came back to work with it a bit, it was very few people — Saul Perlmutter, Gerson Goldhaber, Alex Kim, Ariel Goobar, myself, and Isabelle Hooke was here in 1995, I think. Then, we had Peter Nugent, Rob Knop, Greg Aldering join at some point. The group grew a lot when things started to work. At the beginning it was very hard to get telescope time. Saul was very good at it, at trying to get telescope time. Without telescope time, one cannot do anything. Then, it became easier and easier to get telescope time and to get supernovae. The group grew much more.

Pavlish:

Would you say that there were phases in the project, based on the people working on it at one time? Would you say that these were two phases of the experiment — Saul Perlmutter, Gerson Goldhaber, Alex Kim, Ariel Goobar, Isabelle Hooke, and yourself as one phase. Then, Peter Nugent, Rob Knop, Greg Aldering joining, as another?

Pain:

It was a smooth transition. It is hard to say. Certainly, at the beginning it was not obvious that something would come out of it. When it started to work, doors opened, and there were more opportunities to continue to expand. I do not think there was a big transition. It was an evolution.

Pavlish:

Do you now think of Lambda as Dark Energy? How does it come together with your particle physics understanding? Do you think of it as vacuum energy?

Pain:

That is an interesting part; there is a connection here too. When it appeared that there may be a reason for a cosmological constant, I personally was much more in favor of saying that this was vacuum energy, because of my particle physics background. I liked the idea that there is some vacuum energy. The question of the value is a big problem, because if it is vacuum energy, either it is zero or it is a very big number depending on what theory you ascribe to. This number does not seem right. But conceptually it is a very appealing and simple explanation. It gives a ratio of pressure to density of minus one, which is what we are measuring today. We already know that there are particles, that there is vacuum energy. It does not require any more sophisticated new theory. Now the problem is that it does not work quantitatively. But, conceptually, I still think that it is a very nice explanation. Maybe there is something that we do not understand, and it will turn out to be vacuum energy of some sort.

Pavlish:

When you were doing simulations of supernova explosions, did you think about the theory of that at all?

Pain:

When I say that we were doing simulations, we were not simulating the explosions themselves. We are simulating the detection process. This part, I know much less. I am following what is happening. There are modelers of the explosions who are working isolated from the experiment. Sometimes there are connections. They try to reproduce what we see. It is not the same work as what we do. It is very complicated, practically speaking to do modeling. There are people who are experts in modeling. It was, and maybe still is, that in the analysis of the data, the observations, we need to get some complexity before working together.

Pavlish:

They are waiting for your results, for you to find more and more objects?

Pain:

Yes. Experts are talking to each other from time to time, but not really working together because they are two different fields. I do not have a good feeling about whether this is well understood or not. It seems to me that it is not yet at the stage where we can compare the data and the theory, and say whether a theory is wrong or right.

Pavlish:

Now I would like to ask some philosophy of science and some factual questions. My advisor, historian of physics Peter Galison has written a book called How Experiments End, in which he analyzes how discoveries are made, in a little more sophisticated way, not asking for eureka moments of discovery. He gives the ideas of different philosophers of science. Thomas Kuhn that experiments or observations have to fit into a preconceived theory. There are these social constructivists who say that experiments end when the interests of scientists are satisfied. To Galison neither account is accurate. He does a few case studies in twentieth century physics, of how experiments, end, the discovery of neutral currents being one from particle physics. I would like to ask, in your opinion, when do experiments or observations come to their conclusions? How does consensus form around a scientific finding? I may suggest that an experiment ends when everyone in the research group agrees on a result, or when it has gained general acceptance in the greater scientific community. But, I do not really know. Paired with this question: can we put a time stamp on the discovery, in your view?

Pain:

Maybe it is not exactly answering your question, but one thing, which I think is important, is the timing. Because, before it would not have been possible to do this experiment, because of the computers and the detectors on the telescope used in the experiment could not have been big enough. I think this was a unique condition to have the big detectors on the telescope and to have the computers to analyze the data, which by the way, are not two independent things because of they both use silicon. This is something which I think is very important to realize. Now I turn to the acceptance of the discovery itself. First of all, the thing is not completely a revolution, because in Einstein’s theory having a nonzero cosmological constant is a possibility. It is not like we are breaking General Relativity. From a theoretical point of view, it is not a big thing to accept the existence of a new term, the covariance of the cosmological constant, which is mathematically like the cosmological constant, vacuum energy it is not a big thing to accept the existence of a new term like the cosmological constant, which is mathematically like vacuum energy. So I think it was not like we were on a completely exotic theory. It is General Relativity Theory. The cosmological constant, by the way, had come and gone a few times already. It is just coming back now.

Pavlish:

How does that change our view of the universe, or how science works, to have a discarded concept brought back? Do you think this changes things?

Pain:

You can say its not that it was discovered, it was approximated to a value of zero. It is physics. It is not mathematics. In physics, we approximate something very small to zero. We could say, that before it was just not detected. For me it was not hard to accept. From my background in theoretical physics that I studied at University, this was a concept that was not difficult to accept.

Pavlish:

Even though when you would have taken courses, they would have said that Lambda equals zero?

Pain:

Yes. Mathematically, you look at the equation and you can have a term like that. So, ok, it happened to be zero. But you look at the equation, and see that it could be nonzero. For people in particle physics, from that point of view, it might be easier to accept this thing, than people who are just thinking that is a new constant in the universe, without any justification. That is the way I look at it. It is not a revolution. It is not like discarding the theory of gravitation. This, I would be very reluctant to do. Today, there are people who are trying to do things beyond Einstein’s theory of Gravitation. This is hard for me to accept most of the time. That is the way I look at it. Now, when people recognized in the group that the results were showing a nonzero cosmological constant, even if there was no strong interaction with the other people doing the work, there were rumors that they were also finding results different than what was expected, which was also a nonzero cosmological constant. The data was strongly showing that there was no way the thing would be compatible with a zero cosmological constant. I remember Gerson Goldhaber was showing these plots during the meetings, where he was finding a negative mass density for the universe. He said, “Look, there is something wrong. I am finding a negative Omega_Matter.” Everyone was checking his results. We said, “Well, everyone seemed ok.”

Pavlish:

Did you check the way that Goldhaber did, or differently?

Pain:

Everyone checked his own part of the analysis. People who were working on measuring the brightness of the objects checked that. People who were working on the distance, redshifts, checked that. Others, calibration. Everyone was looking at his own part, and saying, “I do not find any problem.”

Pavlish:

Was that stressful, to be so responsible that everyone had done one part, and you were finding this compound result?

Pain:

I do not remember any stress. It was quite relaxed.

Pavlish:

But you were at those group meetings in the fall of 1997 when he was showing the plots?

Pain:

Yes. The first data is from 1993, 1994, in that paper. But then there is data from 1995-1997. We had quite some time to get used to the fact that the data was showing something different than what was expected. It was not a surprise. It came slowly, I think.

Pavlish:

You would do the analysis while you were going through, collecting the data. You would analyze perhaps a dozen supernovae at a time?

Pain:

Yes. We accumulated the supernovae gradually, collecting enough and starting to analyze them roughly, not in detail at first. Then, at the end, the whole sample was reanalyzed all together with the latest tools, the latest calibration that was needed. I think, the fact that they were fainter than expected had been around from 1995 probably. When Gerson started to show these plots, it must have been 1995 because I was still around here permanently. In 1997 I was here for only part of the time. I was back in France, already. It must have been in 1995. Then, he was adding new supernovae to his plot and still the supernovae were staying too faint. We had time to get used to the idea, in a sense. It was not completely obvious to publish. It is always difficult. When you are in a group, you get used to the idea. But to publish is a new effort. It is a difficult thing because then the result goes to the rest of the world. Many many things have to be looked many many times to be sure that there are no mistakes.

Pavlish:

Any particular stories you remember that you would like to share with the archive?

Pain:

No, not really.

Pavlish:

This is very helpful.

Pain:

I hope it is.

Pavlish:

I heard that this group consisted of members mostly from particle physics backgrounds and the other group’s members come from astronomy backgrounds. People say that, but one does not really get a good sense of it until one talks to someone like you, a particle physics. It is really interesting that you used similar skills in simulation here in the SCP as well as in your work at CERN.

Pain:

Saul Perlmutter was trained in this lab. But Ariel Goobar is from CERN. In fact, I knew him at CERN. He did his PhD in particle physics. If something changed afterwards, in the sociology of the SCP later, in fact more astronomers came and joined the group. There is a noticeable difference, I think. Greg Aldering and Peter Nugent, are more astronomers. Gerson Goldhaber is also a particle physicist. The initial group probably needed some expertise in Astronomy, and so Saul Perlmutter realized that and started hiring people when he could, with Astronomy backgrounds. It certainly changes the sociology of the group.

Pavlish:

There was some press in France after the result came out?

Pain:

Yes, there was a bit of press, but not as much as here. First of all, traditionally, there is much less, although that is changing. In my Institute, it was not well understood until well after the discovery.

Pavlish:

How much after the discovery?

Pain:

When I went back to Paris in 1996, I wanted to start a small group, and it took me a couple of years to convince my colleagues. We had to have preliminary results, at end of 1997 or so, 1998, when it was clearly effective and I got the green light to start a group there to join the SCP. There was some reluctance initially from some people, because it was unusual. After that we got very good support. Now, the Institute is very supportive of this project, of measuring Dark Energy. In a sense, the fact that it is Dark Energy helps, in terms of particle physicists thinking highly of the project. Now, the theoretical aspect of vacuum energy, Dark Energy, is physics that particle physics wants to understand. It is really the same science. The tools we are using are different. We use telescopes. That is a big difference because usually people who use telescopes are from Astronomy departments. It is unusual for particle physicists to invest some thought into Astronomy. But this is also changing. The new detectors, astronomical tools, to find Dark Energy, have features that make them similar to particle physics detectors. Calibration aspects, monitoring stability, are similar. It is a business that is really in between now, and accepted as such. But in the beginning it was not obvious that it would be that way.

Pavlish:

Thank you very much for your time. I enjoyed learning from your knowledge of the subject.