Notice: We are in the process of migrating Oral History Interview metadata to this new version of our website.
During this migration, the following fields associated with interviews may be incomplete: Institutions, Additional Persons, and Subjects. Our Browse Subjects feature is also affected by this migration.
We encourage researchers to utilize the full-text search on this page to navigate our oral histories or to use our catalog to locate oral history interviews by keyword.
Please contact [email protected] with any feedback.
Photo courtesy of Robert Jennings
This transcript may not be quoted, reproduced or redistributed in whole or in part by any means except with the written permission of the American Institute of Physics.
This transcript is based on a tape-recorded interview deposited at the Center for History of Physics of the American Institute of Physics. The AIP's interviews have generally been transcribed from tape, edited by the interviewer for clarity, and then further edited by the interviewee. If this interview is important to you, you should consult earlier versions of the transcript or listen to the original tape. For many interviews, the AIP retains substantial files with further information about the interviewee and the interview itself. Please contact us for information about accessing these materials.
Please bear in mind that: 1) This material is a transcript of the spoken word rather than a literary product; 2) An interview must be read with the awareness that different people's memories about an event will often differ, and that memories can change with time for many reasons including subsequent experiences, interactions with others, and one's feelings about an event. Disclaimer: This transcript was scanned from a typescript, introducing occasional spelling errors. The original typescript is available.
In footnotes or endnotes please cite AIP interviews like this:
Interview of Robert Jennings by David Zierler on June 22, 2020,
Niels Bohr Library & Archives, American Institute of Physics,
College Park, MD USA,
www.aip.org/history-programs/niels-bohr-library/oral-histories/47211
For multiple citations, "AIP" is the preferred abbreviation for the location.
Interview with Robert Jennings, retired since 2018 from the FDA’s Center for Devices and Radiological Health, where he was a research physicist. He recounts his childhood in Southern California and the formative influence of Sputnik on his physics education. Jennings discusses his undergraduate experience at Occidental and his master’s work at UCLA, and he describes his postgraduate work at the NASA Ames Research Center where he worked on optical detectors. He explains his decision to pursue a PhD at Dartmouth where he studied under John Merrill and worked on Tonks-Dattner resonances. Jennings describes the circumstances leading to his postdoctoral research in Brazil at the Institute of Atomic Energy, where he worked on medical radiation in the Division of Solid-State Physics. He discusses his subsequent research with John Cameron at the University of Wisconsin’s Medical Physics section to develop spectroscopy systems. Jennings explains that the expertise he developed in radiation and modeling in Wisconsin served as his entrée to the FDA ,which excited him as the place where the most impactful research was happening at the time. He surveys the major projects he was involved with over his career, including human visual signal detection, quality assessment of medical devices, improving mammography diagnostics, tomosynthesis, and CT scanners. At the end of the interview, Jennings surveys the fundamental developments that have advanced over the course of his forty-plus year career at FDA, his major contributions in tissue simulation science, and why he believes AI will become increasingly central to advances in medical imaging.
This is David Zierler, Oral Historian for the American Institute of Physics. It is June 22nd, 2020. It is my great pleasure to be here with Dr. Robert Jennings. Bob, thank you so much for being with me today.
Oh, I'm happy to be part of this project.
OK. So, to start, would you please tell me your most recent title and institutional affiliation.
I retired at the end of 2018 and my position was research physicist at the Food and Drug Administration’s Center for Devices and Radiological Health, (CDRH), which is one of the centers in FDA. And, within CDRH, I was in the Office of Science and Technology Laboratories, and within that office, I was in the Division of Imaging, Diagnostics, and Software Reliability.
OK. All right. Good. So, let's take it back to the beginning. Bob, tell me a little bit about where you come from. Where did you grow up?
I am a Pearl Harbor baby.
Oh!
I was born in Minnesota, almost exactly 266 days after the bombing of Pearl Harbor, 266 days being the nominal human period of gestation.
[laugh]
My dad worked for the U.S. Army Corps of Engineers during the war and ended up in California at some point. I don't know that history. So, when I was a year and a half old, the family moved to California, and I grew up in Los Angeles.
Oh, I see. And where were your parents born?
My mother was born in Austin, Minnesota, which is famous for the Hormel packing plant, or maybe not so famous, but, anyway, that's where it is. And my dad was born in Lakota, North Dakota.
And what was your father's career before the military?
He was a civil engineer and he worked for the Minnesota Highway Department.
I see.
So, the lore of the family is all about how they lived all over the state as he would go from one project site to another.
Mm-hmm.
Interesting stories like, in the wintertime, the first thing you did was find a farmhouse where you could leave your lunch so it wouldn't be frozen by lunchtime.
Yeah.
And then, I guess when my dad got to California after his work with the Corps of Engineers was over, he took a position with a civil engineering firm in LA. And, as soon as the war was over, the pent-up demand for housing and the desirability of living in California produced a boom that lasted essentially for the rest of his life. He was at ground zero for the tremendous growth of Los Angeles and the San Fernando Valley and all of that.
And what neighborhood did you grow up in, Bob?
The first place that I remember is called Park La Brea, and La Brea is the La Brea tar pits. (Brea is the Spanish word for tar.)
Right.
It was a big housing development. We lived in a townhouse. When I was 6, we moved to West LA. The neighborhood was called Rancho Park, not too far from UCLA. And that's where I lived until I went off to college and later on got a job and moved to Northern California.
And when did you start exhibiting an interest in math and science?
I just remember things that my mom told me, like, when I was a little tiny kid, I asked her how they make doorknobs.
[laugh] Yeah.
I don't know that I showed any particular talent for it until I was in junior high.
And then what happened in junior high?
I just found that I was pretty good at it.
Did you go to public school throughout your childhood?
Yes.
And, when it was time to think about college, were you thinking specifically that you wanted to major in physics, or that decision came during college?
I don't really remember thinking much about it at all. It seemed like the natural thing to do. One factor that I’m sure had an influence was being placed in “PSSC” physics in high school. PSSC stands for Physical Sciences Study Committee, which was formed in response to the launch of Sputnik. Sputnik was launched in 1957, when I was entering high school. By 1959 the PSSC had produced a curriculum and a syllabus for a physics course that was somewhat more advanced than the “regular” physics being taught at the time. The course included several experiments, which I found interesting.
Another factor was that I had the good fortune to participate in a program that was joint with UCLA and the LA Public Schools. They picked a few people from a number of Los Angeles high schools to take courses at UCLA during their senior year, and one of the courses that I took was—I believe it was called analytic geometry. They weren't teaching calculus in high school at that time, and it was algebra and geometry and trigonometry. And I enjoyed that and did well.
Why did you choose Occidental?
[laugh] I was turned down by Stanford and I sent the application late to Harvey Mudd, and Occidental was recommended by the people from UCLA. I'm an introvert, always was, always will be, and they suggested that a small school like Oxy would be a very good place for me.
And did you have a good experience there?
Yes.
Now, I see that you were a double major in math and physics.
Yes.
Was that based on advice that you were given, that that was a smart way to go?
Three or four of us in the physics major looked at how much math we were getting and thought it would be cool to have a second major by your name. So, it didn’t involve a whole lot of extra courses.
Well, that's good.
And it seemed like a good idea at the time.
Right, right. And what kind of physics were you most interested in as an undergraduate? Were you just sort of broadly taking classes, or you were already developing more professional-oriented interests as an undergrad?
I was just taking a regular list of courses. My primary interest in things that are related to physics was making cars go fast. I was a hotrodder.
Really? What kind of cars did you like to work on?
It was later on that I really got into it in a big way, but that was in the era of the Corvair with the rear engine.
Mm-hmm.
And I got one that had been breathed on by Andy Granatelli. I don't know if you know that name.
I don't. Who is he?
He and his brothers were originally from Chicago, and they had all kinds of irons in the fire. Andy Granatelli is best known for multiple appearances at Indianapolis, but he had a business which involved putting superchargers on small cars. And so, I had a blown Corvair.
And would you race?
No. Did go to the dragstrip a couple of times just to see what kind of performance we could get. That was the one place where you could get an elapsed time and a top speed for a quarter mile legally.
[laugh] And then, Bob, at what point did you realize that you wanted to continue on in a graduate program in physics?
I wasn't really focused on a particular branch of physics. But, when I graduated from college, it was easy to go to UCLA. It was nearby. I could live at home. And I thought it would be a good way to learn a little bit more about the field.
Mm-hmm. And was your intention only to get the masters there or were you thinking you were going to stay there for the PhD?
I wasn't really as focused as I should have been. I was more interested in tinkering with cars. So, at the end of two years, I was invited to leave. At that point I decided that I should look around and find a job, which is how I ended up at NASA’s Ames Research Center.
Now, did you have a thesis for the master's program?
No.
So, you just completed enough coursework to get the master's?
Yes.
And what job did you get after UCLA?
I think it was called research physicist. I worked with a group that was working on optical detectors, particularly photomultiplier tubes, at NASA’s Ames Research Center.
And what were some of the big projects that were going on at the time at Ames?
I wasn't involved in any of the wind tunnels, but the big deal at Ames was all the different wind tunnels. There was the 40 x 80, and that's 40 feet x 80 feet, so it was a very large physical facility for low-velocity stuff. There were hypersonic tunnels, so called blow-down tunnels, where, since you can't generate on a continual basis extremely high velocities, you pump up a chamber and then let the high-pressure air loose, typically by punching a hole in a “window” (a metal diaphragm), and you get your data as the pressure reduces. So, it's not a steady-state kind of a thing.
I shared an office with a physicist who—he described himself as a “hack physicist,” but he knew how everything in the world worked, as far as I could tell. And he was extremely helpful to me in getting an appreciation for the nuts and bolts of doing physics. One of his jobs was supporting another scientist who was studying circadian rhythms, the way an organism behaves during the day and the evening, the daylight cycle. And so, they had a very large facility with chickens who were exposed to lights of different colors on varying time schedules.
And my office colleague's responsibility was to work on producing an appropriate environment and then keep the system running. He used a 2500-watt xenon arc lamp, which sometimes would not light, and so there was a complicated system of a guard going by every so often at night, and if the light wasn't on, then he would call and my colleague who would then go and try to figure out why the xenon arc lamp had not fired. One aspect of the experimental setup was that there was a conveyer belt to catch the feathers and bird droppings and run them to a garbage disposal. With all the feathers, the garbage disposal would sometimes become clogged up. And so that was my colleague's other responsibility, keeping the garbage disposal running. So that was kind of the joke, that the project was the chicken shit project.
But we worked on developing high-sensitivity photomultipliers, and I got the job of designing hardware that allowed us to do the testing in an efficient way, and he gave me some guidance on that. I did most of the work, but it would've been much less elegant without his input.
And, Bob, what was the end use for this work? Who was using these instruments?
The guy who headed the project said, “We love astronomers.” The idea was that he wanted to build PMTs that were extremely sensitive, to optimize measurements. I don't know the details, but there were spectroscopic measurements and motion measurements. I don't know what all. But the idea was to advance the basic science. The technique that we were working on was using total internal reflection to bounce light through a semitransparent photocathode. In normal use, the light comes in perpendicular to the photocathode and makes a single pass through the photocathode. But if you can bring the light in using an optically coupled prism with the entrance face at 45 degrees, you can make it bounce along between the photocathode and the glass end window of the tube, thus making multiple passes through the photocathode, and you can collect more of the light in the photocathode and generate more electrons. The technique doesn't work for imaging, but it improves the sensitivity by a significant factor.
And, Bob, at what point did you realize that you wanted to go back to school?
Just looking around at where I was and where I fit in the hierarchy, I realized that I wasn't going to be able to do the kinds of things I wanted to do unless I had an advanced degree beyond the master's.
And how did Dartmouth come together? Why did you end up there?
A friend of mine from Oxy was in graduate school there. He had switched from physics to math, but he said, “Why don't you come and visit at Christmastime and we'll teach you how to ski?” That didn't work out, but...
[laugh]
And you can meet some people in the physics department. And the prospect of going to a new place back east and where it gets really cold seemed like a lot of fun, and I liked the people that I met in the physics department. And my wife said, “Yeah, let's do it.”
At what point—when did you get married?
My wife is a year younger than I am, so she graduated from Oxy a year after I did. I'd been at UCLA for a year and, when she graduated, she got a Fulbright and she spent that in Chile. The Christian Democrats had, for the first time, gained the presidency. Anyway, she had been in Chile for a year. She came back, I was at the end of the two years at UCLA, and we were trying to figure out how to find both of us positions in the same location. She was looking at SAIS, the School of Advanced International Studies at Hopkins, and at Stanford, and I was looking at the Naval Research Lab in DC and at Ames. After a number of false starts, we ended up with her getting an opening at Stanford in Latin American studies and I got the job at Ames. So that was in the Fall of 1966. We were married in December of that year.
And what was your wife able to put together for herself while you were studying at Dartmouth?
It's an interesting story. The high school had offered a teaching position in history in the high school to someone who, at the last minute, refused to sign a loyalty oath so the high school was stuck without a history teacher. My wife had a master's degree from Stanford, which is a pretty good credential, and the school said, if you will agree to take education courses in the summer at Dartmouth, we will hire you as a history teacher. And so, she spent the next four years teaching history in Hanover High School.
Bob, what kind of physics were you looking to do at Dartmouth?
I didn't really have anything in mind when I got there.
Did you think of yourself more as an experimentalist or a theorist, or how did you see your skills matching up with the kind of physics that you wanted to study?
Absolutely experimentalist. Later on, I asked the professor who was my thesis advisor for a reference letter, and he said I was a pretty good experimentalist.
And who was that, Bob? Who was your thesis advisor?
His name is John Merrill. New graduate students interviewed with faculty members looking to take on new group members for their research groups, and he had an opening. He asked me to work on setting up an experiment for the department’s advanced lab. I guess what I did worked out ok.
[laugh] What kinds of things were you doing that impressed him so much?
When I first started talking with him and his group, he was in charge of the advanced lab, which was for, I think, juniors and seniors, and it was truly high-level experimental stuff. And they wanted to set up a new experiment on what's called Tonks-Dattner resonances. The idea is that if you have a plasma and you shine microwaves on it at the plasma frequency, you'll get reflections. The idea was to set up a microwave generating system and generate a plasma with an ordinary fluorescent X-raylamp tube. The plan was to pulse the tube and observe the microwave reflections as the plasma decayed. I set it up in three or four days, and he was surprised.
I must admit that the only reason things went together so quickly was that a number of more advanced students and one very talented technician provided all kinds of assistance. In particular, the use of the fluorescent tube had already been worked out by students in the plasma physics group as a very inexpensive method for doing plasma physics experiments. They very generously provided advice and a number of hardware bits and pieces.
[laugh] Not just at how fast you set it up, but probably also how well you set it up. Bob, what was your dissertation thesis on? What did you work on?
The title was “Energy Gaps and Infrared Spectra by Superconductive Tunneling.”
And can you describe the project a little bit?
A physicist named Ivar Giaever and a couple of others got a Nobel Prize for experiments in electron tunneling in superconductors. The tunnel junctions were made of aluminum, aluminum oxide as an insulating layer, and lead. The transition temperature of lead is 7.2 degrees. So you're automatically there if you're in liquid helium, which is at 4.2 degrees. The tunneling current doesn't start until you exceed the band gap in the lead superconductor. Are you familiar with the density of states diagram?
No, I'm not.
There are base states and then allowed states at higher energies, and there's a gap in what, in a regular conductor would be a continuum. And so, you don't get any current until the voltage applied to the junction exceeds that gap. There are no allowable states for the electrons to go from one side of the junction to the other until you get to a slightly higher energy, which is the energy gap. And so that was what the Nobel Prize was for. It's relatively straightforward to set up that kind of a system—once Nobel Prize winners have shown you what to do. You need a vacuum system with the ability to put enough power in to vaporize aluminum and lead, and so you lay down an aluminum strip, you open the bell jar to air, and the aluminum immediately oxidizes and forms an insulating layer. And then you pump it down again and you evaporate the lead over it. And so, that's your experimental sample.
But it turns out that if you put junk, dirt, in this case vacuum pump oil, in the junction, then there are allowable states in that gap. And so, you can interrogate that energy distribution just by observing the tunneling. And the way that the electronics is done is that you apply an oscillating current with a very small voltage amplitude. And, if you think about the current, when you get to an energy where one of these impurities allows the tunneling to occur, you get an increase in the current. And if you take the second derivative of that, you get a peak. And that turns out to be the infrared spectrum of the junk. So, with a lock-in amplifier and an extremely high impedance source, you can drive the current to be what you need it to be. So, you can record the infrared spectrum with the lock-in amplifier.
Bob, did you see this research as having practical implications, or was this mostly about basic science?
I'm not aware of anybody using that technique for analytic purposes, but it certainly was interesting. And one of the things that I looked at was the temperature dependence of that process, so that was an interesting piece of basic science.
Bob, after you defended, what were your ambitions at that point, what did you hope to do?
Get a job.
[laugh] Were you thinking more industry or teaching? What were you most interested in?
Are you familiar with the way things were back then?
I know that in the early ‘70s the job market was pretty tight, if that's what you're referring to.
Yes. I forget the author's name, but his report was in Physics Today, an article about how the country was in love with PhD physicists right after the war. After all, they developed radar, they built the bomb, they saved us from fascism. And the author quoted somebody in the industry saying about PhD physicists, “You keep minting them, we'll keep hiring them.”
[laugh] Right.
And that fascination with advanced-degree physicists had pretty much run its course by the time that I finished up. And so-- Well, let me tell you about a colleague who finished up a little bit later than I did. He wrote letters to 70 departments looking for a teaching job. Not only did he not get a job, he didn't get a single response. So that's how bad the job market was. He ended up getting a job setting up X-ray lithography systems for making computer chips for computer games. And I don't know what's become of him since, but it was a really bad job market.
And it turned out that another person in the physics department who finished up a little before I did had married a woman who was teaching in the nursing school at the medical center in Hanover. She had been in the Peace Corps, in Brazil, and she wanted to go back, and so they went. Then, when he heard that I was finishing, he wrote me that he was coming back to the United States to do a postdoc and would I like to come and take his position for a year. My wife, having lived in South America was fluent in Spanish, and during her Fulbright she had spent several months in Brazil and studied Portuguese, so she thought that was a great idea to go to Brazil. And so that's how I got to Brazil. That was where I got the job.
Was this an exciting an opportunity for you or was it really the only opportunity for you?
Well, it was the first one that I came across, and it seemed like a really interesting thing to do. The connection to the place that I ended up working was that the colleague in graduate school that told me about the position had worked on elucidating the mechanism of thermoluminescence in lithium fluoride. Do you know what TLD is?
I've heard of it.
Lots of different crystals, when you irradiate them, will be changed in such a way that, when you heat them up, they will emit light. And the amount of light is in proportion to the amount of radiation. So, they are very useful in dosimetry where you need something small and something that records the history that it experiences. One of the better-known applications, aside from just regular radiation dosimetry, is in dating of artifacts, typically pottery. If you find an old water jar or bowl or whatever, you grind it up, separate out the inorganic crystals. You read them out by heating them, and then, after the fact, you do the calibration by irradiating them with a known dose of radiation and doing the readout again. And so, from the amount of radiation that they've accumulated, you can estimate how long they've been around, the idea being, as pottery, when they're fired, they're zeroed out. And so, what you read is what they've accumulated in their lifetime after the piece was created.
Was there an expectation for you to speak Portuguese, or was English the shared language in São Paulo?
My friend was still there in Brazil when I got there. And he did me the great favor of, before I got there, telling everybody, “Don't speak English with Bob. If you speak English with Bob, he'll never learn Portuguese.”
[laugh]
So, they were nice to me. They spoke English when I really needed to know what was going on. But I took being there pretty seriously, and so my wife and I both studied Portuguese at—it was called the Brazilian US Cultural Union. And, in fact, it was just a language school. Almost all the students were Brazilians who wanted to learn English so they could go to the United States and could do business.
Dartmouth has what I think is a marvelous language program. Before we left, I had studied for a few weeks. And so, I knew some preliminary Portuguese, and that allowed me to score high enough on the diagnostic test that the language school used so that I managed to get into the upper of the two levels of Portuguese that they taught, so my wife and I were in the same course. We spent, I think it was about six months, going twice a week, three hours each time, to take the course. And, by that time, I was reasonably fluent. But I still have the accent of a gringo, a foreigner.
Uh-huh.
But I can teach classes, I can give hour-long lectures in Portuguese.
And was the job there—were you teaching as well as conducting research or was it research only?
The way it worked was that my primary place of employment was the Atomic Energy Laboratory, Institute of Atomic Energy, which, at that time, was part of the University of São Paulo. It was on the campus, and it was right next to the physics department. And the person that I worked for was a division director in the Institute of Atomic Energy and a full professor in the physics department. And I don't know the politics or the mechanics of it, but I worked at the IEA, the Institute of Atomic Energy, but I was paid by the physics department. And so, part of the deal was that I would teach a lab course at the physics institute, typically an evening course. To get me started, they paired me with a Brazilian teacher. But later on, I could work by myself.
Were you connected, when you were in Brazil, with your colleagues in the United States in terms of keeping abreast of the research that was going on in the States, or was it mostly you were fully concentrating on what was happening in your institute in São Paulo?
Mostly concentrating on what was going on in São Paulo.
And what were some of the major projects that you were involved in during your time there?
I was in the Division of Solid-State Physics, and the-- Well, there were a number of activities that I was not involved in having to do with looking at the effects of radiation. The thing that I was most involved with had to do with TLD, and the idea was to use it for dosimetry for medical applications. One woman was working on a TLD necklace. The TLD chips are—TLD-100 is a product of what used to be called Harshaw Chemical Company. A long time ago they were bought by a French company. Anyway, they're an eighth of an inch by an eighth of an inch by 35 thousandths, so they're tiny. And what she was doing was putting them together in an array that could be worn by patients undergoing radiation therapy for thyroid cancer. So you'd have this relatively unobtrusive, noninvasive array of radiation detectors, and it allows you to map out the spatial distribution of radiation. So, putting it together and figuring out how to calibrate it and all of that was what her thesis was. I wasn't involved in it, but I was aware of what was going on.
The biggest thing that I worked on was building a TLD reader. You had the issue of, how do you measure the light? How do you do the heating in a reproducible way? The technique had been championed by a person named John Cameron who was at the University of Wisconsin, and he had built a reader, and it was in pieces in the sense that the PMT HV supply was off-the-shelf, just a simple DC power supply, the PMT output was measured with an electrometer and recorded with a general-purpose strip chart recorder. We used a digital voltmeter to measure the signal coming from a thermocouple that was spot welded to the bottom of the little pan which was the heating element for heating the chips. The device he built was elegant but not very robust, so the person who ran the lab where I worked wanted to build something that was able to withstand the rigors of being used by undergraduates and graduates who were relatively inexperienced. So, my job was designing and arranging with local machine shops to build all the pieces that we needed.
And, by that time, I could speak Portuguese well enough that I could negotiate and explain what I wanted quite well. São Paulo is a big, modern city and there's no lack of all kinds of first-rate engineering and mechanical talent. So it was a simple matter to find, in the local neighborhood, people who could build the kind of stuff that we needed. The light detection was typically done with a photomultiplier. I had experience with photomultipliers, so those things all fit.
And, Bob, to foreshadow a little bit, were you getting more interested in the kind of work that involved human health research that would be so integral to your time at the FDA? Were you thinking along these lines in São Paulo?
I was happy to be doing something that was useful in a medical sense, but I really didn't have a thought about doing something like that as a continuation of my career.
Mm-hmm. And then, was it a two-year program? Was there a set time when you were set to leave, or did you move on to Wisconsin because you were interested in your next opportunity?
My reason for leaving was a personal one. Both of my parents were in ill health, and I wanted to—well, two things. I wanted them to see our child, who was born in Brazil, and I also wanted to spend time with them. And so, we decided that it was time to go back to the United States, and we would just see what happened. I didn't have definite plans aside from just doing a job search and looking for things to do.
And, it turns out that John Cameron, whom I just mentioned, was the head of the Medical Physics Section of the Radiology Department at the University of Wisconsin in Madison. It later became an actual department of the university. My colleague from grad school, when he went back for his postdoc, had gone to Wisconsin because of the TLD connection. He let me know that another professor in the radiology department had obtained a grant from NIH to look at contrast mechanisms in radiographic imaging, and he was looking for two post-docs to work on his grant. At that time I was looking around. Do you know the name Schlumberger? Looks like "Schlumburger" but it's French, so it’s pronounced Schlumberzhay.
OK.
I also interviewed with Schlumberger. They are one of the big players in oil well development equipment, and they were looking for a physicist to build a neutron generator. I forget exactly what the reaction is. It's something like protons on beryllium at 16 MeV generates neutrons. The idea is that you can use neutrons to activate stuff a mile down an oil well and figure out what the mixture you're drilling through is composed of.
Schlumberger was looking for a physicist, and I knew that there was an opening job in the radiology department at Wisconsin. So, I went and interviewed with the professor in Wisconsin. He has since passed away, but he was another one of these guys that knew how everything worked and could make just about anything work. And he offered me the job before I heard from Schlumberger, and so that's how I got to Wisconsin. But I think part of my getting the job there had to do with the fact that John Cameron was the head of the department and knew the people I had worked with in Brazil.
Part of what the professor that I went to work for wanted to do was look at the mechanics of X-ray imaging. In particular, he wanted to measure X-ray spectra, and that was something I had done in my lab work as an undergraduate, and during my time in Brazil. So, with part of his grant money, he bought a germanium detector and the necessary electronics, and I set up a spectroscopy system and we wrote some papers on X-ray spectra. And that's one of the things that I'm recognized for these days.
And what is that designed to do? What do you use that for?
It's a basic input for modeling of X-ray systems. The X-ray spectrum from an X-ray tube consists of a continuum like you have with sunlight, with some structural features superimposed that are characteristic of the anode material. The details of the spectra are central to understanding the relationship between contrast and the dose delivered to the patient. X-ray imaging of patients is of necessity noise limited, because you always want to use the smallest dose to the patient that gives you the information you need, so the images can be “grainy.” And so, optimization of the trade off between the radiation dose and the contrast is a common research topic. Everybody does it, let me put it that way. Everybody wants to know how that trade-off works.
Bob, was your work during this period directly related to most of the other bigger projects that were happening in the lab or was this sort of a more focused project that only you and a few colleagues were working on?
At the time that I was hired, the person that hired me also hired a Kiwi, a New Zealander, who had just finished a PhD in nuclear physics. People understood what we were doing. My colleague was computer literate. When I got out of undergraduate school, Oxy had just gotten its first computer. It was an IBM 16-something—1620. And I never got close to it. The professors in the science departments used all of its time when it was first there. And, at UCLA, I had no access. It wasn't until I got to Dartmouth that I did any programming. Dartmouth is where BASIC was developed.
What is BASIC?
It's an acronym but it's a computer language. Do you know the name John Kemeny?
I've heard of him.
John Kemeny was a math professor. He's best known by people not in math but as the chairman of the committee that examined the Challenger explosion. He and a number of colleagues developed BASIC. They had support from—I forget what the order was; at one point, GE and then Honeywell, or Honeywell and then GE. There was this huge, time-shared computer, probably no more powerful than a Raspberry Pi, if you know what that is.
Yes, I do. [laugh]
So, when I was doing my thesis work, I was encouraged to do everything, as much as I could, with computers. If you were a nobody, you got an ASR 33. That's a teletype, and everything was typed out by the teletype or typed in by the user. If you were somebody important, you got an ASR 35, which had a paper tape punch. And so, part of my experience at Dartmouth was learning to read the paper tape enough to know where the start of a program was, that carriage return line feed—I have strayed a little bit but--
That's all right.
What was your BASIC question?
Well, I was just curious if the work that you were involved in at Wisconsin was directly related to the larger research projects in the lab, or it was mostly a project that you and a few colleagues were working on?
My colleague, the computer-literate person, in addition to exposing me to the use of computers, was also, I think, instrumental in getting people in the department interested in doing computer simulations. I was a user of the HP-45, a programmable handheld calculator, but a lot of people were just beginning to get into using computers. That was in the days when the record for data transfer rate was set by a guy in tennis shoes with an armful of eight-track tapes running across the quad. It was not easy to use the computer because you had to have a deck of cards and you would submit them and then come back the next day and get a high-speed printer printout of your results. Then you'd have to think about what you're going to do to fix the problems that had cropped up, that kind of stuff.
Mm-hmm.
Looking back on it, I was not aware of it at the time, but I think people were jealous of the fact that I got to use the germanium detector and multichannel analyzer. Those were exotic things. The germanium detector was a relatively new technology, and it offered a considerably higher resolution than anything that was previously available, unless you wanted to use crystal spectroscopy.
And what was it so good at detecting?
Well, it had very high resolution for X-rays. And so, one of the things about my experience is how I have benefitted so much, often without being aware of it, from people who really knew how to do stuff. One of the things about germanium detectors is that they count X-rays one by one.
What does “one by one” mean, Bob?
They respond to an individual X-ray and produce a signal that's proportional to its energy.
OK.
And the unfortunate thing about radiography is that the patients are usually alive, and they breathe, and their hearts beat, and they twitch. And so, there's a real need to make your image in a very, very short time, a fraction of a second if you can manage. And that's exactly the opposite of what a germanium detector wants, because it has a count rate limitation related to the time that it takes to process each individual signal. And the benefit that I received from the guy that hired me is that he was a superb engineer. He knew where all the junk was in the whole radiology department, and he found an old, abandoned X-ray generator. He was able to modify it so that it would operate at very, very low currents, which meant a low count rate for the germanium detector, which allowed you to collect data easily. You couldn't put a detector in front of a regular X-ray machine and use a pulse like you would for X-raying a person and get reasonable results. And so, I didn't realize just how important it was that he did that. But he did it in a couple of days, and I was off and running without, at the time, being aware of just what a contribution it was that he had made to my being able to get data.
And when did you realize the magnitude of the contribution?
Years later. During my time at Wisconsin with my colleague, the Kiwi, we were able to do some simple-minded but important, I think, modeling that showed that in mammography, the source that had been developed was close but it was not right on the optimal operating point. The source of choice at the time was a molybdenum target tube. The reason is that the characteristic lines for molybdenum are at a very low energy. Breast imaging is all soft tissue, and so, to get contrast, you need to have a very low energy. A French company, CGR—Company General Radiographic in French, I don't know the exact words—was selling a device with a molybdenum anode that had been developed as part of the general development of mammography. Previously, a tungsten target tube was used and that relied just on the continuum. It was typically higher energy. The doses were quite high.
The resolution of normal imaging systems was not adequate for the small things that you'd like to see in a mammogram, the calcifications which are tiny, on the order of one or a few hundred microns. So, the image detector of choice was plain film, whereas for regular imaging you use intensifying screens, which are light-emitting devices that are high-Z materials, high atomic number materials, which are very good at absorbing X-rays, whereas silver-based emulsion is terribly leaky. The screens expose film with visible light and thus reduce the amount of radiation needed to form an image, but the effect of the screens is to produce some blur. So, the low energy source and the development of high-resolution imaging systems to go with it were major developments right at the time when I was at the University of Wisconsin. And so, being able to do modeling of that combination was essentially what got me my job at FDA.
I see.
I gave a paper on it. And I don't know if the name Dave Brown came up?
No, it did not.
He was the head of the branch that I ended up in. And did the name Bob Wagner come up?
Yes.
Bob Wagner worked in the branch that Dave headed.
OK.
And they were incredibly smart, super high-powered, and they were very much interested in the whole optimization issue. And the paper that I gave appealed to them, and so they invited me to come and work with them.
Now, when you started there, was that as a full-time federal employee, or you started as a contract position?
I was a full-time federal employee.
Mm-hmm. And what was exciting to you about joining FDA?
You get to play with the big boys.
[laugh] So can you talk about that? I'm curious to hear how this developed. By saying “the big boys,” is this to suggest that the most impactful research in your field is really happening at the FDA at this point?
There were a lot of good people in the field. Bob Wagner was a superstar in the field and recognized as such at the time that I went to work at FDA, so that was a big draw for me. Bob Wagner's start in medical imaging involved studying the work of Albert Rose and Otto Schade. Do you know those names?
I don't.
TV was invented or developed, I think, in the late ‘20s. It didn't become a consumer item until much later, but there were some very, very smart scientists at the RCA labs who developed the image science. And Bob Wagner had realized that what they had was relevant to medical imaging, and had studied their work, and adapted it to X-ray imaging. And so, he was taking advantage of excellent work that had been done previously, but then adding to it. There were other centers of excellence. The University of Chicago I think was probably the most important one. Kunoi Doi was the head of it. I don't know if he's still working. A woman named Maryellen Giger is now the head of that. But the Chicago Lab and Bob Wagner were doing a lot of the basic measurement science and modeling that underpins a lot of what's going on today. So, for me, the chance to go and work in that environment was a huge piece of good luck either way.
Yeah. Right. And what did you see in terms of the technical and research skills that you had developed? What did you see as your primary contributions that you were going to be able to offer FDA?
I was pretty good at getting stuff to work.
[laugh] Generally?
Well, to give you an example, one of the things that Bob Wagner and others had worked on was the statistical efficiency of human perception. Shannon is a name that comes up—Shannon Information Matrix. You can rigorously define the amount of information in an image, and it would be nice to know how much of that information can be extracted by human observers, radiologists, that is. At some point, a scientist named Art Burgess, who was at the University of British Columbia, who was interested in studying the same thing, got a sabbatical. He went to Cambridge and studied with a perceptual scientist named Horace Barlow, and then he came and worked with us. And the attraction, aside from Bob Wagner being somebody who was knowledgeable in the field, is that we had one of the very first computerized gray-level displays. Man, TV that can show digital images!
That's a big deal?
Yes, at least it was at the time. And so, Art and Bob worked on the science, and I was their programmer. As I like to describe it, the image display device was a large electronics box. I didn’t understand very much about what was inside of that box, but our two computer experts did know, and they knew before I did what I would need in terms of FORTRAN-callable subroutines. With the support I got from them, I could set any bit in the box. I learned how to program it, and I became fairly adept at doing so. Art could tell me what he wanted, and, after a few days, I'd have a version of it. And then he would tell me what was wrong with it, and I would go home at night with my armful of paper printout, modify it, come back in, sit down at the terminal and modify the code and try it out. And then he'd tell me what else was wrong with it. But what we did in six months impressed the heck out of Art. And I think—
And what were you able to do in six months?
I think our paper was the first one to rigorously examine the efficiency of human visual signal detection. It was a starting point for a whole field of study.
How so? How was it a starting point for this developing field?
A lot of people wanted to study the same thing, and so this put the field on a firm footing in terms of the steering. And it demonstrated that the new computer technology had a definite part to play in these studies.
Bob, I'm curious in terms of, like, feedback for knowing that this research was doing what you had intended it to do, what kind of data were you getting to demonstrate that you were on the right track, that this instrumentation was being successful and that you were encouraged to continue in this line of research?
First, Art and Bob were really excited. Second, the first report on our work was published in Science. The research was well-received in the radiology community. And you've spoken with Roger Schneider.
Right.
Roger wholeheartedly supported this whole endeavor. He saw how important it was. After all, he put together the whole lab that could do these things. I think it was helped by a number of people in the radiology community, radiologists who were interested in it, thought that it was important and useful. And so, I think their contribution, in addition to the scientific contributions, was to get buy-in from the radiology community that this was important stuff.
Did you get a sense of when this research was really starting to positively impact diagnosis? Would you get that kind of information or was that sort of too far away from what you were doing?
I think I was a little bit removed from anyplace that might have come from. But we did, on occasion, have visits from well-known radiologists who wanted to see what we were doing and compare notes with Bob Wagner. Bob Wagner was the draw. He was the person people came to see.
And, substantively, were you working on the same things as Wagner or different projects?
He was my supervisor for several years when I first got to FDA, and so the work with Art Burgess was during that period. He didn’t order me to do this or do that, I pretty much knew what I was supposed to do and just did it.
Mm-hmm.
But if there was ever a question about what approach to take, then that was always done in consultation.
And, Bob, I'm curious, how much of your research was about developing new technologies in-house yourself, and how much was it analyzing stuff that was coming in from industry?
A major concentration in everything we did was developing the capability to analyze the quality of medical devices. So one of the things we did was to publish documents to—[lost connection]
In terms of your work with Wagner, that you were largely self-directed and you would go to him when you had any issues that you needed help resolving.
Yeah, pretty much.
You were also talking about—I asked you about the extent to which you were doing research on new technology yourself and how much was it assessing what outside companies were bringing to FDA for approval.
There was always a focus on technology assessment. I mentioned that mammography was becoming a useful modality.
And, Bob, what year would this have been when you're talking about mammography becoming a useful modality?
Mid to late ‘70s.
Mm-hmm.
The two things that I mentioned, I don't know if either one was the driver, but the existence of a suitable X-ray source, and the existence of an image receptor with sufficient resolution, and sufficient resolution at sufficient sensitivity, were things that had to happen all together. The doses were so high when plain film was used—plain film means film without screens—were so high that physicians were reluctant to use mammography as a screening modality. So DuPont, which was a major film manufacturer back then, brought out a product called Lo Dose, which is a screen-film combination. CGR brought out their Senographe. Seno is breast in French, and so a Senographe is a device making a picture of a breast.
And, also, what I consider to be one of the great papers of the 20th century, some people at Memorial Sloan Kettering actually managed to measure the radiation dose delivered by a mammographic examination using the new technology. Previously, people used a number of different metrics. Do you know radiation measures like exposure and dose, the difference between those?
Yeah.
OK. Some people were talking about incident exposure, and the relationship between that and dose depends strongly on the actual spectrum. They were talking about midline dose, all kinds of things, and the people at Memorial Sloan Kettering—and here comes TLD again—they got together with a—I don't know what she would have described herself—as a research physiologist perhaps, Helen Woodard, who was knowledgeable about physiology and tissue composition. They dissected several mastectomy specimens and subjected them to chemical analysis so that they knew the elemental composition of breast tissue, or various parts of breast tissue, that is to say adipose and glandular tissue. They got help from David White, an Englishman with a long history of producing elementally accurate tissue substitutes. Actually, not elementally so much as having the same spectral radiation properties. The whole area of tissue simulation has been a big one. I'm not sure I have the history exactly right, but I was told at some point, I think somebody at Wisconsin told me, Thomas Edison's lab assistant lost both of his arms to radiation burns.
Oh, wow!
And so that would've been around 1915. People recognized early on that X-rays are harmful, and so, if you want to make a test object, you absolutely do not get to use your patient. And so, making phantoms has been a really big deal, not in terms of the size of the industry, but there are several companies that make a living out of making various kinds of test objects for evaluating X-ray systems. Anyway, they got David White to put together something that had the X-ray properties and the shape of a compressed breast, and they put TLD chips in it so that they were able to measure the radiation dose as a function of depth and get the overall value. And that demonstrated that the values were considerably lower than exposures.
Which tells you what, Bob?
Which makes it OK to use mammography as a screening exam.
Oh, I see. And were you involved in making this determination?
No.
Who was? Where did that come from?
The scientific data were developed by Memorial Sloan Kettering and David White. The policy took a while to develop.
OK.
One of the, what somebody at NIH referred to as “radiology rags,” that is to say, a commercial publication on the field of radiology, ran a first-page-after-the-cover editorial saying “Enlightenment at Last” when that paper was published. [laugh]
[laugh] Uh-huh.
Finally, we know what it is we're doing to patients.
And what was that discovery that allowed people to know what they were actually doing to patients?
Well, they measured the depth dose and so they could get the integral of it and therefore have a measure of risk associated with the examination, the risk being induction of cancer.
And what was your involvement in this? What were you doing at the time?
Watching.
Mm-hmm.
Those were the days when things were not quite so tight in terms of budget, and so one thing we did was buy from Memorial Sloan Kettering a couple of the phantoms that they made, just to support the project. I have, for a long time, had a strong interest in tissue simulation and computational methods for finding combinations that worked. And, in fact, that's something that I hope to finish up shortly.
Oh, this is an ongoing project?
Yes.
You've been at it a long time.
Yes. [laugh]
What are some of the major goals of the project?
My goal is to develop—actually, it's developed—demonstrate a highly-accurate, simple-to-use computational formalism for finding combinations of materials that mimic various tissues. ICRU, the International Commission on Radiation Units and Measurements, along with some of the people in that Memorial Sloan Kettering project and David White, have published a document that, among other things, includes the average composition of various tissues, such as internal organs, bones, skin, etc. This information, along with data from NIST on elemental X-ray cross sections, allows you to calculate the X-ray properties using the sum rule, which states that the attenuation coefficient is the mass-fraction-weighted sum of the coefficients of the elements involved. You can calculate what a particular tissue—what the properties of the tissue are as a function of X-ray energy. And, using that data, you can find the relative amounts of various materials that will give you the optimal simulation of the tissue that you're interested in.
Bob, I'm curious if your research was used in sort of the broader debates in the 1980s and the 1990s about who should get a mammogram and how often?
We thought a lot about it. I think, for a lot of radiologists, and rightly so, it was considered something that they (the medical community) should decide. FDA was always interested in having solid information, so there were two programs, NEXT and BENT, both acronyms. BENT is Breast Exposure Nationwide Trends. NEXT is Nationwide Evaluation of X-ray Trends. The idea was to develop measurement protocols for measuring exposure or dose from various procedures. For the BENT program, the procedure was limited to breast imaging. For NEXT, the procedure studied varied from year to year. Different exams were selected, chest X-ray or abdomen or pelvis, spine, things like that.
And so, FDA's contribution was to tell what the average radiation insult was, and what the range was. This information allowed individual facilities to compare their values with national averages, and make changes if warranted. But I think screening recommendations came from places like the ABR, the American Board of Radiology. They're over in Virginia.
And I'm curious if your own views about these questions have changed over time in terms of who should get a mammogram and how often?
I guess I've generally taken the view that I don't know enough about the biology of it to say anything definitive, and so my concentration has always been to minimize, within the limits of the technology that's available, the radiation insult.
Have your attempts to minimize the radiation exposure, has that been an ongoing research project or that was sort of bounded in time, you were working on that particular problem during a specific set of years?
We supported a project at the University of Southern California (USC) to develop a mammography system that could produce mammograms with image quality equal to the current state of the art with the smallest possible dose to the patient. We got to the point of having a company build a system to our specifications, and it had an unusual configuration. One of the things that happens with X-rays is that they scatter. The useful part of the image comes from the photons that either go right on through or are stopped completely. And the scattered radiation goes off at various angles, and so it doesn't represent the source from which it came. And so, it's sort of like a gauzy curtain.
Anyway, one of the ways to deal with that is with something called a grid, which is typically an array of linear attenuating strips. The ones commonly used in mammography are lead with fish paper. Fish paper is not used for its normal purpose (fish paper is just insulating paper and it was used in the early days of electronics as an insulating material—it's oil-impregnated paper) but its characteristic is that it can be fairly uniform in thickness, and it's reasonably stiff. So, you just have an array of strips of this stuff, edge on, alternating with lead strips. Anything that's travelling at an angle will hit one of the lead strips and be absorbed. And the problem then is that, at the low energies used for mammography, even oil-impregnated paper has a significant attenuation, and so it increases the dose.
So, part of the tradeoff is how much attenuation do you allow in order to get a better picture? And the system we developed, instead of using the little, tiny lead strips, used steel strips that were quite tall, with air as the interspace material. The housing to hold the whole thing was quite large, so it was unclear whether the system would work well for patients with an unusual body habitus or obesity. It's a large device that hangs down below the breast support.
Is your sense that mammography has gotten safer over the years?
Certainly, since the early days, there have been quite large improvements. Based on work done by physicists, including myself, spectra that are more suitable than the molybdenum target spectrum have come into use. The image receptors have gotten incrementally better. And, at some point, it was recognized that the inherent tissue structure of the breast made detection of cancers more difficult. Art Burgess was one of the earliest researchers to realize the importance of this “anatomical noise.” That recognition led to the development of tomosynthesis. Have you heard of tomosynthesis?
I have not, no.
It's sometimes referred to as 3D breast imaging. The tomo refers to tomography. It means making a picture of a slice. But, instead of doing a full circle—and there are breast CT machines now—but, instead of taking many, many exposures around a circumference of the object, a limited number of exposures are made at discrete angles, and the reconstruction algorithms have been developed to deal with that geometry. So, instead of having all of these structures superimposed—different layers superimposed one on another, you get something that is more like a slice.
The description of the product as a 3D breast imaging is a little bit misleading because there is still stuff from other layers that “leaks” into any particular slice that you choose to look at, but the technique has certainly improved detection. I think the doses are in the same range as conventional mammography, typically somewhat higher, but I think the current consensus is that you're getting something for the price you're paying in dose. And that's become, at least in places where patients and providers have resources, meaning money, it's gotten to be the norm. I don't know what happens in other places.
Now, Bob, of course, you're not an epidemiologist, but I'm curious if you have a sense of, in assessing how mammography has gotten safer over time, don't you also need to know the extent to which mammography has ever been harmful to people? In other words, do you have a sense of when there's an actual danger, an observable danger in getting a mammography?
I think the difficulty is the one that goes along with all of radiological imaging, which is that the effects are so small that they're very difficult to measure. You'd have to have a gigantic cohort. And one of the difficulties with a huge cohort is ensuring that you're doing the same thing to every member of the cohort, or else controlling for the differences. And there's a lot of variability in practice.
So it is difficult to know when and who might be harmed by a mammogram?
Yes.
And so, the motivation to make mammograms safer, at the end of the day, it's only based on not objective, concrete evidence but on basically trendlining and assumptions on what must be safer?
I think the reasoning—as you say, I'm not an active participant in the conversation—but the way I look at it is, if you know of a way that's usable—not just in absolute “can you possibly do it at all” sense but usable in an economic sense, in a convenience sense—if you know of a way to do it for less dose, it's malpractice not to use that.
Right.
And that's just a whole bunch of caveats about what's reasonable, what's allowable.
Is there anything, Bob, that's lost diagnostically in the attempt to make mammograms safer?
I don't think I know how to answer that. I think that the physicist copout is, that's for the radiologist to determine.
Yeah.
It's hard to know in a statistically robust sense how much a reduction in image quality would harm your diagnostic ability.
Bob, over the course of your career at the FDA, what else besides mammography were you involved in, or was mammography basically the main issue that occupied your attention?
Mammography has been a major part, and it stopped being super important when we were unable to get to the point of a clinical trial with the optimized system. It was clumsy, but it could do—on a test object, could do as well as the state of the art for about 30% of the dose that was used by a state-of-the-art clinical system. And there are just a number of things that happened that prevented that from getting to the point of clinical trial.
And what do you think some of those major issues were that prevented that?
Well, the unusual configuration, the large grid that made patient positioning problematic.
And what year did that happen when the clinical trial did not go forward?
I'd have to look it up.
Was it close to when you retired in 2018?
No. We moved from Rockville to White Oak in 2007, and it was three or four years before that.
I see. And so, Bob, after that, what were some of the other projects that you were engaged in before your retirement?
When we moved to White Oak, another person had started a project to build a bench-top version of a tomosynthesis system. Tomosynthesis normally, when it's in the rest position, looks like a regular mammography machine. The difference is that, in a regular machine when you do a lateral, the whole C-arm—which is the mechanical thing that could be construed to be the letter C with an X-ray tube at one tip and the image receptor at the other—the whole thing rotates. In a tomosynthesis system, the image receptor and the X-ray tube are decoupled and so the X-ray tube moves to different angular positions while the breast support and image receptor stay in one place. You can still do laterals. You can change that whole thing to lying on its side. The bench-top system, conceptually, involves taking the whole thing and laying it down backwards so that the image receptor sits in one place and the tube rotates on a horizontal arc to various positions along the arc.
So the reasoning is, we're not in the business of doing mammography and the bosses don't want to spend three or four hundred thousand dollars on a machine that only has clinical utility and a single configuration, but we still want to be able to evaluate the technology, so we'll build this thing on a table top. And the benefit is that it can simulate most of the geometries that are used by the various manufacturers of these systems. The differences are how many views do you use and what are the angles that you use. Some people have a lot of views closely spaced. Others have larger angular range, and so it would be—well, it is a device that allows you to evaluate the technology in the many forms that it shows up as.
The other thing is the choice between continuous motion and step and shoot. The exam is done with the breast compressed, and I wouldn't know, but, for a lot of women, it's very unpleasant. And so, you would like to complete the exam as quickly as possible, which is the motivation for doing continuous motion. So, you go along and every so often you pulse the X-ray tube, but it's moving and so the focal spot is moving. And so that induces a slight amount of blur that you don't have in step and shoot. With step and shoot you're moving some fairly massive pieces and so it takes some time to do that; first of all, to get it going, and second of all to get it stopped so it's not bouncing around. But you don't have the moving focal spot issue. So, it allowed us to do that kind of evaluation, as well. I was not involved in the initiation of the project, but the person who wanted to do it got her PhD and went someplace else. And so, I became the manager of the project, and we had various people doing the implementation. The hardware was managed with a software product called LabVIEW. Are you familiar with that?
I'm not, no.
It's a software product that is adept at handling servos and stepper motors, and sending signals to various devices to tell them to do something. So, I'm not conversant with it very much. We either hired people who knew LabVIEW, or hired people and trained them, sent them to school to learn LabVIEW. And so that has been used off and on to look at various aspects of tomosynthesis technology. So that was a case of wanting to do something new, but also wanting to maintain our tradition of doing product evaluation.
Bob, was that the last major project during your tenure at FDA?
Well, I think the tissue simulation is the biggest thing. For me, it was a major theme that started when I worked at the University of Wisconsin. A couple of things happened since I first was interested in this. I first wrote code to do the calculations—I think I contributed it to the AAPM software exchange in 1982 or maybe 1984. And the two things that happened, the code that most closely matches what I have today I never used because, if you pick a number of energy points in the range that you want to examine, the way that I ended up using uses N points, just a number that you pick. But the other method used (N x (N -1))/2. And, in the days of a PC with an 80386 with a math coprocessor, that resulted in run times of several tens of minutes. And I just didn't see the benefit. The other thing that happened is that the data on X-ray attenuation coefficients has improved dramatically. At the time I started, I was using what's known as the McMaster compilation, something from the Lawrence Radiation Laboratory.
Mm-hmm.
Lawrence Livermore Radiation Laboratory.
The Rad Lab, yeah.
And it was a compilation with polynomial fits to the best data at the time, one for coherent scattering, one for Compton, and then as many pieces as there are edges in the photoelectric effect, plus one. So, there are separate polynomials for the two scattering effects, and then individual polynomials for the various pieces of the photoelectric effect. Subsequent to that, just before I went to Brazil in 1987, John Hubble, who was renowned for his dedication to the development and publication of data on the interactions of photons with matter, and who was the backyard neighbor of Tom Fewell, learned that I was going to Brazil, and he gave Tom Fewell a diskette with the data and the code for XCOM, which is still the current version of X-ray attention coefficient data that NIST publishes. And, instead of using single polynomials for the scattering coefficients and polynomials that covered the entire range between edges, it uses a quasi-logarithmic spacing, with approximately equal steps in the log of X-ray energy, to do fitting, so it's much more precise.
And so, I switched to using that. But the other thing is that my Raspberry Pi can probably do things in a minute or two which take about a second in a garden variety six-year-old Core I5 PC. So, the precision has gotten a lot better, and execution times are no longer an issue. And then I have realized a few things that I think have relevance to therapy treatment planning for photons; It's based on electron density. You can infer from CT scans what the electron density of a particular point in the patient is, and you can use that as input to Monte Carlo treatment plans.
How does that help? What's the diagnostic value in this method?
It's another way of getting the electron density. Until the advent of dual energy CT, where you do the same scan trajectory but with different spectra, the calibration was done by scanning a phantom with known electron density elements and then fitting a CT number to electron density curve. And it turns out it was bilinear and with a certain amount of uncertainty at the crossover point. With the way of looking at things that I've worked out, and this is not unique, but if you have a dual energy CT scanner and you do the two scans, you can go directly to the electron density with a single calibration curve. Whether that makes enough difference that anybody would bother to do the vetting that's required to make people trust it is another question. But it's a new way of looking at things.
And what are the major promises of this research?
In terms of accuracy, it's an incremental improvement. I do claim that I have better agreement between tissues and the materials that my software produces than other methods give. It's also, in my opinion, a major improvement in ease of use. You can type in the name of a tissue, which is in a library, the names of three candidate materials, and out pops your answer. So, you type four words and you get an answer in a second. And, up until now, people have struggled with getting it right.
Well, Bob, now that we've gotten to sort of the modern day in terms of your career narrative, and for the last portion of our interview, I'd like to ask you a few questions reflecting on your more than 40 years of service at the FDA. It's a significant span of time, and so I'm curious overall, just sort of very broadly considered, in what ways has the mission from your vantagepoint at FDA changed over your 40-plus years there, and in what ways has it stayed the same?
When I first arrived there, we were the Bureau of Radiological Health, and so we worried about color TV sets, which were CRTs and emitted X-rays. And the engineering folks were busy building devices that could scan a large area rather than having somebody take an ion chamber and move it around recording numbers. Then in, I think it was 1982, we were merged with the Bureau of Medical Devices, and so along came responsibility for all kinds of things that we had not worried about before. The one thing that comes to mind, which is probably not a biggie, but prosthetic devices, artificial limbs, probably things like ventilators and just a whole range of things. And so, the emphasis decreased somewhat on radiation control.
Bob, what do you see as your greatest achievement in your career?
My kids.
Say again.
My kids.
[laugh] Let me refine the question. What do you see as your greatest achievement in your professional career? And I know you might tell me you don't want to hurt your arm by patting yourself too much, but still, I want to ask you.
I'm known for a number of things. Tissue simulation is one of them. Probably the people most interested are the people that manufacture phantoms. Another area where I played a pivotal role was in the development of X-ray spectra from current mammography X-ray tubes. In general, X-ray tubes are double ended, and there's a very practical reason for that. They operate at voltages up to 150 KV, kilovolts. And insulation for 150 KV is a pain. It's bulky. And so most X-ray tubes are double ended so each cable only has to deal with 75 or 80 KV.
And until mammography came along, you pretty much had to build a double-ended tube. Mammography operates typically in the range of 20 to 30 kilovolts applied to the tube. And a single small cable is quite sufficient for that in terms of insulation. So, all the X-ray tubes that were coming along were single ended, but we had only double-ended generators. I found a company that would take a commercial mammography system and modify it. We didn’t need the tube stand and the C-arm and all of that. We just needed the generator and the tubes. And the engineer of that company—who appreciated the importance of what we wanted to do and was another one of these people who did things that made my life easy—built us a generator that was a one-off modification of his company’s commercial device.
The generator also had to have a very low current capability so that you don't exceed the count-rate capability of the detector, and the engineer provided that as well. And so, I was able to obtain the generator and a Mo-target X-ray tube, and another manufacturer was happy to make a W-target tube. The manufacturer of a third type of tube said, “Yeah, we could loan it to you.” And I said, “Well, how soon do you need it back?” And they said, “Well, once it's used, we can't really sell it.” [laugh] So they essentially gave it to us without saying it was a gift. But they wanted to know the data just as much as we did and everybody else. That tube ran with reversed polarity with respect to the other tubes. Our benefactor who produced the new generator adapted it so that it could drive the third type of tube as well.
So, we ended up getting a generator that could drive all three types of tubes, at the low currents needed to do spectroscopy. With those assets in hand, we went to work and measured spectra at a wide range of tube potentials, 18 to 42 kV. The paper that resulted from that is one of the most cited in Medical Physics, the journal. We collaborated with John Boone at UC Davis who wrote the code for a computer model that was based on our data. John was the lead author. Spectral measurements were primarily Tom Fewell's work, with me and another person helping. But I was the one that went and begged for the money, and I guess I had enough standing that they gave it to me.
The issue of mimicking patients in various ways has been one of the things that worked out fairly well. There's a technology called phototiming which is used for the control of X-ray machines. It is also called automatic exposure control (AEC). Up to a certain point in the development of radiology, the amount of X-rays was decided based on the technologist looking at the patient, or maybe making a measurement, and then going to a chart on a wall with appropriate settings. And so, if the patient was anomalous in one way or another, the exposure was not correct and the exposure had to be repeated, with additional exposure to the patient.
Phototiming refers to a device behind the patient, that is to say, on the far side of the patient with respect to the X-ray tube, with a screen that emits light, and a photo detector. And so, you set a level for the integrated amount of light that corresponds to a properly exposed film, and you shut off the X-rays when you get to that point. To evaluate those devices, as for example in the NEXT program, you need to have a test object that simulates the patient. Since different facilities may use different X-ray machine settings, i.e., different X-ray spectra, the test object must accurately simulate the patient for all X-ray spectra.
And so, with Tom Fewell doing spectroscopy, we looked at what were regarded at the time as reasonably tissue-equivalent, patient-equivalent phantoms. And, based on that, we designed combinations of Lucite and aluminum, i.e., acrylic plastic and aluminum, that would elicit from the X-ray machine an exposure pretty close to what an average patient would elicit, for a wide range of exposure parameters. The result was the so-called LUCAL phantoms, LUC for Lucite and AL for aluminum. There's one for chest, one for abdomen and lumbosacral spine, and I think a couple more for pediatrics. They're not wildly popular. They're big and heavy and bulky. But I continue to get requests for the paper from Asia and Africa and places where phototiming is still used. So, for me, that was pretty cool. A lot of people don't think it's that big a deal. In the US, generator technology has advanced to the point that the digital image receptor functions as its own AEC.
But you think that remains to be seen, ultimately where this is headed?
I think it's been rendered not relevant anymore because the manufacturers have developed a technology to monitor both the tube voltage and tube current and integrate the tube current. And, based on that, they will deliver at the console that the operator is using an estimated dose. So, what this thing was developed to do is no longer necessary.
(Recent IEC (International Electrochemical Commission) activity indicates interest in the LucAl chest phantom for use with dual-energy projection chest imaging.)
Well, Bob, I think, for my last question—it's been so fun hearing your perspective on all of these issues going back decades and how the technology and how the institutions and the partnerships have changed over the years. I'm curious, for my last question, sort of forward looking, especially because you're still active in researching some elements of this: What are you personally most excited about in terms of your own ongoing contributions, and more broadly as technology changes and diagnostics improves, what do you think are the big things to look out for in the future?
I have never been closely involved in MR imaging, but it's an area where the hits just keep on coming. There's always somebody with a new pulse sequence or a new approach to doing things. Diffusion tensor imaging, all these things that people can dream up to do with these machines. The other thing in general that's happening is people are putting these different modalities together so that you have PET-CT, MR-PET. Do you know what PET is?
I do.
And I think part of the reason that I'm retired is that, as much fun and important as experimental physics is, artificial intelligence (AI), an area in which I don’t have much expertise, will become a large and increasingly important aspect of medical imaging. And software reliability is going to become exceedingly important. You hear all this stuff about what we're going to be able to do. Well, it doesn't matter a bit if you don't do it right. So, software reliability and validation issues are a place where the FDA definitely has a role when AI is used in medical applications, and where FDA needs to think about concentrating its resources.
And so, how can AI improve software reliability?
It's more like software reliability can improve AI. You have to make sure that you're doing what you think you're doing.
Oh, I see, I see. And, in the end, even though it's hard to understand how these things happen, you're confident that better software leads to better diagnostics leads to better clinical outcomes. Is that a fair way of putting all that together?
Yeah, yeah. There are lots of things that people are figuring out when they have access to a million images instead of a thousand. They can see patterns that weren't perceptible with limited data.
Well, Bob, it's been so fun talking with you today. You've shared with me a wealth of knowledge, and I want to really thank you so much for spending the time with me.
Well, I hope there wasn't too much fluff.
[laugh] Not even a little bit.