Notice: We are in the process of migrating Oral History Interview metadata to this new version of our website.
During this migration, the following fields associated with interviews may be incomplete: Institutions, Additional Persons, and Subjects. Our Browse Subjects feature is also affected by this migration.
We encourage researchers to utilize the full-text search on this page to navigate our oral histories or to use our catalog to locate oral history interviews by keyword.
Please contact [email protected] with any feedback.
This transcript may not be quoted, reproduced or redistributed in whole or in part by any means except with the written permission of the American Institute of Physics.
This transcript is based on a tape-recorded interview deposited at the Center for History of Physics of the American Institute of Physics. The AIP's interviews have generally been transcribed from tape, edited by the interviewer for clarity, and then further edited by the interviewee. If this interview is important to you, you should consult earlier versions of the transcript or listen to the original tape. For many interviews, the AIP retains substantial files with further information about the interviewee and the interview itself. Please contact us for information about accessing these materials.
Please bear in mind that: 1) This material is a transcript of the spoken word rather than a literary product; 2) An interview must be read with the awareness that different people's memories about an event will often differ, and that memories can change with time for many reasons including subsequent experiences, interactions with others, and one's feelings about an event. Disclaimer: This transcript was scanned from a typescript, introducing occasional spelling errors. The original typescript is available.
In footnotes or endnotes please cite AIP interviews like this:
Interview of Akio Arakawa by Paul Edwards on 1997 July 17,
Niels Bohr Library & Archives, American Institute of Physics,
College Park, MD USA,
For multiple citations, "AIP" is the preferred abbreviation for the location.
In this interview, Akio Arakawa discusses topics such as: University of California, Los Angeles (UCLA); meteorology; his family and education; University of Tokyo; Japan Meteorological Agency; Hidetoshi Arakawa; fluid dynamics and thermodynamics; Michael Schlesinger; weather prediction; FORTRAN; UNIVAC; Yale Mintz; Chuck Leith; Mark Rhodes; Joseph Smagorinsky; Jule Charney; John Von Neumann; Syukuro Manabe; Geophysical Fluid Dynamics Laboratory (GFDL); International Business Machines Corporation (IBM); Pierre Morel; David Randall; climate models; National Aeronautics and Space Administration (NASA); Milton Halem; Jim Hansen; United States Department of Transportation (DOT); Rand Corporation; Max Suarez; National Center for Atmospheric Research (NCAR); National Science Foundation (NSF); Thomas Rosmond; National Academy of Sciences; carbon dioxide.
I’m interviewing Professor Akio Arakawa in his office as UCLA. It’s the 17th of July, 1997, and we are here to talk about your career as a meteorologist and a climate scientist. I see from your CV that you were in Japan until I guess 1961. Is that right?
And you were born in 1927.
I’d like to hear a bit about your early life, your family, your early education, and how you became a meteorologist.
Okay. At the high school I was interested in the science in general, so there was no question about my future at the time, the direction I was going after there. But I had to decide whether I could have pure science or outright science in community, but finally I decided to go the route pure science. So I entered the University of Tokyo in the Department of Physics.
Tell me about your family and what happened for you during World War II, because you were in Japan at that time.
I have two brothers, and the oldest brother was a college student at that time, but interrupted in 1943, or something like that, and then became a Navy officer. He was stationed in one of the islands near Okinawa. There was no battle in that island, so he was safe. So a few months after the end of the war he came back. My second brother also happened to become soldier.
So they are both older than you?
Yes, yes. And I don’t remember how that went, but then about a year before the end of the war, about a year he became an army soldier, but he never — he was in the Tokyo area.
So he too was never involved in the fighting.
Not actually in the fight. He might have gotten involved in some kind of fighting. There were air raids and things. Now, I was a high school student. Well, actually the educational at that time in Japan is different from the system here and the system there now.
How did the system work then?
A six-year elementary school, grammar school, and that’s the same. After that, five-year middle school, take core. Then after that there are varieties, and those who want to go to university go to high school, and the high school goes for three years, and those who wanted to get some professional training go to the professional school or vocational schools. And some private universities have their own schools, junior school, so that’s three years, and then this three-year university. Okay, so 6-5 or 3-3.
So you would have not graduated from high school until you were 19 or 20, I guess, because if you start when you’re five, six years of elementary school, 11, five years of middle school, 16, and then another three years of high school.
Right, but in my case it is a bit special that the five-year middle school was under the university where high school was combined, and then skipping one year, so there is a special school that was seven years. So then I entered that school after the regular elementary school, and then I entered the University of Tokyo, after the war.
So you were graduating from high school just when the war was ending, I guess.
Yes. I was I would say a fifth-year student of that school that was altogether seven years. So I was not drafted. I was 18 or something like that, but I had to go to the factory and I had to go on both of the fire stations.
The fire stations. So you were a fireman.
Yes. School continued, but there were three days I had to go to the fire station.
Every three days you had to?
Yes. That’s quite an experience. And when there is an air raid, of course it is completely dark, they run the fire engines — they had to — and then it is very dangerous. Sometimes fires are so bad or so high; firemen cannot do anything, so sometimes — Once we were trapped by fire, so we had to put the backwash.
You were putting out fires that were started by incendiary bombs.
Oh yes, right. So before graduating that school, that was called a high school, then I entered the University of Tokyo, so I had to spend three years majoring in physics.
Okay, majored in physics. All right, just one more thing on your background: what did your father do?
Oh, my father, no, not a scientist. He was working for the construction company. He used to be a government employee, but as far back as I remember he worked for a construction company. Not as an engineer, but just a worker.
Okay. Did your brothers also go to university? [Yes.] And what became of them?
My oldest brother who went to the Navy, he came back after the war and he got a job. He was very good at the English, even when he was a student, and when it was the wartime he didn’t have a chance to practice, but there was a social group at the university who kind of practiced.
A club you can go practice.
Yes, for English conversation and speech, and he was very good at it. He won more than once the national contest of English speech, or something like that. So after he came back from the Navy, he got a job at the Japan Travel Bureau, at that time semi-governmental travel agency, a huge organization, and the foreign department. He was there some years, then he was drafted by American Express and he became the manager of the Tokyo branch of American Express. Travel department, not a bank. Then afterwards he became head of the Hong Kong Sightseeing Society (well, I’m literally translating) in Japan. So now he is retired. My second brother was at the university, had a major in Economics. He went to graduate school. And he worked for few companies, he changed companies, but he is basically an accounting type person. He has also retired. So both of my brothers doing work quite different. I am the only scientist in my family.
Okay. So you went to the University of Tokyo, starting I guess in 1945 or ’46?
I think ’47. Yes, ’47, yes.
Okay. And majored in physics. Were you interested in meteorology at that point?
Not at all! [Laughs] I was definitely interested in physics. When I was at the high school, I was fascinated by physics.
Okay. What area in particular interested you?
At the high school they were really not the — oh, I didn’t have any specific idea. But, well, still I guess modern physics. There were a few people who shared that as a similar interest, so we got together and read a book on modern physics, the one as a group. Well, but the course I read at the university. But I was interested in doing that at the high school level. Actually we had a very good physics teacher, so probably that was an influence very much in choosing physics.
Right. So at the university and after, how did it happen that you became interested in meteorology?
I graduated from the University in 1950. That’s only five years after the end of the war. And industry in Japan just about all different things is completely destroyed, and there was no job at all for physics graduates. So at that time there were only 30 graduates from physics from that University of Tokyo. Now they produce a hundred.
There were only how many at that time?
Thirty per year.
Okay, so not a lot.
Hm-mmm [yes]. And I was planning to go to the graduate school. I felt I didn’t learn enough as an undergraduate, so I was thinking of going to graduate school. But around that time I was engaged, so I decided that I’d rather find a job. But there was no job available, and after the war there were the 30 students who wanted jobs, but there are only three jobs available, and one of those is from Japan Meteorological Agency. Before that, it didn’t occur to me at all. But after seeing that, maybe meteorology is interesting. I was a Physics major, but I was a bit more interested at some applied side of the physics. And the University of Tokyo Physics Department had some accommodation with the Japan Meteorological Agency. For example, a professor of that department taught that the school — Japan Meteorological Agency had the school to train meteorologists.
And they were looking specifically for physicists?
More generally. So physicists, geophysicists, okay, so on. So I decided to apply. They gave an examination, so I thought that the examination will be all meteorology, which I didn’t know anything, so I bought a couple of books and studied real hard. And my friend in geophysicists also took the exam, and a few of us. Then it turned out I took the exam and it is not on meteorology at all. It is on the physics and mathematics. So I thought my position was a disadvantage. Simply it turned out to be advantageous, more advantageous than the geophysics major, in a sense. So I passed. So then the Japan Meteorological Agency is very tough on the physics graduates. Of course I was interviewed, then asked what of kind of things have you been interested in, so I think thought so on and so on. Of course, the program of physics. Then they told me, “You know you cannot do that there at the Japan Meteorological Agency.” Of course I know, but I said that meteorology sounds interesting. [Chuckles] Then they put me on a weather ship.
On a weather ship?
Right, said they want to experience the whole situation first.
Okay. Were they training people to be forecasters? Is that the purpose of what you were doing?
No, it’s not a forecaster. They wanted to train us…well, actually four or five people in the Japan Meteorological Agency through that exam. Well, and they wanted to have, and they were, to become the future leaders of the agency in other various ways, not to be forecasters.
So they were going to give you some observational training first, but theoretical…
You were going to go on to become theoretical meteorologists.
Right. So they put us on the weather ship.
What was that like? What did you do?
There were two sites that which the weather ships are stationed, so it’s like their weather station over land but over ocean, so the ships go there just to stay there and they make the observations, just as there were the weather stations on land.
So at a fixed point.
Fixed point, yes, fixed point.
Okay. For how long did you stay?
There are two points. One is east of Japan, east and north of Japan. That point was chosen because that is along the air route of the United States to enter Japan. So that if a signal was sent, so the airplane can [???], and that is one of the missions.
Okay. And how far away from Japan is that particular point?
I wish I had a map. But about a thousand kilometers, so 600 miles or so.
That’s pretty far.
Maybe more. That is the one that is northeast of Japan. And the other point is the south of Japan. That is for watching typhoons, and that’s only summertime. And the Agency had six ships. Very old — actually they are Navy ships, much smaller than a destroyer. And they were older ones. And they take time, so the one ship stayed there about 20 days, but may take three days to go there and three days to come back, and they were about that amount on the ship. Then I think that it was a good idea, that later stuff, because I happen to be interested in meteorology for the ships own safety.
Right, right, that’s true.
So they were making observations. And of course on the ship there were about 15 or more meteorologists. Observers, at least. They would do all the weather maps and like that, try to do their own forecasts, like that.
So was this all instruments at sea level? Did you lift any balloons or anything vertical?
Balloons also. Balloons, some. Basically three types of observations: the air ways observations using balloons to measure the temperature and the pressure and other air as the wind by tracing balloons; and surface observations, very standard, temperature, pressure, and the wind; and marine observation.
Sea surface. How deep?
Well, that’s another story. Routine, it requires 600 meters deep. At one time I was very curious about the measurement of the deeper ocean. And I don't know how they do it now, but at that time the measurement was done by having a thermometer, a sepital [?] thermometer on a wire and dip it into the water. Then some weight is dropped, and they took those through the wire, go along the wire, and they heat the thermometer. Then the thermometer would turn over, then it open, then they take the seawater there and they measure the temperature. And also when we pull up, we get the sample for the [???] analysis. And that they do at different depths. And operational observations, I forgot, but was down to 600 meter also. I also went there [chuckles]. I was very curious, so I went all the way down, as far as I could go, 2500 meter, and the wire was cut, so I lost part of it.
Was it the only wire you had?
It was expensive thermometer, so that were excited them. [Laughs] Very, very embarrassing. But they were generous, I was not so…
Not punished too badly? [Chuckles]
I was not punished at all, but just I had to later let the [???] to approach it, that’s all. [Laughs]
So how long did you do the ship observations?
Actually one year, so that involved six or seven voyages. There was about a six-month period, kind of preparation period before that. So after one and a half year after I entered the JMA, then it was about time I should be transferred to somewhere. But I didn’t know which direction I should go within JMA, and I asked the advice of the chief of the meteorologist group on that weather ship. I had two possibilities. One didn’t fit my interest at all; just a connection, statistic, kind of cryometer diagnostics. That just, coincidence, the former boss was just transferred there, and he asked me if I was interested in going there. That was one possibility. The other possibility was I go on a boat in marine observation and marine underwater section for JMA also asked me if I’m interested. So I had a hard time to decide. Forecasting didn’t occur to me. Then the head of the meteorologist group on the ship, he advised me why don’t you see a person at the Meteorology Institute, a famous meteorologist scientist. Meteorology Institute belongs to JMA, Japan Meteorology Agency. Now there is a very famous person whose last name is also Arakawa, Hidetoshi Arakawa, he is very famous now. So I went to see him for advice.
He’s not related to you, by any chance?
No, no. But he was the head of the forecast research division of that Institute. Then he said, “Why don’t you come here?” So I was lucky I did this Institute under that division. The place was very good within JMA to do scientific work. So that’s how I got involved in that branch of meteorology, and somehow related to forecasting.
So tell me what you did there.
First I was put in a group research, three members, two others that were four or five years senior to me, and basically data analysis. There were vertical cross sections, then plot the data and analyze the structure of a front, like that.
How did you do that? Was this all done by hand?
All by hand. No computer at all.
Was there any calculating equipment, mechanical punch card, or just a regular calculator?
No, just there were…
He’s doing the gesture of a rotating handle.
We just had a manual one. And we had an electric calculator — oh that was very fancy. So that was data analysis, although that is a research institute, but still they’re supposed to do something useful for forecast, and pure theory kind of work was considered somehow useless at that time, and data analysis looked more useful.
You say that as if you think it wasn’t actually more useful. What was the quality of that analysis at that time? Did you think you were getting useful things from doing this?
Well, data analysis is very important, and there were tremendous knowledge to us. But data doesn’t give you a forecast that is knowledge about how the atmosphere behaves or that is already observed doesn’t tell automatically what will happen. Okay, so, I’m not sure you know the term dynamic meteorology and synoptic meteorology.
Explain what those mean to me. I think I understand them, but…
Dynamic meteorology is theoretical, and basically applies fluid dynamics and thermodynamics to atmospheric phenomena. So basically dynamic is theoretical — how cyclone develops and why cyclone develops, try to understand based on the basic weather physical principal. Synoptic meteorology is somewhat misleading, but the standard analysis of the data is done based on the meteorological data collected synoptically.
Synoptically, meaning all at one time?
At one time, yes. So basically weather map type analysis. That’s why it’s called synoptic meteorology. Not just a point of observation; try to look at the atmosphere more widely but for each time. But the word is generalized to almost any kind of data analysis, but with a broader view. So, data analysis and the theory of the analysis. The objective of those two are not so different. Actually a very important objective of meteorology is forecasting, and dynamic meteorology is trying to approach the problem theoretically. It’s like meteorology try to accumulate knowledge of that. But at that time, I’m now talking about shortly after 1950 when I entered the Meteorology Institute; those two fields were quite separated. Dynamical, the theoretical, also it is [???] on the pay. You can do very clever mathematical analysis, but it may not be relevant. [Chuckles]. But in meteorology we handle lots of data, but in many stages of pheno-meteorological description.
As I understand it, the synoptic meteorologists tried to find regularities in patterns of behavior, so they sort of match the current state to something that had been seen before, and try to predict based on that.
That is obviously one possible approach, and in principal many forecasters follow that approach. When we look at the weather map, which may resemble to some other weather map in the past that he saw, and he may not use it in an organized way, but experience the forecast that basically followed that thought. Then there were the numerical weather of predication.
Yes, around 1950.
Right. Now the basis for that is theoretical. You first have a very simple model, theoretically realized, but applied to the actual forecast problem, only a 24-hour forecast, but they had some success.
You’re talking about the ENIAC forecasts of 1950?
Yes. Before that there was a thing, too, in the 1920s by Richardson.
Right, I knew about that, too.
And it was very complicated — too complicated for that time, so it was a complete failure. It predicted the pressure change over a spot in Europe that was 140 millibar, like that wide; the actual change was 2 millibars. Because all he did is calculate the initial tendency of and extrapolated. You have to really follow it.
How aware were you in Japan in the early ’50s of these experiments in America with computer models?
We were aware pretty quickly, maybe as soon it was published we had access. So maybe another year or two to the lab. Of course there were many other studies that followed this, so that kind of became available to us at the forecast research branch of the Meteorology Institute. Then we were very excited because everybody was basically theoretically oriented, but they had to do… [chuckles]
They had to do all the calculations by hand, is that what you’re saying?
No. But then this one really tried to — Well, like this. To make a weather prediction is based on theories, but which can be practically very useful, and that’s the ideal thing to do with the Forecast Research Division data. So they were very enthusiastic about learning and doing our own research in this field.
So you’re beginning to see the possibility of the dynamic and synoptic coming together through…
I think I said somewhere in here, dynamic and synoptic meteorologists became to communicate to each other. Still, the philosophies were just starting in the early 1950s; we didn’t have any computers or like that. Now, there was a graphical method that was originally proposed by Fjörtoft, this man, Norwegian. The weather map, usually we use the constant rate of the height of 500 millibar.
So single level, 500 millibar.
Single level, 500 millibar height. Then it was a [???]. We basically tried to do the same thing they did with a machine graphically, so graphical rotations, obstruction, so we have that, and then we needed the difference between the two intervals, also we had those controls on the tracing paper, and with the two of them, just shift one.
According to the wind?
No, according to what we want. That corresponds to the grid interval. When we used a computer, we did these grids, and arithmetic is down for the grid point varies. For example, the difference between two neighboring grid points is taken, and the difference of the difference is taken, and so on. So to correspond, we have two maps on tracing paper, semi-transparent paper, and just a shift on this corresponds to one grid interval, and then we see the two sets of contours using different color, then graphically below the contours of the difference.
I see. So you take two maps. The second one is shifted by one grid interval, and then you take the difference between the two.
When we want to take the difference between the two grid points. Now, addition can be done in that way. Then in that way we calculate velocity, this rate of speed. In this model, basically…
A [???] model he’s talking about.
Right, at [???] velocity, okay, by wind. So now velocity can be done by deep addition of that procedure, just as I explained. It’s really the difference of difference. Then after that, then where there’s a contour, so that we know approximately the wind direction and that velocity shift graphically by the wind. Then they were after the shifting or the drifting say in one day, then we have to involve the process we did before. Before we take the difference of the difference; now in part of that process just corresponding to solving differential equations — we know the difference, then what is now is that equation, so we have to do it like that. All of that can be done, of course crude, but graphically. That is a very nice way of actually visualizing what the distant model is doing. So there we did a lot of that kind of thing. This new approach was not known at all outside of the Research Division, so we went to various local weather centers and suggested they experience that kind of graphical model, so we had a kind of similar type there that they can participate in that kind of things, just trying to educate the public in…
The last thing we were talking about is whether moving images had ever been made of these graphs, and Prof. Arakawa said no, not to his knowledge; that was later.
Everything manual. So I think that kind of experience is very good, even after the computer became available, after it came over to here. At one time I had to teach basically the map analysis at the laboratory, so that the students would get the experience of that, and apparently they enjoyed it. One of the students in that class was Michael Schlesinger — probably you’ve heard his name?
Really? Yes, yes!
Yes, coined the model at the University of Illinois. At that time he was an engineering student, almost ready for a Ph.D. But then he just took the meteorology course as a basic requirement, then he became interested in meteorology and became my student.
That was here?
Yes. Anyway, going back. So numerical weather prediction became exciting in the early 1950s. At that time, numerical weather prediction was interpreted as a very broad theoretical basis of meteorology, so many programs came to be distinguished as a program in weather prediction, dynamical meteorology, and also dynamical weather prediction dealt with actual data.
One more question about this period, the early 1950s. To what extent was the work that you did and the work in your group about the totality of the global atmosphere as opposed a large area that would cover Japan?
Not the work that I’m talking about in the early 1950s. The data analysis we became involved in, that was with the local in Japan. And the numerical weather prediction we studied with excitement in the early 1950s, but it was not really new, creative research. But that was a very exciting period, there.
He’s pointing to this article about cumulous parameterization history.
Oh, much before that. Cumulous parameterization, because of this conference, so. But I personally had started a different interest, the general circulation of the atmosphere. Numerical weather prediction predicated the behavior of the individual cyclones. General circulation is more global in nature, and tried to understand the mean structure and the mean motion, why there the strong winds [???] in the middle latitudes, why there are subtropical highs, like that. I personally was interested in that very much. Maybe that was influenced because when I was a physics major I was very much interested in statistical mechanics. So I was interested in something statistical. But at that time of the year was a [???] making in that area also. Not in Japan. Mainly in this country was the data analysis from that point of view, that requires upper levels and optical data, and they tried to understand how cyclones transport heat and the momentum from the tropics to middle latitudes. That kind of analysis was very extensively done. [Flips through pages] This is highly condensed. Well, right here, data analysis, which requires a very extensive effort. There is lots of data to the global atmosphere, and that kind of analysis was done by Prof. Stark’s group at MIT and also here by Prof. Bjerknes, who is very eminent, actually founder of this department. We used the hemispherical data and force some amounts, and then they calculated how individual disturbances transferred to basic physical quantities, and tried to understand and explain why general circulation is observed. So that gave the understanding that the dynamics of the cyclones, individual disturbance which is responsible for weather, and the dynamics of general circulation, which is responsible for climate, are not really a separate thing. So that was my personal interest.
And did you learn about Bjerknes and all this work mostly from publications?
Did you ever come to this country before?
No. [Look at materials.]
He’s showing me a final report from March 1995 called “Investigations of the General Circulation of the Atmosphere” by Bjerknes and Jule Mintz from the UCLA Department of Meteorology.
You will notice how I studied!
Yes, I see, it’s very dog-eared.
[Laughs] Yes. This is the one exciting development in the middle of the 1950s. Another exciting development is a laboratory experiment, basically at Harvard, sitting the water tank.
The dishpan experiment?
Dishpan is the early days. Later the people used a cylinder and put in a core and the cooled air produced heat as it rotated. That kind of experiment produced something similar to the flow, upward flow in the atmosphere, wavy and cyclones, something like cyclones develop and then like that. But that depends on rotation rate and also heating rate. So that gave a much greater understanding, and also other theoretical papers — there were many. But at that time, nobody was studying such things, so I felt I should let other people know what was going on in that general circulation area, and so wrote a monograph in Japanese.
Right. Actually I had been studying the middle of 1950 at the laboratories.
Oh, this is Saunders? What is that, a radarscope?
From top. It’s exercises or something like that in the water, it is rotating.
One and a half feet.
I didn’t have access to publication for the technical report. This is very early. In this kind of figure I summarized, including my speculation also. Later other people also published this kind of figures based on data. This is my speculation, and this is observation which is taken from that report, like that. And some theoretical aspects.
So this is kind of a review of what’s going on in the general circulation.
What was at that time? It’s called “Modern Theory of the General Circulation of the Atmosphere”. So that was my big project. This 1958 report is, even now, people say useful. This was very new at that time. Okay, now, in 1955 the numerical prediction became operational. Japan introduced the direct link computer first, the big [???] computer in 1959.
What computer was that?
IBM 704. So before that we’re having to do data studying NWP, but we really didn’t have a computer. My collaborator at the Meteorology Institute and I used a very tiny computer at the Fuji film company, they had the Research Institute. That had 100 words [bytes] of memory [laughs]. We tried to do something. I think that these results are produced by that computer.
Really? He’s showing some graphs, looks like global maps or something.
This is a time tended forecast. This is latitude of that [???] of the eastward velocity. So kind of very simple, had only 128 memory. I believe 128.
Did you ever encounter in this period any analogue computers?
Not personally. No, I don’t recall. That graphical method is some kind of computer [laughs].
Yes, it’s much like that, but done by hand.
But I don’t recall. Then in 1959 JMA introduced [???] computer. I was a member of the Japanese Meteorology Institute, but really the group at the Meteorology Institute Forecast Research Division was promoting the idea of numerical weather prediction and pushed the JMA to buy a computer. So a few of us moved from Meteorology Institute to JMA, new section, and I started NWP in Japan. Before that we had to learn how to use the computer. There were no professional programmers; we had to do it.
FORTRAN existed by that point, but was it available to you?
The thing we had to learn was the pure machine language, and then the new program called Assembly program appeared that corresponds to the machine language, but instead of machine it is symbolic language. When we heard of FORTRAN, we really didn’t know what it does. It didn’t sound like computer program [???], so we didn’t know how important. Then at that time there were two candidates for computers for JMA to buy: one is IBM, and the other is UNIVAC, those two choices. And they were both comparable. We actually programmed something like this model for those two computers, and the computer speed or so.
You programmed that model for both of those?
No, not only me; the group. I did work, though.
You programmed the mathematical model from scratch, rather than borrowing their computer code.
No, no, no, but with machine language.
It must have taken a long time!
Well, but that was completely irrelevant, because what really made the difference was IBM had FORTRAN and UNIVAC didn’t. We only with data began to appreciate the FORTRAN, then we used FORTRAN as a group, programmed basically this model and a bit more advanced model.
He’s talking about the ENIAC models, the Charney, Fjörtoft and von Neumann model.
Yes, but the dynamical model. The FORTRAN, and without having a computer, even the Japanese IBM people didn’t know much about FORTRAN; we studied it together. We programmed, then we sent the program to the United States. We have a Japanese colleague that happened to be in the United States, and he ran it and no error at all! [Laughs] So without the computer, we couldn’t do that work.
Explain that again. I’m sorry, I missed it. You sent it here and your colleague ran it, and it didn’t work?
No, it worked without any errors. We had no experience of actually running, but…
You were lucky, and very skilled.
Well, just careful. 1959 the computer arrived and we started operational forecast. I think Japan might have been number three. The first is United States, then second was Sweden, then maybe the third in 1959. I should tell you, I entered the Japanese Meteorology Agency immediately after I graduated from the University of Tokyo in physics, so I didn’t go to graduate school, and I didn’t take any course in meteorology in my whole life.
All on-the-job training.
Yes, and side studying. But at that time in the Japanese Meteorology Agency was also [???] and I was able to [static; inaudible sentence] generation of these papers. So that’s how I did.
So when did you get that degree?
1961. But before that I had some publications. There was talk before we began about the possibility of coming to the United States, that this was a possibility, and Ermeens [?], he was there. He didn’t know me much, but a professor at the University of Tokyo recommended me to him.
Someone he knew? The person that recommended you to him knew him?
Right, a University of Tokyo professor of meteorology. I was not his student because I didn’t study meteorology, but he knew me, so he recommended me to Ermeens. There was some talk going on, and that was kind of an ideal possibility. My personal interest is general circulation, and my job is numerical weather prediction, and what Ermeens was thinking at that time is my job is general circulation modeling. So the objective is my interest, and actually the method used is basically the depth of the numerical weather prediction. Then in 1960 (okay, a little bit I’m going back), about one year or one and a half year later during the initiation of NWP in Japan, there was a big international symposium on NWP in Tokyo. Now that is a very famous symposium, and practically all the important people came. International Symposium on Weather Prediction. All the important people involved in NWP in this country came by meter [?] train [chuckles], so if something happened with that train, the [laughter].
But you didn’t. That’s good.
And there were Europeans, including Fjörtoft. Basically everybody came. At that time they had an experience of operation of NWP. Of course those were exciting, but we began to recognize there were many, many problems. After this, the other model is basically dynamical — no heating, no water vapor condensation. But at this time they recognized the importance of other processes and so on, and the need for a more comprehensive model than the circulation model, so that symposium was right at this point. So that was very important. Then I came here in the spring of 1961.
When you came from Japan, did you bring your NWP model with you?
Not the model itself, but that is the operational model, just only basically forecast for two days of so. But here what I started on for work was the general circulation model, which is theoretical model over many months over many years. The first thing I had to solve is to have the computation that is stable at the algorithm. The standard was a model they used that blows up.
I’ve read though a number of your publications, and it’s clear that that issue was one of the most important things that you did, was to work out a way that the models could stay computationally stable over a long period.
I don't know most important, but the most spectacular, like in fact it was spectacular.
Why do you say that?
Because before that work, the computations could not be extended beyond a certain time, usually just a few days. And if we want to extend the computation longer, we had to introduce actually a very large smoothing effect.
Right. Then we can continue, but that changes the [???], so then we’re simulating a different world, not our own atmosphere. So my point of view there is such a dissipation smoothing is a physical process which may well be important, but that should be designed based on the physical reason, but not just to keep the computation going. So the model should be free from programming, even without any interfusion at all. Then we free to add any amount of interfusion based on the physical consideration. The results with the standard method available at that time just look analogous to the real world to me. After some time it went on its own otherwise strange solution. And just the mathematical idea of the numerical stability that existed at that time, just couldn’t explain that, but that I strongly felt that this is not our world, this is a simulated world. So I started to question what’s wrong, and at first there was for this simple model, and then it was not so difficult.
Let’s talk a little about the first things that you did with Jule Mintz here. I’ve read this very nice tribute to him that you wrote. I assume from this that he has died, but it’s not actually stated there. When did he die?
There were a couple of other people interested in general circulation models at that point. There was Phillips in 1956 who had done some numerical experiments. There was the Smagorinsky group and Manabe at GFDL. Though you may have been a little bit ahead of them here, so I’m trying to figure out who…
It depends on the way you look at it. Phillips is a pioneer with his publication in 1956. His results are even included in my monograph. That is a real milestone paper. That is almost the beginning of the story. Everybody you mentioned was inspired by Phillips’ work. But Phillips model was very simple, kind of borrowed the numerical weather prediction model and just put the prescribed heating without questioning how heating is produced. Then later on his model blew up computationally, so he couldn’t extend. But still it got very interesting results before it blew up.
The other person who was also working on this around this time was Chuck Leith. Did you have any contact with him? Did you ever see his film?
No. I was stationed in 1960. Then basically there were three groups about to start this kind of thing doing that. It’s hard to say who started first. Smagorinsky first started reproducing their regular model, and it was very simple. Here was still to regular model, but the model here is the first model which introduced the realistic topography and land to sea distribution, and global from the beginning. Smagorinsky’s model was hemispheric; partly basically he used a stereographic projection, which cannot be recovered in the globe. Now, this had many processes, but he used very large diffusion in the smoothing effect, so that his curve is beautiful, but the flaw there is too regular — the atmosphere is much more irregular. It’s beautiful, everything is moving. Now our results were the first results which showed a very irregular nature of atmospheric flow. That’s why… May I show you?
Sure, of course. [Recording paused and resumed.] Back to this. You were saying that they introduced some error into the Leith and into your model and into the others to see what would happen to it, and the Leith model settles down after an initial blip, but errors grow in the Mintz-Arakawa model and also in the Smagorinsky model.
In the turn of things, this is the most reasonable. The small initial deviation grows in nature, and that’s why the character divot [?] of the predictability. That is the main issue here it’s discussing. And so if the error is as large as the difference between the randomly selected maps in the same season, then the prediction is useless. So, to a large extent, this behavior is due to the very small or practically no artificial diffused smoothing effect, but that is a consequence of my work, what I decided. That had a large impact. I recognized the problem here, and solving that problem was rather easy. The real physical problem appeared around 1965.
Let’s go back to your collaboration with Mintz, because I understood you to say in this article that he was not a programmer and never programmed the computer; that was all you. But you’re saying that he was already interested in extended a numerical weather prediction model into a general circulation model by the time you arrived, and part of the reason he wanted you here was to do this.
Tell me about your collaboration with him. What was it like? What was his contribution, what was yours? How did you work together?
I tried to write in that paper that there were — [Chuckles] Strange that we never wrote a joint paper. We headed the Mark Rhodes paper, but no paper just him and me. As I wrote here, every day we spent an hour or so to discuss.
On the phone and in the computer center.
Yes. Mintz was not really a theoretician, and I was theoretically oriented actually. I think I was narrow — I was basically a dynamical meteorologist. He was very broad, so I learned a lot from him and broadened myself. But he also trusted me about the judgment, especially when dynamics was involved, which was almost everything. He has very good insight, and he has new ideas every day. At first he wanted to check that idea with me, and I usually would object [chuckles] to more than half of his ideas, but the rest were very good ideas.
Another thing you said in the article is that you pushed him to slow down and try to get the computational…
Yes. He’s not a theoretician, and he’s not really ever worked on the mathematical aspects. So he wanted me to just do the constructing of the general circulation model as soon as I arrived here, but I knew from my own experience at the JMA and partly from the literature that the model blows up unless we do some artificial, so there is no point of not overcoming that problem and just constructing it, if we’re interested in long-term behavior of circulation. When I came here, that problem, it was clear to me I didn’t know how to solve it, so I was determined that that was the first thing to do. So I try to persuade on him how important this is. And at the same time, I had additional reason with Smagorinsky of developing this one. Of course I didn’t have any such feeling, but Mintz apparently had a feeling we were right about it. Understandably — he was responsible for getting money; I didn’t care!
So was there a kind of race going there, at least from his point of view?
Leith kind of quit after producing beautiful theoretical…But of course Smagorinsky at the Geological Institute had joined the fluid dynamics laboratory, and of course they determined. Then here it is a very small group at the University, so in my mind we don’t have to compete with the big laboratory, but Mintz was a very ambitious man. So he was nervous, I would say.
How many others were in your group here, besides you and him?
At that time, just him and me. Right now I’m talking about the period that may be described as the Arawaka-Mintz model, that’s from 1961 to 1963 I was a visitor here. Just him and me, that’s all. He had a couple of graduate students working on other things.
And you had to sort of steal computer time on the weekends, I understand.
Right. That was a consequence of Mintz’s enthusiasm and his administrative capability. I cannot do that. The University administration regulated that, okay, so. I think I wrote in here during the weekends I think we had a computer that belonged to the Business School, and they allowed us use the computer during the weekends, if we operate it. Fortunately I already knew how to operate such a machine at time.
Do you remember what that machine was?
It was an IBM 709. I was familiar with the IBM 704 because I was using that in Japan. By the way, that machine had only 8K memory, and an additional 8K memory was in the DRAM, so we used the tapes a lot. For one of my research I presented at that symposium in 1960, that was a kind of an advanced dynamical model, then I had to do every detail on the tapes for the computation, all the mathematics I had to do on the tapes [laughs].
What about the 709, what kind of capacities did it have?
32K — oh, that was tremendous! [Laughs]
Of course we used FORTRAN and its complier, but it takes a long time. It may take an hour to compile, so we cannot afford to do a mistake. If there was some drastic error, then an hour is wasted, so we were very careful. Now, if we did find an error, then what we used to do is punch out the machine language in the binary form, then change that punch card by using scotch tape to fill the hole, and the tape goes through the reproducing machine, then they would get the new card in that way.
So you fixed in machine language the card.
Yes, we knew machine language.
Give me an idea of how long these programs were, what size approximately.
The operation of the data analysis, there were many things involved, so I don’t recall how long it operates. This here, it is described [pages through report], about 2,000 FORTRAN statements, 2,000 cards — not so big at all. That's a very early version. So I was a visitor here from 1961 to ’63, and that was the period I basically constructed so-called Mintz-Arakawa Model, but early version. The later version was also called Mintz-Arakawa Model, but the early version without the condensation of water vapor. So it went out in 1963, and at that time I was informally offered the position here.
How long did you stay in Japan before you came back?
Two years. I had to go back to Japan anyway because of the visa program. I got an extended visa called a J visa. Changing that visa to a more permanent one requires at that time two years stay outside of the United States. I had to go back to Japan, and I had a job at the JMA.
What was your job at the JMA when you went back?
It was the NWP section, so it was developing new weather prediction models and it involved operation.
So did you take the model you made here back with you?
No, no. This model here is for general circulation for long-term integration. The JMA is a forecast model. Not at that time — much later models produced here went to Japan, but not at that stage.
Also, I understand from the articles that the methods for getting rid of the computational instabilities didn’t really matter for the NWP models because they were so short term. Is that right, or did you add those to those models there?
It was the usual case, but there were a few times in a year that problem appears, even the two-day forecasts. So JMA really had the very embarrassing time, just they were saying the probabilistic chart to weather stations because the model blew up.
So did you add your techniques to those models?
Oh yes, then that worked beautifully, and it’s a more economical version of it.
What did Mintz do with the model, if anything, while you were back in Japan for those two years?
He kept running the model. Now he had a programmer.
Who was that? Do you remember?
Well, that was a little later, Dennis Subsay. The results are reproduced here [flips through papers]. That was actually run. China didn’t run; it was run here. That was the [???] time after I left. Means the programmer actually produced this. Well. Later on there were a few more programmers, but Dennis Subsay was one of the earliest ones.
So anyway, you were back in Japan for two years working for the JMA.
Yes. I really had the time to decide whether to return. Of course it’s a very big decision.
By the way, in the first two years you were here, ’61 to ’63, did your family come with you?
Not right away, but they joined me five months later, something like that. First I wanted to settle myself. My wife and my son, who was two years old at that time, came later.
So coming back was a big decision because it also involved them.
Oh yes, there was the family, especially my son. So he was here from two years old to four years old, so he picked up on English very basically by playing with the other children. When he was four years old we sent him to kindergarten. In Japan, even for kindergarten there is an interview, sort of. Eventually my son could say figures of the animals and what is this, and he only answered in English, which was not understandable to them, so it was not so easy. [Laughs]
So he flunked the test because he knew too much! [Laughter]
He didn’t know Japanese. Then he studied for two years, then he began at six, school age, then is when we finally decided to come here. So his whole education is here. So it’s a very big change for him, and a very big impact because his whole education was here, so he was educated as an American. So it was a very big decision, and I really couldn’t make up my mind. But Mintz wrote me every month! [Laughter]
And he finally convinced you, obviously. Very persistent. [Lunch Break]
This is a manuscript Charney gave me.
1960 — something, 4 or 1? Hard to tell. Copy of Applied Physical Sciences by Charney and Monk.
This manuscript, especially the first few pages, it talks about von Neumann’s proposal. Neumann called up the Conference of Meteorologists and advocated the idea of numerical weather prediction, but Rossby was hesitant. This is only written source of that. I heard that story from Charney personally.
When we broke off, we were talking about you had been in Japan for two years working for the JMA, and then you decided to come back to this country because Mintz had been writing you letters once a month.
Actually I couldn’t decide, then automatically everything was arranged. [Chuckles] But I hold the position at the JMA for a while just in case I changed my mind, which I doubted. Anyway, I want was to continue this kind of work, this was the only place I could do it. Certainly not in Japan. So I decided to come here in 1960 as regular faculty. Then the first thing I had to start teaching. That was a tremendous strain with the language barrier, and I didn’t have teaching experience in Japan. But in the model, this is now two number [???]. The number really doesn’t characterize the sophistication of the model, but still it is useful to identify the model. That is the model that was developed when I was visiting here.
We’re looking at the chart of the history of the UCLA GCM. Then the bottom level here…
Then that’s the next generation to that model, but the clue is water vapor mixing ratio, so now it carries water vapor that produces rain. That is done. The other main issue, as far as I’m concerned, to overcome in the computational difficulty (in the very first basic model) is the basic dynamics, and also the numerical part. Here moisture comes in, which makes the physics very complicated and very challenging with many things that can happen. That’s how I get involved in the problem of cumulus parameterization. In the tropics the water vapor condenses inside the small cumulus clouds, and the model by no means resolved all those. But the heat of condensation released those cumulus clouds collectively influence larger scale circulation. The fact is that they are the most important heat source in the atmosphere.
What is again? Cumulus convection?
Heat released inside the cumulus convection and in cumulus clouds. Usually we think the sun drives the atmosphere, but that’s not true. The sun gives energy to us, the atmosphere is a coupled system, but the large part of the insulation reaches ground and are absorbed by the underlying surface, not much by the atmosphere itself. So underlying surface, if it is land, then when the temperature becomes hot, then the heat is given to the atmosphere. Over the ocean it is a completely different story. Energy is absorbed, but the ocean has a very large heat capacity. The energy is there, but the ocean current moves that energy around. Then evaporation takes place, then the ocean loses the energy. Still the atmosphere doesn’t feel the energy, but the evaporated water vapor is now carried around by the atmosphere, and only after condenses in the atmosphere as an energy source. So the link between the sun on the atmosphere is kind of remote. The southern hemisphere, which is covered mostly by ocean, doesn’t show much seasonal change — it doesn’t care too much where the sun is. But the northern hemisphere has more land, so it does have seasonal change. But still in the topics, rise over the tropics is covered by ocean. So the ocean is crucial in our climate system. Then there were how to formulate collective effects of the heat of condensation inside those cumulus clouds. That is obviously a very important problem. That is a problem of what we call the cumulus parameterization.
So you began to deal with this in the second half of the 1960-‘65 period.
I began to recognize that problem is very important. I decided after I think I overcome the computation problem that is the next problem we have to overcome.
So how did you do it in that period? You added water vapor in, in the other quantities there.
At GFDL Manabe, Smagorinsky, that group was ahead of us with treatment of moisture. I think they were ahead of us in the computation states, including covering the global atmosphere and the solar. But their report and our report was sort complimentary. But here, of course we have to do many things at the same time, but scientifically my own emphasis was the cumulus parameterization problem, but at this phase I just began to become aware of the importance of the problem. I temporarily proposed that the way the [???] map recognized the ’60s, but this showed up briefly and I explained that.
The chart shows that this two-level model was then developed at the Rand Corporation and Oregon State University in the 1970s.
This was model, IBM used.
You’re talking about the first Mintz-Arakawa model in the first half of the ’60s.
Hm-mmm [yes]. Now both Mintz and I, especially I, were very lazy in writing, especially the writing during this period. We had a good excuse: we were so busy developing the model just we didn’t have time to write.
Were you doing a lot of communicating with other groups by giving talks or visitors coming here or you going to other places?
Well, of course we know pretty well what the other groups are doing. Of course not day-to-day contact; just at the conference and so on. Not really close. Now this model Jule Mintz published the results from this first Mintz-Arakawa model. Now this one, documentation of this model — oh, this is part 2; this is not exactly. These IBM people at San Jose, and this person flew twice a week here when I give lecture on numerical weather prediction, and then turned around and they also go over the code of this model very carefully and produced the documentation.
Why were they interested at IBM?
At this time it was a spectacular application of their computer.
What did these people do? What were their backgrounds? Were they computer scientists?
Also fluid dynamicists. But he didn’t know meteorology at all. That’s why he took my course, flying from San Jose. I went over and visited them pretty frequently. So we produced documents. This is only part 2 computational; there is a part 1 which described the model itself, but I cannot find it. [Looks through papers in room.] Part 1 is in this from, but this is what will be produced. This is Lenglois and Kwok, and description of the 1969, but really was this model.
This is Numerical Simulation of Weather and Climate, Technical Report No. 3.
That was published from the group here.
Published by the Department of Meteorology, UCLA. The title is “Description of the Mintz-Arakawa Numerical General Circulation Model” by W. E. Lenglois and H. C. W. Kwok. First of February 1969, but it is in fact a description of the 1961 to ’63 Mintz-Arakawa model. There’s also a part two called “Numerical Simulation of Weather and Climate, Computational Aspects” by the same people, published by IBM Research Laboratory, Large-scale Scientific Computations Department in San Jose, and this does not have a date.
Still that model, very simple, but it has moisture for rain, and there is a very simple formulation of the effect.
I have a couple more questions about the report, before we go on. How did you hook up with them in the first place? How did they find out what you were doing?
Well, IBM will know immediately. I don't know now, but the Computation Center here was very well supported by IBM. Actually IBM and they were there. They were very much interested in what’s going on. And meteorology always uses the most powerful machine available at the time. One of the first applications of their computer and so on, and with weather prediction. And since then, whatever new machine, well we could immediately use is.
But the main channel is that their technical support team was often here, so they knew what you were doing and they got interested and they came down to check it out.
And they wanted to have in-house people who were familiar with the model.
Do you know if IBM went on to develop the model in any way after this period?
We applied this model to… I forgot, I know the publication of his. They really didn’t develop the model any further, but they used the model. Now, this version of the model, the second version, that has become very famous—probably by Mintz-Arakawa model, many people mean this model, the second one. That is what exported to Rand Corporation. There was a group headed by Gates, and that was a group of people, so they used our model. First they studied the model, every detail, and produced these documents.
This is a report by the Rand Corporation prepared for the Advanced Research Projects Agency. The report number is R-877-ARPA. It’s called “A Documentation of the Mintz-Arakawa Two-Level Atmospheric General Circulation Model,” and the authors are W. L. Gates, E. S. Batten, A. B. Kahle, and A. B. Nelson. The date is December 1971.
This even includes the results and even code. This was distributed worldwide, so that this became the prototype of many models in the world at that time.
This report is what was distributed worldwide?
Apparently. So the Soviet Union, they used this model, or even somebody combined this model at the social class model.
Really? Do you know who that was?
I really don’t remember. I may find something later on.
I’d be very interested in that, if you find it.
In China they used the model. This was a very sophisticated model, although very simple, too. As mathematical it was a little simple, but the physical was pretty sophisticated. So it became very popular. Well, very largely that was due to the [???] because the full documentation was distributed. No other model had such a detailed documentation at that time. So anybody can reproduce these results.
You have down here also Oregon State University.
Now, that door opened automatically because Gates group at Rand Corporation moved to Oregon State University, so he took the people and added people at the joint radar. Michael Schlesinger was one of them. So Gates took that entire group to Oregon State with the model. So automatically it became Oregon State. And Michael Schlesinger moved around here at the University of Illinois with the model. That’s what I meant by that’s why these institutions made their own changes, naturally, developmental, but it can be traced back to this model.
Any other groups that you remember that took this model up directly? You mentioned Rand at Oregon State, Schlesinger at Illinois, and this Soviet group and the Chinese. And any names you can think of of people who did that would be very helpful.
I don’t recall. Well, other than these. Still the General Circulation Model activities were confined to only just a few centers. Of course GFDL, Steve Minores [?] group, they developed their own GCM. Up to this point they had a nine-level model with moisture and so on. And Gates quit his interest, it became something else. Around here, NCAR, the [???] Washington, that group, developed their own model.
Also the second half of the ’60s.
All others were only developed in the United States. There were no other places.
What about the Swedes, the Swedish meteorology groups, had they built a GCM by this point?
No. They had NWP model, short range; not the GCM. So those three centers, not only in the United States, but those three centers in the world. I hope I’m not missing anything.
That’s all that I know of.
Of course in the 1970s, that’s another story. I think there must have been other users of this model. They were all that I remember. It was rather this model — the French.
The French! I was going to ask you about…
Pierre Morel, he used this model (the early one). Not climate simulation. He was consistently interested in measuring the atmosphere by balloons, releasing of the balloons, and they had a project dated in 1960 of releasing many balloons and tracing them. They were hoping by the movement of the balloons they can construct the wind field. But frowned [?] that they were at the launching sites and so on, to think about releasing balloons and they used the model to simulate [???].
That brings up something else about the French. One of the articles I found of your is published by you and Mintz with somebody named Sadurni, a Frenchman, published in 1968, and that was on the use of a 20 triangle geodesic grid. There’s a model of it sitting on the file cabinet in here. So it is a triangular spherical icosahedral grid, and that article describes a kind of initial test of this grid space as a way of dealing with the glow. What ever became of that? Did you ever use it? Did he go on to develop it somewhere else?
He went on. Problem is how to cover the sphere more or less evenly. It is surprisingly difficult problem, and no mapping can cover the entire globe, no single mapping. Weather mapping of the poles may go to infinity; one map can only cover that one hemisphere, basically. So difficulties in mapping always appears as a difficulty in the global model. Smagorinsky used the stereographic mapping, and that really is tangential to this field at the poles, so it cannot cover the other pole. So there are two stereographs, one tangential at the North Pole and the other at the South Pole, then near the equator those two just kind of somehow have to be coupled. But the grid points in the one projection and the grid points in the other projection don’t coincide, so that that interpretation has to be done. Here from the beginning we used the spherical coordinate just a certain longitude and latitude, but the problem there is that near the pole all the regions just converge to one point, so there are the computational problems.
The time steps get very short at the pole.
Oh yes. But I invented a way to overcome that difficulty which allows longer time tables in spite of that problem.
How does that work?
It was beautiful, and still many models are using basically the same approach.
So explain it. What is it?
Now, that kind of program doesn’t appear if you use this so-called spectral method, but that’s a data development. Basically near the poles the regions get very close, and so the grid interval in the west-east direction becomes very short. Time steps have to be short enough compared to the fastest way propagated from one grid point to the next point. Now, the way I did is because of the very high resolution is too high compared to other latitudes, the speed of the wave is the northwest direction, is very fast, or I should say two short waves compared to other latitude are picked up, and they don’t do anything good because other latitudes with such waves are not resolved anyway. So basically what I did is filter those waves.
Filter them out?
Not eliminate, but let them slow down, because they were unnecessary to start with, and unnecessary fast. So that technique is polar filtering. I invented that. That’s why we were able to use a special coordinate, then it was just straightforward to cover the entire globe.
Late ’60s, early ’70s.
Here we bypassed the program not to go all the way to the pole, we stopped somewhere at 75°.
Right, in the first model you stopped at about 75°.
But here it goes all the way to the pole because of that coordinate.
We were talking about the original one, also late ’60s.
Okay, that is around here. So the problem of covering the entire globe with an approximate uniform resolution was an important issue, and it’s still important. Basically we developed a mathematical theorem that maximum number of faces to cover the sphere by regular polygons, the maximum number is 20; we can’t go more than that. There is a mathematical proof, I hope, but I don't know. [Chuckles] Because there is a fear we can have the cube, we can have an increased number of faces, but not more than 20. The dome at the artic section, dome, if you look closely that’s based on that icoshedral — 20 sides. And an example is a soccer ball, the white and the black. If you count the number of white faces, it’s 20. So everybody has the same problem!
He’s showing me the model of the sphere, rectangular grid.
This is hexagon.
So 20 hexagons.
I should go back to that triangle. I developed this. This is a triangle, and then it just splits into the smaller triangle like that. Then we can cover the sphere approximately by smaller triangles, so we can keep it splitting into smaller triangles. But this is a hexagon, but here it is a pentagon — 5.
I see, so there are some hexagons and some pentagons on the surface.
So we had a mix. That is [???] because now I know the maximum number of faces is 20, so at the corner we have to have a pentagon. Okay, so this is a rather well-established technique to approximate cover sphere that are known, anything*, so we not use that. And in that paper, basically we used the Charney model, and simply just add* for this thing (using the icoshedral grid) and that worked beautifully. One thing that surprised is that technique I found to make computations stable in long-term integration, I found that in 1966, but that is for regular square grid or rectangular grid. Now if we had this hexagonal grid to start with, then if we do the simplest thing, which everyone will do first automatically, satisfy the [???] requirement. So if meteorologists use this kind of grid from the beginning, I didn’t have to edit the earlier Mintz! [Laughter] Or, the existence of that kind of computational stability was probably not known at that time.
Did anybody pick it up? Has this been used systematically elsewhere? [No.] I mean it sounds like it has some inherent advantages over the rectangular grid.
Sadurni is a visitor from France, and he went back. He continued in Paris.
Do you remember the name of the institution? [Arakawa retrieves address.]
He became the director of that authority until recently. So he continued, and naturally used this a little bit more. And he had lots of difficulties, mainly because there is a built-in singularity here because we have a pentagon instead of a hexagon. Now, it was how many pentagons, here, here, so we can spread this triangle into smaller triangles for as many as we want, but still this pentagon right here remains a pentagon, not a hexagon. So there is a built-in singularity. And even you can split this into the smaller triangles, then there is a discontinuity around way, around this way, around this way, and so on. Now when the physics of the model is simple it doesn’t bother, but when it becomes more complicated as in climate model, then we may have a built-in climate singularity, or we may have heavier rain along these lines. So he had difficulty basically due to that, and in my mind it was very important that we have two singularities, not four, at the South Pole, what you do is have more. So I myself discontinued. Of course it was harder for Sadurni to forget about it, but he eventually gave up. So no comprehensive climate general circulation model was constructed based on this. But quite recently, David Randall, I’m pretty sure you’ll hear his name, from Colorado State University, he’s my ex-student, and he is now very active in general circulation modeling. It is kind of a branch out from the model here. He is reworking the use of this one, and it seems to promising. If we use the momentum equation, then there in that vector equation we have to actually use in their component forms. Then the component along the coordinates here, it’s a nightmare — this can’t be coordinated right, this can’t be coordinated right. Two components are enough, but the three coordinate lines are a nightmare. What David Randall is doing instead of using original vector equations, but he is using derived scale equations, and then applied to this kind of grid, but somehow modified manually to eliminate singularities. If we try to do [???] this event, this is it, we can even show it numerically, just [???] the sphere and we put many points and get those points have the predicting force together, so try to get the largest mean distance together. Then they integrate that and they end up something like this.
Like this spherical triangular one.
That is vectored. So that’s like that. Then around 1960s, here we began to construct the three-level model, which is basically this model but just another boundary layer just next to the surface.
So the two-level model plus the planetary boundary.
Right. The planetary boundary layer is important for cumulus convection.
This is late ’60s, early ’70s.
Yes, that we developed it, but we almost immediately stopped developing that model because it is a good idea to have another layer, but that makes the interface of the two layers at the lower and produce some bad effects and so on. So we terminated this version of the model (the three-way), but NASA became interested in that model, so we exported this version of the three-layer model to NASA to the Institute of Space in New York.
The New York group. Who got interested in it and took it up?
Actually at that time the Institute as a whole, basically two groups. One is the group headed by Milton Halem. He is now at Goddard with this. But his group was there, and there was another group headed by Jim Hansen, who is now the director. So both groups. And what they did is to change our three-level model to a nine-level model, so dividing each layer into three sub layers in their model, and that has become the NASA case model. But cumulus parameterization, which is very important, was designed for three-level model — our version of this model is a three-layer model. So what they did is they called strapping. So they have nine-level model, but for the purpose of cumulus convection they combined the three layers into one layer, so then it looks like the three-layer model.
They still use the three-layer cumulus parameterization, but then they move it into nine layers and back and forth. I see.
Cumulus parameterization was a very, very difficult problem, and all I can offer at this time is for three-layer.
One of the things I’ve heard about your cumulus parameterization scheme is that in some versions of the model there was an effect called cloud blinking in which when the humidity rises to a certain level in a grid box, the cloud will go off and on at each time step, so that it appears and then disappears over and over again.
That kind of happened with any parameterization. If we are not so careful, even with… not with [???], with any condensation. Condensation basically appears when the air is very humid, and then the precipitation dries. If it dries too much, then the next time step nothing happens, and wait until the air becomes humid again. So it is a constant over response of the condensation to the humidity. If we relax it, then they were. It’s not unique to any scheme. It can happen at any scheme which I’m calling—implement kind of direct structure, in something it’s adjustment.
So this went to this, and then you’re saying Halem’s group moved to Goddard.
Moved with the model and Hansen’s group stayed and got the relative atmosphere model. Of course they made their own development, and even that they have another version, but this was the prototype and they changed the name and so on.
So GOIS and GOS are the same model.
Yes. Well, of course that was the new model that they developed, so I should have [???]. But some trace still can be made of that. Now, this is about 1970, as far as we are concerned we stopped developing this version, then we went to these two versions. These two versions are just system models. This is basically a troposphere model. It starts at 50 millibar, so above the troposphere. And this model goes all the way up the stratosphere, 50 kilometers or so.
So it’s a 12-layer model that goes to one millibar [Right.], and a 6-layer model that goes to 52 millibars.
Right. And the motivation of going to the stratosphere is right around the time there was a big national project sponsored by the Department of Transportation.
The Climate Impact Assessment Program — I was going to ask you about that.
Yes. So they very much wanted to have a model with the stratosphere, and many such groups participated on that program, and we also supported their program basically to develop the general circulation model with the stratosphere to predict ozone. This is prognostic variables in the ozone.
Ozone in addition to water vapor. And that was basically probably because of the SST controversy, the inclusion of ozone. Or was that something that was based more on the theory?
The motivation for including ozone was the Department of Transportation was interested in the supersonic effect, and was applied to reduce water vapor in the stratosphere. That may influence the [???] processes there and may destroy ozone. Now these days we know the ozone hole like that, but at that time of course people didn’t know, but people were aware that the water vapor from the supersonic effects may have some influence on ozone. So very extensive research modelizing the various kinds of vapor, not only modeling people’s observation, people’s [???]. It was a big program, and we benefited from that. Of course National Science Foundation was always a major sponsor. I’m not sure about this period, but from that model. Now we got very close observation with NASA. NASA became also a very important sponsor, and around 1970s the Department of Transportation.
Let me ask you a couple more questions about sponsorship and funding and so on. Do you have any memory what your budgets were for this work, and the first two or three models you made?
I don’t remember, but I can look it up.
It’s not that important, but I’m just curious about orders of magnitude.
Orders of magnitude, hundreds of thousands per year.
What does that money buy? Was it computer time, equipment?
No. It was our time, and we used the IBM 709 at the Business School. But always you need money to find computer time. We partly used the University’s main computer, which is in this building, but since we are the biggest user of the University computer, we got a very good deal. And again during the weekends we are able to use that computer at the Medical Center. They have a very big computer also, but they’re free. This may not be official [laughs], so please don’t go report it. Well, instead we paid the maintenance, so it was a deal. Maintenance when something happened during the weekend, we called up the personnel at our own expense.
This work is funded, or at least the report is made to the Advanced Research Project Agency. Did you ever have sponsorship from them, or is that just the Rand Group?
Just the Rand Group. I have copies of our proposals to NSF somewhere, but I was not really involved in the administrative aspects at this early stage, but later on. Now this one (CLIMAP model), this period we did have a very extensive effort in developing this model. CLIMAP was one of the sponsors, but still NSF. By this time Mintz was pretty much kind of out. He still had an interest, but…
So he was retiring?
Not retiring, but he had very broad interests.
So he was going to do other things. Right, I remember that from your article.
So I had to supervise many people. Actually many students participated with us and worked in various aspects. This version of the model is described here.
This is the UCLA Atmospheric General Circulation Model by Arakawa and Mintz, with the participation of A. Katayama, J. W. Kim, W. Schubert, T. Tokeoka [?], Michael Schlesinger, W. Chow, David Randall, and S. Lord. These are notes from a workshop, 25th March to 4th April 1974 at the Department of Meteorology, UCLA.
This is actually from the second workshop. Mainly describing this version of the model, the three-layer, that we had the workshop.
Technical Report Number 7, “Design of the UCLA General Circulation Model” by Arakawa, 1st July 1972, Department of Meteorology, UCLA. You think this probably describes the three-layer model of the early ’70s.
The main things discussed here still are relevant now. So although this is just an informal technical report, this is very well sited. Many things appeared here the first time rather informally. So we had a workshop, and this is based on the notes of that workshop for this model. This is based on the workshop notes for this model. You see the difference? This time I had to do everything [chuckles], and here I had a task force.
I see, these were all the graduate students.
This is a visitor from Japan, Katayama. This was a post-graduate from here, Kim. This was a graduate student who is now a professor at Colorado State University, Schubert. He is now a professor at Yon Sai [?] University in Seoul, Kim. He is now the head of the climate section of JMA, Tokeoka — he has become a big shot! Schlesinger you know. Chow is at NASA. David Randall, we already talked about him. And Steve Lord is now the acting or formal head of NCEP. Do you know NCEP? It used to be called the National Meteorological Center, so that is the place in charge of the numerical weather prediction within the US Weather Bureau. So everybody has become a big shot, and all participated heavily in developing this model and participated in writing.
The 12-layer and the 6-layer model went to Meteorological Institute in Japan.
Yes, and they are still using it.
Were you responsible for that, or did they come to you?
Katayama was a visitor from there and Tokeoka, and basically they took the model back. He worked on that while he was here. Actually he was the boss of this, so…
At this point, when you say he took the model back, what exactly did he take? Punch cards, tapes, just code?
I think he took tapes with the source code on them. We heavily relied on punch cards around this time, and at this time (in the ’70s) on tapes, we didn’t use punch cards.
What sorts of computers are you using on the third, fourth, and fifth models here?
I don't know now, but the Computer Center had a strong tie with IBM and the computer they had around this time is called a System 360. There are many versions, and I don’t remember this version.
This period is also the point at which the ARPANET began. Did you use that at all? UCLA is one of the very first nodes of the ARPANET.
No, we didn’t use it directly. We began to use NCAR computer also, and we were irritated by the slowness of bringing data back.
How did you use the NCAR computer? You would send them what, tapes?
We sent tapes to store the results, what we called history tapes. But otherwise we did it electronically.
In what sense? How did you transfer the information?
Just use regular telephone and a modem.
Really? Boy, they were awfully slow back then.
Awfully slow! Awfully slow! That’s why we were irritated, it takes hours for it to come back, and just a limited number of dimensions.
Interesting. So the advantage to you of using the NCAR computer was that it was so much faster than what you had, even though the data transmission was so slow?
I don’t remember at what point the Cray comes into the picture. Of course Cray was the faster computer available at the time.
I believe that would have been the late ’60s, early ’70s, because before that it was Control Data, and then Cray left Control Data and formed his own company.
That’s right, so it was around this time, so the fastest computer and we can get free time. We had to write a proposal, it was reviewed both scientifically and by NCAR computation and passed by the people to go through, then we can use the computer free for an allocated amount. Sorry, I forgot to say more precisely which computer I used from year to year.
I know there were many, so.
We used our own computer downstairs, and NCAR computer. It was a possibility to use the NASA computer, but I don’t think we have ever used…
This is the 15 millibar 6-layer troposphere model.
I think this version was exported to the US Navy and became the prototype of their operational NWP model.
Really? So the second version of the 2-layer model.
That’s not the model entirely, but very close to this model. More importantly, around here, this version of the model was exported to the Navy and it became their standard operational NWP model. It’s called NOGAP Model: Navy Operational Global Atmospheric Prediction Model. They made their own changes, mainly to make the model faster. The model we developed here is a research model; we didn’t care too much about the efficiency. In an operational center, of course efficiency is very important.
Why did they use your model instead of one of the NWP models that already existed? Was it because they needed a global model?
By this time a global model is not unique to many people. But our model was considered to have the most advanced physics, and many other things too.
Again, did they come to you? Who was it that was responsible for picking up your model there?
Monterrey Naval Post Graduate School Institute or Laboratory — they keep changing the name; I lost track! Once it was called Navy Environmental Prediction Research Laboratory, but they changed the name. So we had scientific prediction. When the Navy wanted to have their own, they made a weather prediction operation. Even they tried to use some aspects of this model.
Just a general question about what’s happening when these models are going to these institutions. How much did you get phone calls or letters from people who had been working on changing the model and needed to know things from you?
With this IBM, I could have written IBM… as a consultant to them, I’d be with them maybe once a month for a period of a year or more, probably more. Now Rand Corporation is so close, and especially later on our own people moved, I forget the degree and so on, but Schlesinger and the team, these two, our own people moved. And so that was for this model. Here, well, I was a consultant for Rand. And at this workshop, these data center people, and I frequently visited this. And there were [???] and [???] maybe once every half a year or something like that. And that was not much problem because of these people directly involved in developing this model.
Right, so they needed less consultation, sure.
And the Navy people, the key person there is Tom Rosmond, he gave us contract, money and support, so that was additional support. And he used to keep his desk open, and of course telephone calls. But this time I don’t have to be involved so much in technical matters and how the program was with the model in every detail, so most of the questions can be handled by a programmer. Now this prototype of the models we are using now, mostly this version was the 29-layer model, but that’s experimental.
So now we’re talking about the last two models on this chart that start in the early ’80s, a 9-layer and a 15-layer.
Right. Of course we were still making the many changes now, but still this model is running now. Also this model, first this 9-layer model but now 15-layer model is coupled with the ocean, so we are running coupled atmosphere ocean in a circulation model, and that is very exciting thing, which I here explain. I’ve given a new name. Very exciting.
On this chart, here we’re looking at the cumulus parameterization article, you have coupled AOCGMs starting in the late ’80s and continuing. Was there any talk about trying to couple those complicated general circulation models before that time?
Oh, definitely. Manabe was a pioneer, and at least in the same group context, so not integral but the sector. He did couple his atmospheric GCM with the ocean GCM, but there were others in the community that were kind of unique and in a sense isolated in thought. Now from around here there was a massive move toward the coupled model worldwide.
Right, okay. So let’s go back to these, and tell me more about these last two models, the 9- and 15-layers. So here you’re adding this is a primary boundary layer?
The boundary pressure, so basically parenter [?] boundary layer height. So what’s unique about the model here is we predict the depths of the parenter boundary layer. So top of the boundary layer, especially with clouds, a very sharp gradient, a very complicated process is going on, so we want to follow that then.
Actually just for the record, let me make sure I have all the abbreviations on this chart, all the quantities on this chart noted it down. This I understand. D and B…
Other versions explain this. I have seen the version. Okay, these are horizontal grid, which is first explained in this report, my 1972 report, A-B grid structure. So a different way of distributing pro [???] variables. And this was a technical report, very informal. More formal publication of this kind of this kind of stuff.
I have this actually. That came out of the book I think, 1977.
So usually people reference this one, for this grid, and the people call Arakawa A Grid, Arakawa B Grid, Arakawa C Grid, D Grid, E Grid. I’m given credit to all of this! [Laughs]
Okay. So that’s the D, B and C here.
Yes, D, B, C. This you understand. This is additional [???] variables, additional means to the standard one. Standard one is various components and potential temperature, or the temperature itself. Q is water vapor mixing ratio. 03 is this model. GW is ground wetness. GT is ground temperature. S is snow amount overland, snow mass, mass of snow. P is the pressure at the PBL top. Delta is a discontinuity of variables across the PBL top, jumped. This paper really describes this version, 6-level, 12-level. And then this is a PBL, a small PBL top is predicted at PB, okay, and any possible jump discontinuity across the PBL top is also predicted. Perhaps there the important or what is cited most frequently is my paper, Arakawa-Schubert on the cumulus parameterization. That appears in 1974.
You think that’s the thing of yours that’s most cited?
Yes. Well, that paper is very notorious, people say very difficult to understand. I thought it obvious, but I still think it’s obvious, but foolishly I think I made the paper too long; it should be split into parts. And I used mainly too many equations to be precise. So that makes it difficult to follow. And Charney was a very good friend. He is much similar to me, very good to me. He was very interested in that paper, and when I presented basically that material a few years before that, then Charney was excited and he flew here only just to discuss that. So he was one of very few who got to that point other people didn’t, but even he said that that paper has 150 equations and it takes 150 hours to understand. [Laughs]
Wow! If it took him that long, I can imagine how long it took other people.
But he is a very… but he immediately got the point. He didn’t get the point at all when I gave a talk, but during coffee break immediately after that, “Oh I see! That’s great.” So he was consistently interested in it. But I forgot to say here I had to stay with the 3-level model because I didn’t have the cumulus parameterization for Model 3. That wasn’t one of the motivations to attack that problem more generally, and that came out as Arakawa-Schubert [?] paper. And that is implemented to this one, so that can be applied to any number of levels — very general and purposely I tried to make it too general, so it became too difficult to understand. So, my 1966 paper, the computational… people call Arakawa Jacobian paper. Basically a direction process can be expressed in the form of a Jacobian operator. So 1966 paper is known as Arakawa Jacobian paper, and Arakawa-Schubert paper 1974, those are the most frequently cited papers of mine. So now that we have a new version to which we add more problemistic variables such as ice, and they have a snowfall. This is Central Weather Bureau Taiwan. CSU is where David Randall is.
And GLA shows up here again. Is it the same group?
Yes, same group, same division, but not necessarily the same people.
It is the same person, it’s still Halem who is…?
No, Halem moved to a different department, so the people changed.
Okay, GLA picked it up again. How did that happen?
This report describes this version. Now these two models, the person whose name is Max Suarez, he is a very smart person. He got the Ph.D. from Princeton, but he joined here as assistant professor, and he became one of my closest colleagues. So really one of the best colleagues I have ever had, and a very sharp guy. He worked on development of these two versions of the model.
Okay. And did he then leave?
Then he left and moved to NASA, a graph developed for atmospheres, and worked naturally with the model in space. But no, he developed his own version of the model. It has many things taken from this model, but with the new aspects also. So that is Max Suarez’s model at NASA. Now NASA has many later models, so.
Okay. And the Livermore, who picked that up?
Actually I don’t have direct contact with them, but I have a colleague, Carlos Mechoso, he is also professor here. He is usually called Roberto, which is his middle name. He also came from GFDL, but has been faculty for many years and he is now a professor. And he is really taking over leading the general circulation model group. I’m retired, formally at least. Now, connection with Lawrence Livermore, this was basically with him, and Lawrence Livermore is interested in parallel computer, interested in decoding our model for parallel computer. So that’s why we have that close relation. So they have basically our model, but their decoded version of our model. Central Weather Bureau, this is one of my colleagues from Taiwan, and well she’s a very good one. She was a post-doc and she was at NCAR for a while, and then went back to Taiwan. She didn’t take model directly. They got our model through the Navy. CWB and the U.S. Navy have some relation.
Okay. Well listen, I have many more questions, and I’m feeling like we’re both getting tired, so maybe we should stop for today and resume tomorrow. Does that sound okay?