The Skills Gap For Fortran Looms Large In HPC

Back in the dawn of time, which is four decades ago in computer science and which was before technical computing went mainstream with the advent of Unix workstations and their beefy server cousins, the computer science students we knew at college had taught themselves BASIC on either TRS-80s or Commodore VICs and they went to college to learn something useful like COBOL and maybe got a smattering of C and Pascal, or occasionally even RPG, for variety. And more times than not, they learned these “real” programming languages on an IBM mainframe or minicomputer and sometimes on a DEC VAX.

The nerds all learned to program Fortran, which was two years older than COBOL, which came on the scene in 1959, and which was used to digitize the complex formulas underpinning scientific workloads. Again, usually on either an IBM mainframe or a DEC VAX. By the time we learned a little Fortran at Penn State in the mid-1980s as part of an aerospace engineering degree on an IBM 3081, the jobs were all interactive but the hallways were still lined with walls of punched card boxes as legacy program storage. They represented a very weird past that bore more resemblance to a loom than a computer, as far as we were concerned. At that very moment, the university was installing Sun Microsystems workstations in its first technical computing lab, and in came Unix and C along with Fortran. And everything began to change. Penn State never became a supercomputing powerhouse like some American universities did, but the computing that was done there was typical of the time.

Fast forward to today, if you look at the course catalog for aerospace engineering at Penn State, the only Fortran available to play with by students is one that is somehow tied to OneAPI, which makes no sense to us because OpenAPI is supposed to be data parallel C++. The programming courses in the aerospace engineering degree are dominated by C++, and C and MATLAB are also used. What happened to Fortran?

A better question might be: What is going to happen to Fortran, and that is precisely the one that has been posed in a report put together by two researchers at Los Alamos National Laboratory, which has quite a few Fortran applications that are used as part of the US Department of Energy’s stewardship of the nuclear weapons stockpile for the United States. (We covered the hardware issues relating to managing that stockpile a few weeks ago, and now we are coincidentally talking about separate but related software issues.) The researchers who have formalized and quantified the growing concerns that many in the HPC community have talked about privately concerning Fortran are Galen Shipman, a computer scientist, and Timothy Randles, the computational systems and software environment program manager for the Advanced Simulation and Computing (ASC) program of the DOE, which funds the big supercomputer projects at the major nuke labs, which also includes Sandia National Laboratories and Lawrence Livermore National Laboratory.

The report they put together, called An Evaluation Of Risks Associated With Relying On Fortran For Mission Critical Codes For The Next 15 Years, can be downloaded here. It is an interesting report, particularly in that Shipman and Randles included comments from reviewers that offered contrarian views to the ones that they held, just to give a sense that this assessment for Fortran is not necessarily universal. But from our reading, it sure looks like everyone in the HPC community that has Fortran codes has some concerns at the very least.

Employment Is Job One

Just for fun, we broadened a set of search queries on Indeed.com that they did at the beginning of the report to provide some context for the skills shortage that the DOE labs that still have legacy Fortran applications are facing four decades after Fortran was a core part of an engineering education and was not unfamiliar to computer science graduates, either.

If you go to Indeed.com and scan for Fortran jobs today, you will find 1,050 openings in the United States, Even COBOL has 1,228 jobs. This is in stark contrast to C++, which has 32,896 job openings, with C/C++ being mentioned in 15,192 openings. Java has 54,838 job openings, and Python has 83,591 openings. Only RPG, which is the analog to COBOL on the Power Systems platform running the IBM i operating system, has, at 659 openings, fewer jobs than Fortran. (And considering that there are still 120,000 companies worldwide using that IBM i platform, that says more about the constancy of the RPG programmers and their companies than it does about the strength of the market.)

Perhaps this is the case with the Fortran programmers of the world, too. But we can assure you that the RPG shops of the world are having their own skills crisis as experienced programmers start retiring – or getting sick or dying in some cases – and will do so at a growing pace in the coming years. Fortran is having no easier time, and neither is COBOL.

The good news for some HPC simulations and models, both inside of the National Nuclear Security Administration program at the DOE and in the HPC community at large, is that many large-scale physics codes have been rewritten or coded from scratch in C++, and moreover Python has become the dominant language for analysis applications – just like in the world at large. There is still a large pool of Java programmers who work on system and application programs written in that language, but any time you need performance, you usually don’t choose Java. If some Java application or framework is important enough, then it is often ported to C++ for performance reasons. (Java is too far away from the iron, even if it is in theory easier to program.)

The skills issue with Fortran is apparently not just about learning Fortran, but more about being associated with Fortran and all of the legacy baggage that has given its vintage and its low marketability going forward.

“It should be noted that training staff in the use of Fortran is not a major challenge if the staff member has sufficient experience in another programming language,” the Los Alamos researchers write. “Attracting (and retaining) staff in these large Fortran projects may prove more difficult. It is also possible that as the pool of Fortran developers continues to decrease, the demand for this skill set on legacy code bases across the industry will remain flat for quite some time, meaning increased competition for the relatively few developers with deep Fortran expertise. This has the potential to further erode retention and our ability to compete on salary.”

This is a different problem from the technical development of Fortran itself, which was explained well in the State of Fortran 2022 edition published by the IEEE last March, which you can see here. But even this report admits Fortran has its issues, and outlined them thus:

“First, the lack of a standard library, a common resource in modern programming languages, makes mundane general-purpose programming tasks difficult. Second, building and distributing Fortran software has been relatively difficult, especially for newcomers to the language. Third, Fortran does not have a community maintained compiler like Python, Rust or Julia has, that can be used for prototyping new features and is used by the community as a basis for writing tools related to the language. Finally, Fortran has not had a prominent dedicated website – an essential element for new users to discover Fortran, learn about it, and get help from other Fortran programmers. In the same way, Fortran is no longer widely taught to university students or valued as a useful skill by industry. As a consequence, adoption of new users has been stagnating, large scientific Fortran projects have been migrating to other languages, and the communities of Fortran programmers remained scattered and isolated.”

And as a consequence, as the report from the Los Alamos researchers points out, in many cases the DOE’s HPC centers may be the only customers pushing for Fortran support on future dataflow engines or other novel architectures that might come along. This is the real crusher. The sense we get from both reports is that the lack of a standard library is not so much of an issue when it comes to CPU-only parallel processing, where Fortran works well. The support for GPUs accelerators is weaker and fragmented, and the researchers called out the current “Frontier” exascale system at Oak Ridge National Laboratory and the impending “El Capitan” exascale system that is set to go into Lawrence Livermore National Laboratory, whose software ecosystem supports C and C++ applications a whole lot better than it does Fortran applications. “Multiple competing standards for Fortran-based GPU programming with varied levels of robustness and support exist today (Fortran OpenMP Target offload and OpenACC),” the researchers write. “Neither of these technologies is robustly supported on the AMD GPU (MI250) today.”

To be fair, that is why Oak Ridge and Lawrence Livermore got access to cheap flops – so they can help fix the software issues. Hence, the Exascale Computing Project in the United States is working on Flang, an open source Fortran compiler based on the LLVM compiler toolchain, which is a kind of middleware that sits between a compiler front end and a compute engine instruction waiting attentively for work at the backend. Sorcery, now a part of Siemens Software – yes, a division of that German industrial giant – is working on OpenMP targets offload and OpenACC backends for the gfortran compiler in the open source GCC compiler stack that is often affiliated with Linux. “Both efforts are largely reactionary, due to poor community and technology provider support for Fortran on advanced technologies,” Shipman and Randles say in the report.

Things often are would be our observation. The question is will the reaction be strong enough to overcome inertia. . . .

Here are the seven risks that Fortran faces in HPC as Shipman and Randles see it:

  • It is very likely that we will be unable to staff Fortran projects with top-rate computer scientists and computer engineers.
  • There is an even chance that we will be unable to staff Fortran projects with top-rate computational scientists and physicists.
  • There is an even chance continued maintenance of Fortran codes will lead to expensive human or financial maintenance costs.
  • It is very unlikely that codes that rely on Fortran will have poor performance on future CPU technologies.
  • It is likely that codes that rely on Fortran will have poor performance for GPU technologies.
  • It is very likely that Fortran will preclude effective use of important advances in computing technology.
  • There is an even chance that Fortran will inhibit introduction of new features or physics that can be introduced with other languages.

“Our assessments lead us to the view that continued use of Fortran in our mission critical codes poses unique challenges for LANL,” they say, and indeed for any other HPC center that relies on Fortran. “While Fortran will continue to be supported at some level, particularly on CPU-based systems, the outlook for advanced technology systems is dim. The ability to leverage broader and more modern open source technologies/frameworks is unlikely, increasing the cost of new physics and feature development.”

While not bleak, that assessment is also not encouraging. But, maybe it will encourage the powers that be, who control the purse strings, to invest more in Fortran tools and Fortran programmers. This will be a heck of a lot cheaper than porting all of those applications. But, in the longest of runs, maybe that will be necessary anyway as Fortran programmers age out and the pool of programmers who can – and will – take on NNSA codes and not only master them, but enhance them, shrinks.

Our observation would be this. Back in the early 1990s, it was not precisely cool to be a nuclear weapons designer. But the Comprehensive Test Ban Treaty of 1996 and the massive investment in simulation and modeling, coupled with the test data from over 1,000 nuclear bomb explosions, gave a different set of people new tools to tackle a bigger problem, and it attracted some of the smartest minds on the planet to design systems and software to take on the simulation task. Fortran needs something to make it cool again, something more than nuclear weapons, which for a lot of the youth these days is not a plus, but a minus. It certainly was for us, which is how we ended up writing The Next Platform, among other things, and coming to a kind of détente of our own that these weapons do exist and they do perhaps provide some kind of deterrent to World War III. If you want to sell it to this generation, that is how it might be done. As crazy as that might sound.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

47 Comments

  1. I was under the impression that fortran-lang.org was a page entirely dedicated to FORTRAN and dissemination of information and standards. As for standard tooling, this does not seem to hurt the C/C++ case. Not familiar with HPC world, but on Linux, Sun and IBM I can build FORTRAN with GNU and vendor tools and use CMake to manage projects.

  2. The idea of a valuable legacy worth passing down from one generation to the other requires forming the next generation in a way that can appreciate the efforts and fruits of the previous. This is easy with a legacy of gold coins–less so with Fortran. Are the people who sold the family farm because it was more fashionable to live in the city the same who tossed out Fortran for lack of curly braces?

    Just like the Cray-style vector processors went out of style after years of dominance, it may happen that GPU offload technologies will go out of style in favor of something else. For example, maybe the evolution of the Fujitsu a64fx and the Intel Xeon Max will lead to CPUs that run Fortran faster than the GPUs can run CUDA or HIP. Could the Vector+1 extensions to the Power ISA being proposed by Red Semiconductor coupled with on-package high-bandwidth memory turn out to be an even better fit for Fortran?

    It’s not clear to me whether adding language features built around GPU technologies would be consistent with the longevity goals that Fortran codes have enjoyed so far. I also wonder if CUDA programs will even be useful after another 10 years.

  3. I genuinely liked Fortran when I learned its basics. However, the lack of job openings concerned me even though I’m a Rust developer. I couldn’t find big active open-source projects in Fortran so I couldn’t see how it’s actually used.

  4. Fortran was the first language taught in my first computer science classes (both in High School in 1974, then in college Fall 1975).

    I never ended up using Fortran professionally though…My initial decade as a programmer was more various assemblers then on to C.

    Perhaps if I had gotten a job at the NASA facility that I drove by commuting to college every day, Fortran would have had a bigger impact on my programming career.

    The last 10 years of my IT career, ending in 2021 at age 64 as a Linux systems admin of 5,000 servers, was mostly shell scripting along with a bit of Python.

  5. Fortran was (and is) not used just for nuclear weapons! There is plenty of Fortran code still around for quantum physics, protein modeling, climate modeling, astrophysics, fluid dynamics, etc. Admittedly most of it is legacy code, even though still widely used, and newer projects tend to use other languages, but the notion that Fortran needs something “other than nuclear weapons” to make it cool doesn’t make sense to me. It has and always had other things to do. Now, whether Fortran can ever be made cool is a separate question. 🙂

      • There are lots of nuke people. Galen and Tim are not in charge of nuclear weapons, or anything close to that. They are computer people in support of the mission software activities.

        You’re making a mountain out of a mole hill. You should interview somebody who is actually responsible for the nuclear mission software activities.

        • I don’t think I ever implied that they were anything other than software people supporting the mission. As for mountains and molehills, I reported on what they said and added some references to another report as well as some ideas of my own about actually learning Fortran a long time ago and thinking it was pretty cool and was surprised it was not really required the same way as when I was in school.

          • I liked this article; kinda like a walk down memory lane … and it talks about both the folks that are con-FORTRAN (Shipman, Randles) and those that are pro-FORTRAN (kedward, Aradi, … — “the State of Fortran”). Worryingly, Table 2 in that paper (the pro-one) shows that there’s only one free-to-use multi-platform compiler that fully supports F2018 (and F2008): Intel Classic ifort. Other compilers seem to have stopped their full support at F95 and F2003. Still, I think it would be nice to interview some mission software folks and see what they think of the situation (I’m intrigued to know their perspective, if any — interviews are cool 8^b).

  6. Funny, with that name of yours I always had you down as a Brit…

    We weren’t far apart then: in 1980/81 I took a quarter of evening classes at a community college in BASIC programming on a TRS-80 as a senior in High-School (South-Western Ohio), then did another in Fortran and Cobol on an IBM mainframe with these wonderful 3270 terminals. Paper punchers were still in the same room and I did take home both card stacks and tape copies of my collected works before returning to Berlin: I was there as an exchange student.

    I revisited Fortran two times after, one was an industrial project re-factoring an industrial control system from PDP/11-05 machines (probably assembler originally) to an LSI-11 based variant using Fortran-77. Then I worked for many years with a team of meterologists who had vast libraries of code simulating the effects of smog and pollution under the influence of humidity and sunshine using Navier-Stokes equations for the chemistry and much simpler statistical models for the air-flow in box models. There my main job was to find an CDC 6000 replacement which turned out to be a series of 80386/80486 computers with the faster Weitek 1167/4167 co-processors and a matching compiler on ISC System V Release 3.

    I’d been running Microport Sys V.2 on my 80286 even before that (including a DOS box!), and it’s been mostly x86 since, with a heavy emphasis always on infrastructure, high-availability but also HPC. There was some early work with Suns and mainframes and I’ve always tried to bring their best qualities (the network is the computer and virtualization) to x86 infras.

    I did a lot of coding early, and during my computer science studies, especially for the thesis, but for some reason my professional career was far more in operations, where I kept reading code and fixing bugs made by others, but wrote ever less code.

    Much later in my career I was still dealing rather indirectly with a huge body of software written in Cobol, but there it’s main effect was that I could talk with those people who maintained it on a mainframe: I’d been there, done that too, I knew the lingo and I could look at the code, which very often produced the interface files that caused us grief on the C++ or SQL end of things.

    Today my coding skills are probably as good as my Latin, which has completely rusted out while I became near native in four modern human languages, having to become fluent in French in my fifties and in Spanish in my thirties. The worst issue is that for a long time I could never decide which one (programming language) to use as a primary… there were so many and I liked most of them better than my first three (BASIC, Fortran & Cobol).

    I loved Prolog, ML and Occam, but they were impractical as daily drivers. I also liked Lisp, Scheme and Smalltalk, but again they were far from mainline. I did lots of Turbo-Pascal at one point, then it was mostly ANSI-C, but when it came to OO, far too much really badly written C++ code turned me off and then there were 4GLs or simply others to write the code.

    I still enjoy reading good code almost as much as I enjoy reading good baroque sheet music, …doesn’t mean I can do it myself.

    • My many lines of forebears was thrown out of several European countries in the 1600s and 1700s and came here on various boats. I’m as American as they get. But still mostly Anglo-Saxon with some Irish, German, Dutch, and Italian thrown in for good measure.

  7. “The nerds all learned to program Fortran, which was two years younger than COBOL, which came on the scene in 1959,” you seem to have inverted history. The original Fortran predated the original COBOL. yes, COBOL was available in 1959 (leveraged Hopper’s Flowmatic) but the original IBM Fortran was from 1957 making it *older* not younger than COBOL (and also Algol).

  8. 20 years ago I was at NCSA, and disturbed to learn that over half (and growing) of the code was C++, even though FORTRAN ran at that time about 30% faster. (Caveat: I was in a compiler group and the gap may have closed since then…)

    From my perspective, that was like funding a full machine costing many hundreds of millions of dollars, and throwing away a third in pursuit of ease-of-programming. But that was, and likely still is, the reality of software development.

    The only option given that is to simply fund a rewrite of the nuclear sim codes.

  9. The reason that you see it mentioned in the OneAPI course is because Intel rolled the Intel Fortran Compiler into the OneAPI toolkits.

  10. I am an old FORTRAN programmer and biochemist who worked i the field of bioinformatics and mathematical biology.

    The concern elsewhere in the computer programing community is the human programmers will be replaced by Artificial Intelligence (AI). Here is an example of a FORTRAN program written by ChatGPT.

    Query to ChatGPT: Write a computer program in FORTRAN that calculates the particle energy density (EPD) in a fission explosion where EPD=(f/2)nkT, where n is the particle-number density, T is the temperature, f is the number of degrees of freedom, and k is the Boltzmann constant.

    Answer from ChatGPT:

    Sure, here’s a sample program in FORTRAN that calculates the particle energy density using the formula EPD=(f/2)nkT:

    program ParticleEnergyDensity

    implicit none

    ! Declare variables
    real :: n, T, f, k, EPD

    ! Initialize variables
    n = 1.0E24 ! particle number density in particles/m^3
    T = 1.0E7 ! temperature in Kelvin
    f = 3.0 ! degrees of freedom (assuming a gas)
    k = 1.3806E-23 ! Boltzmann constant in J/K

    ! Calculate energy density
    EPD = (f/2.0) * n * k * T

    ! Print results
    write(*,*) “Particle energy density: “, EPD, ” J/m^3″

    end program ParticleEnergyDensity

    In this example, we assume that we’re dealing with a gas, so the number of degrees of freedom is set to 3. You can modify the values of n, T, f, and k to match your specific scenario. The program calculates the energy density and prints it to the console.

    • John (interesting!) — is your perspective that “wow this is great” or “this sucks because …”? I’ve cut-pasted the code in an editor, switched the double-quotes back to ASCII (normal ones) in the write statement, and saved the result as a file: particle.f95. Then: gfortran particle.f95, and then ./a.out gives:

      Particle energy density: 207090000. J/m^3

      is this witchcraft good, bad, or ugly? (… as usual: asking for a friend 8^p )

      • Hubert, If I understand the point of this article by Mr. Timothy Prickett Morgan is that the field of nuclear research and bomb development drove the development of scientific languages and FORTRAN in particular, FORTRAN is still used, and there is a shortage of FORTRAN programmers. The point I am making is that with AI one does not necessarily need more programmers as the new AI systems like ChatGPT can write the software for you. Notice that the query I posed to ChapGPT was minimal and ChatGPT seems to understand a lot about the physics. For example, it know the units of measure for each of the variables in the equation. I did not tell it that. The old adage that computers are stupid and programmers must tells them everything is no longer true. AI can figure things out for itself and produce valid code without being told everything.

      • Hubert, to answer your question more directly, it is good because it allows scientists (in this case nuclear physicists) to ask AI like ChatGPT to write code for them. The physicists themselves do not need to be computer programmers. That may put computer programmers out of work, but again there is a diminishing number of FORTRAN, or Fortran 2003, or F programmers out there.

        • Thanks for this perspective! We were talking about it around the lunch table today, and one French engineer essentially agreed quite unanimously that GPT produced good witchcraft, but the French postal worker, retired teachers, tax officer, and criminologist, expressed nuanced enthusiasm, being positive mostly if new AI tools meant they could work fewer hours per week, and fewer years overall, leveraging the augmented productivity that the tech promises, to enhance their quality of life (without layoffs or salary cuts). For the DOE nuclear bomb Fortran at play here, I guess it would be valuable for the Fed boffins to get student interns piloting GPT code production over the summer (with various levels of complexity), test it out against their existing codebases, and report back on the level of success of the enterprise. A win-win as students (the Nation’s future) would get simultaneously trained in GPT/AI, Fortran, and nuclear bombs!

    • Your example is enlightening, but why have AI produce the code of old languages? Why not put AI to the task upgrading the body of work in COBOL and Fortran to newer languages? Not one of them, but all of them. Then they could all be compared for results. The dwindling number of programmers could be the ones who test these programs and certify them for replacement. No one has to lose their job. No one has to wring their hands because of lack of programmers. My two cents.

      • Jim, you are correct that I could have asked ChatGPT to write the code in a variety of languages and for a variety of operating systems. Furthermore, the knowledge experts in the domain (such as nuclear physics) will still need to test the code to validate its correctness. I believe the point of Mr. Timothy Prickett Morgan’s article is that the field of nuclear physics needs more FORTRAN programmers specifically. Many scientific fields still use FORTRAN and specifically High Performance Computing (HPC) Fortran such as weather modeling and rocketry. It would be easy to reproduce the same result asking ChatGPT to write the code is C, C++, C#, or any other language.

        • Thanks John, but I’m talking about breaking the cycle of running out of programmers for older languages. Otherwise, we’ll still be talking about this in 20 years. I think Ondřej Certik is moving in the wrong direction though it may keep him in his comfort zone to stick with what he knows. There are many methodologies for transitioning from legacy code and you have to start sometime. AI has just made it easier.

          • And just today (May 9), IBM released fm.code for this purpose, under the watsonx.ai brand (possibly in partnership with Meta), based on some foundation model (fm), but apparently distinct from LLM. Hopefully a trustworthy very-high-level language approach to programming!

  11. The heavy use of legacy technology; mainframes, FORTRAN, COBOL etc. in government is a big concern. In corporate world, this would be seen as technology debt. Yes, there are no parallels for computational speeds that FORTRAN offers. However, lack of talent is real. It’d be great to focus on developing translators to newer languages and creating performance enhancements for math libraries for these. Even though these programs in FORTRAN and COBOL are stable and in use for a longer time, they create a challenge for modifications, I am sure, a likely factor in resistance to the changes in tax codes and closing down the loopholes.

  12. I learned BASIC on a college HP mini computer back in the late 70s. I then realized that I didn’t have to pay for college, I could just buy the textbooks and teach myself. So I dropped out, bought a WATFIV book, which I still have, and taught myself. I then got a clerical job and used tuition reimbursement to learn cobol. A few years later I got my first programming job and learned assembler. Many years later, I learned C, C++, Java, PERL, and heaven knows how many other languages. The point is, it’s really not that hard to learn a new language once one learns loops, conditionals, recursion, and many other patterns. It’s just a different syntax.

    Fortran brings with it some very powerful mathematical processing capabilities that are much more difficult to understand than the code needed to use them. It’s start finding mathematicians and teaching them fortran rather than the other way around.

    None of this is as hard as some make it out to be. Smart and clever people will fill in any gaps once the pay is commensurate with the need. It’s called capitalism and supply and demand will provide the resources.

    If they can afford them ..

    • Ayup. I was a math major who was a self-taught assembler coder on a piece of offbeat hardware (programmed by flipping switches), then learned BASIC and FORTRAN which ran on a CDC-6400. WOW, whatta change!

    • John, you wrote, “Fortran brings with it some very powerful mathematical processing capabilities that are much more difficult to understand than the code needed to use them.”
      I learned FORTRAN-66 as my second computer language, and later, FORTRAN-77. Which “powerful mathematical processing capabilities” were you writing about? Are these some things added in later versions of FORTRAN? I know that object capabilities have been added to both modern FORTRAN and modern COBOL …

  13. I’m a former graduate of the Penn State Aerospace Engineering program (’06 BS/’08 MS). While I don’t recall any course options for formal training in Fortran, there was one course taught by Dr. Barnes W. McCormick which required submitting homework written in Fortran, specifically, FORTRAN 77. This required students to learn it to a usable degree, which most of us did during the course. Dr. McCormick was born in 1926 and was set in his ways–I later came to understand and even respect the Fortran requirement. The aerospace engineering graduate degree programs also offered a minor in high performance computing. Graduate students had lectures and lab experience with the HPCs on campus. I believe at least one exercise used the Fortran language and other niche things I’ve forgotten.

    While Intel’s ironically-named OneAPI promotes Data Parallel C++, the software stack is just rebranding an amalgamation of many of the legacy libraries and tooling which Intel has supported for many years, including Intel’s Fortran compiler. I’m quite certain that modern HPC systems also support the GNU Fortran compiler, including Penn State systems. I suspect the curriculum simply neglected to mention the GNU and/or LLVM toolchains.

    The DOE, as discussed in their report, clearly does have an issue. It seems to me that many engineers younger than me do not have the same skill set I had out of college. The report reviewers mentioned Fortran wasn’t hard to learn, but people won’t use it at a deep level and see it as an low marketable skill.

    While I was on the OpenSHMEM specification committee, we had decided to deprecate support for the OpenSHMEM Fortran API (version 1.4). There was no one on the committee with the depth of Fortran experience necessary to develop and maintain the Fortran API, let alone the Fortran reference code. We proposed legacy Fortran would be supported through C bindings and I recall only a single voice was reluctant. So something was lost there, but it’s hard to say that it could have gone any other way.

  14. I am fluent in (ICL 1904) Fortran, but I’m 70 years old and retired. It was the first programming language I learned. It is, was, and always will be a garbage language.

  15. I took my boy to his first day at College, then I wandered around the math wing to see what’s what. I found the study area which literally had a stack of old textbooks for give-away, and nobody touched them! They even had a late edition of Swokowski’s Calculus, published a bit after he had already passed away (1990s), but hey: newer than my copy.
    Lo and behold, a ca.1985 copy of Fortran77 by Borse! Soft cover, sure, but brand-new condition and drawing absolutely zero interest. Of Course I snapped em up.
    Good reading, good times!

  16. Nuclear weapons are a ‘deterrent’ when they are in the hands of human beings. As we have seen from the psychotic mass murders that Americans committed against the Japanese people just for fun, Americans are not capable of responsibly holding nuclear weapons, like human beings are. We will take the weapons from you, and then suppress your murderous instincts.

  17. FORTRAN on a CDC 6600 was my first language – used for astrophysics work back in the glorious 60s. As with all compiler languages, its vital for the coder to understand features of the FORTRAN compiler – and having a clue about math is helpful too. Otherwise results can lead you down a very strange path.

  18. Judging by the relatively large response to this article, maybe you should start a companion blog called
    The Previous Platform. Old gits like me could weigh in on all sorts of things.

    More seriously, the FORTRAN problem in HPC is just a special case of the general problem of keeping old technology going. It doesn’t matter whether the “old” thing is still good or not. It is the perception that it is *obsolete* that repels people not near the end of their careers. Once that perception has taken hold it is well neigh impossible to undo it.

    I’m not sure what the HPC community should do. The technology exists to mechanically translate from one programming language to another (I’ve built such a beast). It can be done in a way to preserve the semantics of the original application; that is, produce the same results given the same inputs. However, if the original program has accumulated a lot of technical debt (i.e., it’s ugly), you end up with an ugly program that happens to be in a new language. If the original was highly tuned (a given for HPC), you end up with a slower, ugly program that is really the same as the old one, just with a new “skin.” However, the process of translation (really, static analysis) would still be useful. A “good enough” result would provide the basis for moving forward. A “bad” result would tell you the application must be replaced with something entirely new. I hope they’ve at least considered this approach.

    • Well, I tend to keep all of my publications alive as long as I can, and have been publishing The Four Hundred, aimed at IBM’s minicomputers (yes, I still call them that inside my own head) for 34 years now. And I think The Previous Platform for Gits is brilliant. You wanna contribute? I mean, more than just this comment.

      • Didn’t they make a movie about The Four Hundred? Oh, wait, that was something else.

        I do have a serious interest in computing history. The Previous Platform could be a set of retrospectives that put today’s computing landscape into perspective. It could serve an educational purpose. Alas, I think it would take Herculean levels of moderation to keep it from turning into a bunch of us telling war stories, some of them even true. Come to think of it, “On Call” and “Who, Me?” at The Register have that covered pretty well…

  19. First of all, the article (and a lot of people reading and replying) don’t seem to know and understand the difference between FORTRAN and Fortran. They all do not know that the later is “free format” code, not with the restrictions that FORTRAN had with the strict adherence to 80 column punch card format.
    Also, there are standard libraries for really standard problems in Fortran, but more specific for the area where it is being used. Plain mathematical problems require different focus than engineering problems. Or astronomical calculations.
    NAG, LINPACK, IMSL, SOFA or CERNLIB are just some of those standard libraries, most of them available both in FORTRAN (77) and in Fortran (95 and newer). And by and large, FORTRAN/Fortran is much more compatible between different compiler implementations that is the case with most other compiler languages, including COBOL. The solutions are out there, you just have to look in the right place…

    • Well, I’m glad that, unlike PYTHON, they’ve let go of the strict “column punch card format” (a real nuisance)! But I also thought they had moved to StUdLyCaPs and “FoRtRaN” (for greater visibility)! (eh-eh-eh!)

  20. Fortran did go through a flurry of pre-processors like Ratfor (Software Tools, Kernighan, Plauger, 1976) which turned FORTRAN into a dialect of C. Ratfor flourished until about 1985 when it slowly faded.

  21. Thanks to Timothy Prickett Morgan for this important FORTRAN article and for his many important publications nad contributions, including ITJungle.
    I started at IBM as a Systems Engineer in 1962 and we were expected to be able to directly support programmers at many IBM customers in many programming languages, including FORTRAN (on the IBM 1620), COBOL, Assember, PL/1, and RPG, after supporting them in Unit Record (punch card machines like the IBM 402 qand 407), and Autocoder and SPS (Symbolic Programming Language) on the iBM 1401 and 1440 computers.

    Mostly all out of the IBM reference manuals after perhaps a one week IBM school, and with essentially no help (no internet) at the customer location.

    Today, at age 84, I still write IBM i RPG and IBM i COBOL, and (I guess) IBM i C++ (Fully Free RPG) from home on a $250 a month Cloud computer which is a million times faster, and increibly cheaper and easier, than the original IBM computers of the early 1960s.

    TPM is very effectively documenting that history as well as the future in the Next Platform.

  22. I certainly like the PastPlatforms idea, got plenty to write about, not so sure many would want to read. A bit like ‘Something for the week-end’ on the Reg, I guess.

    What I find missing from the discussion is the culture aspect of Fortran. Fortran was the very first attempt to do better than machine or assembly language, which was still a lot better than the true Harvard design with external sequencing logic.

    The entire discipline of software engineering hadn’t been invented yet when the first generations of Fortran programmers went to work.

    And those guys were physicists, mathematicians, engineers, some of them even linguists (ok, those Chomskys might have been the first to defect to Algol camp): they wanted to solve a problem, they were no Donald Knuth, no Edsger Dysktra, or any of the other Gods who had invented computer science by the time I studied it twenty years later.

    (Funnily enough those engineers and scientists seem to have gone from Fortran straight to Python…, sorry I couldn’t help myself)

    Looking at the posts here, “Fortran is so much faster” is still a recurring theme, yet since I spent at least some of my many years at University generating compilers, I was inclined to just scoff at that sentiment: by the time the code generator goes to work the original programming language is typically long lost and the GCC or LVM back-ends will generate the same efficiency for any binary, no matter what the original imperative language source code was. There is no inherent advantage to Fortran over C or ADA in terms of generated code (can’t blame array order for everything), we’re not talking Python here.

    But then I remembered how we got imbued with a love for recursion, for pointers, trees and all kinds of data structures far beyond arrays, common blocks or even Cobol records in our computer science classes. And I do recall, that I tried to trick that Fortran-77 compiler on the PDP-11/24 into recursion: It would catch any direct attempt, but all it took was an intermediate proxy function to see how quickly the stack on a (16-bit) PDP-11 would bomb.

    The elegance or even efficiency of coding in Pascal or Modula, perhaps even Lisp drove our intellectual development, not finishing a science or engineering problem, before our batch job got kicked for running out of the allotted compute slice.

    C programmers ran wild with pre-processors, Ratfor seems to have turned everyone into a language designer for some time, with code that’s perhaps deservedly unrecoverable even by the best AI. Some C guys went crazy with bit masking and shifting and outworldish optimizations that were unfortunately impossible to understand a few days later. Some of my dearest colleages spent hours trying to save every bit and every statement, others functionalized every little action, but both never went through the trouble of reading the generated machine code, which sometimes anihilated all their effort, sometimes just inlined or turned a tail recursion into the loop it was meant to be. It’s funny to hear that common sub-expression elimination is a no-no on CUDA cores, because VRAM is so much slower than re-computing in the register file…

    But I also remember some C-code that needed to do some bit masking for ISO 8583 messages and used logarithmic functions to get there: I immediately knew it could have only been programmed by a guy with a physics PhD! (he then went on to do a bubble sort on data which came from a group-by SQL-query, so I had him fired as an unmanageable risk).

    But with Fortran it must have been rather similar, every science discipline and school probably developed their own style and some of the best known and most widely used libraries where those you never had to look into: did I mention that very few programmers, scientist or engineers were touch typists and liked to tire their wary fingers on comments? There is a reason why most variables in Fortran are single letter and it’s not just because Fortran-60 didn’t allow them longer (or later variants only looked at the first six letters).

    Reverse-engineering the algorithmic idea in the original author’s head from the written code might be hard for the author himself, if he (or indeed she) was still alive. And then I’ve met far too many students even in my computer science classes, who believed their testing experiments more than my inductive deliberations on the correctness of their code.

    I’ve heard from HPC Fortran code that survives, mostly because the human community around that code is still active, brains can still be picked and pickled into some degree of refactoring. But there is a lot of code out there that needs to be actively retired before it harms someone, very little chance it will tell on its own.

    • Good heavens, I sure do love reading these kinds of comments. Many thanks to you all who took the time, and continue to do so on other stories. This is a collaborative effort, and I appreciate you all greatly.

    • Towards the end of my first decade as a C programmer I was hired into a group writing military simulations in Fortran in the 80s. They were dependent on a huge legacy of Fortran code and the math libraries available. They were porting from VAX to Sun workstations. But their simulation required a specialty UI developed for basic menus and command interface and it did not exist for the Sun systems. They would have to hire the author of that system to come out of retirement and port his code. I looked at the problem as a C programmer and scoffed at the need for this. I told my boss that I would drag the group into the 70s and write an open source X motif wrapper for their menu system. I needed C routines to link the motif code to, and so I started trying to link the Fortran binaries with my C UI routine. I used the C-preprocessor to read the Fortran program’s common block and generate a matching C structure that was an overlay of the internal memory of the Fortran code and allowed the C wrapper to communicate with the Fortran code. I then added a Motif menu system that could read the old definition language they used for their previous menu systems. Tada! Drop in menu replacement using their legacy product and code.
      I thought that was cool but my boss was stunned when he figured out I was using code, to generate code on the fly (Compile time). From then on my little kludge was an advertised new feature of their 4th generation code.

  23. I can agree hiring has been painful. I get asked by junior (usually self-taught) devs “should I look into Fortran” which leads to a discussion about there being some money in the Fortran niche but at a cost to your training time, so set your rates accordingly.

    I think the tooling story around Fortran is the worst I’ve seen. Tooling has made the Fortran experience stand out as much more unpleasant, even hostile, than most of the other languages I have worked with and I think this contributes to new devs avoiding it once they come into contact with it. The [major vendor] Fortran compiler integration with Visual Studio has been fragile and grueling. If a shop has a big enough Fortran base that they don’t want to just port over, then the code is big enough to warrant tools to navigate, analyze, refactor, debug and test it. None of that analyzes across projects. Yet, it seems there is resistance in the broader Fortran culture and community to advocating for tools, IDE’s, and doc tooling on par with other languages. This has been along the lines of “I don’t need all that popping up in my way.” and “you just need to become a good enough programmer to hold it in your head.” Some of us ask for improved Fortran tools but tool vendors start to see it as niche AND get told the programmers don’t want it, anyway.

    I enjoy visiting the renaissance fair, and I know how to knap flint… I had a point here…which is lost now that I realize I am hand-crafting Fortran manipulation tools while I am watching for further reports from my devs that the Fortran debugger slows down and displays incorrect info so I can file a ticket with the vendor. Ah! I don’t want to live in primitive times, knap flint, or write programming language tools, especially for Fortran. Yet, here I am. Vendors are bailing. C++ is a terrible language to parse and analyze and we are awash in tools for C++.

    We have met the enemy, and they is us.

    • I, too, have two bows, two swords, and know how to knap flint. Perhaps we don’t trust civilization as much as others? HA! Or we are just retro and curious. Or maybe a little of all of that.

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.