Bio-Inspired Flight — Who Is Air Force Basic Research


Technological advances are constantly increasing human potential for developing very small things. For the US Air Force this means revolutionary designs in future air vehicles providing war fighters with tools that enhance situational awareness and the capacity to engage rapidly, precisely and with minimal collateral damage. When it comes to improving flight mechanics in these vehicles what better place to look for inspiration than bats, birds or bugs? These natural flyers have been perfecting their flight techniques for millions of years.

In this video, meet the researchers AFOSR is funding to develop designs for flight vehicles that will have revolutionary impacts on the future Air Force.

Basic Research at AFOSR: Ensuring Our National Security & Making Our Lives Easier


What is Basic Research?
Basic research is the foundation of all scientific and engineering discovery and progress.  It is what leads to new inventions and concepts—many of which are revolutionary.  And the great thing about basic research is the mystery of it: while basic research investigators may start off trying to prove a particular theory, many times they end up going off in an entirely new direction, or their results are ultimately employed in a dramatically different way than they initially envisioned.

Where We Came From & Why Basic Research is Important
Dr. Vannevar Bush, the Director of the Office of Scientific Research and Development during World War II, was the first to formally address the issue of post-war defense research. His July 1945 report, Science, the Endless Frontier, clearly made the case for a civilian-based, and civilian controlled, research program. The leadership of the soon to be independent United States Air Force was also committed to a civilian, or extramural program, but one under their control, and followed through by establishing its own research arm in February 1948. The U.S. Army and Navy established their research organizations as well.  The Air Force, recognizing the importance of basic research, established AFOSR in 1951, dedicated specifically to mining the basic research talent in U.S. universities and industry laboratories.  Overseas offices were subsequently established to identify promising foreign research accomplishments.

How Basic Research Impacts You
One of the primary investigators whom we fund recently characterized the long term successful results of what we do as “the stealth utility of innovation.”  An example to make the point: as a laser expert, he noted that it was military basic research that funded the invention of the laser, beginning in 1951.  And he pointed out, that if all the lasers in the world stop working, the world would come to a stop as well. Lasers are at the heart of our time keeping, our transportation network (the Global Positioning System), our energy system, and in entertainment, finances and electronics applications. This singular “stealth utility,” that regulates much of our state-of-the-art technology, is the result of defense-funded basic research and is taken for granted by everyone.  It exists because AFOSR and our sister service organizations made the research possible—not only for our mutual defense but a wide variety of beneficial applications for society in general. In future posts we will discuss the reach and application of many of these “stealthy” discoveries that not only ensure our security, but work invisibly in the background of our society, making our lives a lot easier: from lasers to computers, from nanotechnology to aerospace, from bio-inspired devices to holographic displays, and what’s in store for the future as well. What technology could you not live without?

An Interview with Dr. John Luginsland: The Plasma & Electro-Energetic Physics Program Manager


We had a chance to sit down with Dr. Luginsland recently to learn about his program and the cool physics research he’s funding. As the manager for the Plasma & Electro- Energetic Physics Program, he oversees a diverse portfolio of AFOSR funded programs and finds the best of the best to fund.

Dr. John Luginsland, AFOSR Program Manager for Plasma & Electro Energetic Physics Position: AFOSR Program Manager
Program: Plasma & Electro-Energetic Physics
Years with AFOSR:  2 years, 7 Months
Society Memberships: The Institute of Electrical and Electronics Engineers (IEEE) – Nuclear and Plasma Science Society, American Physical Society  (APS) – Division of Plasma Physics, Society of Industrial and Applied Mathematics
Favorite Websites: Slashdot, MITs Technology Review, APS – Physics 
Presentations: Video of Dr. Luginsland’s Spring Review presentation.

What brought you to AFOSR?
I’ve been with AFOSR since December of 2009 but I like to say, “I’ve always worked for AFOSR.” Because actually my graduate work at the University of Michigan was funded by AFOSR. Then AFOSR funded my post doc through the National Research Council (NRC) post doc program at the Air Force Research Laboratory based at Kirtland Air Force Base. After that, I transitioned to a staff member of the Directed Energy Directorate of AFRL and worked in a lab that received basic research funding for a number of years from AFOSR. Later I went to industry for a number of years and in 2009 when my predecessor at AFOSR retired, the AFOSR Director at that time, Dr. Brendan Godfrey suggested I apply for the job and here I am.

How do you think AFOSR is different from other basic research organizations?
What I really like about AFOSR is that there’s actually a real tension between two missions. First and foremost, we’re a basic science organization so we find the best science we can and fund it. At the same time, we’re a mission research organization because we work for the Air Force so we have to simultaneously look for the very best science we can fund and also answer the mail, if you will, for the Air Force in terms of technology that will help the Air Force going forward. And I think that actually having to answer both of those questions simultaneously gives a degree of focus that the other funding agencies don’t have.

You’re the Plasma and Electro-Energetic Program Manager. What is plasma?
Plasma is the fourth state of matter: if you heat a solid you get a liquid, if you heat a liquid you get a gas, if you heat a gas you get plasma. Actually plasma is the most ubiquitous state of matter in the entire universe – 99% of the universe is in the plasma state, just not the part we live in here on earth.

How does electro-energetics fit in?
It takes energy to get into the plasma state, so often times we do that with electrical energy to go from the solid, to the liquid, to the gas, to the plasma.

My program is fundamentally looking at how do we make plasmas, how do we make them in energy efficient ways and then once you got something in the plasma state what can you uniquely do in that state that you can’t do in other areas.

Could you give us examples of how your program is benefiting the Air Force?
One major area that we fund is called directed energy technology. Plasma will let you access or make electromagnetic waves. So one big thing that we do is plasma physics that leads to radar sources and other sources of coherent radiation. Really high-powered electromagnetic sources actually create a plasma and then draws energy out of that plasma to make electromagnetic waves for radar – picking up airplanes, for doing electronic warfare, doing high-powered, long-range communications. All of that is based to some degree on plasma physics.

The other big exciting thing that we’re working on right now is trying to find good ways to decontaminate water. So as it turns out Fairfax County [VA] has two facilities that produce ozone and they do it through a chemical means. They do this at a city-block-sized facility and it actually is what purifies the water that we drink in Fairfax County. What we’re trying to do is actually shrink that block-sized facility into something that’s basically truck-sized and we’re using a plasma to do that. And this plasma, which is energetic in a way that chemicals aren’t, lets you make ozone much more efficiently and thereby use that ozone to clean up water and things like that. So you can imagine this is portable and could go into a forward operating base scenario, whereas the block-sized monstrosity can’t.

What direction do you see your program going in in the future?
The really exciting area that I think is starting to happen is that we’re starting to look at very small plasmas and what happens then is we start adding not just classical physics that we’re used to in the plasma physics community but start really pulling in quantum mechanics. That changes the physics associated with the plasmas and actually makes them quite a bit easier to make. So it takes less energy to make but we’re still getting all the benefits of plasma but it’s not requiring a block-sized thing to do it. We’re starting to do it in very small packages.

What is your process and timeline for choosing proposals?
So I always think it’s good for people to email me and have a quick email discussion sort of at the beginning of the calendar year after they look at the Broad Agency Announcement for details about what we’re looking to fund.

I look to get white papers in the late spring early summer say the May/June time period. You can submit them online.

White papers should be three to five pages long. I’d like there to be at least an estimate of the level of effort but for the most part what I’m looking for is what the unique science is you’re trying to do. What’s unique? What are you bringing to the portfolio that hasn’t been there before?

And then after that, I typically try and give feedback within a month.

I like to receive full proposals in the late summer – August/September – to try to get them in before the fiscal year rolls over in October.

I make funding decisions very early in the fiscal year – October/November.

Have a question for Dr. Luginsland? Please leave it in the comments below.

Nanotechnology: Moving Beyond Small Thinking


The recently published National Geographic special issue titled “100 Scientific Discoveries That Changed the World,” leads off with a research program that began in 1997 when we funded a Northwestern University researcher by the name of Chad Mirkin. AFOSR took a chance on a process called Dip-Pen Nanolithography (DPN), and what Dr. Mirkin himself noted, was “a far out idea and a paradigm shift in scanning probe microscopy,” but indeed, proved to be an idea that changed the world.

Highlighted in the Journal of Science, January 1999, DPN is a technology that builds nanoscale structures and patterns by drawing molecules directly onto a substrate. This process was achieved by employing an Atomic Force Microscope (AFM), the tip of which has the innate capability to precisely place items and draw lines at the nanoscale level. The AFM was basically an extremely small paint brush. Mirkin’s fundamental contribution was recognizing that it could be used to print structures on a surface through materials, rather than through an energy delivery process–the latter being the approach taken by all previous researchers.

DPN has led to the development of powerful new nanofabrication tools, ways of miniaturizing gene chips and pharmaceutical screening devices, methods for making and repairing photomasks used in the microelectronics industry, and high-throughput methods for discovering structures important in biology, medicine, and catalysis. Since 1997 Dr. Mirkin has authored over 480 manuscripts, holds over 440 patents and applications, and is the founder of four companies, which specialize in commercializing nanotechnology applications.

Professor Chad Mirkin recently spoke at two AFOSR events on the following topics A Chemist’s Approach to Nanofabrication: Towards a “Desktop Fab” and Nanotechnology: Moving Beyond Small Thinking.

A Chemist’s Approach to Nanofabrication: Towards a “Desktop Fab”

Nanotechnology: Moving Beyond Small Thinking


We met with virtual meeting expert, Matthew Bigman, lead analyst at VT-ARC, to share his tips and answer our questions on how to host and participate in successful virtual events or meetings.

Matthew Bigman has been with VT-ARC BRICC for five years – two of which working remotely from Alaska – as a lead research analyst, facilitating and running meetings and research projects. Without further ado, here are his tips and tricks for making working from home, well, work.

What kinds of things do you need to do to prep for a successful virtual meeting (testing equipment, etc.)?

If you’re participating in a virtual meeting, and haven’t tested the software/hardware before, join at least 15 minutes early to familiarize yourself wit the programs you will be using. Have everything you’ll need within arms reach before the meeting starts.

Etiquette for Virtual Meetings

What kinds of tools do you recommend for conducting a successful virtual meeting? (Note: this can be anything from having a good headset to software tools and a good support team)

As a good rule of thumb – have a good headset/webcam, a piece of software that lets you see and talk with participants and a program that lets you screen share important documents like PowerPoints and notepads.

Can you provide some tips on virtual meeting etiquette?

  • Keep your mic muted unless talking or intending to talk.
  • Recognize that the meetings are going to be different than you are used to and be open to new ideas.
  • Roll with distractions (to a degree), as in this home-work environment there are variables outside of individual’s controls (irritable children, fighter jets for team members who work on bases).
  • Pad your estimates for how long elements of a meeting may take in the virtual environment. When asking questions, take on sip of water to fill dead space and give people time to reply.
  • Try to break up the flow of your meeting every 15 minutes to maintain engagement.
  • Don’t eat on camera, even for what would normally be a lunch meeting.

What are some good practices for keeping people engaged during a meeting?

As noted earlier, a good rule of thumb is to change up the interactivity level or do some kind of activity like asking for a chat response, every 15 minutes, to maintain engagement. Ask individuals to keep their webcams on so you visual cues regarding energy and engagement levels. Use the tools in your chosen software, using digital whiteboards or breakout rooms, to also increase interactivity. And don’t be afraid to take breaks to give people a chance to collect their thoughts and reengage.

“Recognize that meetings are going to be different than you are used to and be open to new ideas”

What are your thoughts on virtual ice breakers?

Ice breakers have a poor reputation, but I believe in a virtual environment they are more important. Well-designed opening exercises provide a chance for participants to familiarize themselves with virtual tools, check to make sure their technologies are working, and provide interactive breaks and engagement. This can be as simple as introducing yourself in the chat or changing name tags to reflect goals for the meeting.

How many people does it take to run a large virtual meeting and can you give advice about logistics, e.g. do you have a person dedicated to facilitating, another to answer questions on chat, etc.?

At least two for large, interactive, virtual meetings, maybe three if you need a dedicated rapporteur. Typically, you want a lead facilitator running the meeting, keeping the agenda, and maintaining the agenda. The other facilitator will work to monitor the chat, moderate, and provide live high-level notes and recaps of major outputs and discussion points being generated or discussed. The rapporteur, if used, focuses on more detailed and precise notes.

How do you manage requests for information that come up during the meeting? Do you send people to a central portal or library?

Ideally, any documents for the meeting should have been linked or sent out ahead of the meeting. Otherwise, using a central database like a SharePoint or APAN site is the best way to share documents. The chat is a great place to place links, and documents, depending on your program. But much like your print out slides ahead of time, you likely want to share them too.

Virtual Meeting Engagement Tips

Do you record your virtual meetings?

That’s going to partially depend on the policies of your organization, but typically no. Some meetings, ones that are more presentations than interactive meetings, may be easily presented as a recording for those who missed the first meeting. But with a highly interactive event, with multiple breakouts, presenting organized notes would likely be easier. Finally, for events, like an unconference, which is a mix of the two, you may only want to record the key speakers.

What is the maximum time you would suggest a virtual meetings should last and how do you manage break times?

At the heart of the question, the answer is none per se. Any event, even an all-day conference, can be simulated in a virtual environment so long as you maintain a level of engagement and change up your interactions. Much like an in-person conference, you want to keep the audience engaged. It is a lot easier to sneak out of a virtual meeting if you’re unengaged though, so you have to work harder. As a rule, if you can change up the flow and interactivity level of your meeting every 15 to 20 minutes, you can maintain engagement and eyes on your meeting over the course of an entire day, with breaks of course.

Can you suggest some resources?

Official federal guidance –
Official state guidance –

Week in Photos: 12/31/17 – 1/6/18

January 2, 2018

Tulane awarded $3.67 million grant for quantum computing

Tulane University professor Michael Mislove has received a $3.67 million grant from the U.S. Air Force Office of Scientific Research to help develop cutting-edge technology related to quantum computing. The goal of the project is to develop tools and related methodologies for designing and analyzing programming languages for quantum computers, which are being designed to complete tasks and solve problems far more efficiently and faster than today’s computers.

January 1, 2018

Single metalens focuses all colors of the rainbow in one point

Metalenses — flat surfaces that use nanostructures to focus light — promise to revolutionize optics by replacing the bulky, curved lenses currently used in optical devices with a simple, flat surface. But, these metalenses have remained limited in the spectrum of light they can focus well. Now a team of researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) has developed the first single lens that can focus the entire visible spectrum of light — including white light — in the same spot and in high resolution. This has only ever been achieved in conventional lenses by stacking multiple lenses.

The power of one small basic research investment

By Molly Lachance and Brianna Hodges, Air Force Office of Scientific Research

A small basic research investment by the Air Force Research Laboratory, in partnership with the National Science Foundation, has created a community of origami researchers who in six short years have transitioned to working on applied technologies for the Air Force.

The story begins in 2012 when AFRL’s Air Force Office of Scientific Research partnered with NSF to develop the Origami Design for Integration of Self-assembling Systems for Engineering Innovation program. Over the years, NSF and AFOSR have invested nearly $28M in this program with the goal of creating the mathematical and material foundation for self-folding origami systems and commercializing the concepts. The origami research community received a big boost when Congress decided to include a $5 million congressional interest item in the fiscal year 2017 budget and made plans for a $4.8 million CII in fiscal year 2018.

AFOSR issued the funding opportunity announcement for the first CII and a team from Georgia Institute of Technology and Florida International University won in open competition. Concurrently, AFRL began taking notice of origami basic research as a promising concept for transition to Air Force applications and provided an additional $20 million of funding to a number of small teams around the lab. AFOSR stayed in the mix by managing that investment and creating a venue for university and AFRL researchers to collaborate and share knowledge.

Now six years after its initial investment, AFRL researchers are developing origami antennas deployable in space. This type of technology requires a multidisciplinary approach, leveraging the knowledge of original and new university partners as well as the expertise of AFRL experts from the Materials and Manufacturing Directorate, Aerospace Systems Directorate, Sensors Directorate, and Space Vehicles Directorate.

“We’re interested in really compelling scientific and engineering challenges that can lead to applications in the future,” said Ken Goretta, AFOSR program officer. “Compelling science and Air Force relevance drive us to invest, and origami antennas have that.”

This community met on September 13 for a workshop on origami antennas and electromagnetics and September 14 for a kick-off meeting for the new Center for Physically Reconfigurable and Deployable Multifunctional Antennas located at Florida International University.

The goal of the center, which celebrated its grand opening with a ribbon cutting September 15, is to develop innovative and advanced origami-based antenna technologies for next-generation Air Force and Department of Defense system.

“We want to use the center as an opportunity to create and train a diverse workforce with state of the art training and antenna programs for our nation and create a pipeline for very well trained engineers that they can go in work in the government” said Dr. Stavros Georgakopoulos, TAC Center Director and Inventor, Professor of Electrical and Computer Engineering at FIU.

Georgakopoulos expresses he never imagined after being funded initially that his grant co-funded by AFOSR would be so transformative for his career. “The Air Force Research Lab have a very strong presence. We are going to collaborate and we are going to use some of their expertise and we are going to do more interdisciplinary type of work. So we are very excited.”

For more information about the history and intent of project as well as its significance to Air force and the Department of Defense, visit

This story is an example of how AFRL creates asymmetric S&T advantage for the Air Force by making small strategic investments that create communities and conversations with far reaching scientific impact.

Forty years as an AFOSR PI: Rod Bartlett’s Personal History

Written By Rod Bartlett

When most read a popular account of scientific progress, the focus is on the ‘big name’ projects that one knows from the press: solar energy storage, hydrogen fuel, reduction of greenhouse gases, cures for cancer, etc. And the ‘experimental’ tools used to address these issues that measure the success or failure of some hypothesis. That, after all, is the scientific method. But as observations are made, science is trying to construct an underlying, organizing ‘theory’ that explains the experiment and will explain untold other future observations. An example is the difference between Newton showing that a prism splits light into many different colors—an experiment—and deriving the equations from his very general laws that explain the observed optics of the prism. The latter enable ‘predicting’ untold other optical phenomena in the absence of experiment.

Therefore, in this case Sherlock’s admonition that it is ‘dangerous to theorize without the facts’ needs some modification for ‘predictive’ theory. When the equations are correct and can be solved, the results have to be true. Today, that kind of predictive theory is what has been developed by quantum mechanics that in Dirac’s phrase underlies ‘all of chemistry.’ Except in his opinion, ‘the equations are too difficult to be soluble.’ The latter is no longer true. All those highly visible ‘big name’ projects depend upon chemistry, and chemistry deals with in Mulliken’s phrase, ‘what the electrons are really doing in molecules.’ With this knowledge, the energies of reactions, the activation barriers that control what reactions occur, and the spectroscopic fingerprints that identify the molecules become known.

The description of electrons requires the solution to the familiar, quantized equations of quantum mechanics for the electronic ‘wavefunctions’ and their energies, HΨk=EkΨk. But the H in these equations describe the Coulombic interactions among a molecule’s ‘many-electrons’. That means the water molecule’s 10 electrons produce a ‘10 body’ problem (45 electron-electron interactions), or for benzene, a’46 body one’ (1081 interactions), or for a piece of DNA, many more. Yet, we can only solve the Schrodinger equation of QM exactly for 1 electron, the hydrogen atom. So, we are faced with having to develop mathematical and computational tools that allow sufficiently accurate solutions of such many-electron problems to obtain the secrets of the molecules in question. When we are able to do that, we have a direct route to facts that are not typically amenable to experimental observation, like for molecules under extreme conditions as in explosions, or in interstellar space, or the detection and identification of rocket plumes, or the design of new concepts for fuels, among many other applications. Providing these solutions is the science of quantum chemistry.

But one major problem remained in its application. The problem of ‘electron correlation’. Electrons are charged particles meaning they interact instantaneously through Coulomb forces that cause their motions to be ‘correlated’, and these interactions are missing from an average (‘mean-field’ approximation) like the well-known Hartree-Fock theory. The latter approximates Ψ0 by Φ0, the familiar molecular orbital approximation that provides the conceptual interpretation of much of chemistry. Quantum chemical solutions to define Φ0 have been practical for many applications since the sixties, but the relatively small ‘correlation’ contribution that distinguishes the correct solution is critical to a ‘predictive’ theory for bond energies, activation barriers, spectra, and structure, indeed chemistry. As such, it has been the dominant unsolved problem in quantum chemistry for about 50 years.

In our forty years of AFOSR support, a number of notable advances have been made in the solution of the correlation problem. As a young scientist at Battelle in Columbus, Ohio, I approached Ralph Kelley, an AFOSR program manager in physics about support. I told him about using many-body perturbation theory (MBPT) and its diagrammatic framework borrowed from quantum field theory and Feynman diagrams to treat ‘electron correlation.’ He and AFOSR enabled me to start as an AFOSR PI in 1978.

As a postdoc at John Hopkins with Robert Parr, I had been given the freedom to pursue the many-body theory I had begun as an NSF postdoc at Aarhus, University in Denmark, in 1973. I and my collaborator, David Silver from the Hopkins Applied Physics Lab had written the first papers in chemistry in 1974-76 showing the potential power of MBPT. Prior work was due to Hugh Kelly in physics, who applied MBPT to atoms, but molecules require a very different treatment, so these were the first such applications. The reason it is called many-body perturbation theory (MBPT) is that the theory is based on the linked-diagram theorem of Brueckner and Goldstone that guarantees correct scaling with the number of electrons. Linked diagrams describe the electron-electron interactions in its most compact way. The energy of one of these quantum states has to be ‘extensive’ , so it should grow correctly with the number of electrons, a feature we later termed, ‘size-extensivity’ as the rationale for all many-body treatments. Although it should be obvious that when all the units (or atoms) in a molecule are too far apart to interact, the correct energy should be the sum of the energies of the units; but this condition is not met by the variational, configuration interaction approximations that were in dominate use during those 50 years. The many manifestations of size-extensivity were not to be fully realized until the turn of the century. Today, it is deemed a fundamental property that all worthy electronic structure approximations should satisfy.

Two years after our initial MBPT papers, John Pople decided to apply this method, but chose to call it Moeller-Plesset perturbation theory (MPPT) as he tried to avoid the less familiar diagrammatic tools we used. But his terminology hides the fundamental rationale for these many-body methods, in that the identification of ‘linked diagrams’ guarantees size-extensivity, and this feature is not apparent in ordinary perturbation theory. Today, the MBPT=MPPT methods for solving the Schrödinger equation are in virtually all quantum chemical programs. A search of the Web of Science shows that though there were only a hand-full of citations in the 70’s, and a couple of hundred until ~1989, there are now more than 295,700 citations to the method and 8105 papers written about it.

But perturbation methods are limited to some order and since the correlation correction is not small (in extreme forms it accounts for phase transitions in solids and super-conductivity), a far more powerful many-body approach is to sum many such linked diagram terms that describe correlation to infinite order. This is the idea of coupled-cluster theory that shows that the correct, infinite-order MBPT wavefunction for any system, Ψ0 = exp(T)|Φ0>. The exponential form guarantees size-extensivity. The cluster separation of, T=T1 +T2 +T3 +…where the subscripts indicate one-electron, two-electrons, three-electron,… provides a framework for a wealth of approximations determined by the number of clusters retained, like CCSD for single and double ones. The size-extensive property is at work at any truncation of T, providing superior solutions to any that had been previously obtained for the same computational effort. This is because the CC wavefunction even limited to T2, the double excitation cluster operator, automatically has all products like ⅟2T22 which are ‘quadruple’ excitations, and ⅟6 T23, ‘hextuple,’ etc., in its wavefunction. As T1, and T3 are added, one rapidly exhausts the effects of electron correlation converging to the exact solution.

The first report of general applications of CC theory, i.e. CCD for just T2 were reported in 1978 by us and Pople in back-to-back papers. Then George Purvis and I first reported CCSD (CC for single and double excitations) in 1982. Our CC papers were supported by AFOSR.

Because of the products included in exp(T2), unlike CI, most ‘quadruple’ effects are already included in CCSD, so the next most important term in due to T3. In our next AFOSR work (1984) we reported the first general inclusion of triple excitations (CCSDT-1), followed by the first non-iterative approximation CCSD[T], in 1985. A better non-iterative approximation, CCSD(T), that added one small term to [T] was introduced by the Pople group (1989) without a rigorous derivation. We presented that in 1993. The latter is now called the ‘gold standard’ in quantum chemical calculations. In 1987 we reported the full CCSDT method for the first time, sometimes called the ‘platinum standard’, followed later by full quadruples, CCSDTQ, and pentuples, CSDTQP! In this way we were able to show the rapid convergence of CC theory to the exact result, documenting its predictive character. Another citation check shows that from virtually no mention in the seventies, to less than a hundred citations in the eighties, CC theory has now spawned 28,780 papers and over 700,000 citations.

Another advantage that calculations have over experiment is the flexibility of application. In a second project with AFOSR, the physics program manager with responsibility for non-linear optics (NLO), Col. Gordon Wepfer, showed me experimental results for electric-field induced second and third harmonic generation experiments in the gas phase, compared to the theory of the time. The theory was hopeless! NLO effects are critical to all kinds of problems from protecting pilots’ eyes from lasers to doing selective surface chemistry. They are in principle, amendable to quantum chemistry, as they depend upon the higher terms in the expansion of a molecule’s energy in the presence of (frequency dependent) electric fields. These quantities are called hyperpolarizabilities, as they are higher-order generalizations of the well-known dipole polarizability for a molecule. I could not promise that we could resolve the discrepancy between theory and experiment, but with our new CC/MBPT methods I could promise to do calculations for such quantities with the best correlated quantum chemistry that existed. It took a few years and required some new theory for the treatment of the frequency dependent effects, but, indeed, we were able to explain the observed experimental values for the first time.

Another illustration of the flexibility of application occurred when Capt. Pat Saatzer of the Rocket Lab asked us at Battelle to provide a theory complement to two experimental efforts, one directed by John Fenn, later to be a Nobel Laureate, to determine the cross sections for vibrational excitations when components of combusted fuel collide with O atoms in the upper atmosphere. The idea is that depending upon the products in the fuel, a knowledge of these signatures allows one to identify whose missile it is. This kind of problem requires the combined efforts of molecular dynamics and quantum chemistry, the latter to provide the potential energy surface of interactions between the molecules and O atoms, and the dynamics, done by Mike Redmon, to add the time-dependent aspects. Both experiments failed, leaving only the theory to provide the cross sections required in the deployment of detectors.

A third illustration deals with NMR spectra. NMR has two components, a vector term that gives the chemical shift and a scalar term that provides the J-J spin-coupling constants for molecules. As the latter connects any two atoms in a molecule through its electronic density, the analogy with a chemical bond has inspired a lot of discussion. Once again, inspired by the lack of agreement between theory and experiment as pointed out in some review articles, Ajith Perera and I decided to apply our new CC/MBPT tools to resolving this issue. Once again these methods were remarkably successful, providing the first ‘predictive’ theory of J-J coupling constants. We went on to use them to further resolve the long-term argument between Georg Olah and H. C. Brown about the existence of non-classical C bonding, and with Janet Del Bene, to study the ‘two-bond’ coupling across an H-bond in nucleic acid bases. By measuring the latter, one can infer the location of the H- bond that cannot be seen in X-ray analysis. We also offered a J-J signature for the meaning of a strong, weak, or normal H-bond.

The next major theory effort for AFOSR was our further development of CC/MBPT but now focused on excited states. In the Schrodinger equation above, the ‘k’ indicates one of the many quantized solutions to the problem. The others are important to electronic spectroscopy and photochemistry among many other needs. In its original formulation, CC/MBPT provided very accurate results for one state, but we changed that by introducing what we call the equation-of-motion (EOM) CC starting in 1984-1992. This enables one to add a spectrum of excited states on top of a CC solution for the ground state. EOM also permits ‘excited’ states that differ from the ground state in the number of electrons, as in ionizations in photoelectron spectroscopy (IP-EOM-CC), or by adding an electron (EA-EOM-CC), or kicking out two electrons, DIP-EOM-CC or adding two, DEA-EOM-CC. Hence, one now has a wide array of ways to describe ‘what the electrons are doing in molecules’ for a wealth of different situations. Subjecting EOM-CC (sometimes called CC linear response) to the same measure of use as the other two developments shows over 23,500 citations and 690 papers using these methods today.

Armed with all these tools, a fascinating problem arose in the new, high-energy density material (HEDM) program geared toward new ideas for ‘revolutionary’ improvements in rocket fuels. I submitted a proposal entitled “An Investigation of Metastability in Molecules” to Drs. Larry Curtiss and Larry Burggraf in AFOSR chemistry, that asked the question how much energy could be stored in a molecule with a sufficient barrier to decomposition to keep it around long enough to be useful. Later Dr. Mike Berman became the program manager, and remains my program manager today. My proposal planned to use our predictive set of quantum chemical tools to address this question. Unlike synthesis, which is difficult, expensive, and dangerous, quantum chemical applications can explore prospects that exhibit different principles to see if any might be worthy of further study.

One strategy for storing energy into molecules would be to force some atoms to bind in unfamiliar ways, a concept we termed ‘geometric metastability.’ A case in point is the tetrahedral form of N4. As the normal form of P4 is a tetrahedron, and N and P are isovalent such a molecule makes sense. But while P2 is not very stable compared to P4, the N2 triple bond is one of the strongest bonds known, and four N atoms energetically prefer two N2 molecules to four single bonded N atoms in a tetrahedron. That, of course, is exactly what one would like, since if the four N atoms could be put into a tetrahedron, and if there is a barrier to decomposition that would keep it around; then under stimulus all the energy in N4 could be released to N2 molecules. Our calculations show that N2 would release 190 kcal/mol and would be held together by a barrier of 40 kcal/mol, once the four atoms could be put into the tetrahedron. That, of course, is the difficulty. Although there have been some potential experimental observations, perhaps the best one is from mass spec, where its isoelectronic analogue, N3O+ has been seen.

Another of our predictions was the existence of the N5 pentazole anion. Again, this makes perfect sense in terms of its bonding, even achieving extra stability via its pi electron aromaticity like in benzene. In this case we predict a barrier to decomposition of 27 kcal/mol. It has now been observed in negative ion mass spec as a byproduct of a known pentazole containing molecule. The targets for the HEDM project originated with theory that spun-off further work by DARPA and NASA, with the former pursuing serious synthetic efforts. Recently, another of our predictions, N8, seems to have been seen experimentally. Some have also been seen in high pressure experiments

Everyone in the computational field would love to be able to make accurate calculations by using an effective one-particle theory, so that all the complicated two-particle terms that must be described in CC theory could be avoided. This is the impetus for the development of Kohn-Sham density functional theory (KS-DFT). But unlike CC, there is no way to converge to the right answer, since the correct density functional is not known in any useful way. Instead, thousands of density based approximations to the KS-DFT theory are made and used to get answers quickly, without any guarantee of veracity.

In our current work for AFOSR we have tried to improve upon such an approach by insisting upon a rigorous foundation. That foundation starts by our formulation of a ‘correlated orbital theory’ (COT). It was derived from manipulating the IP/EA-EOM-CC equations that are formally exact into an effective one- particle form, whose eigenvalues have to correspond to the energy required to remove any electron from the molecule (IP) or to add an electron to the molecule (EA). This approach augments the mean-field Hartree-Fock approximation with a correlation orbital potential (COP)!

Since KS-DFT is a special case of COT, using this rigorous theory as a model, one can assess the accuracy of various DFT approximations. Finding that none satisfy our conditions, we took some well-known forms and, by virtue of the 2-4 parameters in them, fit them to satisfy our eigenvalue property initially only for water’s five Ip’s. In this way we introduced QTP(00). Two new minimally parameterized approximations, QTP(01), and QTP(02) followed. All provide accurate one-particle spectra to some threshold from the eigenvalue attached to each MO, proven by testing them against 401 experimental values from 63 molecules. QTP(01) gets all valence IP’s accurate to ~10%. An important application is core ionization and excitation, where QTP(00) is without peer. It accurately describes the core spectra of all the amino acids. Unlike any other DFT approximation, QTP(02) correctly describes the EA, both bound and unbound. The QTP family also gives excellent activation barriers, excited state excitation energies, and the molecular densities, themselves. As the avowed goal of DFT is to provide accurate densities, the QTP functionals do that better than most.

This QTP family defines what we call ‘consistent’ KS-DFT approximations, since one cannot get the IP-eigenvalues right without a good KS potential. Also, the connection between the orbital eigenvalues and Ip’s requires that the excitations given by adiabatic time-dependent DFT (TDDFT), be correct for excitation into the continuum, i.e. ionization. Further, an accurate potential mitigates the debilitating self-interaction error of KS-DFT, where electrons incorrectly interact with themselves. When we insist upon ‘consistency,’ we are a step closer to our goal of mimicking the predictive results of CC theory in a highly efficient one-particle theory. This is another testament to the CC revolution that began and was nurtured by AFOSR!

Besides the AFOSR work mentioned here, it is important to recognize that other aspects of our formative many-body developments benefitted from exceptional support from ONR (Bobby Junker) and ARO (Mikal Ciftan), and their successors. But it is true that ALL these accomplishments are uniquely a research product of the DoD agencies who had the foresight to back them in their infancy. I am extremely appreciative of the confidence shown in our effort over these 40 years.

The Sky is The Limit for Agile Air Force Science Test and Evaluation

Dr. Brett Pokines, AFOSR Program Officer for the Test Science for Test and Evaluation (T&E) program, hosted a three-day conference at the Doolittle Institute in August, maximizing access for the Test and Evaluation Community at Eglin AFB in Western Florida.

Highlights from the meeting include:

  • Keynote speaker, Dr. Elisabetta Jerome, Technical Advisor for Armament and Weapon Test and Evaluation, shared the Air Force Test Center (AFTC) strategic vision and integral role basic science investments present in the needs, changes, and opportunities within the AFTC Enterprise. AFTC’s Span of Operations occupies 32 locations equipped with over 200 Ground Test Facilities. The Test Center includes 100 Aircraft featuring 21 different variants, 12 Unique Test Cells, and three major installations.
  • Lt Colonel Daniel Montes, Director of Curriculum at the USAF Test Pilot School (TPS), presented links between the TPS mission and fundamental T&E research. The USAF TPS is home to the Air Force’s top pilots, navigators and engineers in conducting flight tests and representative data to carry out test missions.
  • Lt Colonel Randy Gordon, Test and Evaluation Lead for AFWERX, highlighted the cultural changes taking place within the T&E community and encouraged review participants to push beyond traditional developmental thinking. AFWERX is driven by innovation to bring tomorrow’s tools to the warfighter today.
  • Mr. Keith Kirk, Experimentation Program Manager in the Air Force Strategic Development Planning and Experimentation Directorate, provided a compelling example in his talk, as Co-lead for Phase II of “The Light Attack Aircraft Experiment,” which evaluated the Light Attack Aircraft capabilities to inform an expeditious procurement process. The program was instrumental in demonstrating how T&E and acquisition teams are reimagining the concepts of bringing capabilities to the Air Force.

Opportunities to network, collaborate, and connect with stakeholders set the stage for groundbreaking work in the Science Test and Evaluation community in the countdown to 2030. The conference provided an engaging platform for the T&E community to align opportunity, capability, and innovation in support of the Air Force mission.