In 2012, DOE granted Washington’s team and their project, the Climate End Station, a total of 86 million processor hours through the Innovative and Novel Computational Impact on Theory and Experiment program. The team has 56 million processor hours on Jaguar and 30 million processor hours on Argonne National Laboratory’s supercomputer to generate climate simulations. This is equivalent to the power of 28 million dualcore laptops for one hour. However, unlike millions of separate laptops, Jaguar’s massive array of parallel processors are interconnected, allowing them to perform millions of calculations simultaneously and making more complex simulations possible.
For example, our team’s first baseline experiment will require a total of 200 simulated years. For such a simulation to occur within the timeframe of a single supercomputer resource allocation (typically one year), an integration rate of multiple
simulated years per day is required. A high-resolution, century-long climate experiment of the type contemplated here requires enormous amounts of computer time – on the order of 8 million CPU-hours per simulated century.
“8 million CPU-hours per simulated century”
But how does that compare to modeling the north tower crash into the World Trade Center?
100 hours, 8 processors, 0.5 sec
30 hours, 16 processors, 0.37 sec
800 CPU-hours and 480 CPU-hours
It took about 80 hours using a high-performance computer containing 16 processors to produce the first simulation
Impact simulations were performed using the nonlinear finite-element-based dynamic analysis software LS-DYNA [version 970 r5434a SMP] (LSTC 2005) on the IBM multi-processor nanoregatta computer system at Purdue University. Typically, we simulated the first 0.5 second of the time after impact and used adaptive incremental approach resulting in an average of 1.0x10-6 sec time-steps.
Simulating the collapse of a WTC tower should be relatively trivial compared to a climate simulation. There were 2,800 perimeter panels from the 9th floor to the top and fewer than 2,000 column sections in the core. There were probably about 9,000 horizontal beams in the core. So 50,000 components would probably be enough for a good WTC simulation so there should not be a problem with having to do it well in the last 10 years. It is not much compared to a climate simulation with millions of cells.
How much computing power was needed to design the WTC? It was done in the early 60s. The SR-71 Blackbird was flying at 2,000 mph in 1964. That is more impressive than a skyscraper. The groundbreaking for the WTC was in 1966. So the computing power available at the time was not too impressive compared to the 1980s when some early climate models were done, but the buildings stood for 28 years and withstood 100 mph winds on several occasions. I have not heard about the Empire State Building or any other skyscrapers failing because of Sandy’s fury.
So it is certainly curious that with the computing power available almost 40 years after completion of the towers we can’t get a good computer simulation of the supposed collapses with publicly available data, human readable, and yet people who believe that good climate simulations are possible do not have a problem with a lack of satisfactory building collapse models.
So with global warming we are dealing with a huge object with lots of unknowns that have yet to be resolved. But with the 9/11 problem we have a man made object of the kind of which hundreds have been built around the world and yet everyone cannot concede that we should have information as simple as the tons of steel and tons of concrete on every level. The steel had a kind of feedback loop. The more steel put near the top meant it had to be supported from below by more steel.
9/11 is a scientific farce. It is hardly sophisticated enough to be dignified with being called a fraud.