# Supercomputing: An exascale-sized challenge?

### Automatic TRANSCRIPT

Terms of supercomputing and high performance computing or. Hp see are one and the same but you hear hp being used more moldy days. There's a growing cool for the democratisation of supercomputers which historically has been tricky because supercomputers weren't really created just to do any old bit of computing. His jacob bama hp and engineering research scientists from hewlett packard enterprise the for supercomputers were invented to solve a very specific problem a hydrodynamics problem for simulating nuclear weapons. So during world war two there were there. Were trying to develop these nuclear weapons they had to do. What's called a numerical simulation. It's essentially a fluid dynamic simulation and so they needed to run that problem numerically through a computer and that's kind of alan turing and john. Von layman come came up with the architecture in sort of the algorithm for running numerical methods on these systems supercomputers have come a long way in a relatively short amount of time thanks to innovation in materials experimental architectures industry pioneers like seymour cray and ever increasing processing speeds but for the most part until very recently these machines have been the domain of scientists the prospects of solving some of the really hard problems that have plagued humanity forever. That's like in our sights now like we could like feasibly solve problems like cancer and all these crazy permutations on corona virus. And and any of these like really scary viruses. That are gonna come out okay listeners. So this is one of those that starts normal and gets a little complicated. I'm going to get a little bit. But i i really wants to get to the bottom of these powerful machines and i wanted to understand what makes a sweeping meters so well super so i could up bill. Manel vice president and general manager of high performance computing hewlett packard and surprise. A super computer is a lot of processors memory and a high speed interconnect time together. Supercomputing provides the the hardware infrastructure if you will to do parallel computing. Parallel computing is important. Because typically in a problem you'd wanna break up the problem across multiple processors or or multiple servers or nodes typically. What you do is break up. The data that's called data partitioning where little chunks are putting in each server on each processor and then worked on independently and then at the end. You bring it all together. This parallel computing is kind of a big deal. It's what makes sense a numeric. Who problems i deal for. Cb computers on what makes them special or different to the the santa my desk. It's mostly because they're optimized around solving scientific and engineering problems in terms of how they can partition data. How they can manipulate the data. How they can keep a certain amount of data in memory at one time. So you're not always moving data back and forth to drive or to network for example. Okay key point. Here is all about data. Starting from those initial fluid dynamics simulation these big machines have been used for all kinds of modeling from whether to weapons alongside their partners. Intel bill and his team are in the process of deploying a supercomputer could aurora one of the world's first x. scale computers at the us department of energy. What's exco well to answer that. I need to know that supercomputer speed is measured in floating point operations per second aka flops which are basically just the number of calculations it can do a second exit scale computers computer. That's able to do at least a billion billion floating point calculations per second so there's no single chip in the world. That can do that. So you've gotta bring together a lot of chips into one system that allow you to accomplish that. It's basically tend to the eighteenth. In terms of number of floating point operations.