Dirk Van Essendelft, Machine Learning and Data Science Engineer at the National Energy Technology Laboratory (NETL) shares how his team is using the Cerebras CS-1 to accelerate computational fluid dynamics.

Learn more about our work with NETL: https://www.cerebras.net/cerebras-customer-spotlight-overview/spotlight-national-energy-technology-laboratory/

TIMESTAMPS
0:47 National Energy Technology Laboratory’s mission to meet energy’s carbon-neutral energy needs
2:18 Department of Energy’s Energy Earthshots Initiative
3:24 NETL investing in Cerebras CS system to dramatically accelerate modeling of fossil energy, carbon management, carbon capture and more
3:58 Field equation modeling, fundamental to all sciences and engineering
5:17 Using the Cerebras Wafer-Scale Engine for modeling field equations
6:17 Benefits of WSE’s memory bandwidth
7:20 NETL’s domain-specific language for field equations, The WSE-FE API
9:15 Cerebras WSE benchmarking vs Joule 2 supercomputer — 830X performance improvement and impressive scaling
11:18 WSE is 200x times faster than Joule 2 for OpenFOAM
11:34 Massive leap in performance that the nation needs
11:40 Software architecture overview — PyThon Magics to process operations
11:32 How to solve arbitrary field equations — WSE array
15:14 Dynamic bytecode interpreter
16:25 What gets run on the WSE
17:26 Expanding model complexity
17:38 Solving the temperature field in a fluid moving across a hot plate with non-symmetric matrices and non-linear update
18:45 Solving the full Navier Stokes with SIMPLE

#energy #climatechange #HPC #AI #deeplearning #computationalfluiddynamics

Add comment

Your email address will not be published. Required fields are marked *

Categories

All Topics