The Hopper H200 GPU is based on the Grace Hopper Superchip, which consists of a 72-core Grace CPU and a GH100 compute GPU. The Grace CPU has 480 GB of ECC LPDDR5X memory, and the GH100 GPU has 141 GB of HBM3e memory, which comes in six 24 GB stacks and uses a 6,144-bit memory interface. The Hopper H200 GPU can deliver up to 8 petaflops of AI performance and 10 TB/s of memory bandwidth.
APO is a technology that aims to optimize the resource allocation and thread scheduling of applications in real time, based on their characteristics and demands. APO uses a combination of hardware and software components, such as a dedicated APO engine, an APO driver, and an APO library, to analyze and adjust the behavior of applications on the fly.