Introducing support for GPU workloads and even larger Pods in GKE Autopilot | C2C Community

Introducing support for GPU workloads and even larger Pods in GKE Autopilot

  • 29 September 2022
  • 0 replies
  • 2 views

Userlevel 7
Badge +17

Run your AI/ML, video transcoding, and other GPU workloads on Google's fully managed Kubernetes platform! NVIDIA T4 and A100 GPUs are now available on GKE Autopilot.

The great thing about running GPU workloads on Autopilot is that all you need to do is specify your GPU requirements in your Pod configuration, and Google Cloud take care of the rest. No need to install drivers separately, or worry about non-GPU pods running on your valuable GPU nodes, because Autopilot takes care of GPU configuration and Pod placement automatically. 

 

Click on the link below to read it more detail.

 


0 replies

Be the first to reply!

Reply