Google Cloud demonstrated a LLM training job over 50000+ TPU v5e chips, in JAX! #18478
Unanswered
hawkinsp
asked this question in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
https://cloud.google.com/blog/products/compute/the-worlds-largest-distributed-llm-training-job-on-tpu-v5e
This was done using JAX and MaxText (https://github.com/google/maxtext), amongst other libraries.
Beta Was this translation helpful? Give feedback.
All reactions