Its supercomputer can now train a GPT-3 model with 175 billion parameters in under 4 minutes.
Source link