Artificial intelligence requires the simultaneous execution of a large number of low-precision calculations. And the GPUs are designed to have thousands of processing cores that can all work at the same time. Meta’s AI supercomputer currently houses 6,080 Nvidia graphics-processing units and will reach nearly as large as 16,000 GPUs by mid-summer this year, making it the fastest AI supercomputer in the world. The Facebook team of researchers have been developing this supercomputer for over two years and has now been utilizing the supercomputer to train AI models in natural-language processing and computer vision for research purposes
Several hundred employees were involved in the study, according to the business, including researchers from partners Nvidia Inc., Penguin Computing Inc., and Pure Storage Inc. With its aim to ingest troves of data to build AI models that can think like a human brain, with multiple inputs—such as voice and visual recognition—and can deliver a contextual understanding of situations.
The team aims to achieve more than a trillion parameters that can be trained on data sets as large as an exabyte, which is nearly 36,000 years of high-quality video. Apart from that, the key objective is to ingest troves of data to build AI models that can think like a human brain, with multiple inputs—such as voice and visual recognition—and can deliver a contextual understanding of situations.