Skip to content

Meta plans to develop an open-source AGI with a massive GPU arsenal

  • by
  • 3 min read

In the fiercely competitive landscape of artificial intelligence, Meta CEO Mark Zuckerberg has thrown his hat into the ring, announcing the company’s pursuit of open-sourcing general intelligence.

Following the footsteps of OpenAI and Google’s AI leader Demis Hassabis, Zuckerberg aims to push the boundaries of AI capabilities by developing an Artificial General Intelligence (AGI), even though the industry lacks a universally agreed-upon definition for AGI.

A significant move in this direction is relocating Meta’s AI research group, FAIR (Facebook AI Research), to the same segment as the team working on generative AI products across Mets’a applications.

The race for AI talent has never been more intense, with companies scrambling for a limited pool of skilled researchers and engineers. As The Verge reports, the companies offer lucrative compensation packages exceeding $1 million annually to secure top-tier talent to gain a competitive edge.

While Zuckerberg refrains from providing a specific timeline or definition for AGI, he clarifies that Meta is gearing up for a significant leap in computational power. By the end of the year, Meta is poised to own over 340,000 of Nvidia’s H100 GPUs, widely regarded as the industry’s preferred chips for developing generative AI, with reports suggesting that this number could go up to 600,000 GPUs by the end of 2024.

Artificial General Intelligence (AGI) is still in theory.

“We’re building an absolutely massive amount of infrastructure to support this. By the end of this year, we’re going to have around 350,000 NVIDIA H100s, or around 600,000 H100 equivalents of compute, if you include other GPUs. We’re currently training Llama 3, and we’ve got an exciting roadmap of future models that we’re going to keep training responsibly and safely too,” said Zuckerberg.

All this looks good if the industry has a standard definition of AGI. Even Zuckerberg acknowledges the industry-wide uncertainty regarding AGI’s precise nature and arrival.

“You can quibble about if general intelligence is akin to human-level intelligence, or is it like human-plus, or is it some far-future super intelligence. But to me, the important part is actually the breadth of it, which is that intelligence has all these different capabilities where you have to be able to reason and have intuition,” explained Zuckerberg.

Meta’s renewed emphasis on AGi is influenced by the release of Llama 2, its latest large language model. Although the company initially deemed the ability to generate code irrelevant to its applications, it recognised the importance of honing this skill for advancing AI capabilities.

Meta is training Llama 3 and is set to incorporate code-generating capabilities echoing Google’s recent Gemini model.

While Meta ramps up its efforts towards AGI development, earlier this week, it was reported that around 187,000 companies are feeding Meta’s data-hungry servers with people’s data.

In the News: X brings video and audio calls to Android users

Kumar Hemant

Deputy Editor at Candid.Technology. Hemant writes at the intersection of tech and culture and has a keen interest in science, social issues and international relations. You can contact him here:

Exit mobile version