Highlights:
- Developers leverage Lambda’s cloud platform for AI model training, fine-tuning, and inference.
- Lambda is set to launch the Scalar Server, an AI appliance designed for on-premises deployment in enterprise data centers.
Recently, a startup specializing in cloud platforms optimized for artificial intelligence models, Lambda Labs Inc., has secured USD 480 million in Series D funding.
The Series D funding round, co-led by Andra Capital and SGW, values Lambda at USD 2.5 billion. Notable participants include Nvidia Corp., Super Micro Computer Inc., and esteemed computer scientist Andrej Karpathy.
Stephen Balaban, Co-founder and Chief Executive Officer, said, “Lambda is investing billions of dollars to build the software platform and infrastructure powering AI.”
Developers utilize Lambda’s cloud platform to train, fine-tune, and run inference on AI models. The company has disclosed that its data centers host over 25,000 NVIDIA GPUs and is currently integrating NVIDIA’s latest Blackwell B200 GPUs into its cloud infrastructure.
Developers can rapidly deploy AI clusters with up to 512 Nvidia H100 chips using the platform’s 1-Click Clusters. This feature is a key selling point, even though the H100 (introduced in 2022) is two generations prior to the Blackwell B200, it’s still widely used.
Within a 1-Click Cluster, each GPU server is equipped with a 24-terabyte flash storage pool dedicated to AI application data. Furthermore, three standard servers, powered by CPUs, handle inbound network traffic from applications and execute the management software necessary for coordinating the GPUs.
Lambda’s cloud servers utilize an internally developed software suite called Lambda Stack. Built upon the Ubuntu Linux distribution, Lambda Stack includes preconfigured versions of essential AI development tools. This integration automates much of the manual work typically involved in setting up and updating these tools, providing a seamless environment for AI developers.
In addition to its flagship cloud platform, Lambda offers a range of AI-optimized hardware products.
Lambda is preparing to release Scalar Server, an AI appliance designed for on-premises deployment in enterprise data centers. The system boasts eight H100 graphics cards. Lambda also provides AI appliances built on Nvidia’s DGX appliance architecture, which also feature eight GPUs.
Beyond its server offerings, Lambda also provides the Vector series of AI-optimized desktop computers. These machines are designed for developers and can be used for tasks such as testing newly trained AI models. The most capable model in the series, the Vector Pro, ships with four Nvidia chips and a collection of preinstalled AI development tools.
Lambda plans to enhance its cloud infrastructure and software with its latest funding, with a key focus on Lambda Chat, a service providing free access to open-source large language models.