top of page

Software-Defined Infrastructure Improves Utilization

Composable infrastructure gives enterprises cloud flexibility for 70% less than public clouds.


The IT Press Tour met with Sumit Puri, CEO of Liqid to learn how they are helping data centers evolve.


Sumit founded Liqid seven years ago with the vision to help transform the data center into a software-defined infrastructure by focusing on building two primary pieces of technology. One, the software layer that enables the orchestration of off-the-shelf hardware. And, two, a fabric interconnect technology. The mission of the company is to turn statically configured servers into racks of dynamically configurable infrastructure by integrating with the frameworks that already exist.


The Liqid matrix software and fabric technology were created to composable infrastructure. The software-defined infrastructure reduces OpEx costs by improving utilization by 2 to 3X. Markets with immediate demand and needs include AI, cloud, high-performance computing, and edge.


Datacenters are transforming. Gartner believes static infrastructure is dead as it's unable to accelerate and scale at the rate required by growing enterprises. Dynamically configurable infrastructure is the future. The software-defined data center is the endpoint. Technologies like disaggregation and software composability with be the two pillars supporting the successful transition.


Disaggregated Composable Infrastructure


With disaggregated composable infrastructure, everything is a la carte. Liqid takes pools of resources, compute, and storage -- GPUs, FPGAs, and networking cards -- and puts them into a switch fabric. They then use software and dynamically compose servers based on the compute needs of the customer. The server is dynamic. When the user needs another GPU for their application, they reprogram the fabric and add or remove devices from the server depending on what the application layer needs.


Customers are able to build previously impossible server configurations like the highest performance AI system with 16 Nvidia A100 GPUs, paired with 16 Mellanox networking devices, paired with 16 storage devices, connected to a single system enabling peer-to-peer technology.


Many enterprises are evaluating whether or not they want to own infrastructure to give all their money to a public cloud provider. Liqid is enabling customers that need to own infrastructure cloud-like agility. Bank of America saved $2 billion in 2020 by using their own infrastructure rather than AWS. Nike is exploring alternatives to a $700 million per year public cloud experience.


Liqid is not competing with any provider. They strive to improve every infrastructure footprint with their fabric and software. Customers can use any hardware, software, tools, drivers, and application layer. If something works on a static server, it will work on a composed server. The same hardware goes from being static to dynamically configurable. Enterprises achieve cloud flexibility for 70% less cost per year by improving utilization.


Infrastructure Use Cases


Dynamic cloud is a popular use case. On-prem infrastructure today is a game of matching servers and workloads. Small workloads on big servers waste infrastructure. On-prem cloud users want to study the workload on the inbound, spin up what they need to get the workload done, and then as soon as the workload is done, free the $10,000 GPU back to the free pool.


For example, media and entertainment companies in Hollywood have daytime and nighttime workloads. They have a pool of GPUs. During the day, they give the GPU pool to the AI guys. At night, when they go home, they set up a policy, reprogram the fabric and do video rendering at night. Dynamic reallocation of resources drives greater hardware utilization.


Liqid recently replaced Cray in two Department of Defense (DoD) contracts for composable systems with a combined 15 petaflops to run physics-based, AI and ML applications for the US High-Performance Computing Modernization Program. The system ranks in the top 20 rankings of the world's most powerful high-performance computing (HPC) platforms.


Liqid is working with multiple telco customers on edge use cases. The edge is limited by power, floor space, and cooling. By taking a rack of infrastructure that's 20% utilized, dynamic reconfiguration increases utilization to 50 or 60% essentially cutting infrastructure needs, power, floor space, and cooling in half while also removing the need to send a human to the edge -- a very difficult and expensive proposition.


A lot of edge use cases are around retail. The future of retail is not that you're going to take something and stand in front of a human and transact with them. The future of retail is a consumer grabbing a bottle of mustard and walking out of the store. The camera is going to see the mustard. It's going to see my face and it's going to send me a bill on my iPhone. That's the future of retail and all of that is driven by AI at the edge. You can't do that transaction in real-time if we're sending that data packet back to the core data center. It doesn't work. Those workloads are driving infrastructure at the edge.


Another use case shared by an edge customer is the autonomous forklift. Pretty soon the forklift in the distribution center is going to be driving around without a human and if the response time for a data packet ends up being two seconds then that forklift can run over a human and kill somebody. Latency is unacceptable in autonomous vehicles.


The Future of Infrastructure


Sumit believes we are moving from user-defined to policy-based. Right now, people specify the size and features of their servers. In the future, people will tell their servers how to act. When this server goes above 80% of the storage used, more storage is provided. After 10:00 pm when GPU use by humans is down, GPUs are provisioned for automated processing for eight hours.


Eventually, we will have a machine-based environment where AI studies the usage patterns of the applications and hardware and ultimately dictates how hardware is reconfigured and recomposed ultimately removing the human from the equation.



bottom of page