Our Investment in TidalScale & the Data Center of Tomorrow

By Louis Rajczi

Every day, new applications are being developed that either process or generate more data than existed on Planet Earth 50 years ago, and the pace of development of those applications shows no sign of slowing down.  There are many factors contributing to this trend, only some of which we will touch on here, including:

  • The continued decrease in computing costs, thus making it cheaper than ever to develop, test, and run advanced applications.

  • The use of artificial intelligence (AI) and machine learning (ML) to help solve complex business problems, and

  • The proliferation of connected Internet of Things (IoT) devices, most of which generate a tremendous amount of data and have a primary purpose of providing (near) real time information to central systems and applications.

photo-1468070454955-c5b6932bd08d.jpeg

 While the explosion in advanced applications has undoubtedly driven improved business, health, and safety insights, it has also sparked an insatiable appetite for “more” from business leaders, researchers, and consumers. As with anything though, “more” comes at a cost. More accurate insights require more data with which to train applications. This in turn, requires more computing horsepower to process. Unfortunately, the widespread demand for higher levels of compute power is now growing at a rate that outpaces the rate of technical innovation in memory, processing, and I/O. This tends to either limit the types of problems that can economically be addressed or requires companies to manipulate their data through techniques like sharding, which allow large datasets to be processed with more modest resources, but at a cost of potentially removing an application’s ability to find an optimal solution.  

In evaluating this problem, we found ourselves looking back to the 1990s when the inverse situation was true: massive advances in the cost effectiveness of compute processing capabilities had outstripped the computing needs of most applications. Companies like VMware capitalized on that trend by providing environments in which multiple applications could run on virtual systems and share resources, all while behaving as if each application were running on its own server. As the needs of applications changed, the virtual system would adapt and change as well, thereby ensuring that any infrastructure investment resulted in increased overall utilization and improved the cost effectiveness of business operations.

With history as our guide, one might imagine that an ideal solution to the current problem could involve turning the ‘VMware solution’ on its head: “stack” the resources of multiple servers together and make those resources available to applications as needed in such a way that the applications are unable to distinguish between a virtual stacked environment and a super computer platform.  This is the exact approach taken by our newest portfolio company, TidalScale, based in Campbell, CA.

Our fundamental belief is that TidalScale’s core solution, which the company calls “Software-Defined Servers,” will change the way that the data center of tomorrow is structured. Software-Defined Servers pool multiple commodity systems into a single virtual system that creates a virtual machine matching the size of the computing resources to the needs of an application. In other words, for the first time, organizations can use existing servers to create (on-demand and without any modification required to their operating systems or applications) server capacity of virtually any size. TidalScale’s solution works for both on-premise and cloud data centers, allowing maximum flexibility in today’s hybrid world.

While, on the surface, TidalScale might seem like it is simply solving an enterprise infrastructure problem, we are far more bullish on its potential impact. The applications being considered in the opening paragraph of this post do not reside only in the realm of esoteric, niche-focused scientific research. These applications have the potential to affect everyone – from advances in healthcare, to traffic and pollution management.  From learning and training to wildlife conservation. From energy management to security. To the extent that TidalScale’s server virtualization solution improves the processing, memory, and I/O capabilities of tomorrow’s datacenters, we will be able to continue asking more of tomorrow’s most cutting-edge applications.

We are excited to be supporting a world-class team at TidalScale, which includes industry veterans and pioneers like Gary Smerdon, Ike Nassi, and so many others.  We are also honored to be part of a growing roster of supportive investors and strategic partners like Bain Capital Ventures, Hummer Winblad, Infosys, SK Hynix, Sapphire Ventures, and Samsung. The future looks bright for TidalScale and we anticipate with great excitement the new insights and improvements companies can unlock with this technology.