a new requirement for the evolution of computing and analytics

AI storage is needed as the boundaries of traditional computing and analytics move into a new era

Data warehousing or AI is needed to address emerging data issues caused by the shift away from traditional computing and analytics.

The world of supercomputing has grown rapidly to incorporate AI, advanced data analytics and cloud computing. The era of serial data is coming to an end, with parallel data management replacing network file systems (NFS).

This change has matched the rise of AI, with investment in the technology hitting a new high in 2021. As an example, Microsoft invested $1 billion in an artificial intelligence project, co-founded by Elon Musk .

The AI ​​imperative

This shifting of the boundaries of traditional computing and analytics has resulted in several data issues that need to be addressed:

Data skills – there is a need to find new data science talent, maintain and update skills in a rapidly changing software environment.

Information source – there is a need to ingest large volumes of data from wide-area sources via a variety of ingestion methods at rates well beyond traditional computing requirements.

Data processing – another type of data processing is needed to implement large-scale GPU environments to provide the parallelism needed for real-time training and inference.

Data governance – there is a need to label, track and manage this data (forever) and share this data between organizations with the right security policies. Explainable AI at the application level and data available at the platform level.

Data Storage in the Age of AI

While The IT press tour in San Francisco, James CoomerSr. Vice President Products Data at NMSexplained that “data is the source code of AI, data is imperative for AI, and storage is imperative for AI.

“Storage cannot be an afterthought because it is essential for data ingestion, provisioning, management, labeling and longevity, which is essential for AI.”

Data or AI storage is needed to address emerging data issues caused by the shift away from traditional computing and analytics:

  1. Data Skills: AI Storage provides data scientists with streamlined concurrent and continuous workflows.
  2. Information source: AI storage helps scale projects, cost-effectively, with lightning-fast data ingestion speeds.
  3. Data processing: AI Storage offers tight AI integration with optimized performance.
  4. Data governance: AI Storage secures a silo-free approach with advanced workload insights.

AI storage in action

Traditionally, DDN Storage has focused on traditional data storage for unstructured data and big data in the enterprise, government, and academic sectors.

Today, he is redefining the imperatives that drive him as a company, focusing on AI storage, with his solution, A³I, which is at the heart of his growth strategy.

In action, for the past two years, DDN has acted as the primary storage system for NVIDIA to increase performance, scalability and flexibility to drive innovation.

According to Karl Freund, an analyst at Cambrian AI Research, NVIDIA controls “nearly 100%” of the AI ​​algorithm training market and owns several AI clusters.

Following this success, DDN is powering the UK’s most powerful supercomputer, Cambridge 1, which went live in 2021 and is focused on transforming AI-powered healthcare research.

The AI ​​storage provider is also working with Recursion, the drug discovery company.

“Our large-scale data needs require fast ingestion, optimized processing, and reduced application runtimes,” said Kris Howard, systems engineer at Recursion.

Working with DDN, the drug discovery company realized up to 20x lower costs and increased opportunities to accelerate the drug discovery pipeline with new levels of AI capability.

It used to run on the cloud, but now runs more efficiently, with better value for money on-premises.

“DDN has pioneered large-scale data acceleration to tackle what ordinary storage can’t. We create data environments for innovators to create the future. We are the world’s largest AI storage provider, proven across a range of different industries and customers, from financial services to life sciences,” Coomer added.

Case studies: in detail

1. Transforming Cancer Care with Managed Services: DDN as a Service for Precision Oncology


  • Seeking to defeat cancer on a global scale through proprietary blood tests, massive datasets and advanced analytics.
  • Use bioinformatics and HPC to analyze genome sequence data for liquid biopsy.

Data challenge:

  • Demand performance and reliability.
  • Previous systems had huge hardware failures.

NMS Solution:

  • Parallel file system storage as a managed service.
  • Advantages: Evergreen, capacity and performance on demand, Opex model.

2. Simplify data management for a global financial services and venture capital firm


  • Financial organization centered on mathematics and programming, which brings a scientific approach to financial products.

Data challenge:

  • Metadata-heavy applications strained existing NFS-based storage performance.
  • The data set of over 50 PBs presented a management challenge with existing systems.

The DDN solution:

  • Efficient performance from fewer systems – easier to manage and scale in the future.
  • Our consultative approach and knowledge of data helped them identify and solve their specific problems.

3. Transform research data storage: from management and maintenance to universal resources


  • California’s leading academic life science research organization.
  • Provide data storage and management for a wide variety of projects and requirements.

Data challenge:

  • The previous system was local and self-sufficient.
  • Scalability, performance, and stability exceeded current capabilities.

The DDN solution:

  • Simplified and reliable infrastructure with a roadmap for growth and additional capabilities.
  • Easy access for researchers, no need to change workflows.
  • Plans to further accelerate research with GPU clusters aided by DDN expertise.

Comments are closed.