Model Zoo Overview#
The Cerebras Model Zoo is a comprehensive repository offering a range of deep learning models optimized for the Cerebras hardware. This collection is tailored to showcase the best practices for constructing models that exploit the robust capabilities of the Cerebras Wafer-Scale Cluster. With a focus on models developed in PyTorch, the Model Zoo provides detailed instructions for deploying neural network jobs on the Cerebras hardware, guiding users through compiling, validating, and training processes.
Note
From Release 2.2.0 onwards, the Cerebras Model Zoo has been reorganized to enhance clarity and usability. Users should note that this reorganization has resulted in updated paths that may require adjustments in previous projects.
List of models#
The Cerebras Model Zoo includes various models:
To deploy your neural network jobs on the Cerebras Wafer-Scale Cluster, refer to the Quick start guide. This guide will walk you through the necessary steps to compile, validate, and train models from this Model Zoo using your preferred framework.
Model portability#
The Cerebras Model Zoo facilitates model portability, providing tools that aid users in adapting existing models or crafting new ones using Cerebras APIs. This includes a spectrum of user stages, from beginners to advanced users, with varying levels of integration and customization:
Beginners
Start with Cerebras data preprocessing tools and use model implementations found in the Cerebras Model Zoo.
Intermediate Users
Intermediate users can integrate their data preprocessing methods by referring to the section Create your own data preprocessing.
Advanced Users
Define your PyTorch model or code using the run
function in the Cerebras Model Zoo and Supported Operations API.
Modules overview#
The Model Zoo’s structure is designed to be user-friendly, offering an organized array of models, datasets, tools, and utilities. Features include:
Key features of the Cerebras Model Zoo include:
Reorganized models and datasets categorized by classes (NLP, Vision, Multimodal).
Registry APIs that enable querying of paths and supported combinations for each model.
Config classes designed to encapsulate parameters in YAML files, providing a clear class hierarchy and ensuring the validation of configurations used with models.
Enhanced data preprocessing tools for NLP models, significantly improving performance.
With the reorganization, paths within the Model Zoo have been updated. Users should refer to the updated documentation to align their projects with the new structure.
The structure of the Cerebras Model Zoo is illustrated in the diagram below:
List of modules in the Model Zoo#
The principal modules in the Cerebras Model Zoo include:
Modules |
Description |
For configuration classes and base-level configuration. Know more about config classes here. |
|
Hosting data processing scripts and loaders for vision and NLP models |
|
Containing layers and modules for model building and adaptation |
|
Offering a variety of loss functions for different model training phases |
|
Implementing a range of models across NLP, vision, and multimodal domains |
|
Providing tools for data preprocessing stages |
|
Containing utilities for model conversion and configuration |
|
Hosting common utilities across different models |
|
Trainer API that facilitates training and validating Model Zoo models |
This redesign aims to streamline the user experience, making it easier for ML developers and researchers to explore, experiment, and develop solutions efficiently within the Cerebras ecosystem. The documentation further enriches this experience, offering examples and guidance on utilizing the Model Zoo’s Registry APIs and Config Classes, thereby enhancing the model development and deployment process.
Updated directory paths in Model Zoo as of R2.2.0#
The paths in the recently restructured Cerebras Model Zoo have been updated. Below is a table that compares the new paths with their corresponding older versions.
Old Path (relative to $MODELZOO) |
New Path (relative to $MODELZOO) |
common/pytorch/* common/run_utils/* common/model_utils/* |
common/ |
common/pytorch/layers |
layers/ |
common/pytorch/input common/input |
data/ |
models/nlp/* vision/pytorch/* multimodal/pytorch/* |
models/nlp/* models/vision/* models/multimodal/* |
data_preparation/* |
data_preparation/ |
common/pytorch/model_utils/ |
tools/ |
fc_mnist/ |
fc_mnist/ |