Get transparent insights into Vertex AI Workbench pricing. Explore its pay-as-you-go model for powerful notebook instances, helping you scale your AI/ML projects efficiently on Google Cloud.
Confidential Vertex AI Workbench
Google Cloud is extending Vertex AI’s support for Confidential Computing. Customers of Vertex AI Workbench may now improve their data privacy requirements with Confidential Computing, which is currently in preview. With a few clicks, this connection provides more anonymity and privacy.
Vertex AI Notebooks
Vertex AI Workbench or Colab Enterprise are your options. Utilize all of Vertex AI Platform’s features to work on data science projects from data discovery to prototype to production.
Advantages
- Easy exploration and analysis: Streamlined data access and machine learning in-notebook with integration of BigQuery, Dataproc, Spark, and Vertex AI.
- Rapid prototyping and model development: Vertex AI Training allows you to get from data to training at scale while utilizing the power of unlimited computation for exploration and prototyping.
- End-to-end notebook workflows: You can execute your training and deployment processes on Vertex AI from a single location by using Vertex AI Workbench or Colab Enterprise.
Important characteristics
Colab Enterprise: Colab Enterprise integrates Google Cloud enterprise-level security and compliance with the notebook created by Google Research, which is utilized by more than 7 million data scientists. Quickly get started in a collaborative, serverless, zero-config environment.
Building AI/ML models in Python is made simpler by AI-powered code help tools like code completion and code generation, which eliminate the need for repeated code writing so you can concentrate on your data and models.
Vertex AI Workbench: Advanced customization features and a JupyterLab experience are offered by Vertex AI Workbench.
Fully managed compute: Vertex AI laptops offer enterprise-ready computing infrastructure that is fully managed, scalable, and equipped with user management and security features.
Interactive data and ML experience: Easily connect to Google Cloud’s big data solutions to explore data and train machine learning models.
Portal to complete end-to-end ML training: Create and implement AI solutions on Vertex AI with as little transition as possible.
Simplified data access: The whole data estate, including BigQuery, Data Lake, Dataproc, and Spark, will be easily accessible through extensions. Easily scale up or scale out based on your Artificial Intelligence and analytics requirements.
Use a catalogue to investigate data sources: From a notebook cell with auto-complete enabled and syntax awareness, write SQL and Spark queries.
Visualization of data: Data insights will be simple to obtain with integrated, intelligent visualization tools.
Cost-effective, hands-off infrastructure: The computing is handled in every way. Auto shutdown and idle timeout will maximise total cost of ownership.
Simplified enterprise security: Unconventional Google Cloud security measures. easy authentication and single sign-on for other Google Cloud services.
Spark and Data Lake at one location: Vertex AI Workbench allows you to run any engine, including TensorFlow, PyTorch, and Spark.
Integration of MLOps, training, and Deep Git: Connect laptops to well-established operations workflows with a few clicks. For hyper-parameter optimisation, planned or triggered continuous training, or distributed training, use notebooks. MLOps may be implemented in the notebook through deep interaction with Vertex AI services, eliminating the need for additional processes or code rewriting.
Smooth CI/CD: Notebooks are a great, tried-and-true deployment target for Kubeflow Pipelines integration.
Viewer for notebooks: For reporting and bookkeeping reasons, share the output of notebook cells that are updated on a regular basis.
Vertex AI Workbench Pricing
Pricing for instances of Vertex AI Workbench depends on the virtual machine (VM) settings you select. The price is determined by adding up the expenses of the virtual machines you utilize. The cost of the accelerators is determined by multiplying the accelerator pricing by the number of machine hours when using Compute Engine machine types and attaching accelerators.
Depending on its condition, your Vertex AI Workbench instance is charged:
When the instance is in the STARTING, PROVISIONING, ACTIVE, UPGRADING, ROLLBACKING, RESTORING, STOPPING, or SUSPENDING stages, CPU and accelerator utilization is billed.
CPU
Machine Type | Price per vCPU/hour (USD) |
---|---|
N1 | $0.0439224 |
N2 | $0.0439224 |
E2 | $0.0303065 |
A2 | $0.0439224 |
Accelerator
Machine Type | Price per Hour (USD) |
---|---|
Nvidia Tesla A100 | $4.400862 |
Nvidia Tesla A100 80GB | Unavailable |
Nvidia Tesla T4 | Unavailable |
Nvidia Tesla V100 | $3.72 |
Nvidia Tesla P100 | $4.400862 |
When the instance is in the beginning, provisioning, active, upgrading, rolling back, restoring, stopping, stopping, suspending, or suspended states, Disks storage is charged.
The sources also state that pricing data for managed and user-managed laptops are available separately; however, the extracts do not disclose specifics for any of these.
Disks
Machine Type | Price per GB per Month (USD) |
---|---|
Hyperdisk Extreme provisioned space | $0.174 |
Balanced provisioned space | $0.12 |
Extreme provisioned space | $0.15 |
SSD provisioned space | $0.204 |
Standard provisioned space | $0.048 |
Other Google Cloud resources (managed or user-managed notebooks) that you utilize with Vertex AI Workbench may also cost you money. For instance, there may be BigQuery fees associated with executing SQL queries on a notebook. Charges based on Cloud Key Management Service key operations apply when using customer-managed encryption keys. Although they are related services frequently utilized in machine learning workflows, Deep Learning Containers, Deep Learning VM Images, and AI Platform Pipelines are paid according to the computing and storage resources they consume, much like computing Engine and Cloud Storage.