edit

Install PipelineAI Open Source

While we recommend the hosted PipelineAI Community Edition when evaluating PipelineAI, we provide installation instructions below to setup PipelineAI in your own cloud-based or on-premise environment.

Note: Support is limited for this offering, but we would love your feedback, bug reports, and feature requests HERE.

PipelineAI Standalone

The standalone PipelineAI Community Edition uses a single Docker image that can run in any CPU and GPU-based environment that supports Docker for CPUs or Nvidia-Docker for GPUs.

Nvidia GPU

Docker

AWS GPU

AWS GPU

Google Cloud GPU

Google Cloud GPU

AWS CPU

AWS CPU

Google Cloud CPU

Google Cloud CPU

PipelineAI Distributed

PipelineAI uses Kubernetes for Docker Container management and orchestration.

Kubernetes

PipelineAI Cluster

Local

Local, Mini Kubernetes Cluster + PipelineAI Community on Local Laptop or Low-Memory Server

AWS

Full Kubernetes Cluster + PipelineAI Community on AWS.

AWS

This requires large instance types with at least 50 GB RAM, 8 CPUs, 100 GB Disk.

Google Cloud

Full Kubernetes Cluster + PipelineAI Community on Google Cloud

Google Cloud Platform

This requires large instance types with at least 50 GB RAM, 8 CPUs, 100 GB Disk.

Azure

Full Kubernetes Cluster + PipelineAI Community Edition on Azure

Azure Cluster

This requires large instance types with at least 50 GB RAM, 8 CPUs, 100 GB Disk.

On-Premise

Full Kubernetes Cluster + PipelineAI Community Edition On-Premise

On-Premise Cluster


Try our Free Community Edition!

Click HERE to get started with our basic Community Edition

Register for PipelineAI Advanced Edition


Take a Quick Tour!


Help Us Improve

Option 1: Edit the documentation directly.

Option 2: Create a GitHub issue.

More Resources