Supported Instance Types

AWS Instance Type Memory / Cores Equivalent DBUs1
Memory Optimized
r3.xlarge2, r4.xlarge 30GB / 4 cores 1
r3.2xlarge2, r4.2xlarge 60GB / 8 cores 2
r3.4xlarge2, r4.4xlarge 120GB / 16 cores 4
r3.8xlarge2, r4.8xlarge 240GB / 32 cores 8
r5.large 16GB / 2 cores 0.45
r5.xlarge 32GB / 4 cores 0.90
r5.2xlarge 64GB / 8 cores 1.80
r5.4xlarge 128GB / 16 cores 3.60
r5.12xlarge 384GB / 48 cores 10.80
Compute Optimized
c3.2xlarge2, c4.2xlarge 15GB / 8 cores 1
c3.4xlarge2, c4.4xlarge 30GB / 16 cores 2
c3.8xlarge2, c4.8xlarge 60GB / 32 cores 4
c5.xlarge 8GB / 4 cores 0.61
c5.2xlarge 16GB / 8 cores 1.21
c5.4xlarge 32GB / 16 cores 2.43
c5.9xlarge 72GB / 36 cores 5.46
Storage Optimized
i3.xlarge 30GB / 4 cores 1
i3.2xlarge 60GB / 8 cores 2
i3.4xlarge 120GB / 16 cores 4
i3.8xlarge 240GB / 32 cores 8
i3.16xlarge 488GB / 64 cores 16
i2.xlarge 30GB / 4 cores 1.5
i2.2xlarge 60GB / 8 cores 3
i2.4xlarge 120GB / 16 cores 6
i2.8xlarge 240GB / 32 cores 12
General Purpose
m4.large 8GB / 2 cores 0.4
m4.xlarge 15GB / 4 cores 0.75
m4.2xlarge 30GB / 8 cores 1.5
m4.4xlarge 60GB / 16 cores 3
m4.10xlarge 160GB / 40 cores 8
m4.16xlarge 256GB / 64 cores 12
m5.xlarge 16GB / 4 cores 0.69
m5.2xlarge 32GB / 8 cores 1.37
m5.4xlarge 64GB / 16 cores 2.74
m5.12xlarge 192GB / 48 cores 8.23
GPU Accelerated Memory / Cores GPU (Memory / Cores) Equivalent DBUs1
p2.xlarge 60GB / 4 cores 12GB / 2,496 cores 1.22
p2.8xlarge 480GB / 32 cores 96GB / 19,968 cores 9.76
p2.16xlarge 720GB / 64 cores 192GB / 39,936 cores 19.52
p3.2xlarge 61GB / 8 cores 16GB / 5,120 CUDA cores + 640 Tensor cores 4.15
p3.8xlarge 244GB / 32 cores 64GB / 20,480 CUDA cores + 2,560 Tensor cores 16.6
p3.16xlarge 488GB / 64 cores 128GB / 40,960 CUDA cores + 5,120 Tensor cores 33.2

1 Databricks Units (a unit of processing capability per hour).
2 Deprecated. Support for this instance type will end August 28, 2019.