Create a Channel
databricks
.Create and apply a policy
databricks
.
Platform
Is not
linux-64
and
Platform
Is not
noarch
databricks
channel you created earlier. For more information, see Applying a Policy.
Build a Custom Docker Image
dcs-conda
by running the following command:
dcs-conda
directory and create a Dockerfile
file inside the dcs-conda
directory:
Dockerfile
file, depending on your Databricks Runtime version:
env.yml
file inside the dcs-conda
directory:
env.yml
file:
Launch a Cluster using Databricks Container Service
Enabling Databricks Container Service
enableDcs
to true
, as in the following example:Databricks Runtime for Machine Learning
does not support Databricks Container Service.spark.databricks.unityCatalog.volumes.enabled true
.Docker Image URL examples
<organization>/<repository>:<tag>
<aws-account-id>.dkr.ecr.<region>.amazonaws.com/<repository>:<tag>
<your-registry-name>.azurecr.io/<repository-name>:<tag>
Create a Notebook and connect it to your cluster
Verify your conda installation
!conda --help
runs the command in the current shell. %sh conda --help
starts a subshell, which is useful for multi-line scripts, but might not have the same environment or path.Install MLflow from your Anaconda organization channel