As an Data Science & AI Workbench Administrator, you can create custom environments. These environments include specific packages and their dependencies. You can then create a custom installer for the environment, that can be shipped to HDFS and used in Spark jobs.Custom installers enable IT and Hadoop administrators to maintain close control of a Hadoop cluster while also making these tools available to data scientists who need Python and R libraries. They provide an easy way to ship multiple custom Anaconda distributions to multiple Hadoop clusters.
Click on an environment name to view details about the packages included in the environment, then click Edit.
Change the channels and/or packages included in the environment, and enter a version number for the updated package before clicking Save. The new version is displayed in the list of environments.
Select the environment in the list, click Create installer, and then select the type of installer you want to create. Workbench creates the installer and displays it in the Installers list.
If you created a management pack: Install it on your Hortonworks HDP cluster and add it to your local Ambari server to make it available to users. For more information, see this blog post about generating custom management packs.
If you created a parcel: Install it on your CDP cluster to make it available to users, or directly download the parcel onto your CDP cluster as described here.
Use the icons at the top of the page to manage your environment. You can edit, duplicate, delete, and view logs and resolved packages for the selected environment.