Hadoop.proxyuser.livy
to your authenticated hosts, users, or groups.Allow Livy to impersonate users
and set the value to all (*
), or a list of specific users or groups.livy
) must exist on every machine. You can add this user to each machine by running the following command on each node:
DEBUG
in the conf/log4j.properties
file..bashrc file
, or the conf/livy-env.sh
file that’s used to configure the Livy server.
These values are accurate for a Cloudera install of Spark with Java version 1.8:
livy.server.port
in conf/livy-env.sh
is the same port that will generally appear in the Sparkmagic user configuration.
The minimum required parameter is livy.spark.master
. Other possible values include the following:
local[*]
—for testing purposesyarn-cluster
—for using with the YARN resource allocation systemspark://masterhost:7077
—if the spark scheduler is on a different host.cluster
for Livy. The livy.conf
file, typically located in $LIVY_HOME/conf/livy.conf
, may include settings similar to the following:
kadmin.local
.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos credentials)
.
These are hostname and domain dependent, so edit the following example according to your Kerberos settings:
livy-ip-172-31-3-131.ec2.internal.keytab
and HTTP-ip-172-31-3-131.ec2.internal.keytab
.
livy-server
.
conf/livy.conf
configuration file, as shown:
livy.server.access-control.enabled = true
is only required if you’re going to also whitelist the allowed users with the livy.server.access-control.allowed-users <user>
key.krb5.conf
to the global configuration using the following command:
anaconda-config-files-secret.yaml
—with the data converted for Anaconda Enterprise.
Use the following command to upload the yaml file to the server:
/etc/krb5.conf
will be populated with the appropriate data.
livy.conf
with the keystore details. For example:
~/.sparkmagic/config.json
. For example:
ignore_ssl_errors
is set to true
because this configuration uses self-signed certificates. Your production cluster setup may be different..json
file, all Sparkmagic kernels will fail to launch. You can test your Sparkmagic configuration by running the following Python command in an interactive shell: python -m json.tool config.json
.https://<livy host>:<livy port>
.
To test your SSL-enabled Livy server, run the following Python code in an interactive shell to create a session:
keystore.p12
file using the following command:
keystore.jks
file:
rootca.crt
, you can run the following command to extract it from your Anaconda Enterprise installation:
rootca.crt
to the keystore.jks
file:
keystore.jks
file to the livy.conf
file. For example:
0
, you’ve successfully configured Livy to use HTTPS.
ca-certificates
package:
rootca.crt
as a new file:
anaconda-project.yml
file for the project and set the environment variable there. See Hadoop / Spark for more information.