You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Installed JupyterHub latest version. Installed spawner and add properties to HDFS coresite.xml as per below. we used spawner local environment. Logged into JupyterHub and started server once server is started we get error shows in Error Log.
hadoop.proxyuser.oozie.hosts
nodes DNS names
hadoop.proxyuser.jupyterhub.groups
*
Optionally add a shared global password to be used by all users
c.DummyAuthenticator.password = "######"
Error Log
500 : Internal Server Error
Error in Authenticator.pre_spawn_start: DriverError Failed to submit application, exception: Permission denied: user=jupyterhub, access=WRITE, inode="/user/jupyterhub/.skein/application_1592762333481_0016":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1922) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4140) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1102) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:630) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1865) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)
You can try restarting your server from the home page.
The text was updated successfully, but these errors were encountered:
Installed JupyterHub latest version. Installed spawner and add properties to HDFS coresite.xml as per below. we used spawner local environment. Logged into JupyterHub and started server once server is started we get error shows in Error Log.
hadoop.proxyuser.oozie.hosts nodes DNS names hadoop.proxyuser.jupyterhub.groups *python-config.py
c.JupyterHub.bind_url = 'http://NodeDNS:4322'
c.JupyterHub.cookie_secret_file = '/etc/jupyterhub/jupyterhub_cookie_secret'
c.JupyterHub.db_url = 'sqlite:////var/jupyterhub/jupyterhub.sqlite'
Enable yarnspawner
c.JupyterHub.spawner_class = 'yarnspawner.YarnSpawner'
Make the JupyterHub internal communication accessible from other machines
in the cluster
c.JupyterHub.hub_ip = ''
c.YarnSpawner.cmd = '/opt/jupyterhub/minicondabash/bin/python -m yarnspawner.singleuser'
c.YarnSpawner.prologue = 'conda activate /opt/jupyterhub/minicondabash'
Resource limits per-user
c.YarnSpawner.mem_limit = '2 G'
c.YarnSpawner.cpu_limit = 1
c.Authenticator.admin_users = {'jupyterhub'}
c.JupyterHub.authenticator_class = 'dummyauthenticator.DummyAuthenticator'
Optionally add a shared global password to be used by all users
c.DummyAuthenticator.password = "######"
Error Log
500 : Internal Server Error
Error in Authenticator.pre_spawn_start: DriverError Failed to submit application, exception: Permission denied: user=jupyterhub, access=WRITE, inode="/user/jupyterhub/.skein/application_1592762333481_0016":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1922) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4140) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1102) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:630) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1865) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)
You can try restarting your server from the home page.
The text was updated successfully, but these errors were encountered: