For security reason, we cannot expose user password in zeppelin, so we must write down encrypted password into shiro.ini, so how to enable encrypt passwd in zeppelin?
Open shiro.ini, in [user] area, replace all user’s password with sha256 encrypted. such as
xianglei = ba4ae0f17be1449007b955f97f7d1ca967ec72da3f39047adcc3c62eb02524b5, admin,admaster
This encrypted password we can use many tools to generate, so wen do not discus how to generate passwd here.
And Then, in [main] area
add these lines
sha256Matcher = org.apache.shiro.authc.credential.Sha256CredentialsMatcher
iniRealm.credentialsMatcher = $sha256Matcher
And then, restart zeppelin, done!
A file system security error in hive jdbc
java.lang.NoClassDefFoundError: com/google/protobuf/ProtocolMessageEnum at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access$100(URLClassLoader.java:71) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.getDeclaredConstructors0(Native Method) at java.lang.Class.privateGetDeclaredConstructors(Class.java:2595) at java.lang.Class.getConstructor0(Class.java:2895) at java.lang.Class.getConstructor(Class.java:1731) at org.apache.hive.service.cli.HiveSQLException.newInstance(HiveSQLException.java:243) at org.apache.hive.service.cli.HiveSQLException.toStackTrace(HiveSQLException.java:209) at org.apache.hive.service.cli.HiveSQLException.toStackTrace(HiveSQLException.java:235) at org.apache.hive.service.cli.HiveSQLException.toStackTrace(HiveSQLException.java:235) at org.apache.hive.service.cli.HiveSQLException.toStackTrace(HiveSQLException.java:235) at org.apache.hive.service.cli.HiveSQLException.toCause(HiveSQLException.java:196) at org.apache.hive.service.cli.HiveSQLException.<init>(HiveSQLException.java:108) at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:241) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:227) at org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:364) at org.apache.commons.dbcp2.DelegatingResultSet.next(DelegatingResultSet.java:191) at org.apache.commons.dbcp2.DelegatingResultSet.next(DelegatingResultSet.java:191) at org.apache.zeppelin.jdbc.JDBCInterpreter.getResults(JDBCInterpreter.java:478) at org.apache.zeppelin.jdbc.JDBCInterpreter.executeSql(JDBCInterpreter.java:592) at org.apache.zeppelin.jdbc.JDBCInterpreter.interpret(JDBCInterpreter.java:692) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:101) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616) at org.apache.zeppelin.scheduler.Job.run(Job.java:175) at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassNotFoundException: com.google.protobuf.ProtocolMessageEnum at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 42 more
It looks like a jar dependencies error, but actually, its an hdfs permissions error.
The real reason is, the file that hive read, its owned by one guy, and the permission is set to 640. And when another guy tried to read the file ,the error appears. But in hive, it not looks like someone lost the jar on machine.
How to disable some keyword in zeppelin?
Well, sometime, we must disable some keywords in zeppelin in special scenarios, so all we should do is to change the code.
In zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoveInterpreterServer.java
We add these code:
public static HashSet<String[]> blockedCodeString = new HashSet<>(); static { blockedCodeString.add(new String[]{"import", "os"}); blockedCodeString.add(new String[]{"import", "sys"}); blockedCodeString.add(new String[]{"import", "subprocess"}); blockedCodeString.add(new String[]{"import", "pty"}); blockedCodeString.add(new String[]{"import", "socket"}); blockedCodeString.add(new String[]{"import", "commands"}); blockedCodeString.add(new String[]{"import", "paramiko"}); blockedCodeString.add(new String[]{"import", "pexpect"}); blockedCodeString.add(new String[]{"import", "BaseHTTPServer"}); blockedCodeString.add(new String[]{"import", "ConfigParser"}); blockedCodeString.add(new String[]{"import", "platform"}); blockedCodeString.add(new String[]{"import", "popen2"}); blockedCodeString.add(new String[]{"import", "copy"}); blockedCodeString.add(new String[]{"import", "SocketServer"}); blockedCodeString.add(new String[]{"import", "sysconfig"}); blockedCodeString.add(new String[]{"import", "tty"}); blockedCodeString.add(new String[]{"import", "xmlrpmlib"}); blockedCodeString.add(new String[]{"etc"}); blockedCodeString.add(new String[]{"boot"}); blockedCodeString.add(new String[]{"dev"}); blockedCodeString.add(new String[]{"lib"}); blockedCodeString.add(new String[]{"lib64"}); blockedCodeString.add(new String[]{"lost+found"}); blockedCodeString.add(new String[]{"mnt"}); blockedCodeString.add(new String[]{"proc"}); blockedCodeString.add(new String[]{"root"}); blockedCodeString.add(new String[]{"sbin"}); blockedCodeString.add(new String[]{"selinux"}); blockedCodeString.add(new String[]{"usr"}); blockedCodeString.add(new String[]{"passwd"}); blockedCodeString.add(new String[]{"useradd"}); blockedCodeString.add(new String[]{"userdel"}); blockedCodeString.add(new String[]{"rm"}); blockedCodeString.add(new String[]{"akka "}); blockedCodeString.add(new String[]{"groupadd"}); blockedCodeString.add(new String[]{"groupdel"}); blockedCodeString.add(new String[]{"mkdir"}); blockedCodeString.add(new String[]{"rmdir"}); blockedCodeString.add(new String[]{"ping"}); blockedCodeString.add(new String[]{"nc"}); blockedCodeString.add(new String[]{"telnet"}); blockedCodeString.add(new String[]{"ftp"}); blockedCodeString.add(new String[]{"scp"}); blockedCodeString.add(new String[]{"ssh"}); blockedCodeString.add(new String[]{"ps"}); blockedCodeString.add(new String[]{"hostname"}); blockedCodeString.add(new String[]{"uname"}); blockedCodeString.add(new String[]{"vim"}); blockedCodeString.add(new String[]{"nano"}); blockedCodeString.add(new String[]{"top"}); blockedCodeString.add(new String[]{"cat"}); blockedCodeString.add(new String[]{"more"}); blockedCodeString.add(new String[]{"less"}); blockedCodeString.add(new String[]{"chkconfig"}); blockedCodeString.add(new String[]{"service"}); blockedCodeString.add(new String[]{"netstat"}); blockedCodeString.add(new String[]{"iptables"}); blockedCodeString.add(new String[]{"ip"}); blockedCodeString.add(new String[]{"route "}); blockedCodeString.add(new String[]{"curl"}); blockedCodeString.add(new String[]{"wget"}); blockedCodeString.add(new String[]{"sysctl"}); blockedCodeString.add(new String[]{"touch"}); blockedCodeString.add(new String[]{"scala.sys.process"}); blockedCodeString.add(new String[]{"0.0.0.0"}); blockedCodeString.add(new String[]{"58.215.191"}); blockedCodeString.add(new String[]{"git"}); blockedCodeString.add(new String[]{"svn"}); blockedCodeString.add(new String[]{"hg"}); blockedCodeString.add(new String[]{"cvs"}); blockedCodeString.add(new String[]{"exec"}); blockedCodeString.add(new String[]{"ln"}); blockedCodeString.add(new String[]{"kill"}); blockedCodeString.add(new String[]{"rsync"}); blockedCodeString.add(new String[]{"lsof"}); blockedCodeString.add(new String[]{"crontab"}); blockedCodeString.add(new String[]{"libtool"}); blockedCodeString.add(new String[]{"automake"}); blockedCodeString.add(new String[]{"autoconf"}); blockedCodeString.add(new String[]{"make"}); blockedCodeString.add(new String[]{"gcc"}); blockedCodeString.add(new String[]{"cc"}); } ...... @Override public RemoteInterpreterResult interpret(String noteId, String className, String st, RemoteInterpreterContext interpreterContext) throws TException { if (logger.isDebugEnabled()) { logger.debug("st:\n{}", st); } Interpreter intp = getInterpreter(noteId, className); InterpreterContext context = convert(interpreterContext); context.setClassName(intp.getClassName()); Scheduler scheduler = intp.getScheduler(); InterpretJobListener jobListener = new InterpretJobListener(); InterpretJob job = new InterpretJob( interpreterContext.getParagraphId(), "remoteInterpretJob_" + System.currentTimeMillis(), jobListener, JobProgressPoller.DEFAULT_INTERVAL_MSEC, intp, st, context); InterpreterResult result; try{ String matchesStrings = anyMatch(st, blockedCodeString); result = new InterpreterResult(Code.ERROR, "Contains dangerous code : " + matchesStrings); }catch (Exception me){ // no match any scheduler.submit(job); while (!job.isTerminated()) { synchronized (jobListener) { try { jobListener.wait(1000); } catch (InterruptedException e) { logger.info("Exception in RemoteInterpreterServer while interpret, jobListener.wait", e); } } } if (job.getStatus() == Status.ERROR) { result = new InterpreterResult(Code.ERROR, Job.getStack(job.getException())); } else { result = (InterpreterResult) job.getReturn(); // in case of job abort in PENDING status, result can be null if (result == null) { result = new InterpreterResult(Code.KEEP_PREVIOUS_RESULT); } } } return convert(result, context.getConfig(), context.getGui()); } ......