org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.hive.metastore.version=1.2.1' '--conf' 'spark.sql.hive.metastore.jars=maven' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e' '--conf' 'spark.sql.test.version.index=1' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test9150525760230957656.py' 2019-07-31 09:03:15.473 - stderr> 19/07/31 09:03:15 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2019-07-31 09:03:16.354 - stderr> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 2019-07-31 09:03:16.357 - stderr> 19/07/31 09:03:16 INFO SparkContext: Running Spark version 2.4.3 2019-07-31 09:03:16.393 - stderr> 19/07/31 09:03:16 INFO SparkContext: Submitted application: prepare testing tables 2019-07-31 09:03:16.463 - stderr> 19/07/31 09:03:16 INFO SecurityManager: Changing view acls to: jenkins 2019-07-31 09:03:16.463 - stderr> 19/07/31 09:03:16 INFO SecurityManager: Changing modify acls to: jenkins 2019-07-31 09:03:16.463 - stderr> 19/07/31 09:03:16 INFO SecurityManager: Changing view acls groups to: 2019-07-31 09:03:16.464 - stderr> 19/07/31 09:03:16 INFO SecurityManager: Changing modify acls groups to: 2019-07-31 09:03:16.464 - stderr> 19/07/31 09:03:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 2019-07-31 09:03:16.981 - stderr> 19/07/31 09:03:16 INFO Utils: Successfully started service 'sparkDriver' on port 38524. 2019-07-31 09:03:17.01 - stderr> 19/07/31 09:03:17 INFO SparkEnv: Registering MapOutputTracker 2019-07-31 09:03:17.029 - stderr> 19/07/31 09:03:17 INFO SparkEnv: Registering BlockManagerMaster 2019-07-31 09:03:17.032 - stderr> 19/07/31 09:03:17 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2019-07-31 09:03:17.032 - stderr> 19/07/31 09:03:17 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 2019-07-31 09:03:17.128 - stderr> 19/07/31 09:03:17 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-afc7ed70-d733-4354-970f-4d8990ca63d0 2019-07-31 09:03:17.156 - stderr> 19/07/31 09:03:17 INFO MemoryStore: MemoryStore started with capacity 366.3 MB 2019-07-31 09:03:17.175 - stderr> 19/07/31 09:03:17 INFO SparkEnv: Registering OutputCommitCoordinator 2019-07-31 09:03:17.285 - stderr> 19/07/31 09:03:17 INFO Executor: Starting executor ID driver on host localhost 2019-07-31 09:03:17.365 - stderr> 19/07/31 09:03:17 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35258. 2019-07-31 09:03:17.366 - stderr> 19/07/31 09:03:17 INFO NettyBlockTransferService: Server created on amp-jenkins-worker-02.amp:35258 2019-07-31 09:03:17.367 - stderr> 19/07/31 09:03:17 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2019-07-31 09:03:17.399 - stderr> 19/07/31 09:03:17 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-02.amp, 35258, None) 2019-07-31 09:03:17.402 - stderr> 19/07/31 09:03:17 INFO BlockManagerMasterEndpoint: Registering block manager amp-jenkins-worker-02.amp:35258 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-02.amp, 35258, None) 2019-07-31 09:03:17.405 - stderr> 19/07/31 09:03:17 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-02.amp, 35258, None) 2019-07-31 09:03:17.405 - stderr> 19/07/31 09:03:17 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-02.amp, 35258, None) 2019-07-31 09:03:17.796 - stderr> 19/07/31 09:03:17 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e'). 2019-07-31 09:03:17.796 - stderr> 19/07/31 09:03:17 INFO SharedState: Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e'. 2019-07-31 09:03:18.56 - stderr> 19/07/31 09:03:18 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 2019-07-31 09:03:21.346 - stderr> 19/07/31 09:03:21 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using maven. 2019-07-31 09:03:21.353 - stderr> http://www.datanucleus.org/downloads/maven2 added as a remote repository with the name: repo-1 2019-07-31 09:03:21.356 - stderr> Ivy Default Cache set to: /home/jenkins/.ivy2/cache 2019-07-31 09:03:21.356 - stderr> The jars for the packages stored in: /home/jenkins/.ivy2/jars 2019-07-31 09:03:21.396 - stderr> :: loading settings :: url = jar:file:/tmp/test-spark/spark-2.4.3/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml 2019-07-31 09:03:21.47 - stderr> org.apache.hive#hive-metastore added as a dependency 2019-07-31 09:03:21.47 - stderr> org.apache.hive#hive-exec added as a dependency 2019-07-31 09:03:21.47 - stderr> org.apache.hive#hive-common added as a dependency 2019-07-31 09:03:21.47 - stderr> org.apache.hive#hive-serde added as a dependency 2019-07-31 09:03:21.47 - stderr> com.google.guava#guava added as a dependency 2019-07-31 09:03:21.47 - stderr> org.apache.hadoop#hadoop-client added as a dependency 2019-07-31 09:03:21.472 - stderr> :: resolving dependencies :: org.apache.spark#spark-submit-parent-938b276c-7b20-46cf-8a9f-c06bc7eedecb;1.0 2019-07-31 09:03:21.473 - stderr> confs: [default] 2019-07-31 09:03:21.926 - stderr> found org.apache.hive#hive-metastore;1.2.2 in central 2019-07-31 09:03:21.984 - stderr> found org.apache.hive#hive-serde;1.2.2 in central 2019-07-31 09:03:22.041 - stderr> found org.apache.hive#hive-common;1.2.2 in central 2019-07-31 09:03:22.087 - stderr> found org.apache.hive#hive-shims;1.2.2 in central 2019-07-31 09:03:22.13 - stderr> found org.apache.hive.shims#hive-shims-common;1.2.2 in central 2019-07-31 09:03:22.176 - stderr> found commons-logging#commons-logging;1.1.3 in central 2019-07-31 09:03:22.201 - stderr> found log4j#log4j;1.2.16 in central 2019-07-31 09:03:22.23 - stderr> found log4j#apache-log4j-extras;1.2.17 in central 2019-07-31 09:03:22.281 - stderr> found com.google.guava#guava;14.0.1 in central 2019-07-31 09:03:22.315 - stderr> found commons-lang#commons-lang;2.6 in central 2019-07-31 09:03:22.347 - stderr> found org.apache.thrift#libthrift;0.9.2 in central 2019-07-31 09:03:22.374 - stderr> found org.slf4j#slf4j-api;1.7.5 in central 2019-07-31 09:03:22.398 - stderr> found org.apache.httpcomponents#httpclient;4.4 in central 2019-07-31 09:03:22.42 - stderr> found org.apache.httpcomponents#httpcore;4.4 in central 2019-07-31 09:03:22.442 - stderr> found commons-codec#commons-codec;1.4 in central 2019-07-31 09:03:22.475 - stderr> found org.apache.zookeeper#zookeeper;3.4.6 in central 2019-07-31 09:03:22.521 - stderr> found org.slf4j#slf4j-log4j12;1.7.5 in central 2019-07-31 09:03:22.558 - stderr> found jline#jline;2.12 in central 2019-07-31 09:03:22.591 - stderr> found io.netty#netty;3.7.0.Final in central 2019-07-31 09:03:22.618 - stderr> found org.apache.hive.shims#hive-shims-0.20S;1.2.2 in central 2019-07-31 09:03:22.65 - stderr> found org.apache.hive.shims#hive-shims-0.23;1.2.2 in central 2019-07-31 09:03:22.682 - stderr> found org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.6.0 in central 2019-07-31 09:03:22.732 - stderr> found org.apache.hadoop#hadoop-annotations;2.6.0 in central 2019-07-31 09:03:22.762 - stderr> found com.google.inject.extensions#guice-servlet;3.0 in central 2019-07-31 09:03:22.79 - stderr> found com.google.inject#guice;3.0 in central 2019-07-31 09:03:22.818 - stderr> found javax.inject#javax.inject;1 in central 2019-07-31 09:03:22.845 - stderr> found aopalliance#aopalliance;1.0 in central 2019-07-31 09:03:22.863 - stderr> found org.sonatype.sisu.inject#cglib;2.2.1-v20090111 in central 2019-07-31 09:03:22.881 - stderr> found asm#asm;3.2 in central 2019-07-31 09:03:22.907 - stderr> found com.google.protobuf#protobuf-java;2.5.0 in central 2019-07-31 09:03:22.933 - stderr> found commons-io#commons-io;2.4 in central 2019-07-31 09:03:22.957 - stderr> found com.sun.jersey#jersey-json;1.14 in central 2019-07-31 09:03:22.998 - stderr> found org.codehaus.jettison#jettison;1.1 in central 2019-07-31 09:03:23.022 - stderr> found com.sun.xml.bind#jaxb-impl;2.2.3-1 in central 2019-07-31 09:03:23.044 - stderr> found javax.xml.bind#jaxb-api;2.2.2 in central 2019-07-31 09:03:23.067 - stderr> found javax.xml.stream#stax-api;1.0-2 in central 2019-07-31 09:03:23.09 - stderr> found javax.activation#activation;1.1 in central 2019-07-31 09:03:23.112 - stderr> found org.codehaus.jackson#jackson-core-asl;1.9.2 in central 2019-07-31 09:03:23.134 - stderr> found org.codehaus.jackson#jackson-mapper-asl;1.9.2 in central 2019-07-31 09:03:23.157 - stderr> found org.codehaus.jackson#jackson-jaxrs;1.9.2 in central 2019-07-31 09:03:23.18 - stderr> found org.codehaus.jackson#jackson-xc;1.9.2 in central 2019-07-31 09:03:23.203 - stderr> found com.sun.jersey.contribs#jersey-guice;1.9 in central 2019-07-31 09:03:23.232 - stderr> found org.apache.hadoop#hadoop-yarn-common;2.6.0 in central 2019-07-31 09:03:23.264 - stderr> found org.apache.hadoop#hadoop-yarn-api;2.6.0 in central 2019-07-31 09:03:23.337 - stderr> found org.apache.commons#commons-compress;1.4.1 in central 2019-07-31 09:03:23.361 - stderr> found org.tukaani#xz;1.0 in central 2019-07-31 09:03:23.394 - stderr> found org.mortbay.jetty#jetty-util;6.1.26 in central 2019-07-31 09:03:23.418 - stderr> found com.sun.jersey#jersey-core;1.14 in central 2019-07-31 09:03:23.441 - stderr> found com.sun.jersey#jersey-client;1.9 in central 2019-07-31 09:03:23.478 - stderr> found commons-cli#commons-cli;1.2 in central 2019-07-31 09:03:23.516 - stderr> found com.sun.jersey#jersey-server;1.14 in central 2019-07-31 09:03:23.56 - stderr> found org.apache.hadoop#hadoop-yarn-server-common;2.6.0 in central 2019-07-31 09:03:23.587 - stderr> found org.fusesource.leveldbjni#leveldbjni-all;1.8 in central 2019-07-31 09:03:23.797 - stderr> found org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.6.0 in central 2019-07-31 09:03:23.859 - stderr> found commons-collections#commons-collections;3.2.2 in central 2019-07-31 09:03:23.881 - stderr> found org.apache.hadoop#hadoop-yarn-server-web-proxy;2.6.0 in central 2019-07-31 09:03:23.9 - stderr> found commons-httpclient#commons-httpclient;3.0.1 in central 2019-07-31 09:03:23.922 - stderr> found junit#junit;4.11 in central 2019-07-31 09:03:23.942 - stderr> found org.hamcrest#hamcrest-core;1.3 in central 2019-07-31 09:03:23.964 - stderr> found org.mortbay.jetty#jetty;6.1.26 in central 2019-07-31 09:03:24.012 - stderr> found org.apache.hive.shims#hive-shims-scheduler;1.2.2 in central 2019-07-31 09:03:24.031 - stderr> found joda-time#joda-time;2.5 in central 2019-07-31 09:03:24.045 - stderr> found org.apache.ant#ant;1.9.1 in central 2019-07-31 09:03:24.059 - stderr> found org.apache.ant#ant-launcher;1.9.1 in central 2019-07-31 09:03:24.073 - stderr> found org.json#json;20090211 in central 2019-07-31 09:03:24.088 - stderr> found com.google.code.findbugs#jsr305;3.0.0 in central 2019-07-31 09:03:24.101 - stderr> found org.apache.avro#avro;1.7.5 in central 2019-07-31 09:03:24.112 - stderr> found com.thoughtworks.paranamer#paranamer;2.3 in central 2019-07-31 09:03:24.124 - stderr> found org.xerial.snappy#snappy-java;1.0.5 in central 2019-07-31 09:03:24.137 - stderr> found net.sf.opencsv#opencsv;2.3 in central 2019-07-31 09:03:24.15 - stderr> found com.twitter#parquet-hadoop-bundle;1.6.0 in central 2019-07-31 09:03:24.162 - stderr> found com.jolbox#bonecp;0.8.0.RELEASE in central 2019-07-31 09:03:24.172 - stderr> found org.apache.derby#derby;10.10.2.0 in central 2019-07-31 09:03:24.183 - stderr> found org.datanucleus#datanucleus-api-jdo;3.2.6 in central 2019-07-31 09:03:24.195 - stderr> found org.datanucleus#datanucleus-core;3.2.10 in central 2019-07-31 09:03:24.205 - stderr> found org.datanucleus#datanucleus-rdbms;3.2.9 in central 2019-07-31 09:03:24.215 - stderr> found commons-pool#commons-pool;1.5.4 in central 2019-07-31 09:03:24.227 - stderr> found commons-dbcp#commons-dbcp;1.4 in central 2019-07-31 09:03:24.239 - stderr> found javax.jdo#jdo-api;3.0.1 in central 2019-07-31 09:03:24.252 - stderr> found javax.transaction#jta;1.1 in central 2019-07-31 09:03:24.263 - stderr> found org.antlr#antlr-runtime;3.4 in central 2019-07-31 09:03:24.274 - stderr> found org.antlr#stringtemplate;3.2.1 in central 2019-07-31 09:03:24.284 - stderr> found antlr#antlr;2.7.7 in central 2019-07-31 09:03:24.295 - stderr> found org.apache.thrift#libfb303;0.9.2 in central 2019-07-31 09:03:24.31 - stderr> found org.apache.hive#hive-exec;1.2.2 in central 2019-07-31 09:03:24.327 - stderr> found org.apache.hive#hive-ant;1.2.2 in central 2019-07-31 09:03:24.357 - stderr> found org.apache.velocity#velocity;1.5 in central 2019-07-31 09:03:24.377 - stderr> found oro#oro;2.0.8 in central 2019-07-31 09:03:24.439 - stderr> found org.antlr#ST4;4.0.4 in central 2019-07-31 09:03:24.461 - stderr> found org.apache.ivy#ivy;2.4.0 in central 2019-07-31 09:03:24.479 - stderr> found org.codehaus.groovy#groovy-all;2.1.6 in central 2019-07-31 09:03:24.496 - stderr> found org.apache.calcite#calcite-core;1.2.0-incubating in central 2019-07-31 09:03:24.509 - stderr> found org.apache.calcite#calcite-avatica;1.2.0-incubating in central 2019-07-31 09:03:24.522 - stderr> found org.apache.calcite#calcite-linq4j;1.2.0-incubating in central 2019-07-31 09:03:24.551 - stderr> found net.hydromatic#eigenbase-properties;1.1.5 in central 2019-07-31 09:03:24.564 - stderr> found org.codehaus.janino#janino;2.7.6 in central 2019-07-31 09:03:24.578 - stderr> found org.codehaus.janino#commons-compiler;2.7.6 in central 2019-07-31 09:03:24.592 - stderr> found stax#stax-api;1.0.1 in central 2019-07-31 09:03:24.612 - stderr> found org.apache.hadoop#hadoop-client;2.7.3 in central 2019-07-31 09:03:24.635 - stderr> found org.apache.hadoop#hadoop-common;2.7.3 in central 2019-07-31 09:03:24.66 - stderr> found org.apache.hadoop#hadoop-annotations;2.7.3 in central 2019-07-31 09:03:24.679 - stderr> found org.apache.commons#commons-math3;3.1.1 in central 2019-07-31 09:03:24.689 - stderr> found xmlenc#xmlenc;0.52 in central 2019-07-31 09:03:24.699 - stderr> found commons-httpclient#commons-httpclient;3.1 in central 2019-07-31 09:03:24.72 - stderr> found commons-net#commons-net;3.1 in central 2019-07-31 09:03:24.735 - stderr> found log4j#log4j;1.2.17 in central 2019-07-31 09:03:24.751 - stderr> found commons-configuration#commons-configuration;1.6 in central 2019-07-31 09:03:24.767 - stderr> found commons-digester#commons-digester;1.8 in central 2019-07-31 09:03:24.776 - stderr> found commons-beanutils#commons-beanutils;1.7.0 in central 2019-07-31 09:03:24.786 - stderr> found commons-beanutils#commons-beanutils-core;1.8.0 in central 2019-07-31 09:03:24.795 - stderr> found org.slf4j#slf4j-api;1.7.10 in central 2019-07-31 09:03:24.804 - stderr> found org.codehaus.jackson#jackson-core-asl;1.9.13 in central 2019-07-31 09:03:24.812 - stderr> found org.codehaus.jackson#jackson-mapper-asl;1.9.13 in central 2019-07-31 09:03:24.829 - stderr> found com.google.code.gson#gson;2.2.4 in central 2019-07-31 09:03:24.841 - stderr> found org.apache.hadoop#hadoop-auth;2.7.3 in central 2019-07-31 09:03:24.863 - stderr> found org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15 in central 2019-07-31 09:03:24.874 - stderr> found org.apache.directory.server#apacheds-i18n;2.0.0-M15 in central 2019-07-31 09:03:24.886 - stderr> found org.apache.directory.api#api-asn1-api;1.0.0-M20 in central 2019-07-31 09:03:24.902 - stderr> found org.apache.directory.api#api-util;1.0.0-M20 in central 2019-07-31 09:03:24.925 - stderr> found org.apache.htrace#htrace-core;3.1.0-incubating in central 2019-07-31 09:03:24.942 - stderr> found javax.servlet.jsp#jsp-api;2.1 in central 2019-07-31 09:03:24.954 - stderr> found org.slf4j#slf4j-log4j12;1.7.10 in central 2019-07-31 09:03:24.97 - stderr> found org.apache.hadoop#hadoop-hdfs;2.7.3 in central 2019-07-31 09:03:24.997 - stderr> found io.netty#netty-all;4.0.23.Final in central 2019-07-31 09:03:25.135 - stderr> found xerces#xercesImpl;2.9.1 in central 2019-07-31 09:03:25.149 - stderr> found xml-apis#xml-apis;1.3.04 in central 2019-07-31 09:03:25.171 - stderr> found org.apache.hadoop#hadoop-mapreduce-client-app;2.7.3 in central 2019-07-31 09:03:25.188 - stderr> found org.apache.hadoop#hadoop-mapreduce-client-common;2.7.3 in central 2019-07-31 09:03:25.206 - stderr> found org.apache.hadoop#hadoop-yarn-common;2.7.3 in central 2019-07-31 09:03:25.229 - stderr> found org.apache.hadoop#hadoop-yarn-api;2.7.3 in central 2019-07-31 09:03:25.322 - stderr> found org.codehaus.jackson#jackson-jaxrs;1.9.13 in central 2019-07-31 09:03:25.337 - stderr> found org.codehaus.jackson#jackson-xc;1.9.13 in central 2019-07-31 09:03:25.4 - stderr> found org.apache.hadoop#hadoop-yarn-client;2.7.3 in central 2019-07-31 09:03:25.415 - stderr> found org.apache.hadoop#hadoop-mapreduce-client-core;2.7.3 in central 2019-07-31 09:03:25.431 - stderr> found org.apache.hadoop#hadoop-yarn-server-common;2.7.3 in central 2019-07-31 09:03:25.454 - stderr> found org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.7.3 in central 2019-07-31 09:03:25.473 - stderr> found org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.7.3 in central 2019-07-31 09:03:25.568 - stderr> :: resolution report :: resolve 4045ms :: artifacts dl 51ms 2019-07-31 09:03:25.568 - stderr> :: modules in use: 2019-07-31 09:03:25.57 - stderr> antlr#antlr;2.7.7 from central in [default] 2019-07-31 09:03:25.57 - stderr> aopalliance#aopalliance;1.0 from central in [default] 2019-07-31 09:03:25.57 - stderr> asm#asm;3.2 from central in [default] 2019-07-31 09:03:25.57 - stderr> com.google.code.findbugs#jsr305;3.0.0 from central in [default] 2019-07-31 09:03:25.57 - stderr> com.google.code.gson#gson;2.2.4 from central in [default] 2019-07-31 09:03:25.571 - stderr> com.google.guava#guava;14.0.1 from central in [default] 2019-07-31 09:03:25.571 - stderr> com.google.inject#guice;3.0 from central in [default] 2019-07-31 09:03:25.571 - stderr> com.google.inject.extensions#guice-servlet;3.0 from central in [default] 2019-07-31 09:03:25.571 - stderr> com.google.protobuf#protobuf-java;2.5.0 from central in [default] 2019-07-31 09:03:25.571 - stderr> com.jolbox#bonecp;0.8.0.RELEASE from central in [default] 2019-07-31 09:03:25.571 - stderr> com.sun.jersey#jersey-client;1.9 from central in [default] 2019-07-31 09:03:25.572 - stderr> com.sun.jersey#jersey-core;1.14 from central in [default] 2019-07-31 09:03:25.572 - stderr> com.sun.jersey#jersey-json;1.14 from central in [default] 2019-07-31 09:03:25.572 - stderr> com.sun.jersey#jersey-server;1.14 from central in [default] 2019-07-31 09:03:25.572 - stderr> com.sun.jersey.contribs#jersey-guice;1.9 from central in [default] 2019-07-31 09:03:25.572 - stderr> com.sun.xml.bind#jaxb-impl;2.2.3-1 from central in [default] 2019-07-31 09:03:25.573 - stderr> com.thoughtworks.paranamer#paranamer;2.3 from central in [default] 2019-07-31 09:03:25.573 - stderr> com.twitter#parquet-hadoop-bundle;1.6.0 from central in [default] 2019-07-31 09:03:25.573 - stderr> commons-beanutils#commons-beanutils;1.7.0 from central in [default] 2019-07-31 09:03:25.573 - stderr> commons-beanutils#commons-beanutils-core;1.8.0 from central in [default] 2019-07-31 09:03:25.573 - stderr> commons-cli#commons-cli;1.2 from central in [default] 2019-07-31 09:03:25.574 - stderr> commons-codec#commons-codec;1.4 from central in [default] 2019-07-31 09:03:25.574 - stderr> commons-collections#commons-collections;3.2.2 from central in [default] 2019-07-31 09:03:25.574 - stderr> commons-configuration#commons-configuration;1.6 from central in [default] 2019-07-31 09:03:25.574 - stderr> commons-dbcp#commons-dbcp;1.4 from central in [default] 2019-07-31 09:03:25.574 - stderr> commons-digester#commons-digester;1.8 from central in [default] 2019-07-31 09:03:25.575 - stderr> commons-httpclient#commons-httpclient;3.1 from central in [default] 2019-07-31 09:03:25.575 - stderr> commons-io#commons-io;2.4 from central in [default] 2019-07-31 09:03:25.575 - stderr> commons-lang#commons-lang;2.6 from central in [default] 2019-07-31 09:03:25.575 - stderr> commons-logging#commons-logging;1.1.3 from central in [default] 2019-07-31 09:03:25.576 - stderr> commons-net#commons-net;3.1 from central in [default] 2019-07-31 09:03:25.576 - stderr> commons-pool#commons-pool;1.5.4 from central in [default] 2019-07-31 09:03:25.576 - stderr> io.netty#netty;3.7.0.Final from central in [default] 2019-07-31 09:03:25.577 - stderr> io.netty#netty-all;4.0.23.Final from central in [default] 2019-07-31 09:03:25.577 - stderr> javax.activation#activation;1.1 from central in [default] 2019-07-31 09:03:25.577 - stderr> javax.inject#javax.inject;1 from central in [default] 2019-07-31 09:03:25.578 - stderr> javax.jdo#jdo-api;3.0.1 from central in [default] 2019-07-31 09:03:25.578 - stderr> javax.servlet.jsp#jsp-api;2.1 from central in [default] 2019-07-31 09:03:25.578 - stderr> javax.transaction#jta;1.1 from central in [default] 2019-07-31 09:03:25.579 - stderr> javax.xml.bind#jaxb-api;2.2.2 from central in [default] 2019-07-31 09:03:25.579 - stderr> javax.xml.stream#stax-api;1.0-2 from central in [default] 2019-07-31 09:03:25.579 - stderr> jline#jline;2.12 from central in [default] 2019-07-31 09:03:25.579 - stderr> joda-time#joda-time;2.5 from central in [default] 2019-07-31 09:03:25.579 - stderr> log4j#apache-log4j-extras;1.2.17 from central in [default] 2019-07-31 09:03:25.579 - stderr> log4j#log4j;1.2.17 from central in [default] 2019-07-31 09:03:25.579 - stderr> net.hydromatic#eigenbase-properties;1.1.5 from central in [default] 2019-07-31 09:03:25.58 - stderr> net.sf.opencsv#opencsv;2.3 from central in [default] 2019-07-31 09:03:25.58 - stderr> org.antlr#ST4;4.0.4 from central in [default] 2019-07-31 09:03:25.58 - stderr> org.antlr#antlr-runtime;3.4 from central in [default] 2019-07-31 09:03:25.58 - stderr> org.antlr#stringtemplate;3.2.1 from central in [default] 2019-07-31 09:03:25.58 - stderr> org.apache.ant#ant;1.9.1 from central in [default] 2019-07-31 09:03:25.58 - stderr> org.apache.ant#ant-launcher;1.9.1 from central in [default] 2019-07-31 09:03:25.58 - stderr> org.apache.avro#avro;1.7.5 from central in [default] 2019-07-31 09:03:25.581 - stderr> org.apache.calcite#calcite-avatica;1.2.0-incubating from central in [default] 2019-07-31 09:03:25.581 - stderr> org.apache.calcite#calcite-core;1.2.0-incubating from central in [default] 2019-07-31 09:03:25.581 - stderr> org.apache.calcite#calcite-linq4j;1.2.0-incubating from central in [default] 2019-07-31 09:03:25.581 - stderr> org.apache.commons#commons-compress;1.4.1 from central in [default] 2019-07-31 09:03:25.581 - stderr> org.apache.commons#commons-math3;3.1.1 from central in [default] 2019-07-31 09:03:25.581 - stderr> org.apache.derby#derby;10.10.2.0 from central in [default] 2019-07-31 09:03:25.581 - stderr> org.apache.directory.api#api-asn1-api;1.0.0-M20 from central in [default] 2019-07-31 09:03:25.581 - stderr> org.apache.directory.api#api-util;1.0.0-M20 from central in [default] 2019-07-31 09:03:25.582 - stderr> org.apache.directory.server#apacheds-i18n;2.0.0-M15 from central in [default] 2019-07-31 09:03:25.582 - stderr> org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15 from central in [default] 2019-07-31 09:03:25.582 - stderr> org.apache.hadoop#hadoop-annotations;2.7.3 from central in [default] 2019-07-31 09:03:25.582 - stderr> org.apache.hadoop#hadoop-auth;2.7.3 from central in [default] 2019-07-31 09:03:25.582 - stderr> org.apache.hadoop#hadoop-client;2.7.3 from central in [default] 2019-07-31 09:03:25.582 - stderr> org.apache.hadoop#hadoop-common;2.7.3 from central in [default] 2019-07-31 09:03:25.582 - stderr> org.apache.hadoop#hadoop-hdfs;2.7.3 from central in [default] 2019-07-31 09:03:25.582 - stderr> org.apache.hadoop#hadoop-mapreduce-client-app;2.7.3 from central in [default] 2019-07-31 09:03:25.583 - stderr> org.apache.hadoop#hadoop-mapreduce-client-common;2.7.3 from central in [default] 2019-07-31 09:03:25.583 - stderr> org.apache.hadoop#hadoop-mapreduce-client-core;2.7.3 from central in [default] 2019-07-31 09:03:25.583 - stderr> org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.7.3 from central in [default] 2019-07-31 09:03:25.583 - stderr> org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.7.3 from central in [default] 2019-07-31 09:03:25.583 - stderr> org.apache.hadoop#hadoop-yarn-api;2.7.3 from central in [default] 2019-07-31 09:03:25.583 - stderr> org.apache.hadoop#hadoop-yarn-client;2.7.3 from central in [default] 2019-07-31 09:03:25.583 - stderr> org.apache.hadoop#hadoop-yarn-common;2.7.3 from central in [default] 2019-07-31 09:03:25.583 - stderr> org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.6.0 from central in [default] 2019-07-31 09:03:25.584 - stderr> org.apache.hadoop#hadoop-yarn-server-common;2.7.3 from central in [default] 2019-07-31 09:03:25.584 - stderr> org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.6.0 from central in [default] 2019-07-31 09:03:25.584 - stderr> org.apache.hadoop#hadoop-yarn-server-web-proxy;2.6.0 from central in [default] 2019-07-31 09:03:25.584 - stderr> org.apache.hive#hive-ant;1.2.2 from central in [default] 2019-07-31 09:03:25.584 - stderr> org.apache.hive#hive-common;1.2.2 from central in [default] 2019-07-31 09:03:25.584 - stderr> org.apache.hive#hive-exec;1.2.2 from central in [default] 2019-07-31 09:03:25.584 - stderr> org.apache.hive#hive-metastore;1.2.2 from central in [default] 2019-07-31 09:03:25.584 - stderr> org.apache.hive#hive-serde;1.2.2 from central in [default] 2019-07-31 09:03:25.585 - stderr> org.apache.hive#hive-shims;1.2.2 from central in [default] 2019-07-31 09:03:25.585 - stderr> org.apache.hive.shims#hive-shims-0.20S;1.2.2 from central in [default] 2019-07-31 09:03:25.585 - stderr> org.apache.hive.shims#hive-shims-0.23;1.2.2 from central in [default] 2019-07-31 09:03:25.585 - stderr> org.apache.hive.shims#hive-shims-common;1.2.2 from central in [default] 2019-07-31 09:03:25.585 - stderr> org.apache.hive.shims#hive-shims-scheduler;1.2.2 from central in [default] 2019-07-31 09:03:25.585 - stderr> org.apache.htrace#htrace-core;3.1.0-incubating from central in [default] 2019-07-31 09:03:25.585 - stderr> org.apache.httpcomponents#httpclient;4.4 from central in [default] 2019-07-31 09:03:25.586 - stderr> org.apache.httpcomponents#httpcore;4.4 from central in [default] 2019-07-31 09:03:25.586 - stderr> org.apache.ivy#ivy;2.4.0 from central in [default] 2019-07-31 09:03:25.586 - stderr> org.apache.thrift#libfb303;0.9.2 from central in [default] 2019-07-31 09:03:25.586 - stderr> org.apache.thrift#libthrift;0.9.2 from central in [default] 2019-07-31 09:03:25.586 - stderr> org.apache.velocity#velocity;1.5 from central in [default] 2019-07-31 09:03:25.586 - stderr> org.apache.zookeeper#zookeeper;3.4.6 from central in [default] 2019-07-31 09:03:25.586 - stderr> org.codehaus.groovy#groovy-all;2.1.6 from central in [default] 2019-07-31 09:03:25.587 - stderr> org.codehaus.jackson#jackson-core-asl;1.9.13 from central in [default] 2019-07-31 09:03:25.587 - stderr> org.codehaus.jackson#jackson-jaxrs;1.9.13 from central in [default] 2019-07-31 09:03:25.587 - stderr> org.codehaus.jackson#jackson-mapper-asl;1.9.13 from central in [default] 2019-07-31 09:03:25.587 - stderr> org.codehaus.jackson#jackson-xc;1.9.13 from central in [default] 2019-07-31 09:03:25.587 - stderr> org.codehaus.janino#commons-compiler;2.7.6 from central in [default] 2019-07-31 09:03:25.588 - stderr> org.codehaus.janino#janino;2.7.6 from central in [default] 2019-07-31 09:03:25.588 - stderr> org.codehaus.jettison#jettison;1.1 from central in [default] 2019-07-31 09:03:25.588 - stderr> org.datanucleus#datanucleus-api-jdo;3.2.6 from central in [default] 2019-07-31 09:03:25.588 - stderr> org.datanucleus#datanucleus-core;3.2.10 from central in [default] 2019-07-31 09:03:25.588 - stderr> org.datanucleus#datanucleus-rdbms;3.2.9 from central in [default] 2019-07-31 09:03:25.588 - stderr> org.fusesource.leveldbjni#leveldbjni-all;1.8 from central in [default] 2019-07-31 09:03:25.588 - stderr> org.json#json;20090211 from central in [default] 2019-07-31 09:03:25.589 - stderr> org.mortbay.jetty#jetty;6.1.26 from central in [default] 2019-07-31 09:03:25.589 - stderr> org.mortbay.jetty#jetty-util;6.1.26 from central in [default] 2019-07-31 09:03:25.589 - stderr> org.slf4j#slf4j-api;1.7.10 from central in [default] 2019-07-31 09:03:25.589 - stderr> org.slf4j#slf4j-log4j12;1.7.10 from central in [default] 2019-07-31 09:03:25.589 - stderr> org.sonatype.sisu.inject#cglib;2.2.1-v20090111 from central in [default] 2019-07-31 09:03:25.59 - stderr> org.tukaani#xz;1.0 from central in [default] 2019-07-31 09:03:25.59 - stderr> org.xerial.snappy#snappy-java;1.0.5 from central in [default] 2019-07-31 09:03:25.59 - stderr> oro#oro;2.0.8 from central in [default] 2019-07-31 09:03:25.591 - stderr> stax#stax-api;1.0.1 from central in [default] 2019-07-31 09:03:25.592 - stderr> xerces#xercesImpl;2.9.1 from central in [default] 2019-07-31 09:03:25.592 - stderr> xml-apis#xml-apis;1.3.04 from central in [default] 2019-07-31 09:03:25.592 - stderr> xmlenc#xmlenc;0.52 from central in [default] 2019-07-31 09:03:25.592 - stderr> :: evicted modules: 2019-07-31 09:03:25.592 - stderr> log4j#log4j;1.2.16 by [log4j#log4j;1.2.17] in [default] 2019-07-31 09:03:25.593 - stderr> org.slf4j#slf4j-api;1.7.5 by [org.slf4j#slf4j-api;1.7.10] in [default] 2019-07-31 09:03:25.593 - stderr> org.slf4j#slf4j-log4j12;1.7.5 by [org.slf4j#slf4j-log4j12;1.7.10] in [default] 2019-07-31 09:03:25.593 - stderr> org.apache.hadoop#hadoop-annotations;2.6.0 by [org.apache.hadoop#hadoop-annotations;2.7.3] in [default] 2019-07-31 09:03:25.593 - stderr> org.codehaus.jackson#jackson-core-asl;1.9.2 by [org.codehaus.jackson#jackson-core-asl;1.9.13] in [default] 2019-07-31 09:03:25.593 - stderr> org.codehaus.jackson#jackson-mapper-asl;1.9.2 by [org.codehaus.jackson#jackson-mapper-asl;1.9.13] in [default] 2019-07-31 09:03:25.593 - stderr> org.codehaus.jackson#jackson-jaxrs;1.9.2 by [org.codehaus.jackson#jackson-jaxrs;1.9.13] in [default] 2019-07-31 09:03:25.593 - stderr> org.codehaus.jackson#jackson-xc;1.9.2 by [org.codehaus.jackson#jackson-xc;1.9.13] in [default] 2019-07-31 09:03:25.593 - stderr> org.apache.hadoop#hadoop-yarn-common;2.6.0 by [org.apache.hadoop#hadoop-yarn-common;2.7.3] in [default] 2019-07-31 09:03:25.593 - stderr> org.apache.hadoop#hadoop-yarn-api;2.6.0 by [org.apache.hadoop#hadoop-yarn-api;2.7.3] in [default] 2019-07-31 09:03:25.593 - stderr> org.apache.hadoop#hadoop-yarn-server-common;2.6.0 by [org.apache.hadoop#hadoop-yarn-server-common;2.7.3] in [default] 2019-07-31 09:03:25.594 - stderr> commons-httpclient#commons-httpclient;3.0.1 by [commons-httpclient#commons-httpclient;3.1] in [default] 2019-07-31 09:03:25.594 - stderr> junit#junit;4.11 transitively in [default] 2019-07-31 09:03:25.594 - stderr> org.hamcrest#hamcrest-core;1.3 transitively in [default] 2019-07-31 09:03:25.594 - stderr> com.google.code.findbugs#jsr305;1.3.9 by [com.google.code.findbugs#jsr305;3.0.0] in [default] 2019-07-31 09:03:25.594 - stderr> com.google.guava#guava;11.0.2 by [com.google.guava#guava;14.0.1] in [default] 2019-07-31 09:03:25.594 - stderr> org.apache.avro#avro;1.7.4 by [org.apache.avro#avro;1.7.5] in [default] 2019-07-31 09:03:25.594 - stderr> org.apache.httpcomponents#httpclient;4.2.5 by [org.apache.httpcomponents#httpclient;4.4] in [default] 2019-07-31 09:03:25.594 - stderr> io.netty#netty;3.6.2.Final by [io.netty#netty;3.7.0.Final] in [default] 2019-07-31 09:03:25.594 - stderr> com.sun.jersey#jersey-core;1.9 by [com.sun.jersey#jersey-core;1.14] in [default] 2019-07-31 09:03:25.594 - stderr> com.sun.jersey#jersey-server;1.9 by [com.sun.jersey#jersey-server;1.14] in [default] 2019-07-31 09:03:25.594 - stderr> com.sun.jersey#jersey-json;1.9 by [com.sun.jersey#jersey-json;1.14] in [default] 2019-07-31 09:03:25.594 - stderr> --------------------------------------------------------------------- 2019-07-31 09:03:25.595 - stderr> | | modules || artifacts | 2019-07-31 09:03:25.595 - stderr> | conf | number| search|dwnlded|evicted|| number|dwnlded| 2019-07-31 09:03:25.595 - stderr> --------------------------------------------------------------------- 2019-07-31 09:03:25.595 - stderr> | default | 145 | 0 | 0 | 22 || 123 | 0 | 2019-07-31 09:03:25.595 - stderr> --------------------------------------------------------------------- 2019-07-31 09:03:25.642 - stderr> :: retrieving :: org.apache.spark#spark-submit-parent-938b276c-7b20-46cf-8a9f-c06bc7eedecb 2019-07-31 09:03:25.642 - stderr> confs: [default] 2019-07-31 09:03:25.731 - stderr> 0 artifacts copied, 123 already retrieved (0kB/89ms) 2019-07-31 09:03:25.902 - stderr> 19/07/31 09:03:25 INFO IsolatedClientLoader: Downloaded metastore jars to /tmp/hive-v1_2-8bc683c2-f316-4161-9200-1d49b8a0c223 2019-07-31 09:03:26.622 - stderr> 19/07/31 09:03:26 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2019-07-31 09:03:26.65 - stderr> 19/07/31 09:03:26 INFO ObjectStore: ObjectStore, initialize called 2019-07-31 09:03:26.8 - stderr> 19/07/31 09:03:26 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 2019-07-31 09:03:26.8 - stderr> 19/07/31 09:03:26 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored 2019-07-31 09:03:36.04 - stderr> 19/07/31 09:03:36 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 2019-07-31 09:03:37.881 - stderr> 19/07/31 09:03:37 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 2019-07-31 09:03:37.882 - stderr> 19/07/31 09:03:37 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 2019-07-31 09:03:38.132 - stderr> 19/07/31 09:03:38 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 2019-07-31 09:03:38.132 - stderr> 19/07/31 09:03:38 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 2019-07-31 09:03:38.219 - stderr> 19/07/31 09:03:38 INFO Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing 2019-07-31 09:03:38.221 - stderr> 19/07/31 09:03:38 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY 2019-07-31 09:03:38.224 - stderr> 19/07/31 09:03:38 INFO ObjectStore: Initialized ObjectStore 2019-07-31 09:03:38.505 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: Added admin role in metastore 2019-07-31 09:03:38.511 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: Added public role in metastore 2019-07-31 09:03:38.571 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: No user is added in admin role, since config is empty 2019-07-31 09:03:38.682 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_all_databases 2019-07-31 09:03:38.684 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_all_databases 2019-07-31 09:03:38.703 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_functions: db=default pat=* 2019-07-31 09:03:38.704 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_functions: db=default pat=* 2019-07-31 09:03:38.705 - stderr> 19/07/31 09:03:38 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table. 2019-07-31 09:03:38.773 - stderr> 19/07/31 09:03:38 INFO SessionState: Created local directory: /tmp/f399e1fe-8b7b-4ed6-9232-1f649d612162_resources 2019-07-31 09:03:38.778 - stderr> 19/07/31 09:03:38 INFO SessionState: Created HDFS directory: /tmp/hive/jenkins/f399e1fe-8b7b-4ed6-9232-1f649d612162 2019-07-31 09:03:38.782 - stderr> 19/07/31 09:03:38 INFO SessionState: Created local directory: /tmp/jenkins/f399e1fe-8b7b-4ed6-9232-1f649d612162 2019-07-31 09:03:38.787 - stderr> 19/07/31 09:03:38 INFO SessionState: Created HDFS directory: /tmp/hive/jenkins/f399e1fe-8b7b-4ed6-9232-1f649d612162/_tmp_space.db 2019-07-31 09:03:38.79 - stderr> 19/07/31 09:03:38 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.2) is /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e 2019-07-31 09:03:38.796 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:38.796 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:38.803 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_table : db=default tbl=data_source_tbl_1 2019-07-31 09:03:38.803 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_table : db=default tbl=data_source_tbl_1 2019-07-31 09:03:38.818 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:38.818 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:38.822 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:38.822 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:38.858 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:38.858 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:38.861 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:38.861 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:39.008 - stderr> 19/07/31 09:03:39 INFO FileOutputCommitter: File Output Committer Algorithm version is 1 2019-07-31 09:03:39.009 - stderr> 19/07/31 09:03:39 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter 2019-07-31 09:03:39.423 - stderr> 19/07/31 09:03:39 INFO CodeGenerator: Code generated in 254.207918 ms 2019-07-31 09:03:39.566 - stderr> 19/07/31 09:03:39 INFO SparkContext: Starting job: sql at NativeMethodAccessorImpl.java:0 2019-07-31 09:03:39.59 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Got job 0 (sql at NativeMethodAccessorImpl.java:0) with 1 output partitions 2019-07-31 09:03:39.591 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Final stage: ResultStage 0 (sql at NativeMethodAccessorImpl.java:0) 2019-07-31 09:03:39.593 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Parents of final stage: List() 2019-07-31 09:03:39.596 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Missing parents: List() 2019-07-31 09:03:39.604 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at sql at NativeMethodAccessorImpl.java:0), which has no missing parents 2019-07-31 09:03:39.779 - stderr> 19/07/31 09:03:39 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 150.1 KB, free 366.2 MB) 2019-07-31 09:03:39.821 - stderr> 19/07/31 09:03:39 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 54.9 KB, free 366.1 MB) 2019-07-31 09:03:39.827 - stderr> 19/07/31 09:03:39 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on amp-jenkins-worker-02.amp:35258 (size: 54.9 KB, free: 366.2 MB) 2019-07-31 09:03:39.83 - stderr> 19/07/31 09:03:39 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161 2019-07-31 09:03:39.847 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at sql at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 2019-07-31 09:03:39.848 - stderr> 19/07/31 09:03:39 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks 2019-07-31 09:03:39.903 - stderr> 19/07/31 09:03:39 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8080 bytes) 2019-07-31 09:03:39.914 - stderr> 19/07/31 09:03:39 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) 2019-07-31 09:03:40.045 - stderr> 19/07/31 09:03:40 INFO CodeGenerator: Code generated in 24.764432 ms 2019-07-31 09:03:40.053 - stderr> 19/07/31 09:03:40 INFO FileOutputCommitter: File Output Committer Algorithm version is 1 2019-07-31 09:03:40.054 - stderr> 19/07/31 09:03:40 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter 2019-07-31 09:03:40.123 - stderr> 19/07/31 09:03:40 INFO FileOutputCommitter: Saved output of task 'attempt_20190731090339_0000_m_000000_0' to file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e/data_source_tbl_1/_temporary/0/task_20190731090339_0000_m_000000 2019-07-31 09:03:40.124 - stderr> 19/07/31 09:03:40 INFO SparkHadoopMapRedUtil: attempt_20190731090339_0000_m_000000_0: Committed 2019-07-31 09:03:40.143 - stderr> 19/07/31 09:03:40 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 2116 bytes result sent to driver 2019-07-31 09:03:40.151 - stderr> 19/07/31 09:03:40 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 262 ms on localhost (executor driver) (1/1) 2019-07-31 09:03:40.155 - stderr> 19/07/31 09:03:40 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 2019-07-31 09:03:40.162 - stderr> 19/07/31 09:03:40 INFO DAGScheduler: ResultStage 0 (sql at NativeMethodAccessorImpl.java:0) finished in 0.533 s 2019-07-31 09:03:40.167 - stderr> 19/07/31 09:03:40 INFO DAGScheduler: Job 0 finished: sql at NativeMethodAccessorImpl.java:0, took 0.599652 s 2019-07-31 09:03:40.25 - stderr> 19/07/31 09:03:40 INFO FileFormatWriter: Write Job 0efa482b-e8b6-4308-a546-52f942f8e325 committed. 2019-07-31 09:03:40.258 - stderr> 19/07/31 09:03:40 INFO FileFormatWriter: Finished processing stats for write job 0efa482b-e8b6-4308-a546-52f942f8e325. 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 35 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 17 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 22 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 21 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 28 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 19 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 29 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 24 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 18 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 13 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 11 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 30 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 15 2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 32 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 33 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 25 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 31 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 12 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 16 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 26 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 34 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 23 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 20 2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 14 2019-07-31 09:03:40.466 - stderr> 19/07/31 09:03:40 INFO BlockManagerInfo: Removed broadcast_0_piece0 on amp-jenkins-worker-02.amp:35258 in memory (size: 54.9 KB, free: 366.3 MB) 2019-07-31 09:03:40.469 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 27 2019-07-31 09:03:40.478 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:40.478 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:40.481 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: get_table : db=default tbl=data_source_tbl_1 2019-07-31 09:03:40.481 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_table : db=default tbl=data_source_tbl_1 2019-07-31 09:03:40.487 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:40.487 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:40.491 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: get_table : db=default tbl=data_source_tbl_1 2019-07-31 09:03:40.491 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_table : db=default tbl=data_source_tbl_1 2019-07-31 09:03:40.556 - stderr> 19/07/31 09:03:40 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`data_source_tbl_1` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive. 2019-07-31 09:03:40.781 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: create_table: Table(tableName:data_source_tbl_1, dbName:default, owner:jenkins, createTime:1564588998, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:col, type:array<string>, comment:from deserializer)], location:null, inputFormat:org.apache.hadoop.mapred.SequenceFileInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{path=file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e/data_source_tbl_1, serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={"type":"struct","fields":[{"name":"i","type":"integer","nullable":true,"metadata":{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.sources.provider=json, spark.sql.create.version=2.4.3}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null)) 2019-07-31 09:03:40.781 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=create_table: Table(tableName:data_source_tbl_1, dbName:default, owner:jenkins, createTime:1564588998, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:col, type:array<string>, comment:from deserializer)], location:null, inputFormat:org.apache.hadoop.mapred.SequenceFileInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{path=file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e/data_source_tbl_1, serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={"type":"struct","fields":[{"name":"i","type":"integer","nullable":true,"metadata":{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.sources.provider=json, spark.sql.create.version=2.4.3}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null)) 2019-07-31 09:03:40.79 - stderr> 19/07/31 09:03:40 INFO log: Updating table stats fast for data_source_tbl_1 2019-07-31 09:03:40.791 - stderr> 19/07/31 09:03:40 INFO log: Updated size of table data_source_tbl_1 to 8 2019-07-31 09:03:41.13 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_table : db=default tbl=hive_compatible_data_source_tbl_1 2019-07-31 09:03:41.13 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_table : db=default tbl=hive_compatible_data_source_tbl_1 2019-07-31 09:03:41.133 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:41.133 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:41.135 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:41.135 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:41.138 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:41.138 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:41.141 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_database: default 2019-07-31 09:03:41.141 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-07-31 09:03:41.161 - stderr> 19/07/31 09:03:41 INFO ParquetFileFormat: Using default output committer for Parquet: org.apache.parquet.hadoop.ParquetOutputCommitter 2019-07-31 09:03:41.171 - stderr> 19/07/31 09:03:41 INFO FileOutputCommitter: File Output Committer Algorithm version is 1 2019-07-31 09:03:41.173 - stderr> 19/07/31 09:03:41 INFO SQLHadoopMapReduceCommitProtocol: Using user defined output committer class org.apache.parquet.hadoop.ParquetOutputCommitter 2019-07-31 09:03:41.173 - stderr> 19/07/31 09:03:41 INFO FileOutputCommitter: File Output Committer Algorithm version is 1 2019-07-31 09:03:41.173 - stderr> 19/07/31 09:03:41 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.parquet.hadoop.ParquetOutputCommitter 2019-07-31 09:03:41.284 - stderr> 19/07/31 09:03:41 INFO SparkContext: Starting job: sql at NativeMethodAccessorImpl.java:0 2019-07-31 09:03:41.287 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Got job 1 (sql at NativeMethodAccessorImpl.java:0) with 1 output partitions 2019-07-31 09:03:41.287 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Final stage: ResultStage 1 (sql at NativeMethodAccessorImpl.java:0) 2019-07-31 09:03:41.287 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Parents of final stage: List() 2019-07-31 09:03:41.287 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Missing parents: List() 2019-07-31 09:03:41.288 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[4] at sql at NativeMethodAccessorImpl.java:0), which has no missing parents 2019-07-31 09:03:41.39 - stderr> 19/07/31 09:03:41 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 147.7 KB, free 366.2 MB) 2019-07-31 09:03:41.393 - stderr> 19/07/31 09:03:41 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 52.8 KB, free 366.1 MB) 2019-07-31 09:03:41.399 - stderr> 19/07/31 09:03:41 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on amp-jenkins-worker-02.amp:35258 (size: 52.8 KB, free: 366.2 MB) 2019-07-31 09:03:41.4 - stderr> 19/07/31 09:03:41 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161 2019-07-31 09:03:41.401 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[4] at sql at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 2019-07-31 09:03:41.402 - stderr> 19/07/31 09:03:41 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks 2019-07-31 09:03:41.403 - stderr> 19/07/31 09:03:41 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, PROCESS_LOCAL, 8080 bytes) 2019-07-31 09:03:41.404 - stderr> 19/07/31 09:03:41 INFO Executor: Running task 0.0 in stage 1.0 (TID 1) 2019-07-31 09:03:41.439 - stderr> 19/07/31 09:03:41 INFO FileOutputCommitter: File Output Committer Algorithm version is 1 2019-07-31 09:03:41.439 - stderr> 19/07/31 09:03:41 INFO SQLHadoopMapReduceCommitProtocol: Using user defined output committer class org.apache.parquet.hadoop.ParquetOutputCommitter 2019-07-31 09:03:41.44 - stderr> 19/07/31 09:03:41 INFO FileOutputCommitter: File Output Committer Algorithm version is 1 2019-07-31 09:03:41.44 - stderr> 19/07/31 09:03:41 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.parquet.hadoop.ParquetOutputCommitter 2019-07-31 09:03:41.445 - stderr> 19/07/31 09:03:41 INFO CodecConfig: Compression: SNAPPY 2019-07-31 09:03:41.446 - stderr> 19/07/31 09:03:41 INFO CodecConfig: Compression: SNAPPY 2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Parquet block size to 134217728 2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Parquet page size to 1048576 2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Parquet dictionary page size to 1048576 2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Dictionary is on 2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Validation is off 2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Writer version is: PARQUET_1_0 2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Maximum row group padding size is 8388608 bytes 2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Page size checking is: estimated 2019-07-31 09:03:41.467 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Min row count for page size check is: 100 2019-07-31 09:03:41.467 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Max row count for page size check is: 10000 2019-07-31 09:03:41.516 - stderr> 19/07/31 09:03:41 INFO ParquetWriteSupport: Initialized Parquet WriteSupport with Catalyst schema: 2019-07-31 09:03:41.516 - stderr> { 2019-07-31 09:03:41.516 - stderr> "type" : "struct", 2019-07-31 09:03:41.516 - stderr> "fields" : [ { 2019-07-31 09:03:41.516 - stderr> "name" : "i", 2019-07-31 09:03:41.516 - stderr> "type" : "integer", 2019-07-31 09:03:41.516 - stderr> "nullable" : false, 2019-07-31 09:03:41.516 - stderr> "metadata" : { } 2019-07-31 09:03:41.516 - stderr> } ] 2019-07-31 09:03:41.516 - stderr> } 2019-07-31 09:03:41.516 - stderr> and corresponding Parquet message type: 2019-07-31 09:03:41.516 - stderr> message spark_schema { 2019-07-31 09:03:41.516 - stderr> required int32 i; 2019-07-31 09:03:41.516 - stderr> } 2019-07-31 09:03:41.516 - stderr> 2019-07-31 09:03:41.516 - stderr> 2019-07-31 09:03:41.563 - stderr> 19/07/31 09:03:41 INFO CodecPool: Got brand-new compressor [.snappy] 2019-07-31 09:03:41.622 - stderr> 19/07/31 09:03:41 INFO InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 8 2019-07-31 09:03:42.001 - stderr> java.io.FileNotFoundException: /tmp/test-spark/spark-2.4.3/jars/snappy-java-1.1.7.3.jar (No such file or directory) 2019-07-31 09:03:42.002 - stderr> java.lang.NullPointerException 2019-07-31 09:03:42.002 - stderr> at org.xerial.snappy.SnappyLoader.extractLibraryFile(SnappyLoader.java:243) 2019-07-31 09:03:42.002 - stderr> at org.xerial.snappy.SnappyLoader.findNativeLibrary(SnappyLoader.java:355) 2019-07-31 09:03:42.002 - stderr> at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:176) 2019-07-31 09:03:42.002 - stderr> at org.xerial.snappy.SnappyLoader.loadSnappyApi(SnappyLoader.java:154) 2019-07-31 09:03:42.002 - stderr> at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.hadoop.codec.SnappyCompressor.compress(SnappyCompressor.java:67) 2019-07-31 09:03:42.002 - stderr> at org.apache.hadoop.io.compress.CompressorStream.compress(CompressorStream.java:81) 2019-07-31 09:03:42.002 - stderr> at org.apache.hadoop.io.compress.CompressorStream.finish(CompressorStream.java:92) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.hadoop.CodecFactory$HeapBytesCompressor.compress(CodecFactory.java:165) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.hadoop.ColumnChunkPageWriteStore$ColumnChunkPageWriter.writePage(ColumnChunkPageWriteStore.java:95) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.column.impl.ColumnWriterV1.writePage(ColumnWriterV1.java:147) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.column.impl.ColumnWriterV1.flush(ColumnWriterV1.java:235) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.column.impl.ColumnWriteStoreV1.flush(ColumnWriteStoreV1.java:122) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:172) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:114) 2019-07-31 09:03:42.002 - stderr> at org.apache.parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:165) 2019-07-31 09:03:42.002 - stderr> at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetOutputWriter.scala:42) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:57) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:74) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:247) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:242) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:248) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:170) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.scheduler.Task.run(Task.scala:121) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) 2019-07-31 09:03:42.003 - stderr> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) 2019-07-31 09:03:42.003 - stderr> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 2019-07-31 09:03:42.003 - stderr> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 2019-07-31 09:03:42.003 - stderr> at java.lang.Thread.run(Thread.java:748) 2019-07-31 09:03:42.008 - stderr> 19/07/31 09:03:42 ERROR Utils: Aborting task 2019-07-31 09:03:42.008 - stderr> org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] null 2019-07-31 09:03:42.009 - stderr> at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:187) 2019-07-31 09:03:42.009 - stderr> at org.xerial.snappy.SnappyLoader.loadSnappyApi(SnappyLoader.java:154) 2019-07-31 09:03:42.009 - stderr> at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.hadoop.codec.SnappyCompressor.compress(SnappyCompressor.java:67) 2019-07-31 09:03:42.009 - stderr> at org.apache.hadoop.io.compress.CompressorStream.compress(CompressorStream.java:81) 2019-07-31 09:03:42.009 - stderr> at org.apache.hadoop.io.compress.CompressorStream.finish(CompressorStream.java:92) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.hadoop.CodecFactory$HeapBytesCompressor.compress(CodecFactory.java:165) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.hadoop.ColumnChunkPageWriteStore$ColumnChunkPageWriter.writePage(ColumnChunkPageWriteStore.java:95) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.column.impl.ColumnWriterV1.writePage(ColumnWriterV1.java:147) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.column.impl.ColumnWriterV1.flush(ColumnWriterV1.java:235) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.column.impl.ColumnWriteStoreV1.flush(ColumnWriteStoreV1.java:122) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:172) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:114) 2019-07-31 09:03:42.009 - stderr> at org.apache.parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:165) 2019-07-31 09:03:42.009 - stderr> at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetOutputWriter.scala:42) 2019-07-31 09:03:42.009 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:57) 2019-07-31 09:03:42.009 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:74) 2019-07-31 09:03:42.009 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:247) 2019-07-31 09:03:42.009 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:242) 2019-07-31 09:03:42.009 - stderr> at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394) 2019-07-31 09:03:42.009 - stderr> at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$Fi

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.hive.metastore.version=1.2.1' '--conf' 'spark.sql.hive.metastore.jars=maven' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e' '--conf' 'spark.sql.test.version.index=1' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test9150525760230957656.py'

2019-07-31 09:03:15.473 - stderr> 19/07/31 09:03:15 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-07-31 09:03:16.354 - stderr> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
2019-07-31 09:03:16.357 - stderr> 19/07/31 09:03:16 INFO SparkContext: Running Spark version 2.4.3
2019-07-31 09:03:16.393 - stderr> 19/07/31 09:03:16 INFO SparkContext: Submitted application: prepare testing tables
2019-07-31 09:03:16.463 - stderr> 19/07/31 09:03:16 INFO SecurityManager: Changing view acls to: jenkins
2019-07-31 09:03:16.463 - stderr> 19/07/31 09:03:16 INFO SecurityManager: Changing modify acls to: jenkins
2019-07-31 09:03:16.463 - stderr> 19/07/31 09:03:16 INFO SecurityManager: Changing view acls groups to: 
2019-07-31 09:03:16.464 - stderr> 19/07/31 09:03:16 INFO SecurityManager: Changing modify acls groups to: 
2019-07-31 09:03:16.464 - stderr> 19/07/31 09:03:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2019-07-31 09:03:16.981 - stderr> 19/07/31 09:03:16 INFO Utils: Successfully started service 'sparkDriver' on port 38524.
2019-07-31 09:03:17.01 - stderr> 19/07/31 09:03:17 INFO SparkEnv: Registering MapOutputTracker
2019-07-31 09:03:17.029 - stderr> 19/07/31 09:03:17 INFO SparkEnv: Registering BlockManagerMaster
2019-07-31 09:03:17.032 - stderr> 19/07/31 09:03:17 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-07-31 09:03:17.032 - stderr> 19/07/31 09:03:17 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
2019-07-31 09:03:17.128 - stderr> 19/07/31 09:03:17 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-afc7ed70-d733-4354-970f-4d8990ca63d0
2019-07-31 09:03:17.156 - stderr> 19/07/31 09:03:17 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
2019-07-31 09:03:17.175 - stderr> 19/07/31 09:03:17 INFO SparkEnv: Registering OutputCommitCoordinator
2019-07-31 09:03:17.285 - stderr> 19/07/31 09:03:17 INFO Executor: Starting executor ID driver on host localhost
2019-07-31 09:03:17.365 - stderr> 19/07/31 09:03:17 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35258.
2019-07-31 09:03:17.366 - stderr> 19/07/31 09:03:17 INFO NettyBlockTransferService: Server created on amp-jenkins-worker-02.amp:35258
2019-07-31 09:03:17.367 - stderr> 19/07/31 09:03:17 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2019-07-31 09:03:17.399 - stderr> 19/07/31 09:03:17 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-02.amp, 35258, None)
2019-07-31 09:03:17.402 - stderr> 19/07/31 09:03:17 INFO BlockManagerMasterEndpoint: Registering block manager amp-jenkins-worker-02.amp:35258 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-02.amp, 35258, None)
2019-07-31 09:03:17.405 - stderr> 19/07/31 09:03:17 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-02.amp, 35258, None)
2019-07-31 09:03:17.405 - stderr> 19/07/31 09:03:17 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-02.amp, 35258, None)
2019-07-31 09:03:17.796 - stderr> 19/07/31 09:03:17 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e').
2019-07-31 09:03:17.796 - stderr> 19/07/31 09:03:17 INFO SharedState: Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e'.
2019-07-31 09:03:18.56 - stderr> 19/07/31 09:03:18 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
2019-07-31 09:03:21.346 - stderr> 19/07/31 09:03:21 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using maven.
2019-07-31 09:03:21.353 - stderr> http://www.datanucleus.org/downloads/maven2 added as a remote repository with the name: repo-1
2019-07-31 09:03:21.356 - stderr> Ivy Default Cache set to: /home/jenkins/.ivy2/cache
2019-07-31 09:03:21.356 - stderr> The jars for the packages stored in: /home/jenkins/.ivy2/jars
2019-07-31 09:03:21.396 - stderr> :: loading settings :: url = jar:file:/tmp/test-spark/spark-2.4.3/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
2019-07-31 09:03:21.47 - stderr> org.apache.hive#hive-metastore added as a dependency
2019-07-31 09:03:21.47 - stderr> org.apache.hive#hive-exec added as a dependency
2019-07-31 09:03:21.47 - stderr> org.apache.hive#hive-common added as a dependency
2019-07-31 09:03:21.47 - stderr> org.apache.hive#hive-serde added as a dependency
2019-07-31 09:03:21.47 - stderr> com.google.guava#guava added as a dependency
2019-07-31 09:03:21.47 - stderr> org.apache.hadoop#hadoop-client added as a dependency
2019-07-31 09:03:21.472 - stderr> :: resolving dependencies :: org.apache.spark#spark-submit-parent-938b276c-7b20-46cf-8a9f-c06bc7eedecb;1.0
2019-07-31 09:03:21.473 - stderr> 	confs: [default]
2019-07-31 09:03:21.926 - stderr> 	found org.apache.hive#hive-metastore;1.2.2 in central
2019-07-31 09:03:21.984 - stderr> 	found org.apache.hive#hive-serde;1.2.2 in central
2019-07-31 09:03:22.041 - stderr> 	found org.apache.hive#hive-common;1.2.2 in central
2019-07-31 09:03:22.087 - stderr> 	found org.apache.hive#hive-shims;1.2.2 in central
2019-07-31 09:03:22.13 - stderr> 	found org.apache.hive.shims#hive-shims-common;1.2.2 in central
2019-07-31 09:03:22.176 - stderr> 	found commons-logging#commons-logging;1.1.3 in central
2019-07-31 09:03:22.201 - stderr> 	found log4j#log4j;1.2.16 in central
2019-07-31 09:03:22.23 - stderr> 	found log4j#apache-log4j-extras;1.2.17 in central
2019-07-31 09:03:22.281 - stderr> 	found com.google.guava#guava;14.0.1 in central
2019-07-31 09:03:22.315 - stderr> 	found commons-lang#commons-lang;2.6 in central
2019-07-31 09:03:22.347 - stderr> 	found org.apache.thrift#libthrift;0.9.2 in central
2019-07-31 09:03:22.374 - stderr> 	found org.slf4j#slf4j-api;1.7.5 in central
2019-07-31 09:03:22.398 - stderr> 	found org.apache.httpcomponents#httpclient;4.4 in central
2019-07-31 09:03:22.42 - stderr> 	found org.apache.httpcomponents#httpcore;4.4 in central
2019-07-31 09:03:22.442 - stderr> 	found commons-codec#commons-codec;1.4 in central
2019-07-31 09:03:22.475 - stderr> 	found org.apache.zookeeper#zookeeper;3.4.6 in central
2019-07-31 09:03:22.521 - stderr> 	found org.slf4j#slf4j-log4j12;1.7.5 in central
2019-07-31 09:03:22.558 - stderr> 	found jline#jline;2.12 in central
2019-07-31 09:03:22.591 - stderr> 	found io.netty#netty;3.7.0.Final in central
2019-07-31 09:03:22.618 - stderr> 	found org.apache.hive.shims#hive-shims-0.20S;1.2.2 in central
2019-07-31 09:03:22.65 - stderr> 	found org.apache.hive.shims#hive-shims-0.23;1.2.2 in central
2019-07-31 09:03:22.682 - stderr> 	found org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.6.0 in central
2019-07-31 09:03:22.732 - stderr> 	found org.apache.hadoop#hadoop-annotations;2.6.0 in central
2019-07-31 09:03:22.762 - stderr> 	found com.google.inject.extensions#guice-servlet;3.0 in central
2019-07-31 09:03:22.79 - stderr> 	found com.google.inject#guice;3.0 in central
2019-07-31 09:03:22.818 - stderr> 	found javax.inject#javax.inject;1 in central
2019-07-31 09:03:22.845 - stderr> 	found aopalliance#aopalliance;1.0 in central
2019-07-31 09:03:22.863 - stderr> 	found org.sonatype.sisu.inject#cglib;2.2.1-v20090111 in central
2019-07-31 09:03:22.881 - stderr> 	found asm#asm;3.2 in central
2019-07-31 09:03:22.907 - stderr> 	found com.google.protobuf#protobuf-java;2.5.0 in central
2019-07-31 09:03:22.933 - stderr> 	found commons-io#commons-io;2.4 in central
2019-07-31 09:03:22.957 - stderr> 	found com.sun.jersey#jersey-json;1.14 in central
2019-07-31 09:03:22.998 - stderr> 	found org.codehaus.jettison#jettison;1.1 in central
2019-07-31 09:03:23.022 - stderr> 	found com.sun.xml.bind#jaxb-impl;2.2.3-1 in central
2019-07-31 09:03:23.044 - stderr> 	found javax.xml.bind#jaxb-api;2.2.2 in central
2019-07-31 09:03:23.067 - stderr> 	found javax.xml.stream#stax-api;1.0-2 in central
2019-07-31 09:03:23.09 - stderr> 	found javax.activation#activation;1.1 in central
2019-07-31 09:03:23.112 - stderr> 	found org.codehaus.jackson#jackson-core-asl;1.9.2 in central
2019-07-31 09:03:23.134 - stderr> 	found org.codehaus.jackson#jackson-mapper-asl;1.9.2 in central
2019-07-31 09:03:23.157 - stderr> 	found org.codehaus.jackson#jackson-jaxrs;1.9.2 in central
2019-07-31 09:03:23.18 - stderr> 	found org.codehaus.jackson#jackson-xc;1.9.2 in central
2019-07-31 09:03:23.203 - stderr> 	found com.sun.jersey.contribs#jersey-guice;1.9 in central
2019-07-31 09:03:23.232 - stderr> 	found org.apache.hadoop#hadoop-yarn-common;2.6.0 in central
2019-07-31 09:03:23.264 - stderr> 	found org.apache.hadoop#hadoop-yarn-api;2.6.0 in central
2019-07-31 09:03:23.337 - stderr> 	found org.apache.commons#commons-compress;1.4.1 in central
2019-07-31 09:03:23.361 - stderr> 	found org.tukaani#xz;1.0 in central
2019-07-31 09:03:23.394 - stderr> 	found org.mortbay.jetty#jetty-util;6.1.26 in central
2019-07-31 09:03:23.418 - stderr> 	found com.sun.jersey#jersey-core;1.14 in central
2019-07-31 09:03:23.441 - stderr> 	found com.sun.jersey#jersey-client;1.9 in central
2019-07-31 09:03:23.478 - stderr> 	found commons-cli#commons-cli;1.2 in central
2019-07-31 09:03:23.516 - stderr> 	found com.sun.jersey#jersey-server;1.14 in central
2019-07-31 09:03:23.56 - stderr> 	found org.apache.hadoop#hadoop-yarn-server-common;2.6.0 in central
2019-07-31 09:03:23.587 - stderr> 	found org.fusesource.leveldbjni#leveldbjni-all;1.8 in central
2019-07-31 09:03:23.797 - stderr> 	found org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.6.0 in central
2019-07-31 09:03:23.859 - stderr> 	found commons-collections#commons-collections;3.2.2 in central
2019-07-31 09:03:23.881 - stderr> 	found org.apache.hadoop#hadoop-yarn-server-web-proxy;2.6.0 in central
2019-07-31 09:03:23.9 - stderr> 	found commons-httpclient#commons-httpclient;3.0.1 in central
2019-07-31 09:03:23.922 - stderr> 	found junit#junit;4.11 in central
2019-07-31 09:03:23.942 - stderr> 	found org.hamcrest#hamcrest-core;1.3 in central
2019-07-31 09:03:23.964 - stderr> 	found org.mortbay.jetty#jetty;6.1.26 in central
2019-07-31 09:03:24.012 - stderr> 	found org.apache.hive.shims#hive-shims-scheduler;1.2.2 in central
2019-07-31 09:03:24.031 - stderr> 	found joda-time#joda-time;2.5 in central
2019-07-31 09:03:24.045 - stderr> 	found org.apache.ant#ant;1.9.1 in central
2019-07-31 09:03:24.059 - stderr> 	found org.apache.ant#ant-launcher;1.9.1 in central
2019-07-31 09:03:24.073 - stderr> 	found org.json#json;20090211 in central
2019-07-31 09:03:24.088 - stderr> 	found com.google.code.findbugs#jsr305;3.0.0 in central
2019-07-31 09:03:24.101 - stderr> 	found org.apache.avro#avro;1.7.5 in central
2019-07-31 09:03:24.112 - stderr> 	found com.thoughtworks.paranamer#paranamer;2.3 in central
2019-07-31 09:03:24.124 - stderr> 	found org.xerial.snappy#snappy-java;1.0.5 in central
2019-07-31 09:03:24.137 - stderr> 	found net.sf.opencsv#opencsv;2.3 in central
2019-07-31 09:03:24.15 - stderr> 	found com.twitter#parquet-hadoop-bundle;1.6.0 in central
2019-07-31 09:03:24.162 - stderr> 	found com.jolbox#bonecp;0.8.0.RELEASE in central
2019-07-31 09:03:24.172 - stderr> 	found org.apache.derby#derby;10.10.2.0 in central
2019-07-31 09:03:24.183 - stderr> 	found org.datanucleus#datanucleus-api-jdo;3.2.6 in central
2019-07-31 09:03:24.195 - stderr> 	found org.datanucleus#datanucleus-core;3.2.10 in central
2019-07-31 09:03:24.205 - stderr> 	found org.datanucleus#datanucleus-rdbms;3.2.9 in central
2019-07-31 09:03:24.215 - stderr> 	found commons-pool#commons-pool;1.5.4 in central
2019-07-31 09:03:24.227 - stderr> 	found commons-dbcp#commons-dbcp;1.4 in central
2019-07-31 09:03:24.239 - stderr> 	found javax.jdo#jdo-api;3.0.1 in central
2019-07-31 09:03:24.252 - stderr> 	found javax.transaction#jta;1.1 in central
2019-07-31 09:03:24.263 - stderr> 	found org.antlr#antlr-runtime;3.4 in central
2019-07-31 09:03:24.274 - stderr> 	found org.antlr#stringtemplate;3.2.1 in central
2019-07-31 09:03:24.284 - stderr> 	found antlr#antlr;2.7.7 in central
2019-07-31 09:03:24.295 - stderr> 	found org.apache.thrift#libfb303;0.9.2 in central
2019-07-31 09:03:24.31 - stderr> 	found org.apache.hive#hive-exec;1.2.2 in central
2019-07-31 09:03:24.327 - stderr> 	found org.apache.hive#hive-ant;1.2.2 in central
2019-07-31 09:03:24.357 - stderr> 	found org.apache.velocity#velocity;1.5 in central
2019-07-31 09:03:24.377 - stderr> 	found oro#oro;2.0.8 in central
2019-07-31 09:03:24.439 - stderr> 	found org.antlr#ST4;4.0.4 in central
2019-07-31 09:03:24.461 - stderr> 	found org.apache.ivy#ivy;2.4.0 in central
2019-07-31 09:03:24.479 - stderr> 	found org.codehaus.groovy#groovy-all;2.1.6 in central
2019-07-31 09:03:24.496 - stderr> 	found org.apache.calcite#calcite-core;1.2.0-incubating in central
2019-07-31 09:03:24.509 - stderr> 	found org.apache.calcite#calcite-avatica;1.2.0-incubating in central
2019-07-31 09:03:24.522 - stderr> 	found org.apache.calcite#calcite-linq4j;1.2.0-incubating in central
2019-07-31 09:03:24.551 - stderr> 	found net.hydromatic#eigenbase-properties;1.1.5 in central
2019-07-31 09:03:24.564 - stderr> 	found org.codehaus.janino#janino;2.7.6 in central
2019-07-31 09:03:24.578 - stderr> 	found org.codehaus.janino#commons-compiler;2.7.6 in central
2019-07-31 09:03:24.592 - stderr> 	found stax#stax-api;1.0.1 in central
2019-07-31 09:03:24.612 - stderr> 	found org.apache.hadoop#hadoop-client;2.7.3 in central
2019-07-31 09:03:24.635 - stderr> 	found org.apache.hadoop#hadoop-common;2.7.3 in central
2019-07-31 09:03:24.66 - stderr> 	found org.apache.hadoop#hadoop-annotations;2.7.3 in central
2019-07-31 09:03:24.679 - stderr> 	found org.apache.commons#commons-math3;3.1.1 in central
2019-07-31 09:03:24.689 - stderr> 	found xmlenc#xmlenc;0.52 in central
2019-07-31 09:03:24.699 - stderr> 	found commons-httpclient#commons-httpclient;3.1 in central
2019-07-31 09:03:24.72 - stderr> 	found commons-net#commons-net;3.1 in central
2019-07-31 09:03:24.735 - stderr> 	found log4j#log4j;1.2.17 in central
2019-07-31 09:03:24.751 - stderr> 	found commons-configuration#commons-configuration;1.6 in central
2019-07-31 09:03:24.767 - stderr> 	found commons-digester#commons-digester;1.8 in central
2019-07-31 09:03:24.776 - stderr> 	found commons-beanutils#commons-beanutils;1.7.0 in central
2019-07-31 09:03:24.786 - stderr> 	found commons-beanutils#commons-beanutils-core;1.8.0 in central
2019-07-31 09:03:24.795 - stderr> 	found org.slf4j#slf4j-api;1.7.10 in central
2019-07-31 09:03:24.804 - stderr> 	found org.codehaus.jackson#jackson-core-asl;1.9.13 in central
2019-07-31 09:03:24.812 - stderr> 	found org.codehaus.jackson#jackson-mapper-asl;1.9.13 in central
2019-07-31 09:03:24.829 - stderr> 	found com.google.code.gson#gson;2.2.4 in central
2019-07-31 09:03:24.841 - stderr> 	found org.apache.hadoop#hadoop-auth;2.7.3 in central
2019-07-31 09:03:24.863 - stderr> 	found org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15 in central
2019-07-31 09:03:24.874 - stderr> 	found org.apache.directory.server#apacheds-i18n;2.0.0-M15 in central
2019-07-31 09:03:24.886 - stderr> 	found org.apache.directory.api#api-asn1-api;1.0.0-M20 in central
2019-07-31 09:03:24.902 - stderr> 	found org.apache.directory.api#api-util;1.0.0-M20 in central
2019-07-31 09:03:24.925 - stderr> 	found org.apache.htrace#htrace-core;3.1.0-incubating in central
2019-07-31 09:03:24.942 - stderr> 	found javax.servlet.jsp#jsp-api;2.1 in central
2019-07-31 09:03:24.954 - stderr> 	found org.slf4j#slf4j-log4j12;1.7.10 in central
2019-07-31 09:03:24.97 - stderr> 	found org.apache.hadoop#hadoop-hdfs;2.7.3 in central
2019-07-31 09:03:24.997 - stderr> 	found io.netty#netty-all;4.0.23.Final in central
2019-07-31 09:03:25.135 - stderr> 	found xerces#xercesImpl;2.9.1 in central
2019-07-31 09:03:25.149 - stderr> 	found xml-apis#xml-apis;1.3.04 in central
2019-07-31 09:03:25.171 - stderr> 	found org.apache.hadoop#hadoop-mapreduce-client-app;2.7.3 in central
2019-07-31 09:03:25.188 - stderr> 	found org.apache.hadoop#hadoop-mapreduce-client-common;2.7.3 in central
2019-07-31 09:03:25.206 - stderr> 	found org.apache.hadoop#hadoop-yarn-common;2.7.3 in central
2019-07-31 09:03:25.229 - stderr> 	found org.apache.hadoop#hadoop-yarn-api;2.7.3 in central
2019-07-31 09:03:25.322 - stderr> 	found org.codehaus.jackson#jackson-jaxrs;1.9.13 in central
2019-07-31 09:03:25.337 - stderr> 	found org.codehaus.jackson#jackson-xc;1.9.13 in central
2019-07-31 09:03:25.4 - stderr> 	found org.apache.hadoop#hadoop-yarn-client;2.7.3 in central
2019-07-31 09:03:25.415 - stderr> 	found org.apache.hadoop#hadoop-mapreduce-client-core;2.7.3 in central
2019-07-31 09:03:25.431 - stderr> 	found org.apache.hadoop#hadoop-yarn-server-common;2.7.3 in central
2019-07-31 09:03:25.454 - stderr> 	found org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.7.3 in central
2019-07-31 09:03:25.473 - stderr> 	found org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.7.3 in central
2019-07-31 09:03:25.568 - stderr> :: resolution report :: resolve 4045ms :: artifacts dl 51ms
2019-07-31 09:03:25.568 - stderr> 	:: modules in use:
2019-07-31 09:03:25.57 - stderr> 	antlr#antlr;2.7.7 from central in [default]
2019-07-31 09:03:25.57 - stderr> 	aopalliance#aopalliance;1.0 from central in [default]
2019-07-31 09:03:25.57 - stderr> 	asm#asm;3.2 from central in [default]
2019-07-31 09:03:25.57 - stderr> 	com.google.code.findbugs#jsr305;3.0.0 from central in [default]
2019-07-31 09:03:25.57 - stderr> 	com.google.code.gson#gson;2.2.4 from central in [default]
2019-07-31 09:03:25.571 - stderr> 	com.google.guava#guava;14.0.1 from central in [default]
2019-07-31 09:03:25.571 - stderr> 	com.google.inject#guice;3.0 from central in [default]
2019-07-31 09:03:25.571 - stderr> 	com.google.inject.extensions#guice-servlet;3.0 from central in [default]
2019-07-31 09:03:25.571 - stderr> 	com.google.protobuf#protobuf-java;2.5.0 from central in [default]
2019-07-31 09:03:25.571 - stderr> 	com.jolbox#bonecp;0.8.0.RELEASE from central in [default]
2019-07-31 09:03:25.571 - stderr> 	com.sun.jersey#jersey-client;1.9 from central in [default]
2019-07-31 09:03:25.572 - stderr> 	com.sun.jersey#jersey-core;1.14 from central in [default]
2019-07-31 09:03:25.572 - stderr> 	com.sun.jersey#jersey-json;1.14 from central in [default]
2019-07-31 09:03:25.572 - stderr> 	com.sun.jersey#jersey-server;1.14 from central in [default]
2019-07-31 09:03:25.572 - stderr> 	com.sun.jersey.contribs#jersey-guice;1.9 from central in [default]
2019-07-31 09:03:25.572 - stderr> 	com.sun.xml.bind#jaxb-impl;2.2.3-1 from central in [default]
2019-07-31 09:03:25.573 - stderr> 	com.thoughtworks.paranamer#paranamer;2.3 from central in [default]
2019-07-31 09:03:25.573 - stderr> 	com.twitter#parquet-hadoop-bundle;1.6.0 from central in [default]
2019-07-31 09:03:25.573 - stderr> 	commons-beanutils#commons-beanutils;1.7.0 from central in [default]
2019-07-31 09:03:25.573 - stderr> 	commons-beanutils#commons-beanutils-core;1.8.0 from central in [default]
2019-07-31 09:03:25.573 - stderr> 	commons-cli#commons-cli;1.2 from central in [default]
2019-07-31 09:03:25.574 - stderr> 	commons-codec#commons-codec;1.4 from central in [default]
2019-07-31 09:03:25.574 - stderr> 	commons-collections#commons-collections;3.2.2 from central in [default]
2019-07-31 09:03:25.574 - stderr> 	commons-configuration#commons-configuration;1.6 from central in [default]
2019-07-31 09:03:25.574 - stderr> 	commons-dbcp#commons-dbcp;1.4 from central in [default]
2019-07-31 09:03:25.574 - stderr> 	commons-digester#commons-digester;1.8 from central in [default]
2019-07-31 09:03:25.575 - stderr> 	commons-httpclient#commons-httpclient;3.1 from central in [default]
2019-07-31 09:03:25.575 - stderr> 	commons-io#commons-io;2.4 from central in [default]
2019-07-31 09:03:25.575 - stderr> 	commons-lang#commons-lang;2.6 from central in [default]
2019-07-31 09:03:25.575 - stderr> 	commons-logging#commons-logging;1.1.3 from central in [default]
2019-07-31 09:03:25.576 - stderr> 	commons-net#commons-net;3.1 from central in [default]
2019-07-31 09:03:25.576 - stderr> 	commons-pool#commons-pool;1.5.4 from central in [default]
2019-07-31 09:03:25.576 - stderr> 	io.netty#netty;3.7.0.Final from central in [default]
2019-07-31 09:03:25.577 - stderr> 	io.netty#netty-all;4.0.23.Final from central in [default]
2019-07-31 09:03:25.577 - stderr> 	javax.activation#activation;1.1 from central in [default]
2019-07-31 09:03:25.577 - stderr> 	javax.inject#javax.inject;1 from central in [default]
2019-07-31 09:03:25.578 - stderr> 	javax.jdo#jdo-api;3.0.1 from central in [default]
2019-07-31 09:03:25.578 - stderr> 	javax.servlet.jsp#jsp-api;2.1 from central in [default]
2019-07-31 09:03:25.578 - stderr> 	javax.transaction#jta;1.1 from central in [default]
2019-07-31 09:03:25.579 - stderr> 	javax.xml.bind#jaxb-api;2.2.2 from central in [default]
2019-07-31 09:03:25.579 - stderr> 	javax.xml.stream#stax-api;1.0-2 from central in [default]
2019-07-31 09:03:25.579 - stderr> 	jline#jline;2.12 from central in [default]
2019-07-31 09:03:25.579 - stderr> 	joda-time#joda-time;2.5 from central in [default]
2019-07-31 09:03:25.579 - stderr> 	log4j#apache-log4j-extras;1.2.17 from central in [default]
2019-07-31 09:03:25.579 - stderr> 	log4j#log4j;1.2.17 from central in [default]
2019-07-31 09:03:25.579 - stderr> 	net.hydromatic#eigenbase-properties;1.1.5 from central in [default]
2019-07-31 09:03:25.58 - stderr> 	net.sf.opencsv#opencsv;2.3 from central in [default]
2019-07-31 09:03:25.58 - stderr> 	org.antlr#ST4;4.0.4 from central in [default]
2019-07-31 09:03:25.58 - stderr> 	org.antlr#antlr-runtime;3.4 from central in [default]
2019-07-31 09:03:25.58 - stderr> 	org.antlr#stringtemplate;3.2.1 from central in [default]
2019-07-31 09:03:25.58 - stderr> 	org.apache.ant#ant;1.9.1 from central in [default]
2019-07-31 09:03:25.58 - stderr> 	org.apache.ant#ant-launcher;1.9.1 from central in [default]
2019-07-31 09:03:25.58 - stderr> 	org.apache.avro#avro;1.7.5 from central in [default]
2019-07-31 09:03:25.581 - stderr> 	org.apache.calcite#calcite-avatica;1.2.0-incubating from central in [default]
2019-07-31 09:03:25.581 - stderr> 	org.apache.calcite#calcite-core;1.2.0-incubating from central in [default]
2019-07-31 09:03:25.581 - stderr> 	org.apache.calcite#calcite-linq4j;1.2.0-incubating from central in [default]
2019-07-31 09:03:25.581 - stderr> 	org.apache.commons#commons-compress;1.4.1 from central in [default]
2019-07-31 09:03:25.581 - stderr> 	org.apache.commons#commons-math3;3.1.1 from central in [default]
2019-07-31 09:03:25.581 - stderr> 	org.apache.derby#derby;10.10.2.0 from central in [default]
2019-07-31 09:03:25.581 - stderr> 	org.apache.directory.api#api-asn1-api;1.0.0-M20 from central in [default]
2019-07-31 09:03:25.581 - stderr> 	org.apache.directory.api#api-util;1.0.0-M20 from central in [default]
2019-07-31 09:03:25.582 - stderr> 	org.apache.directory.server#apacheds-i18n;2.0.0-M15 from central in [default]
2019-07-31 09:03:25.582 - stderr> 	org.apache.directory.server#apacheds-kerberos-codec;2.0.0-M15 from central in [default]
2019-07-31 09:03:25.582 - stderr> 	org.apache.hadoop#hadoop-annotations;2.7.3 from central in [default]
2019-07-31 09:03:25.582 - stderr> 	org.apache.hadoop#hadoop-auth;2.7.3 from central in [default]
2019-07-31 09:03:25.582 - stderr> 	org.apache.hadoop#hadoop-client;2.7.3 from central in [default]
2019-07-31 09:03:25.582 - stderr> 	org.apache.hadoop#hadoop-common;2.7.3 from central in [default]
2019-07-31 09:03:25.582 - stderr> 	org.apache.hadoop#hadoop-hdfs;2.7.3 from central in [default]
2019-07-31 09:03:25.582 - stderr> 	org.apache.hadoop#hadoop-mapreduce-client-app;2.7.3 from central in [default]
2019-07-31 09:03:25.583 - stderr> 	org.apache.hadoop#hadoop-mapreduce-client-common;2.7.3 from central in [default]
2019-07-31 09:03:25.583 - stderr> 	org.apache.hadoop#hadoop-mapreduce-client-core;2.7.3 from central in [default]
2019-07-31 09:03:25.583 - stderr> 	org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.7.3 from central in [default]
2019-07-31 09:03:25.583 - stderr> 	org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.7.3 from central in [default]
2019-07-31 09:03:25.583 - stderr> 	org.apache.hadoop#hadoop-yarn-api;2.7.3 from central in [default]
2019-07-31 09:03:25.583 - stderr> 	org.apache.hadoop#hadoop-yarn-client;2.7.3 from central in [default]
2019-07-31 09:03:25.583 - stderr> 	org.apache.hadoop#hadoop-yarn-common;2.7.3 from central in [default]
2019-07-31 09:03:25.583 - stderr> 	org.apache.hadoop#hadoop-yarn-server-applicationhistoryservice;2.6.0 from central in [default]
2019-07-31 09:03:25.584 - stderr> 	org.apache.hadoop#hadoop-yarn-server-common;2.7.3 from central in [default]
2019-07-31 09:03:25.584 - stderr> 	org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.6.0 from central in [default]
2019-07-31 09:03:25.584 - stderr> 	org.apache.hadoop#hadoop-yarn-server-web-proxy;2.6.0 from central in [default]
2019-07-31 09:03:25.584 - stderr> 	org.apache.hive#hive-ant;1.2.2 from central in [default]
2019-07-31 09:03:25.584 - stderr> 	org.apache.hive#hive-common;1.2.2 from central in [default]
2019-07-31 09:03:25.584 - stderr> 	org.apache.hive#hive-exec;1.2.2 from central in [default]
2019-07-31 09:03:25.584 - stderr> 	org.apache.hive#hive-metastore;1.2.2 from central in [default]
2019-07-31 09:03:25.584 - stderr> 	org.apache.hive#hive-serde;1.2.2 from central in [default]
2019-07-31 09:03:25.585 - stderr> 	org.apache.hive#hive-shims;1.2.2 from central in [default]
2019-07-31 09:03:25.585 - stderr> 	org.apache.hive.shims#hive-shims-0.20S;1.2.2 from central in [default]
2019-07-31 09:03:25.585 - stderr> 	org.apache.hive.shims#hive-shims-0.23;1.2.2 from central in [default]
2019-07-31 09:03:25.585 - stderr> 	org.apache.hive.shims#hive-shims-common;1.2.2 from central in [default]
2019-07-31 09:03:25.585 - stderr> 	org.apache.hive.shims#hive-shims-scheduler;1.2.2 from central in [default]
2019-07-31 09:03:25.585 - stderr> 	org.apache.htrace#htrace-core;3.1.0-incubating from central in [default]
2019-07-31 09:03:25.585 - stderr> 	org.apache.httpcomponents#httpclient;4.4 from central in [default]
2019-07-31 09:03:25.586 - stderr> 	org.apache.httpcomponents#httpcore;4.4 from central in [default]
2019-07-31 09:03:25.586 - stderr> 	org.apache.ivy#ivy;2.4.0 from central in [default]
2019-07-31 09:03:25.586 - stderr> 	org.apache.thrift#libfb303;0.9.2 from central in [default]
2019-07-31 09:03:25.586 - stderr> 	org.apache.thrift#libthrift;0.9.2 from central in [default]
2019-07-31 09:03:25.586 - stderr> 	org.apache.velocity#velocity;1.5 from central in [default]
2019-07-31 09:03:25.586 - stderr> 	org.apache.zookeeper#zookeeper;3.4.6 from central in [default]
2019-07-31 09:03:25.586 - stderr> 	org.codehaus.groovy#groovy-all;2.1.6 from central in [default]
2019-07-31 09:03:25.587 - stderr> 	org.codehaus.jackson#jackson-core-asl;1.9.13 from central in [default]
2019-07-31 09:03:25.587 - stderr> 	org.codehaus.jackson#jackson-jaxrs;1.9.13 from central in [default]
2019-07-31 09:03:25.587 - stderr> 	org.codehaus.jackson#jackson-mapper-asl;1.9.13 from central in [default]
2019-07-31 09:03:25.587 - stderr> 	org.codehaus.jackson#jackson-xc;1.9.13 from central in [default]
2019-07-31 09:03:25.587 - stderr> 	org.codehaus.janino#commons-compiler;2.7.6 from central in [default]
2019-07-31 09:03:25.588 - stderr> 	org.codehaus.janino#janino;2.7.6 from central in [default]
2019-07-31 09:03:25.588 - stderr> 	org.codehaus.jettison#jettison;1.1 from central in [default]
2019-07-31 09:03:25.588 - stderr> 	org.datanucleus#datanucleus-api-jdo;3.2.6 from central in [default]
2019-07-31 09:03:25.588 - stderr> 	org.datanucleus#datanucleus-core;3.2.10 from central in [default]
2019-07-31 09:03:25.588 - stderr> 	org.datanucleus#datanucleus-rdbms;3.2.9 from central in [default]
2019-07-31 09:03:25.588 - stderr> 	org.fusesource.leveldbjni#leveldbjni-all;1.8 from central in [default]
2019-07-31 09:03:25.588 - stderr> 	org.json#json;20090211 from central in [default]
2019-07-31 09:03:25.589 - stderr> 	org.mortbay.jetty#jetty;6.1.26 from central in [default]
2019-07-31 09:03:25.589 - stderr> 	org.mortbay.jetty#jetty-util;6.1.26 from central in [default]
2019-07-31 09:03:25.589 - stderr> 	org.slf4j#slf4j-api;1.7.10 from central in [default]
2019-07-31 09:03:25.589 - stderr> 	org.slf4j#slf4j-log4j12;1.7.10 from central in [default]
2019-07-31 09:03:25.589 - stderr> 	org.sonatype.sisu.inject#cglib;2.2.1-v20090111 from central in [default]
2019-07-31 09:03:25.59 - stderr> 	org.tukaani#xz;1.0 from central in [default]
2019-07-31 09:03:25.59 - stderr> 	org.xerial.snappy#snappy-java;1.0.5 from central in [default]
2019-07-31 09:03:25.59 - stderr> 	oro#oro;2.0.8 from central in [default]
2019-07-31 09:03:25.591 - stderr> 	stax#stax-api;1.0.1 from central in [default]
2019-07-31 09:03:25.592 - stderr> 	xerces#xercesImpl;2.9.1 from central in [default]
2019-07-31 09:03:25.592 - stderr> 	xml-apis#xml-apis;1.3.04 from central in [default]
2019-07-31 09:03:25.592 - stderr> 	xmlenc#xmlenc;0.52 from central in [default]
2019-07-31 09:03:25.592 - stderr> 	:: evicted modules:
2019-07-31 09:03:25.592 - stderr> 	log4j#log4j;1.2.16 by [log4j#log4j;1.2.17] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.slf4j#slf4j-api;1.7.5 by [org.slf4j#slf4j-api;1.7.10] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.slf4j#slf4j-log4j12;1.7.5 by [org.slf4j#slf4j-log4j12;1.7.10] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.apache.hadoop#hadoop-annotations;2.6.0 by [org.apache.hadoop#hadoop-annotations;2.7.3] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.codehaus.jackson#jackson-core-asl;1.9.2 by [org.codehaus.jackson#jackson-core-asl;1.9.13] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.codehaus.jackson#jackson-mapper-asl;1.9.2 by [org.codehaus.jackson#jackson-mapper-asl;1.9.13] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.codehaus.jackson#jackson-jaxrs;1.9.2 by [org.codehaus.jackson#jackson-jaxrs;1.9.13] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.codehaus.jackson#jackson-xc;1.9.2 by [org.codehaus.jackson#jackson-xc;1.9.13] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.apache.hadoop#hadoop-yarn-common;2.6.0 by [org.apache.hadoop#hadoop-yarn-common;2.7.3] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.apache.hadoop#hadoop-yarn-api;2.6.0 by [org.apache.hadoop#hadoop-yarn-api;2.7.3] in [default]
2019-07-31 09:03:25.593 - stderr> 	org.apache.hadoop#hadoop-yarn-server-common;2.6.0 by [org.apache.hadoop#hadoop-yarn-server-common;2.7.3] in [default]
2019-07-31 09:03:25.594 - stderr> 	commons-httpclient#commons-httpclient;3.0.1 by [commons-httpclient#commons-httpclient;3.1] in [default]
2019-07-31 09:03:25.594 - stderr> 	junit#junit;4.11 transitively in [default]
2019-07-31 09:03:25.594 - stderr> 	org.hamcrest#hamcrest-core;1.3 transitively in [default]
2019-07-31 09:03:25.594 - stderr> 	com.google.code.findbugs#jsr305;1.3.9 by [com.google.code.findbugs#jsr305;3.0.0] in [default]
2019-07-31 09:03:25.594 - stderr> 	com.google.guava#guava;11.0.2 by [com.google.guava#guava;14.0.1] in [default]
2019-07-31 09:03:25.594 - stderr> 	org.apache.avro#avro;1.7.4 by [org.apache.avro#avro;1.7.5] in [default]
2019-07-31 09:03:25.594 - stderr> 	org.apache.httpcomponents#httpclient;4.2.5 by [org.apache.httpcomponents#httpclient;4.4] in [default]
2019-07-31 09:03:25.594 - stderr> 	io.netty#netty;3.6.2.Final by [io.netty#netty;3.7.0.Final] in [default]
2019-07-31 09:03:25.594 - stderr> 	com.sun.jersey#jersey-core;1.9 by [com.sun.jersey#jersey-core;1.14] in [default]
2019-07-31 09:03:25.594 - stderr> 	com.sun.jersey#jersey-server;1.9 by [com.sun.jersey#jersey-server;1.14] in [default]
2019-07-31 09:03:25.594 - stderr> 	com.sun.jersey#jersey-json;1.9 by [com.sun.jersey#jersey-json;1.14] in [default]
2019-07-31 09:03:25.594 - stderr> 	---------------------------------------------------------------------
2019-07-31 09:03:25.595 - stderr> 	|                  |            modules            ||   artifacts   |
2019-07-31 09:03:25.595 - stderr> 	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
2019-07-31 09:03:25.595 - stderr> 	---------------------------------------------------------------------
2019-07-31 09:03:25.595 - stderr> 	|      default     |  145  |   0   |   0   |   22  ||  123  |   0   |
2019-07-31 09:03:25.595 - stderr> 	---------------------------------------------------------------------
2019-07-31 09:03:25.642 - stderr> :: retrieving :: org.apache.spark#spark-submit-parent-938b276c-7b20-46cf-8a9f-c06bc7eedecb
2019-07-31 09:03:25.642 - stderr> 	confs: [default]
2019-07-31 09:03:25.731 - stderr> 	0 artifacts copied, 123 already retrieved (0kB/89ms)
2019-07-31 09:03:25.902 - stderr> 19/07/31 09:03:25 INFO IsolatedClientLoader: Downloaded metastore jars to /tmp/hive-v1_2-8bc683c2-f316-4161-9200-1d49b8a0c223
2019-07-31 09:03:26.622 - stderr> 19/07/31 09:03:26 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2019-07-31 09:03:26.65 - stderr> 19/07/31 09:03:26 INFO ObjectStore: ObjectStore, initialize called
2019-07-31 09:03:26.8 - stderr> 19/07/31 09:03:26 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
2019-07-31 09:03:26.8 - stderr> 19/07/31 09:03:26 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
2019-07-31 09:03:36.04 - stderr> 19/07/31 09:03:36 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2019-07-31 09:03:37.881 - stderr> 19/07/31 09:03:37 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
2019-07-31 09:03:37.882 - stderr> 19/07/31 09:03:37 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
2019-07-31 09:03:38.132 - stderr> 19/07/31 09:03:38 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
2019-07-31 09:03:38.132 - stderr> 19/07/31 09:03:38 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
2019-07-31 09:03:38.219 - stderr> 19/07/31 09:03:38 INFO Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing
2019-07-31 09:03:38.221 - stderr> 19/07/31 09:03:38 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
2019-07-31 09:03:38.224 - stderr> 19/07/31 09:03:38 INFO ObjectStore: Initialized ObjectStore
2019-07-31 09:03:38.505 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: Added admin role in metastore
2019-07-31 09:03:38.511 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: Added public role in metastore
2019-07-31 09:03:38.571 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: No user is added in admin role, since config is empty
2019-07-31 09:03:38.682 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_all_databases
2019-07-31 09:03:38.684 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_all_databases	
2019-07-31 09:03:38.703 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_functions: db=default pat=*
2019-07-31 09:03:38.704 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_functions: db=default pat=*	
2019-07-31 09:03:38.705 - stderr> 19/07/31 09:03:38 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
2019-07-31 09:03:38.773 - stderr> 19/07/31 09:03:38 INFO SessionState: Created local directory: /tmp/f399e1fe-8b7b-4ed6-9232-1f649d612162_resources
2019-07-31 09:03:38.778 - stderr> 19/07/31 09:03:38 INFO SessionState: Created HDFS directory: /tmp/hive/jenkins/f399e1fe-8b7b-4ed6-9232-1f649d612162
2019-07-31 09:03:38.782 - stderr> 19/07/31 09:03:38 INFO SessionState: Created local directory: /tmp/jenkins/f399e1fe-8b7b-4ed6-9232-1f649d612162
2019-07-31 09:03:38.787 - stderr> 19/07/31 09:03:38 INFO SessionState: Created HDFS directory: /tmp/hive/jenkins/f399e1fe-8b7b-4ed6-9232-1f649d612162/_tmp_space.db
2019-07-31 09:03:38.79 - stderr> 19/07/31 09:03:38 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.2) is /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e
2019-07-31 09:03:38.796 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:38.796 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:38.803 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_table : db=default tbl=data_source_tbl_1
2019-07-31 09:03:38.803 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_table : db=default tbl=data_source_tbl_1	
2019-07-31 09:03:38.818 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:38.818 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:38.822 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:38.822 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:38.858 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:38.858 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:38.861 - stderr> 19/07/31 09:03:38 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:38.861 - stderr> 19/07/31 09:03:38 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:39.008 - stderr> 19/07/31 09:03:39 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
2019-07-31 09:03:39.009 - stderr> 19/07/31 09:03:39 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2019-07-31 09:03:39.423 - stderr> 19/07/31 09:03:39 INFO CodeGenerator: Code generated in 254.207918 ms
2019-07-31 09:03:39.566 - stderr> 19/07/31 09:03:39 INFO SparkContext: Starting job: sql at NativeMethodAccessorImpl.java:0
2019-07-31 09:03:39.59 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Got job 0 (sql at NativeMethodAccessorImpl.java:0) with 1 output partitions
2019-07-31 09:03:39.591 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Final stage: ResultStage 0 (sql at NativeMethodAccessorImpl.java:0)
2019-07-31 09:03:39.593 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Parents of final stage: List()
2019-07-31 09:03:39.596 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Missing parents: List()
2019-07-31 09:03:39.604 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at sql at NativeMethodAccessorImpl.java:0), which has no missing parents
2019-07-31 09:03:39.779 - stderr> 19/07/31 09:03:39 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 150.1 KB, free 366.2 MB)
2019-07-31 09:03:39.821 - stderr> 19/07/31 09:03:39 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 54.9 KB, free 366.1 MB)
2019-07-31 09:03:39.827 - stderr> 19/07/31 09:03:39 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on amp-jenkins-worker-02.amp:35258 (size: 54.9 KB, free: 366.2 MB)
2019-07-31 09:03:39.83 - stderr> 19/07/31 09:03:39 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161
2019-07-31 09:03:39.847 - stderr> 19/07/31 09:03:39 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at sql at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0))
2019-07-31 09:03:39.848 - stderr> 19/07/31 09:03:39 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
2019-07-31 09:03:39.903 - stderr> 19/07/31 09:03:39 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8080 bytes)
2019-07-31 09:03:39.914 - stderr> 19/07/31 09:03:39 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
2019-07-31 09:03:40.045 - stderr> 19/07/31 09:03:40 INFO CodeGenerator: Code generated in 24.764432 ms
2019-07-31 09:03:40.053 - stderr> 19/07/31 09:03:40 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
2019-07-31 09:03:40.054 - stderr> 19/07/31 09:03:40 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2019-07-31 09:03:40.123 - stderr> 19/07/31 09:03:40 INFO FileOutputCommitter: Saved output of task 'attempt_20190731090339_0000_m_000000_0' to file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e/data_source_tbl_1/_temporary/0/task_20190731090339_0000_m_000000
2019-07-31 09:03:40.124 - stderr> 19/07/31 09:03:40 INFO SparkHadoopMapRedUtil: attempt_20190731090339_0000_m_000000_0: Committed
2019-07-31 09:03:40.143 - stderr> 19/07/31 09:03:40 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 2116 bytes result sent to driver
2019-07-31 09:03:40.151 - stderr> 19/07/31 09:03:40 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 262 ms on localhost (executor driver) (1/1)
2019-07-31 09:03:40.155 - stderr> 19/07/31 09:03:40 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
2019-07-31 09:03:40.162 - stderr> 19/07/31 09:03:40 INFO DAGScheduler: ResultStage 0 (sql at NativeMethodAccessorImpl.java:0) finished in 0.533 s
2019-07-31 09:03:40.167 - stderr> 19/07/31 09:03:40 INFO DAGScheduler: Job 0 finished: sql at NativeMethodAccessorImpl.java:0, took 0.599652 s
2019-07-31 09:03:40.25 - stderr> 19/07/31 09:03:40 INFO FileFormatWriter: Write Job 0efa482b-e8b6-4308-a546-52f942f8e325 committed.
2019-07-31 09:03:40.258 - stderr> 19/07/31 09:03:40 INFO FileFormatWriter: Finished processing stats for write job 0efa482b-e8b6-4308-a546-52f942f8e325.
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 35
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 17
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 22
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 21
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 28
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 19
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 29
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 24
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 18
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 13
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 11
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 30
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 15
2019-07-31 09:03:40.452 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 32
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 33
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 25
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 31
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 12
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 16
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 26
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 34
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 23
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 20
2019-07-31 09:03:40.453 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 14
2019-07-31 09:03:40.466 - stderr> 19/07/31 09:03:40 INFO BlockManagerInfo: Removed broadcast_0_piece0 on amp-jenkins-worker-02.amp:35258 in memory (size: 54.9 KB, free: 366.3 MB)
2019-07-31 09:03:40.469 - stderr> 19/07/31 09:03:40 INFO ContextCleaner: Cleaned accumulator 27
2019-07-31 09:03:40.478 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:40.478 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:40.481 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: get_table : db=default tbl=data_source_tbl_1
2019-07-31 09:03:40.481 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_table : db=default tbl=data_source_tbl_1	
2019-07-31 09:03:40.487 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:40.487 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:40.491 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: get_table : db=default tbl=data_source_tbl_1
2019-07-31 09:03:40.491 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_table : db=default tbl=data_source_tbl_1	
2019-07-31 09:03:40.556 - stderr> 19/07/31 09:03:40 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`data_source_tbl_1` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
2019-07-31 09:03:40.781 - stderr> 19/07/31 09:03:40 INFO HiveMetaStore: 0: create_table: Table(tableName:data_source_tbl_1, dbName:default, owner:jenkins, createTime:1564588998, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:col, type:array<string>, comment:from deserializer)], location:null, inputFormat:org.apache.hadoop.mapred.SequenceFileInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{path=file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e/data_source_tbl_1, serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={"type":"struct","fields":[{"name":"i","type":"integer","nullable":true,"metadata":{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.sources.provider=json, spark.sql.create.version=2.4.3}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null))
2019-07-31 09:03:40.781 - stderr> 19/07/31 09:03:40 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=create_table: Table(tableName:data_source_tbl_1, dbName:default, owner:jenkins, createTime:1564588998, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:col, type:array<string>, comment:from deserializer)], location:null, inputFormat:org.apache.hadoop.mapred.SequenceFileInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{path=file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-f050d33c-2efb-4f69-944b-1110063c669e/data_source_tbl_1, serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={"type":"struct","fields":[{"name":"i","type":"integer","nullable":true,"metadata":{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.sources.provider=json, spark.sql.create.version=2.4.3}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null))	
2019-07-31 09:03:40.79 - stderr> 19/07/31 09:03:40 INFO log: Updating table stats fast for data_source_tbl_1
2019-07-31 09:03:40.791 - stderr> 19/07/31 09:03:40 INFO log: Updated size of table data_source_tbl_1 to 8
2019-07-31 09:03:41.13 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_table : db=default tbl=hive_compatible_data_source_tbl_1
2019-07-31 09:03:41.13 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_table : db=default tbl=hive_compatible_data_source_tbl_1	
2019-07-31 09:03:41.133 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:41.133 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:41.135 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:41.135 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:41.138 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:41.138 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:41.141 - stderr> 19/07/31 09:03:41 INFO HiveMetaStore: 0: get_database: default
2019-07-31 09:03:41.141 - stderr> 19/07/31 09:03:41 INFO audit: ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-07-31 09:03:41.161 - stderr> 19/07/31 09:03:41 INFO ParquetFileFormat: Using default output committer for Parquet: org.apache.parquet.hadoop.ParquetOutputCommitter
2019-07-31 09:03:41.171 - stderr> 19/07/31 09:03:41 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
2019-07-31 09:03:41.173 - stderr> 19/07/31 09:03:41 INFO SQLHadoopMapReduceCommitProtocol: Using user defined output committer class org.apache.parquet.hadoop.ParquetOutputCommitter
2019-07-31 09:03:41.173 - stderr> 19/07/31 09:03:41 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
2019-07-31 09:03:41.173 - stderr> 19/07/31 09:03:41 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.parquet.hadoop.ParquetOutputCommitter
2019-07-31 09:03:41.284 - stderr> 19/07/31 09:03:41 INFO SparkContext: Starting job: sql at NativeMethodAccessorImpl.java:0
2019-07-31 09:03:41.287 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Got job 1 (sql at NativeMethodAccessorImpl.java:0) with 1 output partitions
2019-07-31 09:03:41.287 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Final stage: ResultStage 1 (sql at NativeMethodAccessorImpl.java:0)
2019-07-31 09:03:41.287 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Parents of final stage: List()
2019-07-31 09:03:41.287 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Missing parents: List()
2019-07-31 09:03:41.288 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[4] at sql at NativeMethodAccessorImpl.java:0), which has no missing parents
2019-07-31 09:03:41.39 - stderr> 19/07/31 09:03:41 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 147.7 KB, free 366.2 MB)
2019-07-31 09:03:41.393 - stderr> 19/07/31 09:03:41 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 52.8 KB, free 366.1 MB)
2019-07-31 09:03:41.399 - stderr> 19/07/31 09:03:41 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on amp-jenkins-worker-02.amp:35258 (size: 52.8 KB, free: 366.2 MB)
2019-07-31 09:03:41.4 - stderr> 19/07/31 09:03:41 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
2019-07-31 09:03:41.401 - stderr> 19/07/31 09:03:41 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[4] at sql at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0))
2019-07-31 09:03:41.402 - stderr> 19/07/31 09:03:41 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
2019-07-31 09:03:41.403 - stderr> 19/07/31 09:03:41 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, PROCESS_LOCAL, 8080 bytes)
2019-07-31 09:03:41.404 - stderr> 19/07/31 09:03:41 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
2019-07-31 09:03:41.439 - stderr> 19/07/31 09:03:41 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
2019-07-31 09:03:41.439 - stderr> 19/07/31 09:03:41 INFO SQLHadoopMapReduceCommitProtocol: Using user defined output committer class org.apache.parquet.hadoop.ParquetOutputCommitter
2019-07-31 09:03:41.44 - stderr> 19/07/31 09:03:41 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
2019-07-31 09:03:41.44 - stderr> 19/07/31 09:03:41 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.parquet.hadoop.ParquetOutputCommitter
2019-07-31 09:03:41.445 - stderr> 19/07/31 09:03:41 INFO CodecConfig: Compression: SNAPPY
2019-07-31 09:03:41.446 - stderr> 19/07/31 09:03:41 INFO CodecConfig: Compression: SNAPPY
2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Parquet block size to 134217728
2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Parquet page size to 1048576
2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Parquet dictionary page size to 1048576
2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Dictionary is on
2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Validation is off
2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Writer version is: PARQUET_1_0
2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Maximum row group padding size is 8388608 bytes
2019-07-31 09:03:41.466 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Page size checking is: estimated
2019-07-31 09:03:41.467 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Min row count for page size check is: 100
2019-07-31 09:03:41.467 - stderr> 19/07/31 09:03:41 INFO ParquetOutputFormat: Max row count for page size check is: 10000
2019-07-31 09:03:41.516 - stderr> 19/07/31 09:03:41 INFO ParquetWriteSupport: Initialized Parquet WriteSupport with Catalyst schema:
2019-07-31 09:03:41.516 - stderr> {
2019-07-31 09:03:41.516 - stderr>   "type" : "struct",
2019-07-31 09:03:41.516 - stderr>   "fields" : [ {
2019-07-31 09:03:41.516 - stderr>     "name" : "i",
2019-07-31 09:03:41.516 - stderr>     "type" : "integer",
2019-07-31 09:03:41.516 - stderr>     "nullable" : false,
2019-07-31 09:03:41.516 - stderr>     "metadata" : { }
2019-07-31 09:03:41.516 - stderr>   } ]
2019-07-31 09:03:41.516 - stderr> }
2019-07-31 09:03:41.516 - stderr> and corresponding Parquet message type:
2019-07-31 09:03:41.516 - stderr> message spark_schema {
2019-07-31 09:03:41.516 - stderr>   required int32 i;
2019-07-31 09:03:41.516 - stderr> }
2019-07-31 09:03:41.516 - stderr> 
2019-07-31 09:03:41.516 - stderr>        
2019-07-31 09:03:41.563 - stderr> 19/07/31 09:03:41 INFO CodecPool: Got brand-new compressor [.snappy]
2019-07-31 09:03:41.622 - stderr> 19/07/31 09:03:41 INFO InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 8
2019-07-31 09:03:42.001 - stderr> java.io.FileNotFoundException: /tmp/test-spark/spark-2.4.3/jars/snappy-java-1.1.7.3.jar (No such file or directory)
2019-07-31 09:03:42.002 - stderr> java.lang.NullPointerException
2019-07-31 09:03:42.002 - stderr> 	at org.xerial.snappy.SnappyLoader.extractLibraryFile(SnappyLoader.java:243)
2019-07-31 09:03:42.002 - stderr> 	at org.xerial.snappy.SnappyLoader.findNativeLibrary(SnappyLoader.java:355)
2019-07-31 09:03:42.002 - stderr> 	at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:176)
2019-07-31 09:03:42.002 - stderr> 	at org.xerial.snappy.SnappyLoader.loadSnappyApi(SnappyLoader.java:154)
2019-07-31 09:03:42.002 - stderr> 	at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.hadoop.codec.SnappyCompressor.compress(SnappyCompressor.java:67)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.hadoop.io.compress.CompressorStream.compress(CompressorStream.java:81)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.hadoop.io.compress.CompressorStream.finish(CompressorStream.java:92)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.hadoop.CodecFactory$HeapBytesCompressor.compress(CodecFactory.java:165)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.hadoop.ColumnChunkPageWriteStore$ColumnChunkPageWriter.writePage(ColumnChunkPageWriteStore.java:95)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.column.impl.ColumnWriterV1.writePage(ColumnWriterV1.java:147)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.column.impl.ColumnWriterV1.flush(ColumnWriterV1.java:235)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.column.impl.ColumnWriteStoreV1.flush(ColumnWriteStoreV1.java:122)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:172)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:114)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:165)
2019-07-31 09:03:42.002 - stderr> 	at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetOutputWriter.scala:42)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:57)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:74)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:247)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:242)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:248)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:170)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.scheduler.Task.run(Task.scala:121)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
2019-07-31 09:03:42.003 - stderr> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
2019-07-31 09:03:42.003 - stderr> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2019-07-31 09:03:42.003 - stderr> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2019-07-31 09:03:42.003 - stderr> 	at java.lang.Thread.run(Thread.java:748)
2019-07-31 09:03:42.008 - stderr> 19/07/31 09:03:42 ERROR Utils: Aborting task
2019-07-31 09:03:42.008 - stderr> org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] null
2019-07-31 09:03:42.009 - stderr> 	at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:187)
2019-07-31 09:03:42.009 - stderr> 	at org.xerial.snappy.SnappyLoader.loadSnappyApi(SnappyLoader.java:154)
2019-07-31 09:03:42.009 - stderr> 	at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.hadoop.codec.SnappyCompressor.compress(SnappyCompressor.java:67)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.hadoop.io.compress.CompressorStream.compress(CompressorStream.java:81)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.hadoop.io.compress.CompressorStream.finish(CompressorStream.java:92)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.hadoop.CodecFactory$HeapBytesCompressor.compress(CodecFactory.java:165)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.hadoop.ColumnChunkPageWriteStore$ColumnChunkPageWriter.writePage(ColumnChunkPageWriteStore.java:95)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.column.impl.ColumnWriterV1.writePage(ColumnWriterV1.java:147)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.column.impl.ColumnWriterV1.flush(ColumnWriterV1.java:235)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.column.impl.ColumnWriteStoreV1.flush(ColumnWriteStoreV1.java:122)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:172)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:114)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:165)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetOutputWriter.scala:42)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:57)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:74)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:247)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:242)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
2019-07-31 09:03:42.009 - stderr> 	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$