安装spark遇到的问题

1.启动spark SQL时,报错:

<span>Caused <span>by: <span>org<span>.datanucleus<span>.store<span>.rdbms<span>.connectionpool<span>.DatastoreDriverNotFoundException: </span></span></span></span></span></span></span></span>

The specified datastore driver ("com.mysql.jdbc.Driver ") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.解决方案 在$SPARK_HOME/conf/spark-env.sh文件中配置:export SPARK_CLASSPATH=$HIVE_HOME/lib/mysql-connector-java-5.1.6-bin.jar