1.配置Kylin的相关Spark参数
1)cd $KYLIN_HOME/conf
2)vim kylin.properties
kylin.engine.spark-conf.spark.master=yarn
kylin.engine.spark-conf.spark.submit.deployMode=cluster
kylin.engine.spark-conf.spark.yarn.queue=default
kylin.engine.spark-conf.spark.driver.memory=2G
kylin.engine.spark-conf.spark.executor.memory=4G
kylin.engine.spark-conf.spark.executor.instances=40
kylin.engine.spark-conf.spark.yarn.executor.memoryOverhead=1024
kylin.engine.spark-conf.spark.shuffle.service.enabled=true
kylin.engine.spark-conf.spark.eventLog.enabled=true
kylin.engine.spark-conf.spark.eventLog.dir=hdfs\:///kylin/spark-history
kylin.engine.spark-conf.spark.history.fs.logDirectory=hdfs\:///kylin/spark-history
kylin.engine.spark-conf.spark.yarn.archive=hdfs://10.4.7.16:8020/kylin/spark/spark-libs.jar
kylin.engine.spark-conf.spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec
3)cd $KYLIN_HOME
4)jar cv0f spark-libs.jar -C $KYLIN_HOME/spark/jars/ ./
5)hdfs dfs -mkdir -p /kylin/spark/
6)hdfs dfs -put spark-libs.jar /kylin/spark/
7)$KYLIN_HOME/bin/kylin.sh stop && $KYLIN_HOME/bin/kylin.sh start
2.修改Cube的配置
1)Model界面
kylin_sales_cube,点击Action,再点击Edit
2)Cube Designer
2.1)Cube info
2.2)Dimensions
2.3)Measures
2.4)Refresh Setting
2.5)Advanced Setting
更改Cube Engine类型,将MapReduce更改为Spark
2.6)Configuration Overwrites
点击"+Property"添加属性"kylin.engine.spark.rdd-partition-cut-mb"其值为"500"
3.构建Cube
1)${KYLIN_HOME}/spark/sbin/start-history-server.sh hdfs://10.4.7.16:8020/kylin/spark-history
2)查看Monitor