SparkSQL External Datasource简易使用之AVRO
下载源码&编译:
git clone https://github.com/databricks/spark-avro.git sbt/sbt package
Maven GAV:
groupId: com.databricks.spark
artifactId: spark-avro_2.10
version: 0.1
$SPARK_HOME/conf/spark-env.sh
export SPARK_CLASSPATH=/home/spark/software/source/spark_package/spark-avro/target/scala-2.10/spark-avro_2.10-0.1.jar:$SPARK_CLASSPATH
测试数据下载:
wget https://github.com/databricks/spark-avro/raw/master/src/test/resources/episodes.avro
Scala API:
import org.apache.spark.sql.SQLContext val sqlContext = new SQLContext(sc) import com.databricks.spark.avro._ val episodes = sqlContext.avroFile("file:///home/spark/software/data/episodes.avro") import sqlContext._ episodes.select(‘title).collect()
SQL:
CREATE TEMPORARY TABLE episodes USING com.databricks.spark.avro OPTIONS (path "file:///home/spark/software/data/episodes.avro"); select * from episodes;
郑重声明:本站内容如果来自互联网及其他传播媒体,其版权均属原媒体及文章作者所有。转载目的在于传递更多信息及用于网络分享,并不代表本站赞同其观点和对其真实性负责,也不构成任何其他建议。