inteloap 编译

该博客详细介绍了如何在Linux环境下安装conda并替换国内源,然后下载并编译Intel Arrow,包括修改配置文件、创建conda环境、安装依赖,并通过编译选项启用相关特性。接下来,它指导读者下载并编译OAP,以及设置相应的Spark配置。整个过程涵盖了从源代码管理、依赖安装到编译和配置的一系列步骤。
摘要由CSDN通过智能技术生成

1.安装conda,替换国内源

2.下载intel arrow:

git clone https://github.com/Intel-bigdata/arrow.git
cd arrow && git checkout native-sql-engine-clean

 

vim ci/conda_env_gandiva.yml 
clangdev=7
llvmdev=7

conda create -y -n pyarrow-dev -c conda-forge \
    --file ci/conda_env_unix.yml \
    --file ci/conda_env_cpp.yml \
    --file ci/conda_env_python.yml \
    --file ci/conda_env_gandiva.yml \
    compilers \
    python=3.7 \
    pandas
conda activate pyarrow-dev

3.下载test组件:

yum install gtest-devel
yum install gmock

4.编译arrow:

cd arrow && git checkout native-sql-engine-clean
git submodule update --init --recursivemkdir -p arrow/cpp/release-build

cd arrow/cpp/release-buildcmake -DARROW_GANDIVA_JAVA=ON -DARROW_GANDIVA=ON -DARROW_PARQUET=ON -DARROW_HDFS=ON -DARROW_BOOST_USE_SHARED=ON -DARROW_JNI=ON -DARROW_WITH_SNAPPY=ON -DARROW_WITH_LZ4=ON -DARROW_FILESYSTEM=ON -DARROW_JSON=ON -DARROW_DATASET=ON ..
make -j
make install

# build java
cd ../../java
# change property 'arrow.cpp.build.dir' to the relative path of cpp build dir in gandiva/pom.xml
mvn clean install -P arrow-jni -am -Darrow.cpp.build.dir=../cpp/release-build/release/ -DskipTests 
# if you are behine proxy, please also add proxy for socks
mvn clean install -P arrow-jni -am -Darrow.cpp.build.dir=../cpp/release-build/release/ -DskipTests -DsocksProxyHost=${proxyHost} -DsocksProxyPort=1080 

 

5.编译OAP

git clone https://github.com/Intel-bigdata/OAP.git
cd OAP && git checkout branch-nativesql-spark-3.0.0
cd oap-native-sql
cd cpp/
mkdir build/
cd build/
cmake .. -DTESTS=ON
make -j

6.spark配置:

spark-default.conf:

spark.sql.sources.useV1SourceList     avro
spark.sql.join.preferSortMergeJoin    false
spark.sql.extensions                  com.intel.oap.ColumnarPlugin
spark.shuffle.manager                 org.apache.spark.shuffle.sort.ColumnarShuffleManager
spark.shuffle.compress                true
spark.io.compression.codec            lz4
spark.executorEnv.LD_LIBRARY_PATH     ${ld_library_path}/libs
spark.executorEnv.ARROW_LIBHDFS3_DIR  ${ld_library_path}/libs

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值