网上买了个Arm板(http://item.taobao.com/item.htm?spm=a1z09.2.9.21.0jIdiD&id=44152364999&_u=d1en2tf048c),看中的就是CPU高(采用ARM四核Cortex-A17架构Mali-T764 GPU主频1.8GHz的瑞芯微RK3288处理),而且预装了Ubuntu-14.04.
拿回来后直接切换到Ubuntu,然后SSH登陆后进行了一些操作:
1. 禁用lightdm,我用linux从来不用界面,so
sudo service lightdm stop
2. 改时区
ln -sf /usr/share/zoneinfo/Asia/Shanghai /etc/localtime
3. 调整sshd,并重启sshd
echo "UseDNS no" >> /etc/ssh/sshd_config
service ssh restart
4. 安装gcc相关东西,一定先update,否则很可能下包时遇到HTTP 404错误。也可以直接下载clang
apt-get update
apt-get install build-essential
然后开始试一把:
1. 跑Spark
1. 下载jdk(Hard Float ABI)http://www.oracle.com/technetwork/java/javase/downloads/jdk7-arm-downloads-2187468.html
2. 下载spark http://www.apache.org/dyn/closer.cgi/spark/spark-1.3.1/spark-1.3.1-bin-hadoop2.4.tgz
3. 配置spark, 由于spark 默认使用Snappy-java做数据压缩,但它是调用so来做处理,在Arm上会报错,需要禁用掉所有的compress项
# cat conf/spark-defaults.conf
spark.serializer = org.apache.spark.serializer.KryoSerializer
spark.ui.showConsoleProgress = false
spark.local.dir = /dev/shm/spark
# java.lang.IllegalArgumentException
# at org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:152)
spark.broadcast.compress = false
spark.shuffle.spill.compress = false
spark.shuffle.compress = false
4. bin/spark-shell
sc.textFile("/opt/100k").flatMap(line => line.split("\\W+")).map(word => (word, 1)).reduceByKey((a, b) => a + b).count
2. 编译Nginx
1. 下载Nginx代码(http://nginx.org/download/nginx-1.8.0.tar.gz )
2. 编译,注意会提示少包
apt-get install libpcre3 libpcre3-dev openssl libssl-dev
./configure --prefix=/opt/nginx
make -j
make install
3. 运行
/opt/nginx/sbin/nginx
弄完这些以后就和通常用的Ubuntu-14.04没什么区别了,除了性能差点。