spark官方指南文档翻译(一)

Spark Overview

Apache Spark is a fast and general-purpose cluster computing system.It provides high-level APIs in Java, Scala, Python and R,and an optimized engine that supports general execution graphs.It also supports a rich set of higher-level tools includingSpark SQL for SQL and structured data processing, MLlib for machine learning, GraphX for graph processing, and Spark Streaming.


spark概述

apache spark 是一个快速的通用的集群的计算系统。他提供高级的api为了Java,Scala,Python和R,还有一个优化引擎支持一般的执行图。他也支持一组丰富的高级工具包括Spark SQL为了SQL,结构化数据处理,MLlib 为了机器学习,GraphX为了图处理,还有Spark Streaming。

Downloading

Get Spark from the downloads page of the project website. This documentation is for Spark version 2.0.1. Spark uses Hadoop’s client libraries for HDFS and YARN. Downloads are pre-packaged for a handful of popular Hadoop versions.Users can also download a “Hadoop free” binary and run Spark with any Hadoop versionby augmenting Spark’s classpath.

If you’d like to build Spark from source, visit Building Spark.

Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS). It’s easy to runlocally on one machine — all you need is to havejava installed on your system PATH,or the JAVA_HOME environment variable pointing to a Java installation.

Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API, Spark 2.0.1uses Scala 2.11. You will need to use a compatible Scala version(2.11.x).

下载

这里不翻译了。。。

Running the Examples and Shell

Spark comes with several sample programs. Scala, Java, Python and R examples are in theexamples/src/main directory. To run one of the Java or Scala sample programs, usebin/run-example <class> [params] in the top-level Spark directory. (Behind the scenes, thisinvokes the more generalspark-submit script forlaunching applications). For example,

./bin/run-example SparkPi 10

运行实例和脚本

spark 自带几个实例程序。Scala,Java,Python和R实例在examples/src/main目录。为了运行一个Java或者Scala实例程序,在spark目录顶层用bin/run-example<class> [params]命令.(在幕后,这个调用更通用的spark-submit脚本来启动应用).例如,

./bin/run-example SparkPi 10


You can also run Spark interactively through a modified version of the Scala shell. This is agreat way to learn the framework.

./bin/spark-shell --master local[2]
The --master option specifies the master URL for a distributed cluster, or local to run locally with one thread, or local[N] to run locally with N threads. You should start by using local for testing. For a full list of options, run Spark shell with the --help option.


你同样可以通过修改版的Scala shell来交互的运行Spark。

./bin/spark-shell --master local[2]
--master选项为分布式集群指定master url,local 指定本地运行一个线程,local[N]指定本地运行N个线程。为了测试你应该通过使用local指令开始。查看更多的的选项,运行spark shell带上--help参数。


Spark also provides a Python API. To run Spark interactively in a Python interpreter, usebin/pyspark:

./bin/pyspark --master local[2]

Example applications are also provided in Python. For example,

./bin/spark-submit examples/src/main/python/pi.py 10


spark同样提供python的api。为了在python解释器交互的运行spark,使用 bin/pyspark:

./bin/pyspark --master local[2]

实例程序同样提供基于python的。例于,

./bin/spark-submit examples/src/main/python/pi.py 10


Spark also provides an experimental R API since 1.4 (only DataFrames APIs included).To run Spark interactively in a R interpreter, use bin/sparkR:

./bin/sparkR --master local[2]

Example applications are also provided in R. For example,

./bin/spark-submit examples/src/main/r/dataframe.R

同上这是r的,不翻译了

Launching on a Cluster

The Spark cluster mode overview explains the key concepts in running on a cluster.Spark can run both by itself, or over several existing cluster managers. It currently provides several options for deployment:

集群上启动

spark集群模式概序解释到运行集群模式的关键概念。spark既能自己运行,又能通过几个现有的集群管理者管理。当前提供几个部署选择:


Where to Go from Here

Programming Guides:

API Docs:

Deployment Guides:

Other Documents:

External Resources:


这里也不翻译了都是一些导航链接












  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值