Linux下 spark的版本查询

1.到spark的安装目录下

[root@hadoop001 bin]# pwd
/usr/local/software/spark3.2/bin

执行

[root@hadoop001 bin]# ls
beeline               find-spark-home.cmd  pyspark2.cmd     spark-class       sparkR2.cmd       spark-shell.cmd  spark-submit
beeline.cmd           load-spark-env.cmd   pyspark.cmd      spark-class2.cmd  sparkR.cmd        spark-sql        spark-submit2.cmd
docker-image-tool.sh  load-spark-env.sh    run-example      spark-class.cmd   spark-shell       spark-sql2.cmd   spark-submit.cmd
find-spark-home       pyspark              run-example.cmd  sparkR            spark-shell2.cmd  spark-sql.cmd
[root@hadoop001 bin]#

[root@hadoop001 bin]#
[root@hadoop001 bin]# ./spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
24/07/31 14:59:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
24/07/31 14:59:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
Spark context Web UI available at http://hadoop001:4041
Spark context available as 'sc' (master = spark://hadoop001:7077, app id = app-20240731145931-0002).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.2.2
      /_/

Using Scala version 2.12.15 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_161)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值