当我在unix shell / terminal中运行所有这3个命令时,它们都工作正常,返回退出状态为0
unix_shell> ls -la
unix_shell> hadoop fs -ls /user/hadoop/temp
unix_shell> s3-dist-cp --src ./abc.txt --dest s3://bucket/folder/
现在我试图通过scala process api作为外部进程运行这些相同的命令,示例代码如下:
import scala.sys.process._
val cmd_1 = "ls -la"
val cmd_2 = "hadoop fs -ls /user/hadoop/temp/"
val cmd_3 = "/usr/bin/s3-dist-cp --src /tmp/sample.txt --dest s3://bucket/folder/"
val cmd_4 = "s3-dist-cp --src /tmp/sample.txt --dest s3://bucket/folder/"
val exitCode_1 = (stringToProcess(cmd_1)).! // works fine and produces result
val exitCode_2 = (stringToProcess(cmd_2)).! // works fine and produces result
val exitCode_3 = (stringToProcess(cmd_3)).! // **it just hangs,yielding nothing**
val exitCode_4 = (stringToProcess(cmd_4)).! // **it just hangs,yielding nothing**
以上cmd_3和cmd_4之间的区别只是绝对路径.
我正在如下所示的spark-submit脚本中显式传递相关的依赖项
--jars hdfs:///user/hadoop/s3-dist-cp.jar
您的意见/建议将有所帮助.谢谢 !