import my_script; spark = my_script.Sparker(sc); spark.collapse('./data/')
一切都OK。但是,如果我尝试通过命令行做同样的事情,引发提交,我得到一个错误:
Command: /usr/local/spark/bin/spark-submit my_script.py collapse ./data/
File "/usr/local/spark/python/pyspark/rdd.py", line 352, in func
return f(iterator)
File "/usr/local/spark/python/pyspark/rdd.py", line 1576, in combineLocally
merger.mergeValues(iterator)
File "/usr/local/spark/python/pyspark/shuffle.py", line 245, in mergeValues
for k, v in iterator:
File "/.../my_script.py", line 173, in _json_args_to_arr
js = cls._json(line)
RuntimeError: uninitialized staticmethod object
my_script:
...
if __name__ == "__main__":
args = sys.argv[1:]
if args[0] == 'collapse':
directory = args[1]
from pyspark import SparkContext
sc = SparkContext(appName="Collapse")
spark = Sparker(sc)
spark.collapse(directory)
sc.stop()
这究竟是为什么?运行pyspark和运行spark-submit会有什么区别,会导致这种分歧?我如何在spark-submit中做这项工作?
编辑:我试图通过做pyspark my_script.py collapse ./data/运行这个从bash shell,我得到了同样的错误。当一切正常时唯一的一次是当我在一个python shell中并导入脚本时。