如果启动pyspark,然后运行以下命令:import my_script; spark = my_script.Sparker(sc); spark.collapse('./data/')
一切都很好。但是,如果我试图通过命令行和spark submit做同样的事情,我会得到一个错误:Command: /usr/local/spark/bin/spark-submit my_script.py collapse ./data/
File "/usr/local/spark/python/pyspark/rdd.py", line 352, in func
return f(iterator)
File "/usr/local/spark/python/pyspark/rdd.py", line 1576, in combineLocally
merger.mergeValues(iterator)
File "/usr/local/spark/python/pyspark/shuffle.py", line 245, in mergeValues
for k, v in iterator:
File "/.../my_script.py", line 173, in _json_args_to_arr
js = cls._json(line)
RuntimeError: uninitialized staticmethod object
我的剧本:...
if __name__ == "__main__":
args = sys.argv[1:]
if args[0] == 'collapse':
directory = args[1]
from pyspark import SparkContext
sc = SparkContext(appName="Collapse")
spark = Sparker(sc)
spark.collapse(directory)
sc.stop()
为什么会这样?运行pyspark和运行spark submit有什么区别,这会导致这种分歧?我如何才能在spark submit中完成这项工作?
编辑:我试着通过执行pyspark my_script.py collapse ./data/从bash shell运行它,得到了相同的错误。只有当我在一个python shell中并导入脚本时,一切才能正常工作。