I just installed pyspark 2.2.0 using conda (using python v3.6 on windows 7 64bit, java v1.8)
$conda install pyspark
It downloaded and seemed to install correctly with no errors. Now when I run pyspark on the command line, it just tells me "The system cannot find the path specified."
$pyspark
The system cannot find the path specified.
The system cannot find the path specified.
I tried including the pyspark path directory in my PATH environment variables, but that still didn't seem to work, but maybe I am giving the wrong path? Can anyone please advise. Does the Java path need to be specified in PATH environment variables or something? Thanks
解决方案
PySpark from PyPi (i.e. installed with pip or conda) does not contain the full PySpark functionality; it is only intended for use with a Spark install