I have installed spark the release: spark-2.2.0-bin-hadoop2.7.
I'm using Windows 10 OS
My java version 1.8.0_144
I have set my environment variables:
SPARK_HOME D:\spark-2.2.0-bin-hadoop2.7
HADOOP_HOME D:\Hadoop ( where I put bin\winutils.exe )
PYSPARK_DRIVER_PYTHON ipython
PYSPARK_DRIVER_PYTHON_OPTS notebook
Path is D:\spark-2.2.0-bin-hadoop2.7\bin
When I launch pyspark from command line I have this error:
ipython is not recognized as an internal or external command
I tried also to set PYSPARK_DRIVER_PYTHON in jupyter but and it's giving me the same error (not recognized as an internal or external command).
Any help please?
解决方案
Search in your machine the ipython application, in my case it is in "c:\Anaconda3\Scripts". Then just add that path to the PATH Environment Variables