Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

findspark not working after installation #18

Open
behpouriahi opened this issue Feb 24, 2018 · 9 comments
Open

findspark not working after installation #18

behpouriahi opened this issue Feb 24, 2018 · 9 comments

Comments

@behpouriahi
Copy link

Hi, I used pip3 install findspark . after installation complete I tryed to use import findspark but it said No module named 'findspark'. I don't know what is the problem here

@minrk
Copy link
Owner

minrk commented Feb 26, 2018

Typically that means that pip3 and your Python interpreter are not the same. Try comparing head -n 1 $(which pip3) and print(sys.executable) in your Python session.

@justinnaldzin
Copy link

Make sure your SPARK_HOME environment variable is correctly assigned.
does this work for you? ls $SPARK_HOME

@JunqiLoveCoding
Copy link

I face the same issue now. I installed the findspark in my laptop but cannot import it in jupyter notebook.

@nursace
Copy link

nursace commented Nov 17, 2019

I face the same issue now. I installed the findspark in my laptop but cannot import it in jupyter notebook.

Could you solve your issue? I have the same

@mazino2d
Copy link

mazino2d commented Feb 4, 2020

I have the same too :(

@nmay231
Copy link

nmay231 commented May 26, 2020

I would suggest using something to keep pip and python/jupyter pointing to the same installation. Pyenv (while it's not its main goal) does this pretty well. Just install jupyter and findspark after install pyenv and setting a version with pyenv (global | local) VERSION.

You should be able to use python -m pip install ... to install or otherwise interact with pip. Doing this with IPython should work as well.

If you are using jupyter, run jupyter --paths. I get this.

config:
    /home/nmay/.jupyter
    /home/nmay/.pyenv/versions/3.8.0/etc/jupyter
    /usr/local/etc/jupyter
    /etc/jupyter
data:
    /home/nmay/.local/share/jupyter
    /home/nmay/.pyenv/versions/3.8.0/share/jupyter   <-- This is the important path
    /usr/local/share/jupyter
    /usr/share/jupyter
runtime:
    /home/nmay/.local/share/jupyter/runtime

In my case, it's /home/nmay/.pyenv/versions/3.8.0/share/jupyter (since I use pyenv). The python and pip binaries that runs with jupyter will be located at /home/nmay/.pyenv/versions/3.8.0/bin/python and <path>/bin/pip. You could alias these (e.g. jupyter-pip) and install findspark with those.

Hope that helps 👍

@D4N005H
Copy link

D4N005H commented May 6, 2021

In case you're using Jupyter, Open Anaconda Prompt (Anaconda3) from the start menu. Then use this code to specifically force Findspark to be installed for the Jupyter's environment.
conda install -c conda-forge findspark

@rakib06
Copy link

rakib06 commented Aug 31, 2021

I install findspark in conda base env.. then I could solve it

bashconda deactivate conda activate python conda list pip3 install pyspark pip install pyspark conda install pyspark pip install findspark pip3 install findspark conda install findspark conda deactivate conda activate spark_env jupyter notebook doskey /history
image

@Shiva10k
Copy link

Shiva10k commented Jan 7, 2023

Hi, I used pip3 install findspark . after installation complete I tryed to use import findspark but it said No module named 'findspark'. I don't know what is the problem here

Please restart your Jupyter notebook kernal and it will solve your problem.
Screenshot from 2023-01-07 13-10-47

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants