Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

biomart_annotations still creates cache when use_cache=False #2861

Closed
3 tasks done
PauBadiaM opened this issue Feb 16, 2024 · 2 comments
Closed
3 tasks done

biomart_annotations still creates cache when use_cache=False #2861

PauBadiaM opened this issue Feb 16, 2024 · 2 comments

Comments

@PauBadiaM
Copy link
Contributor

Please make sure these conditions are met

  • I have checked that this issue has not already been reported.
  • I have confirmed this bug exists on the latest version of scanpy.
  • (optional) I have confirmed this bug exists on the master branch of scanpy.

What happened?

When using sc.queries.biomart_annotations, it still generates the file .pybiomart.sqlite even when use_cache=False is used.

To me this is a problem because the generated hidden file interferes with Omnipath and makes it crash. It used to be the case that use_cache used to work but not anymore.

Thank you for your time.

Minimal code sample

import scanpy as sc

annot = sc.queries.biomart_annotations(
    'hsapiens',
    ['ensembl_gene_id', 'external_gene_name'],
    use_cache=False
)

Error output

No response

Versions

-----
anndata     0.10.3
scanpy      1.9.8
-----
PIL                 10.2.0
asttokens           NA
attr                23.1.0
attrs               23.1.0
brotli              1.1.0
cattr               NA
cattrs              NA
certifi             2023.11.17
cffi                1.16.0
charset_normalizer  3.3.2
colorama            0.4.6
comm                0.1.4
cycler              0.12.1
cython_runtime      NA
dateutil            2.8.2
debugpy             1.8.0
decorator           5.1.1
executing           2.0.1
future              0.18.3
h5py                3.10.0
idna                3.4
igraph              0.11.2
ipykernel           6.26.0
jedi                0.19.1
joblib              1.3.2
kiwisolver          1.4.5
leidenalg           0.10.1
llvmlite            0.41.1
matplotlib          3.8.3
mpl_toolkits        NA
natsort             8.4.0
numba               0.58.1
numpy               1.26.4
packaging           23.2
pandas              2.2.0
parso               0.8.3
pexpect             4.8.0
pickleshare         0.7.5
platformdirs        3.11.0
prompt_toolkit      3.0.39
psutil              5.9.5
ptyprocess          0.7.0
pure_eval           0.2.2
pyarrow             15.0.0
pybiomart           0.2.0
pydev_ipython       NA
pydevconsole        NA
pydevd              2.9.5
pydevd_file_utils   NA
pydevd_plugins      NA
pydevd_tracing      NA
pygments            2.16.1
pyparsing           3.1.1
pytz                2024.1
requests            2.31.0
requests_cache      1.1.1
scipy               1.11.3
session_info        1.0.0
six                 1.16.0
sklearn             1.3.2
socks               1.7.1
stack_data          0.6.2
texttable           1.7.0
threadpoolctl       3.2.0
tornado             6.3.3
traitlets           5.13.0
typing_extensions   NA
url_normalize       1.4.3
urllib3             2.0.7
wcwidth             0.2.9
yaml                6.0.1
zmq                 25.1.1
zoneinfo            NA
-----
IPython             8.17.2
jupyter_client      8.6.0
jupyter_core        5.5.0
-----
Python 3.11.6 | packaged by conda-forge | (main, Oct  3 2023, 10:40:35) [GCC 12.3.0]
Linux-6.5.0-17-generic-x86_64-with-glibc2.35
-----
Session information updated at 2024-02-16 11:40
@ivirshup
Copy link
Member

Do you know when this used to work?

And would you be up for submitting a bugfix?

@PauBadiaM
Copy link
Contributor Author

Hi @ivirshup, it used to work 6 months ago. As discussed in saezlab/omnipath#54 (comment), it seems it was an error with the package requests_cache. Installing the latest version from GitHub solves the issue so I'll close this then, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants