Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Module 'accuracy' doesn't exist on the Hugging Face Hub either. #456

Open
AceCHQ opened this issue May 6, 2023 · 17 comments
Open

Module 'accuracy' doesn't exist on the Hugging Face Hub either. #456

AceCHQ opened this issue May 6, 2023 · 17 comments

Comments

@AceCHQ
Copy link

AceCHQ commented May 6, 2023

I don't know how to solve it. Can you help me? Thank you. The error is Module 'accuracy' doesn't exist on the Hugging Face Hub either.

@Imaginny
Copy link

Imaginny commented May 9, 2023

I have the same issue. I installed the evaluate module (version 0.4.0) in my virtual environment using pip install evaluate.
When I execute this:

import evaluate
metric = evaluate.load("accuracy")

I get the following error message:

---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
Cell In[14], line 3
      1 import evaluate
----> 3 metric = evaluate.load("accuracy")
      4 model.eval()
      5 for batch in test_dataloader:

File ~/virtual_python/document_classification-venv/lib/python3.10/site-packages/evaluate/loading.py:731, in load(path, config_name, module_type, process_id, num_process, cache_dir, experiment_id, keep_in_memory, download_config, download_mode, revision, **init_kwargs)
    703 """Load a `evaluate.EvaluationModule`.
    704 
    705 Args:
   (...)
    728     `evaluate.EvaluationModule`
    729 """
    730 download_mode = DownloadMode(download_mode or DownloadMode.REUSE_DATASET_IF_EXISTS)
--> 731 evaluation_module = evaluation_module_factory(
    732     path, module_type=module_type, revision=revision, download_config=download_config, download_mode=download_mode
    733 )
    734 evaluation_cls = import_main_class(evaluation_module.module_path)
    735 evaluation_instance = evaluation_cls(
    736     config_name=config_name,
    737     process_id=process_id,
   (...)
    743     **init_kwargs,
    744 )

File ~/virtual_python/document_classification-venv/lib/python3.10/site-packages/evaluate/loading.py:681, in evaluation_module_factory(path, module_type, revision, download_config, download_mode, force_local_path, dynamic_modules_path, **download_kwargs)
    679         if not isinstance(e1, (ConnectionError, FileNotFoundError)):
    680             raise e1 from None
--> 681         raise FileNotFoundError(
    682             f"Couldn't find a module script at {relative_to_absolute_path(combined_path)}. "
    683             f"Module '{path}' doesn't exist on the Hugging Face Hub either."
    684         ) from None
    685 else:
    686     raise FileNotFoundError(f"Couldn't find a module script at {relative_to_absolute_path(combined_path)}.")

FileNotFoundError: Couldn't find a module script at /home/foo/bar/accuracy/accuracy.py. Module 'accuracy' doesn't exist on the Hugging Face Hub either.

@mariosasko
Copy link
Contributor

Hi! The Hub was experiencing an outage around this time, so it should work now :).

@amina-mardiyyah
Copy link

It doesn't work for me. It was working fine yesterday but since this morning it's been throwing this error. Neither of the metrics from evaluate seems to be working actually.

image

@nithotep
Copy link

I installed the module with anaconda: conda install -c huggingface -c conda-forge evaluate

Get the exact same error from the same code as @AceCHQ

@nithotep
Copy link

In my case it was a firewall problem. Hugging Face Hub was not down, my firewall prevented me from connecting to it.

@lyq1152227095
Copy link

maybe you can install dataset first.if the problem is not solved ,you can copy the code into your dir.i think there are somethings wrong of dowloading of metrics.

@RoversCode
Copy link

hello, I have the same issue. But I use proxy in the terminal, is temporarily works. Additionally, if I run code in the vscode, it doesn't work again, unless I use proxy in the vscode. it seems to be a network issue.

@yzy1421
Copy link

yzy1421 commented Jul 12, 2023

pip install accuracy
it works for me.

@dongcun1231
Copy link

Download the metrics directory to the local path
git clone https://github.com/huggingface/evaluate.git
metric = evaluate.load("evaluate/metrics/accuracy/accuracy.py")

@amina-mardiyyah
Copy link

For me, I solved it in two ways.

  1. Downgrade my transformers version to 4.28 (reason: it just seems to be the most stable version for every experiment I need to run currently, and most compatible with the other dependencies I use).
  2. It seems for some reason my .cache file keeps getting redirected to a different folder anytime I restart my virtual machine. So setting up a constant environment variable with my conda environment helps. Since It just picks up whatever paths I have already pre-configured. It works fine now.

@robinsonmhj
Copy link

I suggest to put the necessary mode into the wheel, so it has all the modules when installing evaluate. It doesn't make sense to download the module from huggingface as not all the places open the internet access for security reasons.

@LronDC
Copy link

LronDC commented Dec 8, 2023

I suggest to put the necessary mode into the wheel, so it has all the modules when installing evaluate. It doesn't make sense to download the module from huggingface as not all the places open the internet access for security reasons.

+1, /cc @lvwerra Can we make this happen? Offline env is really a major scene.

@failable
Copy link

Same issue

@MH-lxj
Copy link

MH-lxj commented Mar 22, 2024

same issue

@MichaelMedek
Copy link

Same issue

@ShenJinglong
Copy link

Download the metrics directory to the local path git clone https://github.com/huggingface/evaluate.git metric = evaluate.load("evaluate/metrics/accuracy/accuracy.py")

it works for me, thanks!

@Edenzzzz
Copy link

It's mostly a network issue. I tried inside docker, and successfully downloaded the metric outside it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests