Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BrokenPipeError: [Errno 32] Broken pipe #3

Open
cfortunylombra opened this issue Jun 8, 2021 · 1 comment
Open

BrokenPipeError: [Errno 32] Broken pipe #3

cfortunylombra opened this issue Jun 8, 2021 · 1 comment

Comments

@cfortunylombra
Copy link

When I run the code in the PowerShell command line with the default arguments (except for the cv_dir) and using the R32_C10 model only, it stops at this line:

train(epoch)

It enters the for-loop, but in an iteration, it gets stuck due to an issue with the definition of train:

for batch_idx, (inputs, targets) in tqdm.tqdm(enumerate(trainloader), total=len(trainloader)):

The error message is the following:

(stanford) PS E:\EUMETSAT\PatchDrop>  python classifier_training.py
Files already downloaded and verified
Files already downloaded and verified
Files already downloaded and verified
Files already downloaded and verified
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "E:\Program Files\Anaconda3\envs\stanford\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "E:\Program Files\Anaconda3\envs\stanford\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "E:\Program Files\Anaconda3\envs\stanford\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\EUMETSAT\PatchDrop\classifier_training.py", line 143, in <module>
    train(epoch)
  File "E:\EUMETSAT\PatchDrop\classifier_training.py", line 51, in train
    for batch_idx, (inputs, targets) in tqdm.tqdm(enumerate(trainloader), total=len(trainloader)):
  File "E:\Program Files\Anaconda3\envs\stanford\lib\site-packages\torch\utils\data\dataloader.py", line 355, in __iter__
Traceback (most recent call last):
      File "classifier_training.py", line 143, in <module>
return self._get_iterator()
train(epoch)  File "E:\Program Files\Anaconda3\envs\stanford\lib\site-packages\torch\utils\data\dataloader.py", line 301, in _get_iterator

      File "classifier_training.py", line 51, in train
return _MultiProcessingDataLoaderIter(self)
for batch_idx, (inputs, targets) in tqdm.tqdm(enumerate(trainloader), total=len(trainloader)):  File "E:\Program Files\Anaconda3\envs\stanford\lib\site-packages\torch\utils\data\dataloader.py", line 914, in __init__

  File "E:\Program Files\Anaconda3\envs\stanford\lib\site-packages\torch\utils\data\dataloader.py", line 355, in __iter__
    w.start()
return self._get_iterator()  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\process.py", line 112, in start

      File "E:\Program Files\Anaconda3\envs\stanford\lib\site-packages\torch\utils\data\dataloader.py", line 301, in _get_iterator
self._popen = self._Popen(self)
return _MultiProcessingDataLoaderIter(self)  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\context.py", line 223, in _Popen

      File "E:\Program Files\Anaconda3\envs\stanford\lib\site-packages\torch\utils\data\dataloader.py", line 914, in __init__
return _default_context.get_context().Process._Popen(process_obj)
  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\context.py", line 322, in _Popen
        w.start()return Popen(process_obj)

  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\process.py", line 112, in start
  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\popen_spawn_win32.py", line 46, in __init__
        self._popen = self._Popen(self)prep_data = spawn.get_preparation_data(process_obj._name)

  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\context.py", line 223, in _Popen
  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\spawn.py", line 143, in get_preparation_data
        return _default_context.get_context().Process._Popen(process_obj)_check_not_importing_main()

  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\context.py", line 322, in _Popen
  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\spawn.py", line 136, in _check_not_importing_main
        return Popen(process_obj)is not going to be frozen to produce an executable.''')

  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
RuntimeError    : reduction.dump(process_obj, to_child)
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.

  File "E:\Program Files\Anaconda3\envs\stanford\lib\multiprocessing\reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
BrokenPipeError: [Errno 32] Broken pipe

Does anyone know how to fix this issue?

@DiggyBee
Copy link

DiggyBee commented Aug 5, 2021

I think this is an issue with the num_workers argument for dataloaders. In windows you need to set num_workers=0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants