解决办法
pip install --upgrade accelerate
在运行MedicalGPT时异常内容如下
W0724 23:44:20.975000 140034745251648 torch/distributed/run.py:757]
W0724 23:44:20.975000 140034745251648 torch/distributed/run.py:757] *****************************************
W0724 23:44:20.975000 140034745251648 torch/distributed/run.py:757] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed.
W0724 23:44:20.975000 140034745251648 torch/distributed/run.py:757] *****************************************
Traceback (most recent call last):
File "/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1586, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "/root/miniconda3/envs/medicalgpt/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/transformers/trainer.py", line 221, in <module>
from accelerate.utils import (
ImportError: cannot import name 'is_mlu_available' from 'accelerate.utils' (/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/accelerate/utils/__init__.py)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/ai-server/MedicalGPT/supervised_finetuning.py", line 33, in <module>
from transformers import (
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1576, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1588, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.trainer because of the following error (look up to see its traceback):
cannot import name 'is_mlu_available' from 'accelerate.utils' (/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/accelerate/utils/__init__.py)
Traceback (most recent call last):
File "/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1586, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "/root/miniconda3/envs/medicalgpt/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/transformers/trainer.py", line 221, in <module>
from accelerate.utils import (
ImportError: cannot import name 'is_mlu_available' from 'accelerate.utils' (/root/miniconda3/envs/medicalgpt/lib/python3.10/site-packages/accelerate/utils/__init__.py)
The above exception was the direct cause of the following exception: