报错内容:
(llamaf) PS E:\Project\LLaMA-Factory> llamafactory-cli.exe webui
* Running on local URL: http://127.0.0.1:3000
Traceback (most recent call last):
File "E:\Users\Namta\miniconda3\envs\llamaf\lib\runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "E:\Users\Namta\miniconda3\envs\llamaf\lib\runpy.py", line 86, in _run_code
exec(code, run_globals)
File "E:\Users\Namta\miniconda3\envs\llamaf\Scripts\llamafactory-cli.exe\__main__.py", line 7, in <module>
sys.exit(main())
File "E:\Project\LLaMA-Factory\src\llamafactory\cli.py", line 122, in main
run_web_ui()
File "E:\Project\LLaMA-Factory\src\llamafactory\webui\interface.py", line 94, in run_web_ui
create_ui().queue().launch(share=gradio_share, server_name="127.0.0.1", server_port=3000, inbrowser=True)
File "E:\Users\Namta\miniconda3\envs\llamaf\lib\site-packages\gradio\blocks.py", line 2709, in launch
raise ValueError(
ValueError: When localhost is not accessible, a shareable link must be created. Please set share=True or check your proxy settings to allow access to localhost.
解决办法:在服务里面启动这个服务,这个是概率性的问题,网上也没有解决方法,我自己摸索的