本次试用是在colab环境下完成的
首先,安装相关包
!pip install transformers
其次,试用模型350m
from transformers import pipeline
generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True, num_return_sequences=5)
generator("please introduce yourself")
返回结果如下:
[{'generated_text': 'please introduce yourself - what is your favorite book?'},
{'generated_text': "please introduce yourself to our members.\n\nWhat's your name, and where from, and how"},
{'generated_text': 'please introduce yourself\n\nhello,\ni am actually about to do a major in psychology, i'},
{'generated_text': 'please introduce yourself!\nHey there! What kind of music are you trying to make?\nI'},
{'generated_text': "please introduce yourself\nIt's me, Mike! I made my own reddit"}]
可能是试用的模型比较小,跟chatgpt的长篇大论的问答比,还是稍微弱了点
最后,试用模型1.3b
from transformers import pipeline
generator = pipeline('text-generation', model="facebook/opt-1.3b", do_sample=True, num_return_sequences=5)
generator("please introduce yourself")
返回结果如下:
[{'generated_text': "please introduce yourself to me in the back, i can't believe i forgot to do the little welcome"},
{'generated_text': "please introduce yourself\nI'm the guy! Glad to see someone recognizes my username. :)\nwhat"},
{'generated_text': 'please introduce yourself and give us some more info about yourself!\nThe girl in this picture is a'},
{'generated_text': "please introduce yourself as a mod in this thread.\nI'm not a mod. However I"},
{'generated_text': 'please introduce yourself.\nWhat is up!? I am a huge fan of the show, I started'}]
从返回结果上看,1.3b的模型比350m的要好很多,不过因为ram限制,再大的模型已经无法在colab里加载了