环境:
python:3.10.12
开发环境:juypter
Phoenix能做什么?
Arize的Phoenix是一个平台,用于监控、调试和优化机器学习模型的性能和行为。它可以帮助数据科学家和工程师更好地理解模型在生产环境中的表现,提供实时的监控和可视化分析,以便进行及时的调整和优化。
拉取代码
https://github.com/Arize-ai/phoenix
文档:https://docs.arize.com/phoenix
git clone https://github.com/Arize-ai/phoenix.git
Docker部署
➜ phoenix git:(main) ✗ docker-compose up -d
[+] Running 4/4
✔ Network phoenix_default Created 0.1s
✔ Volume "phoenix_database_data" Created 0.0s
✔ Container phoenix-db-1 Started 1.9s
✔ Container phoenix-phoenix-1 Started 2.2s
➜ phoenix git:(main) ✗ docker-compose ps
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
phoenix-db-1 postgres "docker-entrypoint.s…" db 9 minutes ago Up 9 minutes 0.0.0.0:32769->5432/tcp, :::32769->5432/tcp
phoenix-phoenix-1 phoenix-phoenix "/usr/bin/python3.11…" phoenix 9 minutes ago Up 9 minutes 0.0.0.0:4317->4317/tcp, :::4317->4317/tcp, 0.0.0.0:6006->6006/tcp, :::6006->6006/tcp, 9090/tcp
安装依赖包(jupyter)
%pip install arize-phoenix arize-phoenix-evals openai openinference-instrumentation-openai opentelemetry-sdk opentelemetry-exporter-otlp
大模型相关配置
import os
# 设置环境变量
# os.environ['http_proxy'] = "http://192.168.11.242:8889"
# os.environ['https_proxy'] = "http://192.168.11.242:8889"
# openai api
# os.environ["OPENAI_API_KEY"] = "sk-****"
# os.environ["OPENAI_BASE_URL"] = "https://api.openai.com/v1"
# lingyiwanwu
os.environ["OPENAI_API_KEY"] = "sk-****"
os.environ["OPENAI_BASE_URL"] = "https://api.lingyiwanwu.com/v1"
os.environ["PHOENIX_HOST"] = "0.0.0.0"
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "http://0.0.0.0:6006"
os.environ["PHOENIX_PORT"] = "6006"
os.environ["PHOENIX_GRPC_PORT"] = "4317"
初始化
from opentelemetry import trace as trace_api
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
OTLPSpanExporter as GRPCSpanExporter,
)
from opentelemetry.exporter.otlp.proto.http.trace_exporter import (
OTLPSpanExporter as HTTPSpanExporter,
)
from openinference.instrumentation.openai import OpenAIInstrumentor
# Add Phoenix
span_phoenix_processor = SimpleSpanProcessor(HTTPSpanExporter(endpoint="http://0.0.0.0:6006/v1/traces"))
# Add them to the tracer
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(span_processor=span_phoenix_processor)
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace_api.set_tracer_provider(tracer_provider=tracer_provider)
OpenAIInstrumentor().instrument()
大模型对话
import openai
openai_client = openai.OpenAI()
response = openai_client.chat.completions.create(
# model="gpt-3.5-turbo",
model="yi-large",
messages=[{"role": "user", "content": "写一首关于雪山的事"}],
max_tokens=20,
)
print(response.choices[0].message.content)
总结
trace功能类似LangSmith,可视化追踪问题,还可以评估和数据集采集,更多功能参考官方文档。
转载请注明出处,谢谢!