base_url和site_url的区别

本文深入解析了CodeIgniter框架中base_url与site_url的区别,阐述了它们在网站路径配置中的作用。base_url定义了网站的基本URL,而site_url用于生成包含入口文件的完整URL。

base_url和site_url的区别


在使用ci框架的时候 经常碰到跳转以及路径方面的问题,base_url和site_url则会让有些人混淆,现在来说一下区别在哪里

site_url指的是文件的相对路径,base_url表示默认地址:

// 将CI中配置文件(application/config/config.php)中 
/**网站根目录**/
$config['base_url']="http://test.com/";
/**入口文件**/
$config['index_page'] = "index.php";

打印二者区分如下:

site_url(“ask/2”),url为http://test.com/index.php/ask/2
base_url(“ask/2”),url为http://test.com/ask/2

from abc import ABC from typing import Dict, Optional, Union import httpx from mem0.configs.base import AzureConfig class BaseLlmConfig(ABC): """ Config for LLMs. """ def __init__( self, model: Optional[Union[str, Dict]] = None, temperature: float = 0.1, api_key: Optional[str] = None, max_tokens: int = 2000, top_p: float = 0.1, top_k: int = 1, enable_vision: bool = False, vision_details: Optional[str] = "auto", # Openrouter specific models: Optional[list[str]] = None, route: Optional[str] = "fallback", openrouter_base_url: Optional[str] = None, # Openai specific openai_base_url: Optional[str] = None, site_url: Optional[str] = None, app_name: Optional[str] = None, # Ollama specific ollama_base_url: Optional[str] = None, # AzureOpenAI specific azure_kwargs: Optional[AzureConfig] = {}, # AzureOpenAI specific http_client_proxies: Optional[Union[Dict, str]] = None, # DeepSeek specific deepseek_base_url: Optional[str] = None, # XAI specific xai_base_url: Optional[str] = None, # Sarvam specific sarvam_base_url: Optional[str] = "https://api.sarvam.ai/v1", # LM Studio specific lmstudio_base_url: Optional[str] = "http://localhost:1234/v1", lmstudio_response_format: dict = None, # vLLM specific vllm_base_url: Optional[str] = "http://localhost:8000/v1", # AWS Bedrock specific aws_access_key_id: Optional[str] = None, aws_secret_access_key: Optional[str] = None, aws_region: Optional[str] = "us-west-2", ): """ Initializes a configuration class instance for the LLM. :param model: Controls the OpenAI model used, defaults to None :type model: Optional[str], optional :param temperature: Controls the randomness of the model's output. Higher values (closer to 1) make output more random, lower values make it more deterministic, defaults to 0 :type temperature: float, optional :param api_key: OpenAI API key to be use, defaults to None :type api_key: Optional[str], optional :param max_tokens: Controls how many tokens are generated, defaults to 2000 :type max_tokens: int, optional :param top_p: Controls the diversity of words. Higher values (closer to 1) make word selection more diverse, defaults to 1 :type top_p: float, optional :param top_k: Controls the diversity of words. Higher values make word selection more diverse, defaults to 0 :type top_k: int, optional :param enable_vision: Enable vision for the LLM, defaults to False :type enable_vision: bool, optional :param vision_details: Details of the vision to be used [low, high, auto], defaults to "auto" :type vision_details: Optional[str], optional :param models: Openrouter models to use, defaults to None :type models: Optional[list[str]], optional :param route: Openrouter route to be used, defaults to "fallback" :type route: Optional[str], optional :param openrouter_base_url: Openrouter base URL to be use, defaults to "https://openrouter.ai/api/v1" :type openrouter_base_url: Optional[str], optional :param site_url: Openrouter site URL to use, defaults to None :type site_url: Optional[str], optional :param app_name: Openrouter app name to use, defaults to None :type app_name: Optional[str], optional :param ollama_base_url: The base URL of the LLM, defaults to None :type ollama_base_url: Optional[str], optional :param openai_base_url: Openai base URL to be use, defaults to "https://api.openai.com/v1" :type openai_base_url: Optional[str], optional :param azure_kwargs: key-value arguments for the AzureOpenAI LLM model, defaults a dict inside init :type azure_kwargs: Optional[Dict[str, Any]], defaults a dict inside init :param http_client_proxies: The proxy server(s) settings used to create self.http_client, defaults to None :type http_client_proxies: Optional[Dict | str], optional :param deepseek_base_url: DeepSeek base URL to be use, defaults to None :type deepseek_base_url: Optional[str], optional :param xai_base_url: XAI base URL to be use, defaults to None :type xai_base_url: Optional[str], optional :param sarvam_base_url: Sarvam base URL to be use, defaults to "https://api.sarvam.ai/v1" :type sarvam_base_url: Optional[str], optional :param lmstudio_base_url: LM Studio base URL to be use, defaults to "http://localhost:1234/v1" :type lmstudio_base_url: Optional[str], optional :param lmstudio_response_format: LM Studio response format to be use, defaults to None :type lmstudio_response_format: Optional[Dict], optional :param vllm_base_url: vLLM base URL to be use, defaults to "http://localhost:8000/v1" :type vllm_base_url: Optional[str], optional """ self.model = model self.temperature = temperature self.api_key = api_key self.max_tokens = max_tokens self.top_p = top_p self.top_k = top_k self.enable_vision = enable_vision self.vision_details = vision_details # AzureOpenAI specific self.http_client = httpx.Client(proxies=http_client_proxies) if http_client_proxies else None # Openrouter specific self.models = models self.route = route self.openrouter_base_url = openrouter_base_url self.openai_base_url = openai_base_url self.site_url = site_url self.app_name = app_name # Ollama specific self.ollama_base_url = ollama_base_url # DeepSeek specific self.deepseek_base_url = deepseek_base_url # AzureOpenAI specific self.azure_kwargs = AzureConfig(**azure_kwargs) or {} # XAI specific self.xai_base_url = xai_base_url # Sarvam specific self.sarvam_base_url = sarvam_base_url # LM Studio specific self.lmstudio_base_url = lmstudio_base_url self.lmstudio_response_format = lmstudio_response_format # vLLM specific self.vllm_base_url = vllm_base_url # AWS Bedrock specific self.aws_access_key_id = aws_access_key_id self.aws_secret_access_key = aws_secret_access_key self.aws_region = aws_region 按照上述分析一下
07-03
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值