Scipy.Optimize

目录

1.Scipy.Optimize.Minimize

例子1:数学问题,非线性函数在约定条件下求解最小值问题

例子2:机器学习寻找参数


 

1.Scipy.Optimize.Minimize

Source:   https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html#scipy.optimize.minimize

 

scipy.optimize.minimize(funx0args=()method=Nonejac=Nonehess=Nonehessp=Nonebounds=Noneconstraints=()tol=Nonecallback=Noneoptions=None)[source]

fun callable

The objective function to be minimized.

fun(x, *args) -> float

where x is an 1-D array with shape (n,) and args is a tuple of the fixed parameters needed to completely specify the function.

#x是一维数组,目标函数所需的固定参数的元组,由后面的args初始化

x0 ndarray, shape (n,)

Initial guess. Array of real elements of size (n,), where ‘n’ is the number of independent variables.

#初始化自变量参数

args tuple, optional

Extra arguments passed to the objective function and its derivatives (funjacand hess functions).

#传递给目标函数的额外参数

method:  str or callable, optional

Method for computing the gradient vector.  

#计算梯度的方法

例子1:数学问题,非线性函数在约定条件下求解最小值问题

Source:http://apmonitor.com/che263/index.php/Main/PythonOptimization

min x1x4(x1+x2+x3)+x3 

s.t.

x1x2x3x4 \geq 25

x1^{2} + x2^{2} + x3^{2} + x4^{2} = 40

1 \leq x1,x2,x3,x4 \leq 5

x0 = (1,5,5,1)

Scipy.Optimize.Minimize is demonstrated for solving a nonlinear objective function

subject to general inequality and equality constraints.

def objective(x):
    x1 = x[0]
    x2 = x[1]
    x3 = x[2]
    x4 = x[3]
    return x1*x4*(x1+x2+x3)+x3

def constraint1(x):
    return x[0]*x[1]*x[2]*x[3]-25.0

def constraint2(x):
    sum_sq = 40
    for i in range(4):
        sum_sq = sum_sq - x[i]**2
    return sum_sq
x0 = [1, 5 , 5, 1]
print(objective(x0))
#16
b = (1.0 , 5.0)
bnds = (b,b,b,b)
cons1 = {'type' : 'ineq' , 'fun' : constraint1}
cons2 = {'type' : 'eq' , 'fun' : constraint2}
cons = [cons1,cons2]
sol = minimize(objective, x0, method='SLSQP', bounds=bnds, constraints=cons)
print(sol)
    fun: 17.01401724563517
     jac: array([14.57227015,  1.37940764,  2.37940764,  9.56415057])
 message: 'Optimization terminated successfully.'
    nfev: 30
     nit: 5
    njev: 5
  status: 0
 success: True
       x: array([1.        , 4.7429961 , 3.82115462, 1.37940765])

例子2:机器学习寻找参数

在吴恩达cs229的机器学习

作业1中 linear_regression中,通过Gradient Descent方法,设置学习率,循环次数,进行寻找参数

而在作业2中 logistic_regression中,我们采用scipy.optimize,minimize,去寻找参数。

详细细节,请参考作业,这里只贴出关键代码

Source : https://study.163.com/course/courseMain.htm?courseId=1004570029&_trace_c_p_k2_=00b4467624554d9fae0c292a58af0a45

theta = theta=np.zeros(3) 

def cost(theta, X, y):
    costf = np.mean(-y * np.log(sigmoid(X.dot(theta))) - (1 - y) * np.log(1 - sigmoid(X @ theta)))
    return costf

def gradient(theta, X, y):   
    grad = (1 / len(X)) * X.T @ (sigmoid(X @ theta) - y)
    return grad
import scipy.optimize as opt
res = opt.minimize(fun=cost, x0=theta, args=(X, y), method='Newton-CG', jac=gradient)
print(res)


 

  • 5
    点赞
  • 40
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值