Minimum Distance
Write a python function def min_dist
(
u
1
,
u
2
,
u
3
)
(u_1, u_2, u_3)
(u1,u2,u3)
the functions takes in 3 lists of floats, each of length k, representing k-dimensional vectors. The function returns a list representing a k-dimensional vector v, such that
∣
v
−
u
1
∣
2
+
∣
v
−
u
2
∣
2
+
∣
v
−
u
3
∣
2
|v-u_1|^2 + |v-u_2|^2 + |v-u_3|^2
∣v−u1∣2+∣v−u2∣2+∣v−u3∣2 is as small as possible.
(Recall that the norm of a vector |w| is sqrt(
w
1
2
+
w
2
2
+
.
.
.
+
w
k
2
w_1^2+w_2^2+...+w_k^2
w12+w22+...+wk2).)
You MUST use gradient descent in order to complete this task. Make any reasonable algorithmic choices.
面试时遇到的题,有 u 1 , u 2 , u 3 u1,u2,u3 u1,u2,u3这3个向量,用梯度下降法找到向量 v v v,使得 ∣ v − u 1 ∣ 2 + ∣ v − u 2 ∣ 2 + ∣ v − u 3 ∣ 2 |v-u_1|^2 + |v-u_2|^2 + |v-u_3|^2 ∣v−u1∣2+∣v−u2∣2+∣v−u3∣2尽可能小。实际上 v v v就是3个向量的均值,但需要用梯度下降法进行求解。
import numpy as np
import torch
np.random.seed(2333)
k = 10
u1 = torch.randn([k,1])
u2 = torch.randn([k,1])
u3 = torch.randn([k,1])
v = torch.zeros([k,1],requires_grad=True)
# 损失函数
def f(u1,u2,u3,v):
return (v-u1) ** 2 + (v-u2) ** 2 + (v-u3) ** 2
# 优化方法
def sgd(params, lr):
with torch.no_grad():
for param in params:
param -= lr * param.grad
param.grad.zero_()
lr = 0.1
for epoch in range(10):
y = f(u1,u2,u3,v)
y.sum().backward()
sgd([v], lr)
print(v.data)
print((u1+u2+u3)/3)
tensor([[-0.6027],
[ 0.1890],
[ 0.9928],
[-0.1126],
[-0.1857],
[-0.1601],
[ 0.3190],
[-0.3286],
[-0.0229],
[ 0.4333]])
tensor([[-0.6027],
[ 0.1890],
[ 0.9929],
[-0.1126],
[-0.1857],
[-0.1601],
[ 0.3190],
[-0.3287],
[-0.0229],
[ 0.4333]])
Process finished with exit code 0