本文是学习Andrew Ng的机器学习系列教程的学习笔记。教学视频地址:
https://study.163.com/course/introduction.htm?courseId=1004570029#/courseDetail?tab=1
通过用户对电影的评分,推荐用户喜爱的电影:
- 建立协同过滤的成本函数
- 建立协同过滤的梯度下降函数
- 同时最优化参数和特征
- 预测新用户对电影的评价
原始数据:
写函数时用矩阵方式计算效率高,公式也简洁。
function [J, grad] = cofiCostFunc(params, Y, R, num_users, num_movies, ...
num_features, lambda)
%COFICOSTFUNC Collaborative filtering cost function
% [J, grad] = COFICOSTFUNC(params, Y, R, num_users, num_movies, ...
% num_features, lambda) returns the cost and gradient for the
% collaborative filtering problem.
%
% Unfold the U and W matrices from params
X = reshape(params(1:num_movies*num_features), num_movies, num_features);
Theta = reshape(params(num_movies*num_features+1:end), ...
num_users, num_features);
% You need to return the following values correctly
J = 0;
X_grad = zeros(size(X));
Theta_grad = zeros(size(Theta));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost function and gradient for collaborative
% filtering. Concretely, you should first implement the cost
% function (without regularization) and make sure it is
% matches our costs. After that, you should implement the
% gradient and use the checkCostFunction routine to check
% that the gradient is correct. Finally, you should implement
% regularization.
%
% Notes: X - num_movies x num_features matrix of movie features
% Theta - num_users x num_features matrix of user features
% Y - num_movies x num_users matrix of user ratings of movies
% R - num_movies x num_users matrix, where R(i, j) = 1 if the
% i-th movie was rated by the j-th user
%
% You should set the following variables correctly:
%
% X_grad - num_movies x num_features matrix, containing the
% partial derivatives w.r.t. to each element of X
% Theta_grad - num_users x num_features matrix, containing the
% partial derivatives w.r.t. to each element of Theta
%
%J = 1/2.*sum(sum((X * Theta' .* R - Y .* R) .^ 2));%矩阵乘以R能保证R为1时参与计算
J = (1/2).*sum(sum(((X * Theta') .* R - Y .* R) .^ 2))
+ ( lambda ./ 2 .* sum(sum(Theta .^ 2)))
+ ( lambda ./ 2 .* sum(sum(X .^ 2)));
%X_grad = ((X * Theta') .* R - Y .* R) * Theta;
%Theta_grad = (X' * (((X * Theta') .* R) - (Y .* R)))';
X_grad = ((X * Theta') .* R * Theta - Y .* R * Theta) + lambda .* X;
Theta_grad = ((X' * ((X * Theta') .* R) - X' * (Y .* R)))' + lambda .* Theta;
% =============================================================
grad = [X_grad(:); Theta_grad(:)];
end
预测结果,推荐新用户的几部电影: