MATLAB Deep Learning-Neural Network

Three typical schemes

在这里插入图片描述

SGD-Gradient Descent

  1. Function DeltaSGD & Sigmoid
function W = DeltaSGD(W, X, D)
  alpha = 0.9;
  
  N = 4;  
  for k = 1:N
    x = X(k, :)';
    d = D(k);

    v = W*x;
    y = Sigmoid(v);
    
    e     = d - y;  
    delta = y*(1-y)*e;
  
    dW = alpha*delta*x;     % delta rule    
    
    W(1) = W(1) + dW(1); 
    W(2) = W(2) + dW(2);
    W(3) = W(3) + dW(3);    
  end
function y = Sigmoid(x)
  y = 1 / (1 + exp(-x));
end

BGD-Batch Gradient Descent

function W = DeltaSGD(W, X, D)
  alpha = 0.9;
  
  N = 4;  
  for k = 1:N
    x = X(k, :)';
    d = D(k);

    v = W*x;
    y = Sigmoid(v);
    
    e     = d - y;  
    delta = y*(1-y)*e;
  
    dW = alpha*delta*x;     % delta rule    
    
    W(1) = W(1) + dW(1); 
    W(2) = W(2) + dW(2);
    W(3) = W(3) + dW(3);    
  end

Training date

  • In supervised learning, each training dataset should consist of input and correct output pairs.
    { input, correct output }
clear all
           
X = [ 0 0 1;
      0 1 1;
      1 0 1;
      1 1 1;
    ];

D = [ 0
      0
      1
      1
    ];


E1 = zeros(1000, 1);
E2 = zeros(1000, 1);

W1 = 2*rand(1, 3) - 1;                %W1=[a,b,c]
W2 = W1;

for epoch = 1:1000           % train
  W1 = DeltaSGD(W1, X, D);
  W2 = DeltaBatch(W2, X, D);

  es1 = 0;
  es2 = 0;
  N   = 4;
  for k = 1:N
    x = X(k, :)';
    d = D(k);
    
    v1  = W1*x;
    y1  = Sigmoid(v1);
    es1 = es1 + (d - y1)^2;
    
    v2  = W2*x;
    y2  = Sigmoid(v2);
    es2 = es2 + (d - y2)^2;
  end
  E1(epoch) = es1 / N;
  E2(epoch) = es2 / N;
end

plot(E1, 'r')
hold on
plot(E2, 'b:')
xlabel('Epoch')
ylabel('Average of Training error')
legend('SGD', 'Batch')

在这里插入图片描述
This program trains the neural network 1,000 times for each function, DeltaSGD and DeltaBatch. At each epoch, it inputs the training data into the neural network and calculates the mean square error (E1, E2) of the output. Once the program completes 1,000 trainings, it generates a graph that shows the mean error at each epoch. As the figure shows, the SGD yields faster reduction of the learning error than the batch; the SGD learns faster.

Test date

clear all
           
X = [ 0 0 1;
      0 1 1;
      1 0 1;
      1 1 1;
    ];

D = [ 0
      0
      1
      1
    ];
      
W = 2*rand(1, 3) - 1;

for epoch = 1:10000           % train
  W = DeltaSGD(W, X, D);
end

N = 4;                        % inference
for k = 1:N
  x = X(k, :)';
  v = W*x;
  y = Sigmoid(v)
end

Training of Multi-Laye Neural Network

Back-Propagation Algorithm

Cross Entropy Function

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
MATLAB中实现卷积神经网络(Convolutional Neural Network, CNN)和长短期记忆网络(Long Short-Term Memory, LSTM)的结合通常涉及到深度学习工具箱(Deep Learning Toolbox)。以下是一个简化版的示例代码,用于创建一个简单的CNN-LSTM模型: ```matlab % 导入所需库 if ~exist('dlworkshop', 'dir') addpath(genpath(fullfile(matlabroot, 'toolbox', 'nnet'))); end % 加载数据集(假设你有一个图片和对应的标签) imds = imageDatastore('your_data_directory', ... 'IncludeSubfolders', true, 'LabelSource', 'foldernames'); % 数据预处理(例如,归一化、大小调整等) imds = augmentedImageDatastore([32 32], imds); % 定义CNN层 layers = [ imageInputLayer([32 32 3]) convolution2dLayer(3, 16) % 卷积核大小,滤波器数 reluLayer % 激活函数 maxPooling2dLayer(2) % 最大池化层 convolution2dLayer(3, 32) reluLayer fullyConnectedLayer(512) % 全连接层 lstmLayer(512, 'OutputMode', 'sequence') % LSTM层 fullyConnectedLayer(numClasses) % 输出全连接层,根据你的任务调整类别数 softmaxLayer % 归一化输出 classificationLayer % 分类输出 ]; % 定义训练选项 options = trainingOptions('adam', ... % 使用Adam优化器 'MiniBatchSize', 64, % 批次大小 'MaxEpochs', 10, % 迭代次数 'ValidationData', imds.test, % 验证数据集 'Plots', 'training-progress'); % 显示训练进度图表 % 创建并训练模型 net = trainNetwork(imds.train, layers, options); % 测试模型 YPred = classify(net, imds.test); ``` 请注意,这个代码只是一个基础框架,实际应用中你需要根据具体的任务(如图像分类、文本序列预测等)、数据集规模及格式对代码进行适当的修改。
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值