Machine Learning Lab Assignment 3

OBJECT OF THIS ASSIGNMENT
To understand how the Backpropagation algorithm 反向傳播演算法 learns the weight values for multilayer networks, and understand how a hidden layer (number of hidden nodes) changes the performance of the backpropagation algorithm.

ASSIGNMENT
Implement a two-layer neural network with the BACKPROPAGATION learning algorithm. Your implementation needs to be flexible enough to implement any reasonable size of two-layer neural networks . In other words, an implementation that assumes a fixed number of inputs, a fixed number of outputs, or a fixed number of hidden nodes is not sufficient. To be more specific, we assume that your
program will accept any number of input attributes (≦ 50), any number of hidden nodes (≦ 50), and any number of outputs (≦ 50).

TESTING PROBLEMS

  1. BINARY (TWO-CLASS) CLASSIFICATION PROBLEM
    PARITY PROBLEM 奇偶校驗問題 : USE ONE OUTPUT NEURON
    The inputs of the problem consist of a fixed number of binary inputs and the desired target is 1 if the input pattern contains an odd number of 1’s and 0 otherwise, and the training set consists of all the valid inputs with their desired target.
    You need to apply your program to parity problems with two inputs, four inputs,and eight inputs.
    For each testing case with different inputs, report which neural network architecture (i.e., different number of hidden nodes) obtained the best performance. Analyze and explain your findings.
  2. MULTICLASS-CLASSIFICATION PROBLEM
    K-CLASS CLASSIFICATION PROBLEM: USE K OUTPUT NEURONS
    Find an open dataset about K-class classification problem from some website (for example, the UCI Machine Learning Repository). The number of classes in the dataset has to be greater than or equal to three (K ≧ 3). Use p% of the examples as a training set and the remaining (1- p%) as a test set.
    (a) Display the training accuracy and testing accuracy.
    (b) Report which neural network architecture (i.e., different number of hidden nodes) obtained the best performance. Analyze and explain your findings.
    (c) (10% Bonus) Plot a graph showing the number of training epochs 期 vs. the training accuracy.
    (d) (10% Bonus) Rerun the experiment on a different training/test data-split (i.e., change p%). Analyze and explain your findings.

WHAT TO HAND IN
Please hand in the following items:
(a) Your source code file, executable file, data files, report file and so on.
(b) Your report file may contain

  1. Data description
  2. Experimental setting
    – Architecture of each neural network (number of input, hidden and output neurons)
    – Parameters (learning rate, stopping criteria)
  3. Results and discussion
    – Discuss the results; e.g. compare the performance of different network architectures, discuss how the experimental results agree with the theory and everything else you consider important.
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值