- 博客(5)
- 收藏
- 关注
原创 Dive into Deep Learning
Dive into Deep Learning 1 Introduce Most neural networks contain a few principles: using linear or nonlinear units alternately,which are called layer. using gradient descent to update network parameters,which is base on chain rule(back propagation BP) in
2021-12-19 22:15:28 204
原创 [NLP] Why does Network Need Memory
Why? There is an enlightening example, slot-filing is a significant task in a variety of NLP tasks which is finding pieces of information that we need. Here are two sentences for exampling: arrive Taipei on November 2nd leave Taipei on November 2nd We ar
2021-12-19 22:12:07 254 1
原创 A Little Tips on Constructing Neuronal Network Based on Pytorch(1)
In python coding, we can divide the whole code into some parts, difference of every part of code with different functions. Using Manual Experiment of Fully Connected Neuronal Network Based on MNIST as an example, in this task, we have several subtasks: su
2021-12-12 21:36:25 1268 2
原创 Manual Experiment of Fully Connected Neuronal Network Based on MNIST
Manual Experiment of Fully Connected Neuronal Network Based on MNIST 1.Experiment Preparation 1.1Task Overview In this task, it is mainly about constructing a FNN model for handwriting recognition. FNN is a fundamental model in deep learning, which means a
2021-12-12 20:59:25 919
原创 Manual Experiment of Linear Regression
Manual Experiment of Linear Regression 1. Experiment preparation 1.1 Task overview The order of task is to be familiar with pytorch and gradient descent. Detail: Given x which is randomly generated, and define a function which is fitting target. Usi
2021-11-26 16:45:34 600 1
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人