DL
Jason24_Zeng
这个作者很懒,什么都没留下…
展开
-
Char_Level_CNN_Model 模型介绍
Char_Level_CNN_Model模型介绍模型参数总览逐层介绍Input LayerEmbedding LayerConvolution layers 1D逐层计算参数量:Flatten LayerFull Connected(FC) Layers and Output Layer参数数量预测:Build ModelTrain Model模型介绍模型参数总览_________________________________________________________________Laye原创 2020-10-23 10:54:53 · 495 阅读 · 1 评论 -
简单理解ConvNet层的Backpropagation
这里只谈一下二维的情况,也就是channel = 1, strides = 1的情况下,单就这一层的计算.ConvNet forward propagationfilter’s shape: [height, length, 1]yi,j=Σn=0lengthΣm=0heightWm,nxi+m,j+ny_{i,j} = \Sigma_{n=0}^{length}\Sigma_{m=0}^{height} W_{m,n}x_{i+m,j+n}yi,j=Σn=0lengthΣm=0height原创 2020-10-16 17:58:30 · 250 阅读 · 0 评论 -
使用20_newsgroup集做训练集,载入Glove预训练权重训练模型
使用20_newsgroup集做训练集,载入Glove预训练权重训练模型预训练20_newsgroup数据集Load samplePreview file folderDefine the path to 20_newsgroup folderLoad data from all the child folder in 20_newsgroupPreprocess the texts dataImport LibraryTokenizerPad_sequencesPreprocess the labelsI翻译 2020-10-15 17:59:12 · 1238 阅读 · 0 评论 -
tensorflow.keras的某些函数的使用
Tensorflow.keras common functionTensorflow.keras.preprocessingTokenizersPad_sequencesTensorflow.keras.preprocessingTokenizerstf.keras.preprocessing.text.Tokenizer( num_words=None, filters='!"#$%&()*+,-./:;<=>?@[\\]^_`{|}~\t\n', lower=True,原创 2020-10-15 16:30:35 · 424 阅读 · 0 评论 -
Word2Vec skip-gram model
word2vec Skip-Gram model的理解参考文献运用word2vec的目的模型结构1. Feedforward Neural Net Language Model (NNLM)2. Recurrent Neural Net Language Model (RNNLM)3. Parallel Training of Neural Networks4. New Log-Linear Models4.1 CBOW: Continuous Bag-of-Words Model4.2 Continuou原创 2020-10-14 19:33:55 · 308 阅读 · 0 评论 -
Train_word2vec in Convolutional Neural Network for Sentences Classification
Parameterstrain_word2vec(sentence_matrix, vocabulary_inv, num_features = 300, min_word_count = 1, context = 10)上面主要涉及了五个参数:sentences matrix: 一个整数矩阵,就是每句话对应一行,长度不够做pad, 每个单词用vocabulary里的value替代vocabulary_inv: dict{int: str}用来对应每个value表示哪个单词num_fe翻译 2020-10-13 12:03:24 · 199 阅读 · 0 评论 -
Convolutional Neural Networks for Sentence Classification 代码实现
NLP: Convolutional Neural Networks for Sentence Classification 代码实现预处理数据原始数据及预处理数据结果预处理步骤Load DataLoad from fileSplit by WordsGenerate LabelsPaddingBuild VocabularyMap Sentences and Labels to IndexShuffle Data and Split Train/Test SetWord Embedding(将word 转翻译 2020-10-13 00:39:24 · 455 阅读 · 0 评论 -
Convolutional Neural Network for Sentences Classification 笔记
Convolutional Neural Network for Sentences Classfication文章特点仅用一个很简单的拥有少数hyperpamater tuning and static vectors 的CNN, 就提高了七个任务中的四个SOA(state of the art),其中包括sentiment analysis and question classification.四个model:2.1 CNN-rand: 基准model, 里面的字随机初始化成向量vector原创 2020-10-12 00:39:14 · 180 阅读 · 0 评论 -
NLP_Transformer_Attention_mechanism
Transformer 之前技术机器局限RNN: 可捕获长距离依赖性息,无法并行CNN: 能并行,无法捕获长距离依赖关系(需要通过pooling 或 kernel 去扩张感受野传统AttentionNLP神经网络中某些层的解释Attention Mechanism(注意力机制)注意力模型之前比较重要的需要留意的模型:seq2seq, or RNN encoder-decoder,http://emnlp2014.org/papers/pdf/EMNLP2014179.pdf较早火起来是原创 2020-10-11 12:16:12 · 185 阅读 · 2 评论