1 Text Classification, Part I - Convolutional Networks
2 Text Classification, Part 2 - sentence level Attentional RNN
3 Text Classification, Part 3 - Hierarchical attention network
4 100行深度学习文本分类
5 CNN中文文本分类
6 DNN文本分类实践
7 用8千万评论训练的char-rnn做情感分析
8 词向量/CNN电影评论情感分类
9 浅层词级CNN vs. 深层字符级CNN 文本分类对比研究
10 一篇文章带你看懂 Keras 文本分类实现
11 How does Keras handle multilabel classification?
12 keras 调参, 优化, 一些设置等
13 Python/Keras - accessing ModelCheckpoint callback
14 深度学习框架Keras使用心得
15 指定GPU并且限制GPU用量
import os
import tensorflow as tf
import keras.backend.tensorflow_backend as KTF
# 指定第一块GPU可用
os.environ["CUDA_VISIBLE_DEVICES"] = "0"
config = tf.ConfigProto()
config.gpu_options.allow_growth=True #不全部占满显存, 按需分配
sess = tf.Session(config=config)
KTF.set_session(sess)
16 Neural network for multi label classification with large number of classes outputs only zero
If you use the tensorflow backend in Keras, you can use the loss function like this (Keras 2.1.1):
import tensorflow as tf
import keras.backend.tensorflow_backend as tfb
POS_WEIGHT = 10 # multiplier for positive targets, needs to be tuned
def weighted_binary_crossentropy(target, output):
"""
Weighted binary crossentropy between an output tensor
and a target tensor. POS_WEIGHT is used as a multiplier
for the positive targets.
Combination of the following functions:
* keras.losses.binary_crossentropy
* keras.backend.tensorflow_backend.binary_crossentropy
* tf.nn.weighted_cross_entropy_with_logits
"""
# transform back to logits
_epsilon = tfb._to_tensor(tfb.epsilon(), output.dtype.base_dtype)
output = tf.clip_by_value(output, _epsilon, 1 - _epsilon)
output = tf.log(output / (1 - output))
# compute weighted loss
loss = tf.nn.weighted_cross_entropy_with_logits(targets=target,
logits=output,
pos_weight=POS_WEIGHT)
return tf.reduce_mean(loss, axis=-1)
Then in your model:
model.compile(loss=weighted_binary_crossentropy, ...)
17 How to train a multi-label Classifier