NLP 基于kashgari和BERT实现中文命名实体识别(NER)

本文详细介绍了如何利用kashgari框架结合预训练的BERT模型,进行中文命名实体识别(NER)任务。通过实例,展示了从数据预处理、模型训练到效果评估的全过程。
摘要由CSDN通过智能技术生成
from kashgari.corpus import ChineseDailyNerCorpus
import kashgari
from kashgari.embeddings import BERTEmbedding
from kashgari.tasks.labeling import BiLSTM_CRF_Model

train_x, train_y = ChineseDailyNerCorpus.load_data('train')
valid_x, valid_y = ChineseDailyNerCorpus.load_data('validate')
test_x, test_y  = ChineseDailyNerCorpus.load_data('test')

print(f"train data count: {len(train_x)}")
print(f"validate data count: {len(valid_x)}")
print(f"test data count: {len(test_x)}")
#train data count: 20864
#validate data count: 2318
#test data count: 4636
bert_embed = BERTEmbedding('chinese_L-12_H-768_A-12',
                           task=kashgari.LABELING,
                           sequence_length=100)

model = BiLSTM_CRF_Model(bert_embed)
model.fit(train_x,
          train_y,
          x_validate=valid_x,
          y_validate=valid_y,
          epochs=20,
          batch_size=512)
model.save('ner.h5')

model.evaluate(test_x, test_y)

结果:

train data count: 20864
validate data count: 2318
test data count: 4636
WARNING:root:seq_len: 100
Model: "model_4"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
Input-Token (InputLayer)        [(None, 100)]        0                                            
__________________________________________________________________________________________________
Input-Segment (InputLayer)      [(None, 100)]        0                                            
__________________________________________________________________________________________________
Embedding-Token (TokenEmbedding [(None, 100, 768), ( 16226304    Input-Token[0][0]                
__________________________________________________________________________________________________
Embedding-Segment (Embedding)   (None, 100, 768)     1536        Input-Segment[0][0]              
__________________________________________________________________________________________________
Embedding-Token-Segment (Add)   (None, 100, 768)     0           Embedding-Token[0][0]            
                                                                 Embedding-Segment[0][0]          
__________________________________________________________________________________________________
Embedding-Position (PositionEmb (None, 100, 768)     76800       Embedding-Token-Segment[0][0]    
__________________________________________________________________________________________________
Embedding-Dropout (Dropout)     (None, 100, 768)     0           Embedding-Position[0][0]         
__________________________________________________________________________________________________
Embedding-Norm (LayerNormalizat (None, 100, 768)     1536        Embedding-Dropout[0][0]          
__________________________________________________________________________________________________
Encoder-1-MultiHeadSelfAttentio (None, 100, 768)     2362368     Embedding-Norm[0][0]             
__________________________________________________________________________________________________
Encoder-1-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-1-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-1-MultiHeadSelfAttentio (None, 100, 768)     0           Embedding-Norm[0][0]             
                                                                 Encoder-1-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-1-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-1-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-1-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-1-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-1-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-1-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-1-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-1-MultiHeadSelfAttention-
                                                                 Encoder-1-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-1-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-1-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-2-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-1-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-2-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-2-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-2-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-1-FeedForward-Norm[0][0] 
                                                                 Encoder-2-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-2-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-2-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-2-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-2-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-2-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-2-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-2-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-2-MultiHeadSelfAttention-
                                                                 Encoder-2-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-2-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-2-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-3-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-2-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-3-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-3-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-3-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-2-FeedForward-Norm[0][0] 
                                                                 Encoder-3-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-3-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-3-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-3-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-3-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-3-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-3-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-3-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-3-MultiHeadSelfAttention-
                                                                 Encoder-3-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-3-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-3-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-4-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-3-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-4-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-4-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-4-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-3-FeedForward-Norm[0][0] 
                                                                 Encoder-4-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-4-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-4-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-4-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-4-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-4-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-4-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-4-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-4-MultiHeadSelfAttention-
                                                                 Encoder-4-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-4-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-4-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-5-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-4-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-5-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-5-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-5-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-4-FeedForward-Norm[0][0] 
                                                                 Encoder-5-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-5-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-5-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-5-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-5-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-5-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-5-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-5-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-5-MultiHeadSelfAttention-
                                                                 Encoder-5-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-5-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-5-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-6-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-5-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-6-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-6-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-6-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-5-FeedForward-Norm[0][0] 
                                                                 Encoder-6-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-6-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-6-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-6-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-6-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-6-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-6-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-6-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-6-MultiHeadSelfAttention-
                                                                 Encoder-6-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-6-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-6-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-7-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-6-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-7-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-7-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-7-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-6-FeedForward-Norm[0][0] 
                                                                 Encoder-7-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-7-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-7-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-7-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-7-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-7-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-7-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-7-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-7-MultiHeadSelfAttention-
                                                                 Encoder-7-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-7-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-7-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-8-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-7-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-8-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-8-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-8-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-7-FeedForward-Norm[0][0] 
                                                                 Encoder-8-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-8-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-8-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-8-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-8-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-8-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-8-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-8-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-8-MultiHeadSelfAttention-
                                                                 Encoder-8-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-8-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-8-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-9-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-8-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-9-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-9-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-9-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-8-FeedForward-Norm[0][0] 
                                                                 Encoder-9-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-9-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-9-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-9-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-9-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-9-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-9-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-9-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-9-MultiHeadSelfAttention-
                                                                 Encoder-9-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-9-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-9-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-10-MultiHeadSelfAttenti (None, 100, 768)     2362368     Encoder-9-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-10-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-10-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-10-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-9-FeedForward-Norm[0][0] 
                                                                 Encoder-10-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-10-MultiHeadSelfAttenti (None, 100, 768)     1536        Encoder-10-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-10-FeedForward (FeedFor (None, 100, 768)     4722432     Encoder-10-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-10-FeedForward-Dropout  (None, 100, 768)     0           Encoder-10-FeedForward[0][0]     
__________________________________________________________________________________________________
Encoder-10-FeedForward-Add (Add (None, 100, 768)     0           Encoder-10-MultiHeadSelfAttention
                                                                 Encoder-10-FeedForward-Dropout[0]
__________________________________________________________________________________________________
Encoder-10-FeedForward-Norm (La (None, 100, 768)     1536        Encoder-10-FeedForward-Add[0][0] 
__________________________________________________________________________________________________
Encoder-11-MultiHeadSelfAttenti (None, 100, 768)     2362368     Encoder-10-FeedForward-Norm[0][0]
__________________________________________________________________________________________________
Encoder-11-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-11-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-11-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-10-FeedForward-Norm[0][0]
                                                                 Encoder-11-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-11-MultiHeadSelfAttenti (None, 100, 768)     1536        Encoder-11-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-11-FeedForward (FeedFor (None, 100, 768)     4722432     Encoder-11-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-11-FeedForward-Dropout  (None, 100, 768)     0           Encoder-11-FeedForward[0][0]     
__________________________________________________________________________________________________
Encoder-11-FeedForward-Add (Add (None, 100, 768)     0           Encoder-11-MultiHeadSelfAttention
                                                                 Encoder-11-FeedForward-Dropout[0]
__________________________________________________________________________________________________
Encoder-11-FeedForward-Norm (La (None, 100, 768)     1536        Encoder-11-FeedForward-Add[0][0] 
__________________________________________________________________________________________________
Encoder-12-MultiHeadSelfAttenti (None, 100, 768)     2362368     Encoder-11-FeedForward-Norm[0][0]
__________________________________________________________________________________________________
Encoder-12-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-12-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-12-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-11-FeedForward-Norm[0][0]
                                                                 Encoder-12-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-12-MultiHeadSelfAttenti (None, 100, 768)     1536        Encoder-12-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-12-FeedForward (FeedFor (None, 100, 768)     4722432     Encoder-12-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-12-FeedForward-Dropout  (None, 100, 768)     0           Encoder-12-FeedForward[0][0]     
__________________________________________________________________________________________________
Encoder-12-FeedForward-Add (Add (None, 100, 768)     0           Encoder-12-MultiHeadSelfAttention
                                                                 Encoder-12-FeedForward-Dropout[0]
__________________________________________________________________________________________________
Encoder-12-FeedForward-Norm (La (None, 100, 768)     1536        Encoder-12-FeedForward-Add[0][0] 
__________________________________________________________________________________________________
Encoder-Output (Concatenate)    (None, 100, 3072)    0           Encoder-9-FeedForward-Norm[0][0] 
                                                                 Encoder-10-FeedForward-Norm[0][0]
                                                                 Encoder-11-FeedForward-Norm[0][0]
                                                                 Encoder-12-FeedForward-Norm[0][0]
__________________________________________________________________________________________________
non_masking_layer (NonMaskingLa (None, 100, 3072)    0           Encoder-Output[0][0]             
__________________________________________________________________________________________________
layer_blstm (Bidirectional)     (None, 100, 256)     3277824     non_masking_layer[0][0]          
__________________________________________________________________________________________________
layer_dense (Dense)             (None, 100, 64)      16448       layer_blstm[0][0]                
__________________________________________________________________________________________________
layer_crf_dense (Dense)         (None, 100, 8)       520         layer_dense[0][0]                
__________________________________________________________________________________________________
layer_crf (CRF)                 (None, 100, 8)       64          layer_crf_dense[0][0]            
==================================================================================================
Total params: 104,655,496
Trainable params: 3,294,856
Non-trainable params: 101,360,640
__________________________________________________________________________________________________
Epoch 1/20

 1/41 [..............................] - ETA: 2:23:31 - loss: 189.6792 - accuracy: 0.0289
 2/41 [>.............................] - ETA: 2:20:27 - loss: 111.6294 - accuracy: 0.4817
 3/41 [=>............................] - ETA: 2:14:09 - loss: 85.3781 - accuracy: 0.6322 
 4/41 [=>............................] - ETA: 2:07:16 - loss: 70.9284 - accuracy: 0.7092
 5/41 [==>...........................] - ETA: 1:53:34 - loss: 61.8583 - accuracy: 0.7563
 6/41 [===>..........................] - ETA: 1:43:57 - loss: 55.4911 - accuracy: 0.7879
 7/41 [====>.........................] - ETA: 1:36:36 - loss: 50.8054 - accuracy: 0.8099
 8/41 [====>.........................] - ETA: 1:30:29 - loss: 46.8067 - accuracy: 0.8272
 9/41 [=====>........................] - ETA: 1:25:18 - loss: 43.7727 - accuracy: 0.8397
10/41 [======>.......................] - ETA: 1:20:42 - loss: 41.0071 - accuracy: 0.8504
11/41 [=======>......................] - ETA: 1:16:35 - loss: 38.5938 - accuracy: 0.8594
12/41 [=======>......................] - ETA: 1:12:49 - loss: 36.5614 - accuracy: 0.8668
13/41 [========>.....................] - ETA: 1:09:18 - loss: 34.7422 - accuracy: 0.8736
14/41 [=========>....................] - ETA: 1:05:59 - loss: 33.1137 - accuracy: 0.8800
15/41 [=========>....................] - ETA: 1:02:51 - loss: 31.5732 - accuracy: 0.8861
16/41 [==========>...................] - ETA: 59:51 - loss: 30.2485 - accuracy: 0.8912  
17/41 [===========>..................] - ETA: 56:58 - loss: 29.0256 - accuracy: 0.8958
18/41 [============>.................] - ETA: 54:10 - loss: 27.9502 - accuracy: 0.8999
19/41 [============>.................] - ETA: 51:27 - loss: 26.9262 - accuracy: 0.9037
20/41 [=============>................] - ETA: 48:48 - loss: 26.0287 - accuracy: 0.9070
21/41 [==============>...............] - ETA: 46:13 - loss: 25.1085 - accuracy: 0.9104
22/41 [===============>..............] - ETA: 43:40 - loss: 24.3304 - accuracy: 0.9133
23/41 [===============>..............] - ETA: 41:10 - loss: 23.5406 - accuracy: 0.9162
24/41 [================>.............] - ETA: 38:43 - loss: 22.8484 - accuracy: 0.9188
25/41 [=================>............] - ETA: 36:18 - loss: 22.2058 - accuracy: 0.9212
26/41 [==================>...........] - ETA: 33:54 - loss: 21.6206 - accuracy: 0.9234
27/41 [==================>...........] - ETA: 31:32 - loss: 21.0356 - accuracy: 0.9256
28/41 [===================>..........] - ETA: 29:11 - loss: 20.5043 - accuracy: 0.9276
29/41 [====================>.........] - ETA: 26:51 - loss: 19.9868 - accuracy: 0.9295
30/41 [====================>.........] - ETA: 24:33 - loss: 19.5049 - accuracy: 0.9314
31/41 [=====================>........] - ETA: 22:15 - loss: 19.0341 - accuracy: 0.9332
32/41 [======================>.......] - ETA: 19:59 - loss: 18.5822 - accuracy: 0.9349
33/41 [=======================>......] - ETA: 17:43 - loss: 18.1677 - accuracy: 0.9364
34/41 [=======================>......] - ETA: 15:28 - loss: 17.7761 - accuracy: 0.9379
35/41 [========================>.....] - ETA: 13:15 - loss: 17.3926 - accuracy: 0.9394
36/41 [=========================>....] - ETA: 11:01 - loss: 17.0286 - accuracy: 0.9407
37/41 [==========================>...] - ETA: 8:48 - loss: 16.6884 - accuracy: 0.9420 
38/41 [==========================>...] - ETA: 6:35 - loss: 16.3570 - accuracy: 0.9432
39/41 [===========================>..] - ETA: 4:23 - loss: 16.0342 - accuracy: 0.9444
40/41 [============================>.] - ETA: 2:12 - loss: 15.7314 - accuracy: 0.9455
41/41 [==============================] - 10803s 263s/step - loss: 15.4359 - accuracy: 0.9463 - val_loss: 89.0157 - val_accuracy: 0.9726
Epoch 2/20

 1/41 [..............................] - ETA: 1:52:16 - loss: 3.8260 - accuracy: 0.9887
 2/41 [>.............................] - ETA: 2:02:00 - loss: 3.6733 - accuracy: 0.9896
 3/41 [=>............................] - ETA: 2:03:56 - loss: 3.8901 - accuracy: 0.9888
 4/41 [=>............................] - ETA: 2:02:28 - loss: 3.8505 - accuracy: 0.9888
 5/41 [==>...........................] - ETA: 2:00:36 - loss: 3.6963 - accuracy: 0.9894
 6/41 [===>..........................] - ETA: 1:58:02 - loss: 3.6410 - accuracy: 0.9895
 7/41 [====>.........................] - ETA: 1:50:32 - loss: 3.6530 - accuracy: 0.9894
 8/41 [====>.........................] - ETA: 1:43:52 - loss: 3.6497 - accuracy: 0.9894
 9/41 [=====>........................] - ETA: 1:38:02 - loss: 3.5950 - accuracy: 0.9896
10/41 [======>.......................] - ETA: 1:31:57 - loss: 3.5294 - accuracy: 0.9899
11/41 [=======>......................] - ETA: 1:26:34 - loss: 3.4709 - accuracy: 0.9900
12/41 [=======>......................] - ETA: 1:21:42 - loss: 3.4425 - accuracy: 0.9901
13/41 [========>.....................] - ETA: 1:17:19 - loss: 3.4268 - accuracy: 0.9901
14/41 [=========>....................] - ETA: 1:15:31 - loss: 3.4032 - accuracy: 0.9902
15/41 [=========>....................] - ETA: 1:11:30 - loss: 3.3936 - accuracy: 0.9902
16/41 [==========>...................] - ETA: 1:08:36 - loss: 3.3785 - accuracy: 0.9902
17/41 [===========>..................] - ETA: 1:04:56 - loss: 3.3780 - accuracy: 0.9902
18/41 [============>.................] - ETA: 1:01:25 - loss: 3.3540 - accuracy: 0.9903
19/41 [============>.................] - ETA: 58:02 - loss: 3.3473 - accuracy: 0.9903  
20/41 [=============>................] - ETA: 54:56 - loss: 3.3040 - accuracy: 0.9904
21/41 [==============>...............] - ETA: 51:55 - loss: 3.2783 - accuracy: 0.9904
22/41 [===============>..............] - ETA: 48:51 - loss: 3.2713 - accuracy: 0.9904
23/41 [===============>..............] - ETA: 45:54 - loss: 3.2449 - accuracy: 0.9905
24/41 [================>.............] - ETA: 42:59 - loss: 3.2201 - accuracy: 0.9906
25/41 [=================>............] - ETA: 40:10 - loss: 3.2031 - accuracy: 0.9906
26/41 [==================>...........] - ETA: 37:23 - loss: 3.1892 - accuracy: 0.9907
27/41 [==================>...........] - ETA: 34:40 - loss: 3.1780 - accuracy: 0.9907
28/41 [===================>..........] - ETA: 31:59 - loss: 3.1598 - accuracy: 0.9907
29/41 [====================>.........] - ETA: 29:21 - loss: 3.1312 - accuracy: 0.9908
30/41 [====================>.........] - ETA: 26:45 - loss: 3.1161 - accuracy: 0.9908
31/41 [=====================>........] - ETA: 24:13 - loss: 3.0992 - accuracy: 0.9909
32/41 [======================>.......] - ETA: 21:41 - loss: 3.0769 - accuracy: 0.9909
33/41 [=======================>......] - ETA: 19:11 - loss: 3.0552 - accuracy: 0.9910
34/41 [=======================>......] - ETA: 16:43 - loss: 3.0336 - accuracy: 0.9911
35/41 [========================>.....] - ETA: 14:16 - loss: 3.0293 - accuracy: 0.9911
36/41 [=========================>....] - ETA: 11:51 - loss: 3.0031 - accuracy: 0.9912
37/41 [==========================>...] - ETA: 9:27 - loss: 2.9999 - accuracy: 0.9911 
38/41 [==========================>...] - ETA: 7:04 - loss: 2.9762 - accuracy: 0.9912
39/41 [===========================>..] - ETA: 4:42 - loss: 2.9595 - accuracy: 0.9912
40/41 [============================>.] - ETA: 2:20 - loss: 2.9461 - accuracy: 0.9913
41/41 [==============================] - 6191s 151s/step - loss: 2.9283 - accuracy: 0.9913 - val_loss: 88.1398 - val_accuracy: 0.9763
Epoch 3/20

 1/41 [..............................] - ETA: 1:21:47 - loss: 2.3645 - accuracy: 0.9929
 2/41 [>.............................] - ETA: 1:19:44 - loss: 2.3826 - accuracy: 0.9926
 3/41 [=>............................] - ETA: 1:17:39 - loss: 2.2856 - accuracy: 0.9929
 4/41 [=>............................] - ETA: 1:15:51 - loss: 2.3566 - accuracy: 0.9927
 5/41 [==>...........................] - ETA: 1:13:48 - loss: 2.4427 - accuracy: 0.9923
 6/41 [===>..........................] - ETA: 1:11:44 - loss: 2.3604 - accuracy: 0.9926
 7/41 [====>.........................] - ETA: 1:09:43 - loss: 2.3260 - accuracy: 0.9927
 8/41 [====>.........................] - ETA: 1:07:43 - loss: 2.2732 - accuracy: 0.9929
 9/41 [=====>........................] - ETA: 1:05:42 - loss: 2.2614 - accuracy: 0.9930
10/41 [======>.......................] - ETA: 1:03:39 - loss: 2.2640 - accuracy: 0.9930
11/41 [=======>......................] - ETA: 1:01:35 - loss: 2.2310 - accuracy: 0.9931
12/41 [=======>......................] - ETA: 59:31 - loss: 2.2228 - accuracy: 0.9932  
13/41 [========>.....................] - ETA: 57:27 - loss: 2.2280 - accuracy: 0.9931
14/41 [=========>....................] - ETA: 55:26 - loss: 2.2386 - accuracy: 0.9931
15/41 [=========>....................] - ETA: 53:23 - loss: 2.2361 - accuracy: 0.9930
16/41 [==========>...................] - ETA: 51:22 - loss: 2.2266 - accuracy: 0.9931
17/41 [===========>..................] - ETA: 49:21 - loss: 2.2104 - accuracy: 0.9931
18/41 [============>.................] - ETA: 47:21 - loss: 2.2063 - accuracy: 0.9931
19/41 [============>.................] - ETA: 45:18 - loss: 2.1942 - accuracy: 0.9932
20/41 [=============>................] - ETA: 43:14 - loss: 2.1704 - accuracy: 0.9933
21/41 [==============>...............] - ETA: 41:11 - loss: 2.1761 - accuracy: 0.9932
22/41 [===============>..............] - ETA: 39:09 - loss: 2.1709 - accuracy: 0.9933
23/41 [===============>..............] - ETA: 37:05 - loss: 2.1618 - accuracy: 0.9933
24/41 [================>.............] - ETA: 35:02 - loss: 2.1493 - accuracy: 0.9933
25/
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值