ResNet 各个版本 tensorflow下载

ResNet 各个版本 tensorflow下载:
下载链接:https://github.com/keras-team/keras-applications/releases/tag/resnet

resnet101v2_weights_tf_dim_ordering_tf_kernels.h5 171 MB

resnet101v2_weights_tf_dim_ordering_tf_kernels_notop.h5 163 MB

resnet101_weights_tf_dim_ordering_tf_kernels.h5 171 MB

resnet101_weights_tf_dim_ordering_tf_kernels_notop.h5 164 MB

resnet152v2_weights_tf_dim_ordering_tf_kernels.h5 232 MB

resnet152v2_weights_tf_dim_ordering_tf_kernels_notop.h5 224 MB

resnet152_weights_tf_dim_ordering_tf_kernels.h5 232 MB

resnet152_weights_tf_dim_ordering_tf_kernels_notop.h5 224 MB

resnet50v2_weights_tf_dim_ordering_tf_kernels.h5 98.1 MB

resnet50v2_weights_tf_dim_ordering_tf_kernels_notop.h5 90.3 MB

resnet50_weights_tf_dim_ordering_tf_kernels.h5 98.2 MB

resnet50_weights_tf_dim_ordering_tf_kernels_notop.h5 90.4 MB

resnext101_weights_tf_dim_ordering_tf_kernels.h5 170 MB

resnext101_weights_tf_dim_ordering_tf_kernels_notop.h5 162 MB

resnext50_weights_tf_dim_ordering_tf_kernels.h5 96.2 MB

resnext50_weights_tf_dim_ordering_tf_kernels_notop.h5 88.4 MB

notop模型是指什么?
是否包含最后的3个全连接层(whether to include the 3 fully-connected layers at the top of the network)。用来做fine-tuning专用,专门开源了这类模型。

以下是使用TensorFlow 2.0实现resnet18的示例代码: ```python import tensorflow as tf from tensorflow.keras.layers import Conv2D, BatchNormalization, Activation, Add, Input, MaxPooling2D, ZeroPadding2D, Dense, Flatten from tensorflow.keras.models import Model def conv_bn_relu(inputs, filters, kernel_size, strides): x = Conv2D(filters=filters, kernel_size=kernel_size, strides=strides, padding='same')(inputs) x = BatchNormalization()(x) x = Activation('relu')(x) return x def identity_block(inputs, filters): x = conv_bn_relu(inputs, filters, kernel_size=3, strides=1) x = conv_bn_relu(x, filters, kernel_size=3, strides=1) x = Add()([x, inputs]) x = Activation('relu')(x) return x def conv_block(inputs, filters, strides): shortcut = inputs x = conv_bn_relu(inputs, filters, kernel_size=3, strides=strides) x = conv_bn_relu(x, filters, kernel_size=3, strides=1) shortcut = Conv2D(filters=filters, kernel_size=1, strides=strides)(shortcut) shortcut = BatchNormalization()(shortcut) x = Add()([x, shortcut]) x = Activation('relu')(x) return x def resnet18(): inputs = Input(shape=(224, 224, 3)) x = ZeroPadding2D(padding=(3, 3))(inputs) x = Conv2D(filters=64, kernel_size=7, strides=2)(x) x = BatchNormalization()(x) x = Activation('relu')(x) x = MaxPooling2D(pool_size=3, strides=2, padding='same')(x) x = conv_block(x, filters=64, strides=1) x = identity_block(x, filters=64) x = identity_block(x, filters=64) x = conv_block(x, filters=128, strides=2) x = identity_block(x, filters=128) x = identity_block(x, filters=128) x = conv_block(x, filters=256, strides=2) x = identity_block(x, filters=256) x = identity_block(x, filters=256) x = conv_block(x, filters=512, strides=2) x = identity_block(x, filters=512) x = identity_block(x, filters=512) x = Flatten()(x) x = Dense(units=1000, activation='softmax')(x) model = Model(inputs=inputs, outputs=x) return model ``` 这里定义了一些辅助函数,如conv_bn_relu、identity_block和conv_block,用于构建ResNet18的基本块。然后,定义了resnet18函数,该函数返回一个ResNet18模型。在这个函数中,我们首先定义了输入层,然后进行了一些预处理,接着构建了ResNet18的各个块,并在最后添加了一个全连接层,输出1000个类别的概率分布。
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值