pytorch keras_在pytorch中重新创建keras代码的入门教程

pytorch keras

初学者的深度学习(DEEP LEARNING FOR BEGINNERS)

I love Keras, there I said it! However…

我爱Keras,我说了! 然而…

As an applied data scientist, nothing gives me more pleasure than quickly whipping up a functional neural network with as little as three lines of code! However, as I have begun to delve deeper into the dark web of neural nets, I would like to accept the fact that Pytorch does allow you a much larger control over the architecture of your network.

作为一名应用数据科学家,没有什么比用三行代码快速构建功能神经网络给我带来更多的乐趣了! 但是,随着我开始更深入地研究神经网络的黑暗网络,我想接受一个事实,即Pytorch确实允许您对网络的体系结构进行更大的控制。

Given that most of us are pretty comfortable with Keras (if not, see here for a warm intro to Keras), learning to create a similar network in Pytorch (whilst learning Pytorch basics) isn’t challenging at all. Let’s begin!

鉴于我们大多数人对Keras都非常满意(如果不是,请参阅此处,了解Keras的简要介绍),学习在Pytorch中创建类似的网络(同时学习Pytorch基础知识)根本不是挑战。 让我们开始!

Note: We will not be creating a replica of the Keras code as I would like to introduce more features and functionalities of PyTorch in this introductory tutorial!

注意:我们不会创建Keras代码的副本,因为我想在本入门教程中介绍PyTorch的更多功能!

快速回顾一下Keras代码的外观: (A quick recap of how the Keras code looks like:)

Here is the code snippet (from my previous post on neural networks in Keras) for creating the model architecture, compiling the model, and finally training the model. It is a loan assessment model that outputs whether a loan should be accepted or rejected.

这是代码片段(来自我之前在Keras中的神经网络文章),用于创建模型架构,编译模型并最终训练模型。 它是一个贷款评估模型,输出应接受还是拒绝贷款。

# Model architecture
model_m = Sequential([
Dense(units = 8, input_shape= (2,), activation = 'relu'),
Dense(units = 16, activation = 'relu'),
Dense(units = 2, activation = 'softmax')
])# Model compilation
model_m.compile(optimizer= Adam(learning_rate = 0.0001),
loss = 'sparse_categorical_crossentropy',
metrics = ['accuracy']
)# Model Training and Validation
model_m.fit(x = scaled_train_samples_mult,
y = train_labels,
batch_size= 10,
epochs = 30,
validation_split= 0.1,
shuffle = True,
verbose = 2
)

To summarize, we have built a network with three hidden layers, all of which are Dense. The activation functions for the three hidden layers are relu,reluand softmax, respectively. input_shape is a tuple (2,0) meaning we have two predictor features. Adam optimizer along with a learning rate lr = 0.0001 has been used to update network weights iteratively based on training data. The loss we will be monitoring on each iteration is sparse_categorical_crossentropy.The accuracy will be used to judge how good the network is. For training, we will be using 30 epochs along with a batch_size of 10. If the meaning of any of the aforementioned terms is unclear, please see here or here.

总而言之,我们建立了一个包含三个隐藏层的网络,所有这些层都是Dense 。 三个隐藏层的激活函数分别是relurelusoftmaxinput_shape是一个元组(2,0)表示我们具有两个预测变量功能。 Adam

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值