在创建pyG中model的过程中
出现报错:
Traceback (most recent call last):
File "C:\Users\Wang\PycharmProjects\pythonProject\唐老师\mmmmmmmm.py", line 31, in <module>
model = GCN()
File "C:\Users\Wang\PycharmProjects\pythonProject\唐老师\mmmmmmmm.py", line 12, in __init__
self.conv1 = GCNConv(dataset.num_features, 4)
File "F:\anaconda\envs\pyG\lib\site-packages\torch_geometric\nn\conv\gcn_conv.py", line 140, in __init__
weight_initializer='glorot')
File "F:\anaconda\envs\pyG\lib\site-packages\torch_geometric\nn\dense\linear.py", line 65, in __init__
self.reset_parameters()
File "F:\anaconda\envs\pyG\lib\site-packages\torch_geometric\nn\dense\linear.py", line 78, in reset_parameters
if isinstance(self.weight, nn.parameter.UninitializedParameter):
AttributeError: module 'torch.nn.parameter' has no attribute 'UninitializedParameter'
原因是torch的版本不同,导致其中没有UninitializedParameter这个属性
可以更换torch版本或修改代码去掉这个变量,但是更换版本要重装,这里学习了网上的经验修改代码解决了问题
不会用到 ‘torch.nn.parameter’ 的’UninitializedParameter’属性了,因此修改后就不会报错了
1.按照上面报错的提示打开文件:
File "F:\anaconda\envs\pyG\lib\site-packages\torch_geometric\nn\dense\linear.py"
2.修改该linear文件
修改前:
from typing import Optional
import copy
import math
import torch
from torch import nn
from torch import Tensor
import torch.nn.functional as F
from torch.nn.parameter import Parameter
from torch_geometric.nn import inits
class Linear(torch.nn.Module):
r"""Applies a linear tranformation to the incoming data
.. math::
\mathbf{x}^{\prime} = \mathbf{x} \mathbf{W}^{\top} + \mathbf{b}
similar to :class:`torch.nn.Linear`.
It supports lazy initialization and customizable weight and bias
initialization.
Args:
in_channels (int): Size of each input sample. Will be initialized
lazily in case it is given as :obj:`-1`.
out_channels (int): Size of each output sample.
bias (bool, optional): If set to :obj:`False`, the layer will not learn
an additive bias. (default: :obj:`True`)
weight_initializer (str, optional): The initializer for the weight
matrix (:obj:`"glorot"`, :obj:`"uniform"`, :obj:`"kaiming_uniform"`
or :obj:`None`).
If set to :obj:`None`, will match default weight initialization of
:class:`torch.nn.Linear`. (default: :obj:`None`)
bias_initializer (str, optional): The initializer for the bias vector
(:obj:`"zeros"` or :obj:`None`).
If set to :obj:`None`, will match default bias initialization of
:class:`torch.nn.Linear`. (default: :obj:`None`)
"""
def __init__(self, in_channels: int, out_channels: int, bias: bool = True,
weight_initializer: Optional[str] = None,
bias_initializer: Optional[str] = None):
super().__init__()
self.in_channels = in_channels
self.out_channels = out_channels
self.weight_initializer = weight_initializer
self.bias_initializer = bias_initializer
if in_channels > 0:
self.weight = Parameter(torch.Tensor(out_channels, in_channels))
else:
self.weight = nn.parameter.UninitializedParameter()
self._hook = self.register_forward_pre_hook(
self.initialize_parameters)
if bias:
self.bias = Parameter(torch.Tensor(out_channels))