多传感器信息融合与MAML应用于故障诊断

一种时变工况下基于信息融合的元迁移学习方法

摘自论文:《An information fusion-based meta transfer learning method for few-shot fault diagnosis under varying operating conditions》Mechanical Systems and Signal Processing

摘要

近年来,元学习因其处理小样本和快速适应不同诊断任务的优势,在故障诊断领域受到越来越多的关注。然而,工况急剧变化的场景和繁重的计算负担仍然限制了元学习在迁移故障诊断领域的有效应用。针对上述挑战,提出了一种基于信息融合的模型不可知元学习(IFMAML)的少量元迁移诊断方法。首先,引入基于稀疏主成分分析的信息增强方法,增强领域不变特征,减少数据冗余。随后,提出了多传感器数据的信息融合策略,形成图像的红绿蓝(RGB)通道信息,丰富了域不变特征的多样性,挖掘了多传感器的空间信息。然后,IFMAML,以其增强的潜在诊断性能和计算效率,被开发来解决在变化的操作条件下具有挑战性的少量跨域迁移诊断。最后,通过两个齿轮箱故障诊断的案例研究,验证了该方法的有效性和优越性。实验结果表明,与几种主流的元学习方法相比,所提出的IFMAML方法可以实现更高的诊断精度,并且可以快速适应不同操作条件下的新的变速器诊断场景。

关键词:多传感器信息融合、数据增强、元迁移学习、少量故障诊断。

主要贡献

1)针对信息融合中冗余特征掩盖领域不变特征的问题,提出了一种稀疏主成分分析方法。通过稀疏表示来自多传感器的高维数据,它可以有效地增强领域不变特征,减少数据冗余,从而实现信息增强。

2)为了丰富多传感器域不变特征的多样性,挖掘隐藏的空间特征,提出了一种基于红绿蓝(RGB)通道的信息融合方法。通过信息融合引起的图像纹理和颜色的变化可以用来揭示多传感器异构数据中的特征多样性和内在的空间联系。

3)提出了一种少样本迁移诊断方法IFMAML,以解决在急剧变化的操作条件下具有挑战性的跨域诊断任务。与当前主流的元学习方法相比,IFMAML已被证明在诊断性能和计算效率方面具有很高的潜力。

4)为了验证该方法的泛化能力,基于两个齿轮箱数据集进行了实验。实验分析包括在急剧变化的速度条件下和在未知健康条件下的迁移诊断。实验结果证实了该方法的有效性和优越性。

信息增强与信息融合

信息融合主代码

import numpy as np
import os
import scipy.io as scio
import cv2
from normalize255 import normalize,sparse_coding
import time
from PIL import Image

start = time.perf_counter()
Samples = 10
ImageL = 64
ImageW = ImageL
ImageSize = ImageL * ImageW
dataPoints = ImageSize
signal_cut1 = np.zeros((Samples, ImageSize))
signal_cut2 = np.zeros((Samples, ImageSize))
signal_cut3 = np.zeros((Samples, ImageSize))

OPERATING =["V1run_0A","V1run_0dot4A"]
FILES = ["H", "IB_IRC", "PB_IRC","PG_CT","SG_TM"]

data_dir = r'F:\Datasets'
output_dir = r'F:\data_spca'

for i in range(0,len(OPERATING)):
    operate_name = OPERATING[i]
    for j in range(0,len(FILES)):
        file_name = FILES[j]
        if j == 1:
            FAULT_NAMES = ["H"]
        if j == 2:
            FAULT_NAMES = ["IB_IRC"]
        if j == 3:
            FAULT_NAMES = ["PB_IRC"]
        if j == 4:
            FAULT_NAMES = ["PG_CT"]
        if j == 5:
            FAULT_NAMES = ["SG_TM"]

    for iFault in range(0,len(FILES)):
        fault_name = FILES[iFault]

        txt_name = os.path.join(fault_name +"_"+operate_name+"_1")
        mat_path = os.path.join(data_dir,fault_name, txt_name +'.mat')

        data = scio.loadmat(mat_path)
        data = data[fault_name +"_"+operate_name+"_1"]

        data_vibration = data[:,0]
        data_current = data[:,4]
        data_torque = data[:,6]

        data_new = [data_vibration, data_current, data_torque]

        data_new = np.array(data_new)
        data_enhance = sparse_coding(data_new.T, 3, 100, 1, 1e-4) #Proposed
        print(data_enhance.shape)

        random_series = np.random.randint(0, (data_enhance.shape[0] - dataPoints), Samples)

        # signal_cut1, signal_cut2, signal_cut3 = [], [], []


        for i_cut  in range(Samples):
            cut_index = random_series[i_cut]
            signal_cut1[i_cut,:] = normalize(data_enhance[cut_index: cut_index + dataPoints,0])
            signal_cut2[i_cut,:] = normalize(data_enhance[cut_index: cut_index + dataPoints,1])
            signal_cut3[i_cut,:] = normalize(data_enhance[cut_index: cut_index + dataPoints,2])

        for i_image in range(Samples):
            img_pre1, img_pre2, img_pre3 = signal_cut1[i_image],signal_cut2[i_image],signal_cut3[i_image]
            img1 = np.reshape(img_pre1,
(int(np.sqrt(dataPoints)),int(np.sqrt(dataPoints))))
            img2 = np.reshape(img_pre2, (int(np.sqrt(dataPoints)), int(np.sqrt(dataPoints))))
            img3 = np.reshape(img_pre3, (int(np.sqrt(dataPoints)), int(np.sqrt(dataPoints))))

            imgRGB = np.dstack((img1, img2, img3))
            imgRGB = cv2.transpose(imgRGB)
            output_path_subF = os.path.join(output_dir, operate_name,fault_name)
            if not os.path.exists(output_path_subF):
                os.makedirs(output_path_subF)
            # image_save_path = os.path.join(output_path_subF, fault_name + "_" + str(i_image) + '.png')
            # cv2.imwrite(image_save_path, imgRGB)

end = time.perf_counter()
print('Running time: %s Seconds' % (end - start))

 稀疏主成分分析normalize255.py

import numpy as np
import scipy
from functools import reduce
def normalize(x):
    y_max = 255
    y_min = 0
    x_max = np.max(np.max(x))
    x_min = np.min(np.min(x))
    out = np.around((y_max-y_min)*(x-x_min)/(x_max-x_min) + y_min)
    return out

# a = np.random.rand(5,5)
# print(normalize(a))

# class Data_Sparse():
#     def __int__(self):
#         self.max_iter = 100
#         self.tol = 1e-4
#         self.k = 3
#         self.lma = 1
def sparse_coding(x,k = 3,max_iter=100,lma=1,tol=1e-4):
    x = scipy.stats.zscore(x)
    n = x.shape[0]
    d = x.shape[1]
    x = x - np.mean(x)
    w = np.random.randn(d,k)
    for i in range(max_iter):
        alpha = np.dot(x,w)
        w_old = w
        for j in range(k):
            # w[:,j] = x.T * (x * w_old[:,j])/(w_old[:,j].T * x.T * x * w_old[:,j] + lma)
            w[:, j] = np.dot(x.T,np.dot(x,w_old[:, j]))/(reduce(np.dot, (w_old[:, j].T, x.T, x, w_old[:, j])) + lma)
            # w[:, j] =reduce(np.dot, (w_old[:, j].T, x.T, x, w_old[:, j])) + lma
        err = np.linalg.norm(x - np.dot(alpha,np.transpose(w)),'fro')
        if err < tol:
            break

    return alpha

训练IFMAML

该部分请参考Github: 519473104/IFMAML: An information fusion-based meta transfer learning method for few-shot fault diagnosis under varying operating conditions (github.com)

  • 14
    点赞
  • 5
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值