图像生成:SD LoRA模型详解


介绍

相信在SD的生图过程中,我们对LoRA都不陌生,但是它的模型文件是什么样子的,保存的东西是什么,如何起作用的,接下来将详细探讨一下。

一、LoRA基本原理

LoRA(Learnable Re-Weighting),是一种重加权模型。LORA模型将神经网络中的每一层看做是一个可加权的特征提取器,每一层的权重决定了它对模型输出的影响。通过对已有的SD模型的部分权重进行调整,从而实现对生图效果的改善。(大部分LoRA模型是对Transformer中的注意力权重进行了调整)

LoRA从技术角度来讲很简单,基本流程如下图:
请添加图片描述
蓝色部分表示原来的预训练权重,橙色部分则是lora需要训练的权重A和B。

二、LoRA的训练与推理

训练阶段:上图中的A和B是可训练的权重,在训练阶段,蓝色部分冻结,训练A和B,最后保存权重也仅保存A和B的相关参数。基本步骤如下:
请添加图片描述
推理阶段正常使用W = W0+BA来更新模型权重。

二、LoRA文件内容

lora模型中每层的权重包含3个部分,分别为lora_down.weightlora_up.weightalpha。其中down和up分别为lora模型的上下层权重分别对应了B和A权重,alpha也是一个可学习的参数。lora模型每层的权重可表示为:
w = a l p h a ∗ d o w n ∗ u p w = alpha * down * up w=alphadownup

三、LoRA key可视化

以目前最流行的LCM lora为例进行一下可视化,key如下(部分):

lora_unet_down_blocks_0_downsamplers_0_conv.alpha
lora_unet_down_blocks_0_downsamplers_0_conv.lora_down.weight
lora_unet_down_blocks_0_downsamplers_0_conv.lora_up.weight
lora_unet_down_blocks_0_resnets_0_conv1.alpha
lora_unet_down_blocks_0_resnets_0_conv1.lora_down.weight
lora_unet_down_blocks_0_resnets_0_conv1.lora_up.weight
lora_unet_down_blocks_0_resnets_0_conv2.alpha
lora_unet_down_blocks_0_resnets_0_conv2.lora_down.weight
lora_unet_down_blocks_0_resnets_0_conv2.lora_up.weight
lora_unet_down_blocks_0_resnets_0_time_emb_proj.alpha
lora_unet_down_blocks_0_resnets_0_time_emb_proj.lora_down.weight
lora_unet_down_blocks_0_resnets_0_time_emb_proj.lora_up.weight
lora_unet_down_blocks_0_resnets_1_conv1.alpha
lora_unet_down_blocks_0_resnets_1_conv1.lora_down.weight
lora_unet_down_blocks_0_resnets_1_conv1.lora_up.weight
lora_unet_down_blocks_0_resnets_1_conv2.alpha
lora_unet_down_blocks_0_resnets_1_conv2.lora_down.weight
lora_unet_down_blocks_0_resnets_1_conv2.lora_up.weight
lora_unet_down_blocks_0_resnets_1_time_emb_proj.alpha
lora_unet_down_blocks_0_resnets_1_time_emb_proj.lora_down.weight
lora_unet_down_blocks_0_resnets_1_time_emb_proj.lora_up.weight
lora_unet_down_blocks_1_attentions_0_proj_in.alpha
lora_unet_down_blocks_1_attentions_0_proj_in.lora_down.weight
lora_unet_down_blocks_1_attentions_0_proj_in.lora_up.weight
lora_unet_down_blocks_1_attentions_0_proj_out.alpha
lora_unet_down_blocks_1_attentions_0_proj_out.lora_down.weight
lora_unet_down_blocks_1_attentions_0_proj_out.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_k.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_k.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_k.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_out_0.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_out_0.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_out_0.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_q.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_q.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_q.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_v.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_v.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn1_to_v.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_k.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_k.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_k.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_out_0.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_out_0.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_out_0.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_q.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_q.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_q.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_v.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_v.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_attn2_to_v.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_ff_net_0_proj.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_ff_net_0_proj.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_ff_net_0_proj.lora_up.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_ff_net_2.alpha
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_ff_net_2.lora_down.weight
lora_unet_down_blocks_1_attentions_0_transformer_blocks_1_ff_net_2.lora_up.weight
lora_unet_down_blocks_1_attentions_1_proj_in.alpha
lora_unet_down_blocks_1_attentions_1_proj_in.lora_down.weight
lora_unet_down_blocks_1_attentions_1_proj_in.lora_up.weight
lora_unet_down_blocks_1_attentions_1_proj_out.alpha
lora_unet_down_blocks_1_attentions_1_proj_out.lora_down.weight
lora_unet_down_blocks_1_attentions_1_proj_out.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_k.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_k.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_k.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_out_0.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_out_0.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_out_0.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_q.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_q.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_q.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_v.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_v.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn1_to_v.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_k.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_k.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_k.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_out_0.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_out_0.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_out_0.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_q.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_q.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_q.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_v.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_v.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_attn2_to_v.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_ff_net_0_proj.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_ff_net_0_proj.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_ff_net_0_proj.lora_up.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_ff_net_2.alpha
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_ff_net_2.lora_down.weight
lora_unet_down_blocks_1_attentions_1_transformer_blocks_1_ff_net_2.lora_up.weight
lora_unet_down_blocks_1_downsamplers_0_conv.alpha
lora_unet_down_blocks_1_downsamplers_0_conv.lora_down.weight
lora_unet_down_blocks_1_downsamplers_0_conv.lora_up.weight
  • 16
    点赞
  • 25
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
以下是一个用MATLAB生成LoRa信号模型的示例代码: ```matlab % 生成LoRa信号 fc = 868e6; % 中心频率 fs = 2.5e6; % 采样率 BW = 125e3; % 带宽 SF = 12; % 扩频因子 CR = 4/5; % 纠错编码 PreambleLength = 8; % 前导码长度 PayloadLength = 20; % 负载长度 SymRate = BW/(2^SF); % 符号速率 Tsym = 1/SymRate; % 符号时间 Ts = 1/fs; % 采样间隔 % 生成前导码 Preamble = randi([0 1],1,PreambleLength); Preamble = reshape([Preamble; ~Preamble],1,[]); % 生成随机负载 Payload = randi([0 1],1,PayloadLength); % 生成数据 Data = [Preamble Payload]; % 生成符号序列 N = length(Data); K = floor(N*SF/8)*8; Data = [Data zeros(1,K-N)]; Data = reshape(Data,[],8); Data = bi2de(Data,'left-msb'); Data = fliplr(dec2bin(Data)); Data = Data(:)'-'0'; Symbol = reshape(Data,[],SF); % 生成载波 t = 0:Ts:(length(Symbol)*Tsym-Ts); Carrier = exp(1j*2*pi*fc*t); % 生成LoRa信号 Signal = []; for i = 1:size(Symbol,1) % 扩频 Spread = ones(1,SF)*Symbol(i,:); Spread = reshape(Spread,1,[]); Spread = repmat(Spread,1,8); Spread = Spread(1:K); Spread = reshape(Spread,SF,[]); Spread = fliplr(Spread); Spread = reshape(Spread,1,[]); Spread = repmat(Spread,1,length(Carrier)/length(Spread)); % 调制 Modulated = Spread.*Carrier; % 附加到信号中 Signal = [Signal Modulated]; end % 绘制信号频谱 f = linspace(-fs/2,fs/2,length(Signal)); S = fftshift(abs(fft(Signal))); plot(f,S); xlabel('频率 (Hz)'); ylabel('幅度'); title('LoRa信号频谱'); ``` 该代码生成一个长度为28的数据包,其中前8位为前导码,后20位为随机生成的负载。生成的数据被扩频和调制后形成LoRa信号,最后绘制出其频谱。可以根据需要修改代码中的参数来生成不同的LoRa信号模型

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值