混合模式是用于控制两个对象之间混合方式,可以让在绘制时控制绘制的像素如何与场景中已存在的像素进行混合。
在photoshop中,27种混合模式如下图
如果要实现混合模式,那就要理解每一种混合模式的原理。其核心概念是将两个图层的像素进行数学运算,得到一个新的颜色。它涉及两个图层间的像素点,底图的像素点我们称为基色(下文运算中以a代替),其他图层(在底图之上的图层)的像素点我们称为混合色(下文运算中以b代替),由基色和混合色进行数学运算计算出来的混合就是结果色(下文中以c代替)。
而无论是处理基色像素点还是处理混合色像素点,本质上都是对其RGB三个值进行运算,比如让我们做一个简单的运算,在ps中的**“正常混合模式”**,在计算结果像素点c时,我们会使Rc = Rb Gc = Gb Bc =Bb
,结果就是,运算结束后,在每个有RGB值的像素中,只能看到混合色的内容而看不到基色的内容,这就是正常混合模式的运算。
正常混合模式下叠加部分只能看到贴图
理解了这个原理,我们就可以一个一个查看混合模式的公式了
首先是正常模式,设置透明度为n,其混合公式为:c = n*b + (1-n)*a
,因此,在默认情况下(透明度为100%)可得c = b,也就是在叠加状态中只能看见上方图层。
在openCV中,mResImage是底图的克隆像素点(下文和openCV相关时mResImage都作为底图的克隆像素点),m_Layer是叠加图像素点(下文和openCV相关时m_Layer都是叠加图像素点),可以用这个方式来表示像素间的操作。
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] += m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC];
在openGL的shader中,baseColor为底图像素点,layerColor为叠加图像素点,blendedColor为混合后的结果像素点,可以用这个方式来表示shader中的逐像素操作。
blendedColor = baseColor + layerColor;
了解上述操作后,可以分成几个类别来看混合模式
1.变亮和变暗系
其本质都是对像素进行比较,保留其最值,不同的是一些是求逐通道的最值,比如变亮和变暗,有些是求整体的最值,比如深色和浅色。在后一种模式中不会产生新颜色。
公式如下:
变亮:c = ( max( Ra, Rb), max( Ga, Gb), max( Ba, Bb))
变暗:c = ( min( Ra, Rb), min( Ga, Gb), min( Ba, Bb))
浅色:c = max( a, b)
深色:c = min( a, b)
这个很多人会有疑问,怎么做出来效果是一样的?
那显然不是一样的!后来分析发现可能是我的公式给的不是很清楚,如果我改成深色是c = maxGrey( a, b)
,浅色是c = minGrey(a,b)
就更好理解了,因为取较暗较亮像素我们可以简单理解为对比其rgb平均值或者灰度值或者亮度值,然后取完整像素而不产生新的颜色。
这是在ps验证时的对比图,可以看出在深色模式下颜色是离散不连续的,说明是直接取用像素不产生新像素,而变暗模式下出现了两张图都没有的绿色,说明产生了新像素。
常用的加深模式还有正片叠底、颜色加深、线性加深;常用的减淡模式有滤色、颜色减淡、线性减淡。
正片叠底是我们最常用的混合模式了,其计算方式是c = a * b(逐通道计算,也就是(Ra Rb,GaGb,Ba*Bb)),其特点是任何颜色和黑色混合结果都是黑的,任何颜色跟白色混合结果都是原来的颜色,也就是我们常说的减色混合或者CMYK混合,洋红、青色、黄色混合结果如图
CMYK混合
在cv中的混合公式如下:
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] = m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] * m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]
在gl中shader的混合公式如下:
blendedColor = baseColor * layerColor;
而滤色则是我们最常见的减淡模式,其计算方式为c = 1 − (1−b)*(1−a)(逐通道计算),特点是基色或混合色为白色时,结果会是白色;任何颜色和黑色混合,结果仍原来的颜色,这也是我们常说的加色混合或者RGB混合,红色绿色蓝色混合结果如图
RGB混合
颜色加深的计算方式是:c = 1 − (1−a)/b
(逐通道计算),颜色减淡是c = a/1-b
(逐通道计算),看到这个计算公式就知道,在颜色加深中任何颜色和白色(a = 1)混合都是白色(c = 1),在颜色减淡中任何颜色和黑色混合(a = 0)都是黑色。
而线性加深的计算方式是c = b + a - 1(逐通道计算),线性减淡的计算方式是c = a + b(逐通道计算),由公式可以看出线性加深的模式下任何颜色跟白色(a = 1)混合结果都是原来的颜色,在线性减淡模式下任何颜色跟黑色(a = 0)混合结果都是原来的颜色
强弱关系如下:
变暗程度:线性加深 > 颜色加深 > 正片叠底
变亮程度:线性减淡 > 颜色减淡 > 滤色
2. 对比系
下面这些混合模式相对比较复杂,该组特点是让亮的更亮,暗的更暗,每一个“对比系”的混合模式都可看作“变暗系”和“变亮系”的结合,如“叠加”是对较暗的像素进行“正片叠底”,对较亮的像素进行“滤色”
首先是叠加模式,其计算公式为:c = b <= 0.5 ? 2ab : 1 - 2( 1 - a)( 1 - b)
(逐通道计算),而强光模式则为c = a <= 0.5 ? 2ab : 1 - 2*( 1 - a)( 1 - b)
(逐通道计算),神奇吧,这个就是完全反相的混合,当底图和贴图对掉时,原本的叠加模式就会变成强光模式
柔光模式中,计算公式为:c = b <= 0.5 ? 2ab + a2 + (1 - 2b) : 2a(1-b) + \sqrt{a}(2b-1)
(逐通道计算),其效果类似于“叠加”,但效果更柔和,有透明的光线和阴影,近似于 “浅色”+“深色”的结合
亮光模式下,计算公式为:c = b <= 0.5 ? 1 + ( a - 1)/ 2b : a/2(1-b)
(逐通道计算),效果类似于“实色混合”,但效果通常更加剧烈,近似于“颜色减淡”+“颜色加深”的结合
而线性光比较简单,其近似计算公式为:a + 2b -1
(逐通道计算),效果类似于“亮光”,但效果通常更加剧烈,实现近似于 “线性加深”+“线性减淡”的结合
点光的公式是c = b > 0.5 ? max ( 2(b-0.5) , a ) : min ( 2b , a )
,其效果较为强烈,容易形成色块色斑和噪点,近似于 “变亮”+“变暗”的结合
实色混合则是c = a+b >= 1 ?1 : 0
,其结果只有8个颜色:R、G、B、C、M、Y 、K、White(如果“填充度”不为100,结果色会多于8)
3.负片系
这类效果处理完的叠加区域有一定可能会出现负片
首先是是差值: c = | a - b |
(逐通道计算),会有一定负片效果
其次是排除:c = b + a - 2ab
(逐通道计算),同样会有一定负片效果
4.相消系
减去是:c = a - b
(逐通道计算)会使图片变深,常结合“应用图像”,用于高低频法调色
划分是:c = a / b
(逐通道计算)会使图片变亮
下面是openGL实现的shader(只展示像素混合部分)
if (blendMode == 0) { // Normal
blendedColor = baseColor + layerColor;
} else if (blendMode == 1) { // Multiply
blendedColor = baseColor * layerColor;
} else if (blendMode == 2) { // ColorBurn
blendedColor = vec4(1.0 - (1.0 - baseColor.rgb) / layerColor.rgb, baseColor.a);
} else if (blendMode == 3) { // ColorDodge
blendedColor = vec4(baseColor.rgb / (1.0 - layerColor.rgb), baseColor.a);
} else if (blendMode == 4) { // Screen
blendedColor = vec4(1.0 - (1.0 - baseColor.rgb) * (1.0 - layerColor.rgb), baseColor.a);
} else if (blendMode == 5) { // SoftLight
blendedColor.rgb = mix(2.0 * baseColor.rgb * layerColor.rgb + baseColor.rgb * baseColor.rgb * (1.0 - 2.0 * layerColor.rgb),
sqrt(baseColor.rgb) * (2.0 * layerColor.rgb - 1.0) + 2.0 * baseColor.rgb * (1.0 - layerColor.rgb),
step(0.5, baseColor.rgb));
blendedColor.a = baseColor.a;
} else if (blendMode == 6) { // Lighten
blendedColor.rgb = max(baseColor.rgb, layerColor.rgb);
blendedColor.a = baseColor.a;
} else if (blendMode == 7) { // Darken
blendedColor.rgb = min(baseColor.rgb, layerColor.rgb);
blendedColor.a = baseColor.a;
} else if (blendMode == 8) { // Difference
blendedColor.rgb = abs(baseColor.rgb - layerColor.rgb);
blendedColor.a = baseColor.a;
} else if (blendMode == 9) { // Exclusion
blendedColor = baseColor + layerColor - 2.0 * baseColor * layerColor;
} else if (blendMode == 10) { // Overlay
blendedColor.rgb = mix(2.0 * baseColor.rgb * layerColor.rgb, 1.0 - 2.0 * (1.0 - baseColor.rgb) * (1.0 - layerColor.rgb), step(0.5, baseColor.rgb));
blendedColor.a = baseColor.a;
} else if (blendMode == 11) { // HardLight
blendedColor.rgb = mix(2.0 * baseColor.rgb * layerColor.rgb, 1.0 - 2.0 * (1.0 - baseColor.rgb) * (1.0 - layerColor.rgb), step(0.5, layerColor.rgb));
blendedColor.a = baseColor.a;
} else if (blendMode == 12) { // VividLight
blendedColor.rgb = mix(baseColor.rgb / (2.0 * (1.0 - layerColor.rgb)), 1.0 - (1.0 - baseColor.rgb) / (2.0 * layerColor.rgb - 1.0), step(0.5, layerColor.rgb));
blendedColor.a = baseColor.a;
} else if (blendMode == 13) { // LinearLight
blendedColor = baseColor + 2.0 * layerColor - 1.0;
} else if (blendMode == 14) { // PinLight
blendedColor.rgb = mix(max(baseColor.rgb, 2.0 * layerColor.rgb - 1.0), min(baseColor.rgb, 2.0 * layerColor.rgb), step(0.5, layerColor.rgb));
blendedColor.a = baseColor.a;
} else if (blendMode == 15) { // LinearBurn
blendedColor = baseColor + layerColor - 1.0;
} else if (blendMode == 16) { // LinearDodge
blendedColor = baseColor + layerColor;
}
下面是opencv实现
cv::Mat BlendCV::BlendMultiply() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] *
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC];
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendColorBurn() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
1 - (1 - m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]) /
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC];
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendLinearBurn() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] +
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] - 1;
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendLighten() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
cv::max(m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC],
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]);
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendScreen() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
1 - (1 - m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]) *
(1 - m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]);
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendColorDodge() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] /
(1 - m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]);
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendLinearDodge() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] +
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC];
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendOverlay() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
if (m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] > 0.5f) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
1 -
(1 - 2 * (m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] -
0.5f)) *
(1 - m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]);
} else {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
(2 * m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]) *
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC];
}
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendSoftLight() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
if (m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] > 0.5f) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
(2 * m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] - 1) *
(m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] - 0.5f) +
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC];
} else {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
(2 * m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]) *
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] +
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] - 1;
}
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendHardLight() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
if (m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] > 0.5f) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
(2 * m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] - 1) *
(1 - (1 - m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]) *
(1 - m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] -
0.5f));
} else {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] *
(2 * m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]);
}
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendVividLight() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
if (m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] > 0.5f) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
1 -
(1 - m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]) /
(2 * m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] - 0.5f);
} else {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] /
(1 - 2 * m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]);
}
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendLinearLight() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] +
2 * m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] - 1;
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendPinLight() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
if (m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] > 0.5f) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
cv::max(m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC],
2 * m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] - 0.5f);
} else {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
cv::min(m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC],
2 * m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]);
}
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendDifference() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
abs(m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] -
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC]);
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendExclusion() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] =
m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] +
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] -
2 * m_BaseImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] *
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC];
}
}
}
return mResImage;
}
cv::Mat BlendCV::BlendNormal() {
cv::Mat mResImage = m_BaseImage.clone();
for (int iIndexRow = m_Point.x; iIndexRow < cv::min(m_BaseImage.rows, m_Layer.rows) + m_Point.x; ++iIndexRow) {
for (int iIndexCol = m_Point.y; iIndexCol < cv::min(m_BaseImage.cols, m_Layer.cols) + m_Point.y; ++iIndexCol) {
for (int iIndexC = 0; iIndexC < 3; ++iIndexC) {
mResImage.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC] +=
m_Layer.at<cv::Vec3f>(iIndexRow, iIndexCol)[iIndexC];
}
}
}
return mResImage;
}