Android平台美颜相机/Camera实时滤镜/视频编解码/影像后期/人脸技术探索——2.3 仿制Snow相机和FaceU的边框/小脸模式

时间:2022-02-03 20:30:02

Github项目地址

回到目录

在体验各种美颜相机时,我发现FaceU和Snow相机都带一个小脸模式(或者边框模式),像这样的效果:

这是Snow相机的:
Android平台美颜相机/Camera实时滤镜/视频编解码/影像后期/人脸技术探索——2.3 仿制Snow相机和FaceU的边框/小脸模式
这是FaceU的:
Android平台美颜相机/Camera实时滤镜/视频编解码/影像后期/人脸技术探索——2.3 仿制Snow相机和FaceU的边框/小脸模式

两个看上去并不一样,因为Snow相机的默认模式就是场景优化模式(感觉Snow相机的边框更漂亮一些),不过这哪里小脸啦,图片还是一样大的好吗???!!!!

本文的目标,就是仿制一个这样的边框效果。

解决方案

相信大家也能看出来,其实就是对场景做了一个模糊的效果(模糊的半径略大呀),然后再把画面缩小贴出来,个人感觉Snow相机的处理更漂亮一些,不过也可能跟前期处理有关,FaceU的模糊显得很一般,仔细看都能看到条纹。

模糊(Blur)算法选择

毫无疑问,Blur这种耗时的操作,我们肯定希望使用GPU来加速完成,常见的模糊算法(低通滤波)有FastBlur/BoxBlur,就是我们常见的图像卷积模糊以及在其上的各种优化姿势(例如积分图、线性可分等),以及高斯模糊,利用高斯分布(正太分布,正泰分布,正态分布。。终于打对了,好艰辛)来选择周围像素的权重。
对了,高斯模糊也是可以降维的,也就是两个维度的操作互不相关。
如果感兴趣,可以自己去找找模糊算法的相关资料。
较为复杂的FastBlur的代码如下(这里的卷积核不再是一个矩形,而是一个随机的球形卷积核)

precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D sTexture;

uniform vec3 iResolution;

vec2 Circle(float Start, float Points, float Point)
{
float Rad = (3.141592 * 2.0 * (1.0 / Points)) * (Point + Start);
return vec2(sin(Rad), cos(Rad));
}

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = fragCoord.xy;
vec2 PixelOffset = 1.0 / iResolution.xy;

float Start = 2.0 / 14.0;
vec2 Scale = 0.66 * 4.0 * 2.0 * PixelOffset.xy;

vec3 N0 = texture2D(sTexture, uv + Circle(Start, 14.0, 0.0) * Scale).rgb;
vec3 N1 = texture2D(sTexture, uv + Circle(Start, 14.0, 1.0) * Scale).rgb;
vec3 N2 = texture2D(sTexture, uv + Circle(Start, 14.0, 2.0) * Scale).rgb;
vec3 N3 = texture2D(sTexture, uv + Circle(Start, 14.0, 3.0) * Scale).rgb;
vec3 N4 = texture2D(sTexture, uv + Circle(Start, 14.0, 4.0) * Scale).rgb;
vec3 N5 = texture2D(sTexture, uv + Circle(Start, 14.0, 5.0) * Scale).rgb;
vec3 N6 = texture2D(sTexture, uv + Circle(Start, 14.0, 6.0) * Scale).rgb;
vec3 N7 = texture2D(sTexture, uv + Circle(Start, 14.0, 7.0) * Scale).rgb;
vec3 N8 = texture2D(sTexture, uv + Circle(Start, 14.0, 8.0) * Scale).rgb;
vec3 N9 = texture2D(sTexture, uv + Circle(Start, 14.0, 9.0) * Scale).rgb;
vec3 N10 = texture2D(sTexture, uv + Circle(Start, 14.0, 10.0) * Scale).rgb;
vec3 N11 = texture2D(sTexture, uv + Circle(Start, 14.0, 11.0) * Scale).rgb;
vec3 N12 = texture2D(sTexture, uv + Circle(Start, 14.0, 12.0) * Scale).rgb;
vec3 N13 = texture2D(sTexture, uv + Circle(Start, 14.0, 13.0) * Scale).rgb;
vec3 N14 = texture2D(sTexture, uv).rgb;

float W = 1.0 / 15.0;

vec3 color = vec3(0,0,0);

color.rgb =
(N0 * W) +
(N1 * W) +
(N2 * W) +
(N3 * W) +
(N4 * W) +
(N5 * W) +
(N6 * W) +
(N7 * W) +
(N8 * W) +
(N9 * W) +
(N10 * W) +
(N11 * W) +
(N12 * W) +
(N13 * W) +
(N14 * W);
}

void main() {
mainImage(gl_FragColor, vTextureCoord);
}

其实利用随机来做模糊是一个很棒的选择,但是有些低端移动GPU对于随机算法的支持非常烂,所以这就成为了一个备选方案。

But,高斯模糊和BoxBlur在半径较大的时候就歇菜了,因为是需要预处理(求和)的,如果我们使用OpenGL来做模糊半径一般都是3,那么在半径很大时就会出现很奇怪的重影效果。像GPUImage的高斯模糊在半径为20时的效果就成了这样(当然这也是和纹理大小以及GPU相关的):
模糊做成这样我也是醉了
Android平台美颜相机/Camera实时滤镜/视频编解码/影像后期/人脸技术探索——2.3 仿制Snow相机和FaceU的边框/小脸模式

怎么办呢,有两种解决方案:

  • 在GPUImage的IOS项目中,高斯模糊的shader代码是根据半径动态生成的,我们也可以通过写一个循环的方式来对大半径进行模糊操作
  • 多次进行模糊操作,先缩小图像再模糊再放大

年轻人,一言不合就上代码

我们采用第二种解决方案,模糊的滤镜组合如下:

private static final int BLUR_RADIUS=6;
private static final float SCALING_FACTOR=0.6f;
private ScalingFilter scalingFilter;
public BlurredFrameEffect(Context context) {
super();
addFilter(new FastBlurFilter(context).setScale(true));
addFilter(new GaussianBlurFilter(context).setTexelHeightOffset(BLUR_RADIUS).setScale(true));
addFilter(new GaussianBlurFilter(context).setTexelWidthOffset(BLUR_RADIUS).setScale(true));
addFilter(new GaussianBlurFilter(context).setTexelHeightOffset(BLUR_RADIUS));
addFilter(new GaussianBlurFilter(context).setTexelWidthOffset(BLUR_RADIUS));
scalingFilter=new ScalingFilter(context).setScalingFactor(SCALING_FACTOR).setDrawOnTop(true);
}

先用FastBlur过一遍,再按照X方向和Y方向做两次GaussianBlur,前几次都是缩小图像进行绘制的,这样可以减小计算量(当然这只是我瞎选的组合)

类是继承FilterGroup的,使用FilterGroup除了可以使用多个滤镜以外,其实还可以将多个滤镜打包成为一个滤镜处理,我们就把这次做的边框效果作为一个特殊的滤镜,放在所有处理之后,绘制到屏幕之前。

看一下,效果如图:
Android平台美颜相机/Camera实时滤镜/视频编解码/影像后期/人脸技术探索——2.3 仿制Snow相机和FaceU的边框/小脸模式

还不错,个人感觉比FaceU的漂亮一些。
然后我们要做的就是把真正的相机数据缩小贴上去了,在这里我采用的缩放比例是0.6(FaceU应该是0.5吧),也就是把所有的顶点坐标都缩小到原来的60%,像这样

public Plain scale(float scalingFactor){
float[] temp=new float[TRIANGLES_DATA.length];
System.arraycopy(TRIANGLES_DATA,0,temp,0,TRIANGLES_DATA.length);
for(int i=0;i<temp.length;i++){
temp[i]*=scalingFactor;
}
mVerticesBuffer = BufferUtils.getFloatBuffer(temp,0);
return this;
}

看看效果吧:
Android平台美颜相机/Camera实时滤镜/视频编解码/影像后期/人脸技术探索——2.3 仿制Snow相机和FaceU的边框/小脸模式

好了,这样我们就基本仿制完成了相机的边框模式,如果觉得模糊效果不够或者太过了,也可以调整模糊半径和模糊方案。
5次模糊在预览分辨率1280x720的情况下(截屏的手机分辨率1280x800,所以有黑边),可以在MTK双核CPU+Mali400MP的渣手机上做到十分流畅完全不卡顿,当然这肯定不是最优的方案,之后我会进行修改。

扩展

如果把相机换成视频,相信大家也知道该怎么做(如果没有OpenGL基础可以看我写的全景视频播放器系列文章)
此外,我们也可以像网易云音乐那样,不将整个图像进行模糊,而是切一小块,符合比例或者进行其他处理后在模糊作为背景。
因为使用了滤镜组,我们也可以先叠加其他滤镜再贴个框,调节的好的话可以达到接近Snow相机的效果,贴个丑图作为示范:
Android平台美颜相机/Camera实时滤镜/视频编解码/影像后期/人脸技术探索——2.3 仿制Snow相机和FaceU的边框/小脸模式

相关类的完整代码

前面可能说的太简略了(我是假设大家都会OpenGL了,逃…

因为每个类都进行了多次包装(现在项目还处在堆功能阶段,后期类结构可能也会进行调整),所以我还是把相关代码都贴出来吧

BlurredFrameEffect

package com.martin.ads.omoshiroilib.filter.ext;

import android.content.Context;

import com.martin.ads.omoshiroilib.filter.base.FilterGroup;
import com.martin.ads.omoshiroilib.filter.ext.shadertoy.FastBlurFilter;
import com.martin.ads.omoshiroilib.filter.imgproc.GaussianBlurFilter;

/**
* Created by Ads on 2017/2/16.
*/


public class BlurredFrameEffect extends FilterGroup{

private static final int BLUR_RADIUS=6;
private static final float SCALING_FACTOR=0.6f;
private ScalingFilter scalingFilter;
public BlurredFrameEffect(Context context) {
super();
addFilter(new FastBlurFilter(context).setScale(true));
addFilter(new GaussianBlurFilter(context).setTexelHeightOffset(BLUR_RADIUS).setScale(true));
addFilter(new GaussianBlurFilter(context).setTexelWidthOffset(BLUR_RADIUS).setScale(true));
addFilter(new GaussianBlurFilter(context).setTexelHeightOffset(BLUR_RADIUS));
addFilter(new GaussianBlurFilter(context).setTexelWidthOffset(BLUR_RADIUS));
scalingFilter=new ScalingFilter(context).setScalingFactor(SCALING_FACTOR).setDrawOnTop(true);
}

@Override
public void onDrawFrame(int textureId) {
super.onDrawFrame(textureId);
scalingFilter.onDrawFrame(textureId);
}

@Override
public void init() {
super.init();
scalingFilter.init();
}

@Override
public void onFilterChanged(int surfaceWidth, int surfaceHeight) {
super.onFilterChanged(surfaceWidth, surfaceHeight);
scalingFilter.onFilterChanged(surfaceWidth,surfaceHeight);
}

@Override
public void destroy() {
super.destroy();
scalingFilter.destroy();
}
}

FastBlurFilter

package com.martin.ads.omoshiroilib.filter.ext.shadertoy;

import android.content.Context;

/**
* Created by Ads on 2017/2/16.
* Similar to BoxBlur
*/


public class FastBlurFilter extends ShaderToyAbsFilter {

private boolean scale;

public FastBlurFilter(Context context) {
super(context, "filter/fsh/fast_blur.glsl");
}

@Override
public void onFilterChanged(int surfaceWidth, int surfaceHeight) {
if(!scale)
super.onFilterChanged(surfaceWidth, surfaceHeight);
else super.onFilterChanged(surfaceWidth/4, surfaceHeight/4);
}

public FastBlurFilter setScale(boolean scale) {
this.scale = scale;
return this;
}
}

GaussianBlurFilter

package com.martin.ads.omoshiroilib.filter.imgproc;

import android.content.Context;
import android.opengl.GLES20;

import com.martin.ads.omoshiroilib.filter.base.AbsFilter;
import com.martin.ads.omoshiroilib.glessential.object.Plain;
import com.martin.ads.omoshiroilib.glessential.program.GLSimpleProgram;
import com.martin.ads.omoshiroilib.util.TextureUtils;

/**
* Created by Ads on 2017/2/16.
*/


public class GaussianBlurFilter extends AbsFilter {
protected GLSimpleProgram glSimpleProgram;

private float texelWidthOffset;
private float texelHeightOffset;

private boolean scale;

public GaussianBlurFilter(Context context) {
super("GaussianBlurFilter");
glSimpleProgram=new GLSimpleProgram(context, "filter/vsh/gaussian_blur.glsl","filter/fsh/gaussian_blur.glsl");
texelWidthOffset=texelHeightOffset=0;
scale=false;
}

@Override
public void init() {
glSimpleProgram.create();
}

@Override
public void onPreDrawElements() {
super.onPreDrawElements();
glSimpleProgram.use();
plain.uploadTexCoordinateBuffer(glSimpleProgram.getTextureCoordinateHandle());
plain.uploadVerticesBuffer(glSimpleProgram.getPositionHandle());
}

@Override
public void destroy() {
glSimpleProgram.onDestroy();
}

@Override
public void onDrawFrame(int textureId) {
onPreDrawElements();
setUniform1f(glSimpleProgram.getProgramId(),"texelWidthOffset",texelWidthOffset/surfaceWidth);
setUniform1f(glSimpleProgram.getProgramId(),"texelHeightOffset",texelHeightOffset/surfaceHeight);

TextureUtils.bindTexture2D(textureId, GLES20.GL_TEXTURE0,glSimpleProgram.getTextureSamplerHandle(),0);
GLES20.glViewport(0,0,surfaceWidth,surfaceHeight);
plain.draw();
}

public GaussianBlurFilter setTexelHeightOffset(float texelHeightOffset) {
this.texelHeightOffset = texelHeightOffset;
return this;
}

public GaussianBlurFilter setTexelWidthOffset(float texelWidthOffset) {
this.texelWidthOffset = texelWidthOffset;
return this;
}

@Override
public void onFilterChanged(int surfaceWidth, int surfaceHeight) {
if(!scale)
super.onFilterChanged(surfaceWidth, surfaceHeight);
else super.onFilterChanged(surfaceWidth/4, surfaceHeight/4);
}

public GaussianBlurFilter setScale(boolean scale) {
this.scale = scale;
return this;
}
}

ScalingFilter

package com.martin.ads.omoshiroilib.filter.ext;

import android.content.Context;

import com.martin.ads.omoshiroilib.filter.base.PassThroughFilter;

/**
* Created by Ads on 2017/2/16.
*/


public class ScalingFilter extends PassThroughFilter {
private boolean drawOnTop;

public ScalingFilter(Context context) {
super(context);
drawOnTop=false;
}

@Override
public void onPreDrawElements() {
if(!drawOnTop) super.onPreDrawElements();
}

public ScalingFilter setScalingFactor(float scalingFactor){
plain.scale(scalingFactor);
return this;
}

public ScalingFilter setDrawOnTop(boolean drawOnTop) {
this.drawOnTop = drawOnTop;
return this;
}
}

Plain

package com.martin.ads.omoshiroilib.glessential.object;

import android.opengl.GLES20;

import com.martin.ads.omoshiroilib.constant.Rotation;
import com.martin.ads.omoshiroilib.util.BufferUtils;
import com.martin.ads.omoshiroilib.util.PlainTextureRotationUtils;
import com.martin.ads.omoshiroilib.util.ShaderUtils;

import java.nio.FloatBuffer;

/**
* Created by Ads on 2016/11/19.
* This class is assumed to only render in FilterGroup
* if you want to render it alone, set isInGroup false
*/


public class Plain {
private FloatBuffer mVerticesBuffer;
private FloatBuffer mTexCoordinateBuffer;
private final float TRIANGLES_DATA[] = {
-1.0f, -1.0f, 0f,
1.0f, -1.0f, 0f,
-1.0f, 1.0f, 0f,
1.0f, 1.0f, 0f
};

public Plain(boolean isInGroup) {
mVerticesBuffer = BufferUtils.getFloatBuffer(TRIANGLES_DATA,0);
if (isInGroup)
mTexCoordinateBuffer = BufferUtils.getFloatBuffer(PlainTextureRotationUtils.getRotation(Rotation.NORMAL, false, true), 0);
else mTexCoordinateBuffer = BufferUtils.getFloatBuffer(PlainTextureRotationUtils.TEXTURE_NO_ROTATION,0);
}

public void uploadVerticesBuffer(int positionHandle){
FloatBuffer vertexBuffer = getVerticesBuffer();
if (vertexBuffer == null) return;
vertexBuffer.position(0);

GLES20.glVertexAttribPointer(positionHandle, 3, GLES20.GL_FLOAT, false, 0, vertexBuffer);
ShaderUtils.checkGlError("glVertexAttribPointer maPosition");
GLES20.glEnableVertexAttribArray(positionHandle);
ShaderUtils.checkGlError("glEnableVertexAttribArray maPositionHandle");
}

public void uploadTexCoordinateBuffer(int textureCoordinateHandle){
FloatBuffer textureBuffer = getTexCoordinateBuffer();
if (textureBuffer == null) return;
textureBuffer.position(0);

GLES20.glVertexAttribPointer(textureCoordinateHandle, 2, GLES20.GL_FLOAT, false, 0, textureBuffer);
ShaderUtils.checkGlError("glVertexAttribPointer maTextureHandle");
GLES20.glEnableVertexAttribArray(textureCoordinateHandle);
ShaderUtils.checkGlError("glEnableVertexAttribArray maTextureHandle");
}


public FloatBuffer getVerticesBuffer() {
return mVerticesBuffer;
}

public FloatBuffer getTexCoordinateBuffer() {
return mTexCoordinateBuffer;
}

//only used to flip texture
public void setTexCoordinateBuffer(FloatBuffer mTexCoordinateBuffer) {
this.mTexCoordinateBuffer = mTexCoordinateBuffer;
}

public void setVerticesBuffer(FloatBuffer mVerticesBuffer) {
this.mVerticesBuffer = mVerticesBuffer;
}

public void resetTextureCoordinateBuffer(boolean isInGroup) {
mTexCoordinateBuffer=null;
if (isInGroup)
mTexCoordinateBuffer = BufferUtils.getFloatBuffer(PlainTextureRotationUtils.getRotation(Rotation.NORMAL, false, true), 0);
else mTexCoordinateBuffer = BufferUtils.getFloatBuffer(PlainTextureRotationUtils.TEXTURE_NO_ROTATION,0);
}

public void draw() {
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}

public Plain scale(float scalingFactor){
float[] temp=new float[TRIANGLES_DATA.length];
System.arraycopy(TRIANGLES_DATA,0,temp,0,TRIANGLES_DATA.length);
for(int i=0;i<temp.length;i++){
temp[i]*=scalingFactor;
}
mVerticesBuffer = BufferUtils.getFloatBuffer(temp,0);
return this;
}
}

Github项目地址

回到目录