日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

android 视频播放滤镜,用openGL ES+MediaPlayer 渲染播放视频+滤镜效果

發布時間:2024/10/8 编程问答 30 豆豆
生活随笔 收集整理的這篇文章主要介紹了 android 视频播放滤镜,用openGL ES+MediaPlayer 渲染播放视频+滤镜效果 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

之前曾經寫過用SurfaceView,TextureView+MediaPlayer 播放視頻,和 ffmpeg avi解碼后SurfaceView播放視頻,今天再給大家來一篇openGL ES+MediaPlayer來播放視頻。。。。當年也曾呆過camera開發組近一年時間,可惜那時候沒寫博客的意識,沒能給自己給大家留下多少干貨分享。。。

上個效果圖吧:

這里寫圖片描述

用openGL著色器實現黑白(灰度圖)效果。

即 0.299,0.587,0.114 CRT中轉灰度的模型

這里寫圖片描述

下面看具體實現的邏輯:

如果你曾用openGL實現過貼圖,那么就容易理解多了。和圖片不同的是,視頻需要不斷地刷新,每當有新的一幀來時,我們都應該更新紋理,然后重新繪制。用openGL播放視頻就是把視頻貼到屏幕上。

對openGL不熟的同學先看這里:學openGL必知道的圖形學知識

1.先寫頂點著色器和片段著色器(我的習慣是這樣,你也可以后邊根據需要再寫這個)

頂點著色器:

attribute vec4 aPosition;//頂點位置

attribute vec4 aTexCoord;//S T 紋理坐標

varying vec2 vTexCoord;

uniform mat4 uMatrix;

uniform mat4 uSTMatrix;

void main() {

vTexCoord = (uSTMatrix * aTexCoord).xy;

gl_Position = uMatrix*aPosition;

}

片段著色器:

#extension GL_OES_EGL_image_external : require

precision mediump float;

varying vec2 vTexCoord;

uniform samplerExternalOES sTexture;

void main() {

gl_FragColor=texture2D(sTexture, vTexCoord);

}

samplerExternalOES代替貼圖片時的sampler2D,作用就是和surfaceTexture配合進行紋理更新和格式轉換

2.MediaPlayer的輸出

在GLVideoRenderer的構造函數中初始化MediaPlayer:

mediaPlayer=new MediaPlayer();

try{

mediaPlayer.setDataSource(context, Uri.parse(videoPath));

}catch (IOException e){

e.printStackTrace();

}

mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);

mediaPlayer.setLooping(true);

mediaPlayer.setOnVideoSizeChangedListener(this);

onSurfaceCreated函數中使用SurfaceTexture來設置MediaPlayer的輸出

我們要用SurfaceTexture 創建一個Surface,然后將這個Surface作為MediaPlayer的輸出表面。

SurfaceTexture的主要作用就是,從視頻流和相機數據流獲取新一幀的數據,獲取新數據調用的方法是updateTexImage。

需要注意的是MediaPlayer的輸出往往不是RGB格式(一般是YUV),而GLSurfaceView需要RGB格式才能正常顯示。

所以我們先在onSurfaceCreated中將生成紋理的代碼改成這樣:

textureId = textures[0];

GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);

ShaderUtils.checkGlError("ws-------glBindTexture mTextureID");

GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,

GLES20.GL_NEAREST);

GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,

GLES20.GL_LINEAR);

GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用處是什么?

之前提到視頻解碼的輸出格式是YUV的(YUV420sp,應該是),那么這個擴展紋理的作用就是實現YUV格式到RGB的自動轉化,我們就不需要再為此寫YUV轉RGB的代碼了

然后在onSurfaceCreated的最后加上如下代碼:

surfaceTexture = new SurfaceTexture(textureId);

surfaceTexture.setOnFrameAvailableListener(this);//監聽是否有新的一幀數據到來

Surface surface = new Surface(surfaceTexture);

mediaPlayer.setSurface(surface);

surface.release();

if (!playerPrepared){

try {

mediaPlayer.prepare();

playerPrepared=true;

} catch (IOException t) {

Log.e(TAG, "media player prepare failed");

}

mediaPlayer.start();

playerPrepared=true;

}

用SurfaceTexture 創建一個Surface,然后將這個Surface作為MediaPlayer的輸出表面.

在onDrawFrame中

synchronized (this){

if (updateSurface){

surfaceTexture.updateTexImage();//獲取新數據

surfaceTexture.getTransformMatrix(mSTMatrix);//讓新的紋理和紋理坐標系能夠正確的對應,mSTMatrix的定義是和projectionMatrix完全一樣的。

updateSurface = false;

}

}

在有新數據時,用updateTexImage來更新紋理,這個getTransformMatrix的目的,是讓新的紋理和紋理坐標系能夠正確的對應,mSTMatrix的定義是和projectionMatrix完全一樣的。

private final float[] vertexData = {

1f,-1f,0f,

-1f,-1f,0f,

1f,1f,0f,

-1f,1f,0f

};

private final float[] textureVertexData = {

1f,0f,

0f,0f,

1f,1f,

0f,1f

};

vertexData 代表要繪制的視口坐標。textureVertexData 代表視頻紋理,與屏幕坐標對應

然后我們讀取坐標,在此自己我們先與著色器映射。

在onSurfaceCreated映射

aPositionLocation= GLES20.glGetAttribLocation(programId,"aPosition");

uMatrixLocation=GLES20.glGetUniformLocation(programId,"uMatrix");

uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");

uTextureSamplerLocation=GLES20.glGetUniformLocation(programId,"sTexture");

aTextureCoordLocation=GLES20.glGetAttribLocation(programId,"aTexCoord");

onDrawFrame中讀取:

GLES20.glUseProgram(programId);

GLES20.glUniformMatrix4fv(uMatrixLocation,1,false,projectionMatrix,0);

GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, mSTMatrix, 0);

vertexBuffer.position(0);

GLES20.glEnableVertexAttribArray(aPositionLocation);

GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,

12, vertexBuffer);

textureVertexBuffer.position(0);

GLES20.glEnableVertexAttribArray(aTextureCoordLocation);

GLES20.glVertexAttribPointer(aTextureCoordLocation,2,GLES20.GL_FLOAT,false,8,textureVertexBuffer);

GLES20.glActiveTexture(GLES20.GL_TEXTURE0);

GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textureId);

GLES20.glUniform1i(uTextureSamplerLocation,0);

GLES20.glViewport(0,0,screenWidth,screenHeight);

GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);

GLVideoRenderer 全部代碼如下:

package com.ws.openglvideoplayer;

import android.content.Context;

import android.graphics.SurfaceTexture;

import android.media.AudioManager;

import android.media.MediaPlayer;

import android.net.Uri;

import android.opengl.GLES11Ext;

import android.opengl.GLES20;

import android.opengl.GLSurfaceView;

import android.opengl.Matrix;

import android.util.Log;

import android.view.Surface;

import java.io.IOException;

import java.nio.ByteBuffer;

import java.nio.ByteOrder;

import java.nio.FloatBuffer;

import javax.microedition.khronos.egl.EGLConfig;

import javax.microedition.khronos.opengles.GL10;

/**

* Created by Shuo.Wang on 2017/3/19.

*/

public class GLVideoRenderer implements GLSurfaceView.Renderer

, SurfaceTexture.OnFrameAvailableListener, MediaPlayer.OnVideoSizeChangedListener {

private static final String TAG = "GLRenderer";

private Context context;

private int aPositionLocation;

private int programId;

private FloatBuffer vertexBuffer;

private final float[] vertexData = {

1f,-1f,0f,

-1f,-1f,0f,

1f,1f,0f,

-1f,1f,0f

};

private final float[] projectionMatrix=new float[16];

private int uMatrixLocation;

private final float[] textureVertexData = {

1f,0f,

0f,0f,

1f,1f,

0f,1f

};

private FloatBuffer textureVertexBuffer;

private int uTextureSamplerLocation;

private int aTextureCoordLocation;

private int textureId;

private SurfaceTexture surfaceTexture;

private MediaPlayer mediaPlayer;

private float[] mSTMatrix = new float[16];

private int uSTMMatrixHandle;

private boolean updateSurface;

private boolean playerPrepared;

private int screenWidth,screenHeight;

public GLVideoRenderer(Context context,String videoPath) {

this.context = context;

playerPrepared=false;

synchronized(this) {

updateSurface = false;

}

vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4)

.order(ByteOrder.nativeOrder())

.asFloatBuffer()

.put(vertexData);

vertexBuffer.position(0);

textureVertexBuffer = ByteBuffer.allocateDirect(textureVertexData.length * 4)

.order(ByteOrder.nativeOrder())

.asFloatBuffer()

.put(textureVertexData);

textureVertexBuffer.position(0);

mediaPlayer=new MediaPlayer();

try{

mediaPlayer.setDataSource(context, Uri.parse(videoPath));

}catch (IOException e){

e.printStackTrace();

}

mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);

mediaPlayer.setLooping(true);

mediaPlayer.setOnVideoSizeChangedListener(this);

}

@Override

public void onSurfaceCreated(GL10 gl, EGLConfig config) {

String vertexShader = ShaderUtils.readRawTextFile(context, R.raw.simple_vertex_shader);

String fragmentShader= ShaderUtils.readRawTextFile(context, R.raw.simple_fragment_shader);

programId=ShaderUtils.createProgram(vertexShader,fragmentShader);

aPositionLocation= GLES20.glGetAttribLocation(programId,"aPosition");

uMatrixLocation=GLES20.glGetUniformLocation(programId,"uMatrix");

uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");

uTextureSamplerLocation=GLES20.glGetUniformLocation(programId,"sTexture");

aTextureCoordLocation=GLES20.glGetAttribLocation(programId,"aTexCoord");

int[] textures = new int[1];

GLES20.glGenTextures(1, textures, 0);

textureId = textures[0];

GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);

ShaderUtils.checkGlError("glBindTexture mTextureID");

/*GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用處?

之前提到視頻解碼的輸出格式是YUV的(YUV420p,應該是),那么這個擴展紋理的作用就是實現YUV格式到RGB的自動轉化,

我們就不需要再為此寫YUV轉RGB的代碼了*/

GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,

GLES20.GL_NEAREST);

GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,

GLES20.GL_LINEAR);

surfaceTexture = new SurfaceTexture(textureId);

surfaceTexture.setOnFrameAvailableListener(this);//監聽是否有新的一幀數據到來

Surface surface = new Surface(surfaceTexture);

mediaPlayer.setSurface(surface);

surface.release();

if (!playerPrepared){

try {

mediaPlayer.prepare();

playerPrepared=true;

} catch (IOException t) {

Log.e(TAG, "media player prepare failed");

}

mediaPlayer.start();

playerPrepared=true;

}

}

@Override

public void onSurfaceChanged(GL10 gl, int width, int height) {

Log.d(TAG, "onSurfaceChanged: "+width+" "+height);

screenWidth=width; screenHeight=height;

}

@Override

public void onDrawFrame(GL10 gl) {

GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);

synchronized (this){

if (updateSurface){

surfaceTexture.updateTexImage();//獲取新數據

surfaceTexture.getTransformMatrix(mSTMatrix);//讓新的紋理和紋理坐標系能夠正確的對應,mSTMatrix的定義是和projectionMatrix完全一樣的。

updateSurface = false;

}

}

GLES20.glUseProgram(programId);

GLES20.glUniformMatrix4fv(uMatrixLocation,1,false,projectionMatrix,0);

GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, mSTMatrix, 0);

vertexBuffer.position(0);

GLES20.glEnableVertexAttribArray(aPositionLocation);

GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,

12, vertexBuffer);

textureVertexBuffer.position(0);

GLES20.glEnableVertexAttribArray(aTextureCoordLocation);

GLES20.glVertexAttribPointer(aTextureCoordLocation,2,GLES20.GL_FLOAT,false,8,textureVertexBuffer);

GLES20.glActiveTexture(GLES20.GL_TEXTURE0);

GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textureId);

GLES20.glUniform1i(uTextureSamplerLocation,0);

GLES20.glViewport(0,0,screenWidth,screenHeight);

GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);

}

@Override

synchronized public void onFrameAvailable(SurfaceTexture surface) {

updateSurface = true;

}

@Override

public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {

Log.d(TAG, "onVideoSizeChanged: "+width+" "+height);

updateProjection(width,height);

}

private void updateProjection(int videoWidth, int videoHeight){

float screenRatio=(float)screenWidth/screenHeight;

float videoRatio=(float)videoWidth/videoHeight;

if (videoRatio>screenRatio){

Matrix.orthoM(projectionMatrix,0,-1f,1f,-videoRatio/screenRatio,videoRatio/screenRatio,-1f,1f);

}else Matrix.orthoM(projectionMatrix,0,-screenRatio/videoRatio,screenRatio/videoRatio,-1f,1f,-1f,1f);

}

public MediaPlayer getMediaPlayer() {

return mediaPlayer;

}

}

要實現上圖中的濾鏡視頻效果,只需用0.299,0.587,0.114 CRT中轉灰度的模型算法。(自己可以網上搜尋更多效果,這里只是拋磚引玉)

更改片段著色器即可:

#extension GL_OES_EGL_image_external : require

precision mediump float;

varying vec2 vTexCoord;

uniform samplerExternalOES sTexture;

void main() {

//gl_FragColor=texture2D(sTexture, vTexCoord);

vec3 centralColor = texture2D(sTexture, vTexCoord).rgb;

gl_FragColor = vec4(0.299*centralColor.r+0.587*centralColor.g+0.114*centralColor.b);

}

這里寫圖片描述

到此結束,我們已經實現了openGL ES+MediaPlayer 渲染播放視頻+濾鏡效果。后期將講述全景視頻原理及實現過程,敬請關注~

總結

以上是生活随笔為你收集整理的android 视频播放滤镜,用openGL ES+MediaPlayer 渲染播放视频+滤镜效果的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。