威尼斯人线上娱乐

【威尼斯人线上娱乐】ES来播音摄像,直播技术笔记

5 4月 , 2019  

unity中播放录像步骤如下:

unity5.陆伊始增添了videoPlayer,使得录像播放相对相比较简单,项目需求开始展览了弹指间钻探选取,也碰着很多坑,Google百度时而意识确实有这几个题材,一些简约难题如下:

在讲代码达成在此之前,笔者先讲讲TextureView, SurfaceTexture,OpenGL
ES都以些什么鬼东西,小编又是怎么使用那多少个东西来展现一个录制的。

按: 近期做了叁个直播的预备性商讨项目,
因而记录下直播的技术的达成,在那进度中有个别难题消除的笔触,以android平台的贯彻认证。

一.快要播放的录像拖入projec。(注意:unity1般援救的录制格式有mov, .mpg,
.mpeg, .mp5,.avi, .asf格式  )

1)播放无声音

TextureView
顾名思义也正是八个继续了View的三个View控件而已,官网的分解是这么的:
A TextureView can be used to display a content stream. Such a content
stream can for instance be a video or an OpenGL scene. The content
stream can come from the application’s process as well as a remote
process.

它亦可去体现3个内容流,比如录像流,OpenGL渲染的现象等。这个流能够是地方程序进度也足以是远程进度流,有点绕,作者的通晓就是,比如既可以是本地摄像流,也得以是互连网摄像流。
留神的是: TextureView
选取的是硬件加速器去渲染,就就好像录制的硬解码跟软解码,四个靠的是GPU解码,二个靠CPU解码。
那么什么样去行使那么些TextureView呢?
OK,现在SurfaceTexture就要上场了,从那四个类的命名我们就知晓TextureView重点是View,而SurfaceTexture
重点是Texture它的官网解释:
Captures frames from an image stream as an OpenGL ES texture.The image
stream may come from either camera preview or video decode. \

也正是说它能捕获2个图像流的1帧来作为OpenGL
的texture约等于纹理。那么些图片流首即使出自相机的预览或摄像的解码。(作者想以此天性是不应有能够用来做过多事了)。
到这儿,texture也有了,那么OpenGL\【威尼斯人线上娱乐】ES来播音摄像,直播技术笔记。也就能够出来工作了,它能够绑定texture并将其在TextureView上一帧壹帧的给绘制出来,就形成了大家所观望录像图像了(切实有关SurfaceTexture、TextureView我们能够参见那里)
说了这么,是该来点代码来瞧瞧了,好的代码就跟读军事学小说亦然,那样的雅观,并不是说自身写的代码绝对漂亮啦,那只是追求。。。

种类结构

贰.在场景中添加RawImage。(因为Image使用sprite渲染,rawImage是用texture渲染)

二)通过slider控制播放进度

代码

先从MainActicity主类开始:

public class MainActivity extends AppCompatActivity implements TextureView.SurfaceTextureListener,
        MediaPlayer.OnPreparedListener{
    /**本地视频的路径*/
    public String videoPath = Environment.getExternalStorageDirectory().getPath()+"/aoa.mkv";
    private TextureView textureView;
    private MediaPlayer mediaPlayer;
    /**
    * 视频绘制前的配置就发生在这个对象所在类中.
    * 真正的绘制工作则在它的子类中VideoTextureSurfaceRenderer
    */
    private TextureSurfaceRenderer videoRenderer;
    private int surfaceWidth;
    private int surfaceHeight;
    private Surface surface;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        textureView = (TextureView) findViewById(R.id.id_textureview);
        //注册一个SurfaceTexture,用于监听SurfaceTexure
        textureView.setSurfaceTextureListener(this);

    }
    /**
    * 播放视频的入口,当SurfaceTexure可得到时被调用
    */
    private void playVideo() {
        if (mediaPlayer == null) {
            videoRenderer = new VideoTextureSurfaceRenderer(this, textureView.getSurfaceTexture(), surfaceWidth, surfaceHeight);
            surface = new Surface(videoRenderer.getSurfaceTexture());
            initMediaPlayer();
        }
    }

    private void initMediaPlayer() {
        this.mediaPlayer = new MediaPlayer();
        try {
            mediaPlayer.setDataSource(videoPath);
            mediaPlayer.setSurface(surface);
            mediaPlayer.prepareAsync();
            mediaPlayer.setOnPreparedListener(this);
            mediaPlayer.setLooping(true);
        } catch (IllegalArgumentException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (SecurityException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IllegalStateException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IOException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        }
    }
    @Override
    public void onPrepared(MediaPlayer mp) {
        try {
            if (mp != null) {
                mp.start(); //视频开始播放了
            }
        } catch (IllegalStateException e) {
            e.printStackTrace();
        }
    }


    @Override
    protected void onResume() {
        super.onResume();
        if (textureView.isAvailable()) {
            playVideo();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();
        if (videoRenderer != null) {
            videoRenderer.onPause();  //记得去停止视频的绘制线程
        }
        if (mediaPlayer != null) {
            mediaPlayer.release();
            mediaPlayer =null;
        }
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        surfaceWidth = width;
        surfaceHeight = height;
        playVideo();
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }

}

那就是程序的入口类,关于Mediaplayer是怎么播放时录像源的,小编就在此就不说了,那其间其实还有很多东西的,我们能够活动的检察。有一点自个儿急需说说正是,一般MediaPlayer.setSurface(param)里头的参数param都以SurfaceView.SurfaceHolder,而自小编那时一向用的是Surface
(关于Surface能够参见那里),作者这个录像播放与其余的摄像播放的界别就在此。那篇先权且写在那时候啦,后续大旨的绘图工作,就前边有空就再写了。上边写的比方有何样难点期待我们能多多辅导,感谢不尽!
下壹篇已写好TextureView+SurfaceTexture+OpenGL
ES来播音录制(二)

  • unity纹理插件和录像采访(录像源)
    VideoSourceCamera
  • 迈克风范集(音频源)
    AudioSourceMIC
  • 录制编码
    VideoEncoder
  • 节奏编码
    AudioEncoder
  • FLV编码(混合)
    MuxerFLV
  • http流上传(上传源)
    PublisherHttp
  • 流录制播放(重放)
    play
  • OpenGL图形图象处理

3.rawImage下添加videoPlayer组件,将录制赋给videoplayer,将其拖到video
clip上。

三)录像截图(texture->texture二d)

从本篇小说发轫将会介绍那多少个零件的完毕细节,互相信赖关系的处理方式。

4.创办脚本PlayVodeoOnUGUI,宗旨代码:rawImage.texture =
videoPlayer.texture,即将video的tuxture赋值给rawImage就能来看要播放的录像了

四)录制甘休时事件激活

(壹) —— unity纹理插件

我们的直播项目劳务于unity,而unity是三个跨平台的游戏引擎,底层依据不一致平台,选用了directx,
opengl, opengles, 因而需求落成分化平台的图形插件。
(unity的图纸插件文档)
https://docs.unity3d.com/Manual/NativePluginInterface.html
在anroid平台下的直播,unity图形插件成效重点是渲染线程公告,
因为不论录像采访,创立三星平板,
图像处理(shader),照旧编码录制纹理传入,都亟待工作在unity的渲染线程下,

  • unity成立纹理,将纹理ID传递到直播插件。

  • 开拓camera设备,准备好采访三星平板,
    mCameraGLTexture =
    new GLTexture(width, height, GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
    GLES20.GL_威尼斯人线上娱乐 ,RGBA);
    note: camera
    GALAXY Tab是1种万分类型的纹理,通过GLES1一Ext.GL_TEXTURE_EXTERNAL_OES参数创设

  • 回调文告每1帧数据准备落成
    public void onFrameAvailable(final SurfaceTexture surfaceTexture)
    {
    //那里将募集线程的图象push到渲染线程处理
    getProcessor().append (new Task() {
    @Override
    public void run() {
    surfaceTexture.updateTexImage();
    }
    });
    }

    camera 苹果平板也急需做特别纹理声明

      #extension GL_OES_EGL_image_external : require
      precision mediump float;
      uniform samplerExternalOES uTexture0;
      varying vec2 texCoordinate;
      void main(){
          gl_FragColor = texture2D(uTexture0, texCoordinate);
      }
    
  • 将camera 华为平板纹理写入到 unity的纹理,
    将一张纹理写入到另一纹理,能够二种方法,

    • 透过glReadPixels, 但这样会造成巨大的内部存款和储蓄器拷贝,CPU压力。

    • 渲染到纹理(render to texture)
      mTextureCanvas = new
      GLRenderTexture(mGLTexture);//声明rendertexture

        void renderCamera2Texture()
        {
            mTextureCanvas.begin();
            cameraDrawObject.draw();
            mTextureCanvas.end();
        }
      

      GLRenderTexture的实现, 如下
      GLRenderTexture(GLTexture tex)
      {
      mTex = tex;
      int fboTex = tex.getTextureID();
      GLES20.glGenFramebuffers(1, bufferObjects, 0);
      GLHelper.checkGlError(“glGenFramebuffers”);
      fobID = bufferObjects[0];

            //创建render buffer
            GLES20.glGenRenderbuffers(1, bufferObjects, 0);
            renderBufferId = bufferObjects[0];
            //绑定Frame buffer
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            //Bind render buffer and define buffer dimension
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, tex.getWidth(), tex.getHeight());
            GLHelper.checkGlError("glRenderbufferStorage");
            //设置为framebuffer为texutre类型
            GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, fboTex, 0);
            GLHelper.checkGlError("glFramebufferTexture2D");
            //设置depthbuffer
            GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glFramebufferRenderbuffer");
            //we are done, reset
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, 0);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
            GLHelper.checkGlError("glBindFramebuffer");
        }
      
        void begin()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            GLES20.glViewport(0, 0, mTex.getWidth(), mTex.getHeight());
            GLHelper.checkGlError("glViewport");
        }
      
        void end()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
        }
      
  • 美颜
    因而shader实现实时的美颜功用,美白,磨皮
    (美颜功效的规律可参照)
    http://meituplus.com/?p=101
    (更加多的实时shader处理可参看)
    https://github.com/wuhaoyu1990/MagicCamera

大多无太大题材,以上多个问题一蹴即至方案在下文淡白紫文字区域,先介绍一下video
Player应用,后续对这四个难题举办缓解。

 

(1)新建video Player 能够在ui下田间video
Play组建,也得以一直右键-video-videoplayer,添加后方可知见如下图所示的零件

威尼斯人线上娱乐 1

正文首要重点说一下一下参数:source有三种格局clip形式和url格局,clip则足以平昔通过videoClip实行播放,url则足以通过url举行播报。renderMode为渲染形式,既能够为camera,material等,假使是行使ui播放的选取render
texture,本文采纳此形式。audioOutputMode有二种,none情势,direct方式(没尝试)和audiosource格局,本文选用audiosource情势,采取此情势时只供给将audiosource组建拖入上航海用图书馆中videoPlayer中的audiosource参数槽中即可,不须要任何处理,但神迹会并发拖入后videoPlayer中的audiosource参数槽消失,且无声音播放,所以一般选择代码添加,如下所示:

 

      //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

 

(二)摄像播放的决定与节奏/动画播放类似,videoPlayer有play/pause等措施,具体可以瞻仰前边完整代码。

         
在调用视频播放完成时事件loopPointReached(此处为借鉴旁人称作,此事件实际上并不是录制播放实现时的事件),顾名思义,此事件为达到规定的标准录像播放循环点时的风浪,即当videoplay
的isLooping属性为true(即循环播放录制)时,录制截至时调用此方法,所以当录制非循环播放时,此事件在录像甘休时调用不到。要想调用此措施能够把录像安装为循环播放,在loopPointReached钦点的事件中停播录像

(叁)关于摄像播放的ui选拔难题,选拔render texture时必要内定target
texture。

      
一)在project面板上create-renderTexture,并把新建的renderTexture拖到videoplayer相应的参数槽上

      
二)在Hierarchy面板上新建ui-RawImage,并把上一步新建的renderTexture拖到RawImage的texture上即可。

      
其实能够毫不这么处理,videoPlayer有texture变量,直接在update里面把texture值赋给RawImage的texture即可,代码如下

rawImage.texture = videoPlayer.texture;

      录像截图时能够因此videoPlayer.texture,把图像保存下来但是供给把texture转变为texture二d,即便后者继续在前者,但是力不从心强制转货回去,转换以及存款和储蓄图片代码如下:

   private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }

 

最终说一下经过slider控制录像播放进度的标题,

透过slider控制摄像播放存在多个难题,1方面在update实时把videoPlayer.time
赋值给slider,1方面供给把slider的value反馈给time,假诺用slider的OnValueChanged(float
value)
方法则设有冲突,导致难题。所以能够透过UI事件的BeginDrag和EndDrag事件

事件开始展览,即当BeginDrag时,甘休给slider赋值,当EndDrag时再也开始赋值。如下图所示

威尼斯人线上娱乐 2

 全代码

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.Video;

public class VideoController : MonoBehaviour {
    public GameObject screen;
    public Text videoLength;
    public Text currentLength;
    public Slider volumeSlider;
    public Slider videoSlider;

    private string video1Url;
    private string video2Url;
    private VideoPlayer videoPlayer;
    private AudioSource audioSource;
    private RawImage videoScreen;
    private float lastCountTime = 0;
    private float totalPlayTime = 0;
    private float totalVideoLength = 0;

    private bool b_firstVideo = true;
    private bool b_adjustVideo = false;
    private bool b_skip = false;
    private bool b_capture = false;

    private string imageDir =@"D:\test\Test\bwadmRe";

    // Use this for initialization
    void Start () {
        videoScreen = screen.GetComponent<RawImage>();
        string dir = Path.Combine(Application.streamingAssetsPath,"Test");
        video1Url = Path.Combine(dir, "01.mp4");
        video2Url = Path.Combine(dir, "02.mp4");

        //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

        videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
        videoPlayer.SetTargetAudioSource(0, audioSource);

        VideoInfoInit(video1Url);
        videoPlayer.loopPointReached += OnFinish;
    }

    #region private method
    private void VideoInfoInit(string url)
    {
        videoPlayer.source = VideoSource.Url;
        videoPlayer.url = url;        

        videoPlayer.prepareCompleted += OnPrepared;
        videoPlayer.isLooping = true;

        videoPlayer.Prepare();
    }

    private void OnPrepared(VideoPlayer player)
    {
        player.Play();
        totalVideoLength = videoPlayer.frameCount / videoPlayer.frameRate;
        videoSlider.maxValue = totalVideoLength;
        videoLength.text = FloatToTime(totalVideoLength);

        lastCountTime = 0;
        totalPlayTime = 0;
    }

    private string FloatToTime(float time)
    {
        int hour = (int)time / 3600;
        int min = (int)(time - hour * 3600) / 60;
        int sec = (int)(int)(time - hour * 3600) % 60;
        string text = string.Format("{0:D2}:{1:D2}:{2:D2}", hour, min, sec);
        return text;
    }

    private IEnumerator PlayTime(int count)
    {
        for(int i=0;i<count;i++)
        {
            yield return null;
        }
        videoSlider.value = (float)videoPlayer.time;
        //videoSlider.value = videoSlider.maxValue * (time / totalVideoLength);
    }

    private void OnFinish(VideoPlayer player)
    {
        Debug.Log("finished");        
    }

    private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }
    #endregion

    #region public method
    //开始
    public void OnStart()
    {
        videoPlayer.Play();
    }
    //暂停
    public void OnPause()
    {
        videoPlayer.Pause();
    }
    //下一个
    public void OnNext()
    {
        string nextUrl = b_firstVideo ? video2Url : video1Url;
        b_firstVideo = !b_firstVideo;

        videoSlider.value = 0;
        VideoInfoInit(nextUrl);
    }
    //音量控制
    public void OnVolumeChanged(float value)
    {
        audioSource.volume = value;
    }
    //视频控制
    public void OnVideoChanged(float value)
    {
        //videoPlayer.time = value;
        //print(value);
        //print(value);
    }
    public void OnPointerDown()
    {
        b_adjustVideo = true;
        b_skip = true;
        videoPlayer.Pause();
        //OnVideoChanged();
        //print("down");
    }
    public void OnPointerUp()
    {
        videoPlayer.time = videoSlider.value;

        videoPlayer.Play();
        b_adjustVideo = false;  
        //print("up");
    }
    public void OnCapture()
    {
        b_capture = true;
    }
    #endregion

    // Update is called once per frame
    void Update () {
        if (videoPlayer.isPlaying)
        {            
            videoScreen.texture = videoPlayer.texture;
            float time = (float)videoPlayer.time;
            currentLength.text = FloatToTime(time);

            if(b_capture)
            {
                string name = DateTime.Now.Minute.ToString() + "_" + DateTime.Now.Second.ToString() + ".png";
                SaveRenderTextureToPNG(videoPlayer.texture,Path.Combine(imageDir,name));                
                b_capture = false;
            }

            if(!b_adjustVideo)
            {
                totalPlayTime += Time.deltaTime;
                if (!b_skip)
                {
                    videoSlider.value = (float)videoPlayer.time;
                    lastCountTime = totalPlayTime;
                }                
                if (totalPlayTime - lastCountTime >= 0.8f)
                {
                    b_skip = false;
                }
            }
            //StartCoroutine(PlayTime(15));   

        }
    }
}

 

 

 

若是利用插件AVPro Video全部失水准小难题


相关文章

发表评论

电子邮件地址不会被公开。 必填项已用*标注

网站地图xml地图