How to use Android MediaCodec encode Camera data(YUV420sp)

Thank you for your focus! I want to use Android MediaCodec APIs to encode the video frame which aquired from Camera, unfortunately, I have not success to do that! I still not familiar with the MediaCodec API。 The follow is my codes,I need your help to figure out what I should do.

1、The Camera setting:

        Parameters parameters = mCamera.getParameters();
        parameters.setPreviewFormat(ImageFormat.NV21);
        parameters.setPreviewSize(320, 240);
        mCamera.setParameters(parameters);

2、Set the encoder:

    private void initCodec()
    {
        try
        {
            fos = new FileOutputStream(mVideoFile, false);
        }
        catch (FileNotFoundException e)
        {
            e.printStackTrace();
        }
        mMediaCodec = MediaCodec.createEncoderByType("video/avc");
        MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",
                320,
                240);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
                MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
        mMediaCodec.configure(mediaFormat,
                null,
                null,
                MediaCodec.CONFIGURE_FLAG_ENCODE);
        mMediaCodec.start();
        inputBuffers = mMediaCodec.getInputBuffers();
        outputBuffers = mMediaCodec.getOutputBuffers();

    }



    private void encode(byte[] data)
    {
        int inputBufferIndex = mMediaCodec.dequeueInputBuffer(0);
        if (inputBufferIndex >= 0)
        {
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(data);
            mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
        }
        else
        {
            return;
        }

        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
        Log.i(TAG, "outputBufferIndex-->" + outputBufferIndex);
        do
        {
            if (outputBufferIndex >= 0)
            {
                ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
                System.out.println("buffer info-->" + bufferInfo.offset + "--"
                        + bufferInfo.size + "--" + bufferInfo.flags + "--"
                        + bufferInfo.presentationTimeUs);
                byte[] outData = new byte[bufferInfo.size];
                outBuffer.get(outData);
                try
                {
                    if (bufferInfo.offset != 0)
                    {
                        fos.write(outData, bufferInfo.offset, outData.length
                                - bufferInfo.offset);
                    }
                    else
                    {
                        fos.write(outData, 0, outData.length);
                    }
                    fos.flush();
                    Log.i(TAG, "out data -- > " + outData.length);
                    mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
                    outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
                            0);
                }
                catch (IOException e)
                {
                    e.printStackTrace();
                }
            }
            else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
            {
                outputBuffers = mMediaCodec.getOutputBuffers();
            }
            else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
            {
                MediaFormat format = mMediaCodec.getOutputFormat();
            }
        } while (outputBufferIndex >= 0);

    }

I guess the problem occurred in the encoder method,the method will be used in the Camera Preview Callback ,like this

                initCodec();
                //mCamera.setPreviewCallback(new MyPreviewCallback());
                mCamera.setPreviewCallback(new PreviewCallback()
                {
                    @Override
                    public void onPreviewFrame(byte[] data, Camera camera)
                    {
                        encode(data);
                    }
                });

I just have no idea how to do it correctly with the MediaCodec API.Can you give me some advice or links about it?

Thank you!

Android MediaCodec – encode audio on the fly into aac

I’m trying to use the MediaCodec from Android to encode raw audio data into aac. I worked my way through the Andoid documentation which lacks a lot. I’m currently able to record audio using AudioRecor

How to use ByteBuffer in the MediaCodec context in android

So far I am able to setup a MediaCodec to encode a video stream. The aim is to save my user generated artwork into a video file. I use android Bitmap objects of the user artwork to push frames into th

Android MediaCodec video wrong color and playing too fast

I am encoding a video using MediaCodec using Camera’s setPreviewCallback. (I follow this example Encoding H.264 from camera with Android MediaCodec). I use the follow setting for the MediaCodec: media

Encoding AAC Audio using AudioRecord and MediaCodec on Android

I am trying to encode aac audio using android AudioRecord and MediaCodec. I have created a encoder class very similar to (Encoding H.264 from camera with Android MediaCodec). With this class, I create

Multiple instances of MediaCodec used as video encoder in Android

Is it possible to use two Android MediaCodec instances as video encoder to encode two videos simultaneously? I know that MediaCodec itself can have multiple instances, for video/audio encoding/decodin

Problems of using MediaCodec.getOutputFormat() for an encoder in Android 4.1/4.2 devices

I’m trying to use MediaCodec to encode frames (either by camera or decoder) into a video. When processing the encoder output by dequeueOutputBuffer(), I expect to receive the return index = MediaCodec

Android MediaCodec AAC encoder

I use the MediaCodec class provided by the Android SDK since API level 16 with the OMX.SEC.aac.enc encoder to encode audio to a file. I get the audio input from the AudioRecord class. My instance of t

mediacodec encode decode simultaneously?

With the new JB MediaCodec APIs, is it possible to encode and decode AMR simultaneously? I’m getting an illegalstateexception when I try to call dequeueOutputBuffer in the decoder side after I started

How do I downsize a YUV420SP (NV21) frame retrieved from Android camera?

I am currently working on an Android application that processes camera frames retrieved from Camera.PreviewCallback.onPreviewFrame(). These frames are encoded in YUV420SP format and provided as a byte

How to use my local computer camera in android emulator?

How to use my local computer camera in android emulator?

Answers

The YUV420 formats output by the camera are incompatible with the formats accepted by the MediaCodec AVC encoder. In the best case, it’s essentially NV12 vs. NV21 (U and V planes are reversed), requiring a manual reordering. In the worst case, as of Android 4.2, the encoder input format may be device-specific.

You’re better off using MediaRecorder to connect the camera hardware to the encoder.

Update: It’s now possible to pass the camera’s Surface preview to MediaCodec, instead of using the YUV data in the ByteBuffer. This is faster and more portable. See the CameraToMpegTest sample here.

I have solved the problem.As follows:

private synchronized void encode(byte[] data)
{
    inputBuffers = mMediaCodec.getInputBuffers();// here changes
    outputBuffers = mMediaCodec.getOutputBuffers();

    int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
    Log.i(TAG, "inputBufferIndex-->" + inputBufferIndex);
    //......

And next,you will find your encoded video color is not right, for more information,please go to here MediaCodec and Camera: colorspaces don’t match