展会信息港展会大全

Android Camera数据流完整分析
来源:互联网   发布日期:2015-09-29 10:12:22   浏览:4084次  

导读:Android Camera数据流完整分析之前已经有很多文章一直在讲述Android Camera,这里也算是进行以下总结我们依旧从camera 的打开开始,逐步看看camera的数据流向...

Android Camera数据流完整分析

之前已经有很多文章一直在讲述Android Camera,这里也算是进行以下总结

我们依旧从camera 的打开开始,逐步看看camera的数据流向,内存分配,首先打开camera的第一步,实例化camera类,onCreate被调用,在这个方法中到底做了些什么事情,也在这里做一下总结:

1.实例化FocusManager

2.开启一个CameraOpenThread用于打开camera的全部过程,mCameraOpenThread.start();

3.实例化PreferenceInflater,初始化一些参数

4.实例化TouchManager

5.实例化RotateDialogController等等等

重点是下面几行代码,之前已经不止一次提到,不在多说了

[html]

// don't set mSurfaceHolder here. We have it set ONLY within

// surfaceChanged / surfaceDestroyed, other parts of the code

// assume that when it is set, the surface is also set.

SurfaceView preview = (SurfaceView) findViewById(R.id.camera_preview);

SurfaceHolder holder = preview.getHolder();

holder.addCallback(this);

// don't set mSurfaceHolder here. We have it set ONLY within

// surfaceChanged / surfaceDestroyed, other parts of the code

// assume that when it is set, the surface is also set.

SurfaceView preview = (SurfaceView) findViewById(R.id.camera_preview);

SurfaceHolder holder = preview.getHolder();

holder.addCallback(this);另外这个方法中,还开启了一个startPreviewThread用于开始preview过程,初始化完成之后就开启preview功能

[html]

Thread mCameraPreviewThread = new Thread(new Runnable() {

public void run() {

initializeCapabilities();

startPreview(true);

}

});

Thread mCameraPreviewThread = new Thread(new Runnable() {

public void run() {

initializeCapabilities();

startPreview(true);

}

});这个线程进行初始化之后就调用了app层的startPreview方法,这里开始了万里长征的第一步

这里还是花一点时间说说camera的打开过程,从app调用到的open方法过程这之前的文章中已经讲过,在底层的open方法中完成以下工作

1.为上层调用注册调用接口

2.实例化hal层,并初始化,这个hal层初始化的过程同样之前的文章中已经详细讲过,这个初始化过程非常重要,直接影响着之后的操作

回到我们的startPreview,一步一步看看他的走向

app层---------------------------做一些基本初始化,最重要的就是setPreviewwindow这个方法了,之前也已经详细分析过了,然后调用frameworks层的方法

frameworks层 ---------------这里只是定义了startPreview这个方式,通过JNI层调用

JNI层---------------------------这里作为一个中间站,不做任何处理,转去调用下一层实现

camera client层-------------同样可以理解为一个中转站,通过binder机制接着往下走

camera server层------------初始化显示窗口属性,调用下一层

hardware interface层------转调camerahal_module中的方法

camera hal层----------------这个hal层的startPreview实现是整个preview过程中最为重要的部分,下面再说明

下面就是通过hal层分发命令通过V4LCameraAdapter或者OMXCameraAdapter与kernel driver进行交互

我们看一下server层的调用,相对较重要

[html]

status_t CameraService::Client::startPreview() {

LOG1("startPreview (pid %d)", getCallingPid());

return startCameraMode(<SPAN style="BACKGROUND-COLOR: rgb(51,255,51)">CAMERA_PREVIEW_MODE</SPAN>);

}

status_t CameraService::Client::startPreview() {

LOG1("startPreview (pid %d)", getCallingPid());

return startCameraMode(CAMERA_PREVIEW_MODE);

}[html] view plaincopyprint?// start preview or recording

status_t CameraService::Client::startCameraMode(camera_mode mode) {

LOG1("startCameraMode(%d)", mode);

Mutex::Autolock lock(mLock);

status_t result = checkPidAndHardware();

if (result != NO_ERROR) return result;

switch(mode) {

case CAMERA_PREVIEW_MODE:

if (mSurface == 0 && mPreviewWindow == 0) {

LOG1("mSurface is not set yet.");

// still able to start preview in this case.

}

<SPAN style="BACKGROUND-COLOR: rgb(51,255,51)">return startPreviewMode();</SPAN>

case CAMERA_RECORDING_MODE:

if (mSurface == 0 && mPreviewWindow == 0) {

LOGE("mSurface or mPreviewWindow must be set before startRecordingMode.");

return INVALID_OPERATION;

}

return startRecordingMode();

default:

return UNKNOWN_ERROR;

}

}

// start preview or recording

status_t CameraService::Client::startCameraMode(camera_mode mode) {

LOG1("startCameraMode(%d)", mode);

Mutex::Autolock lock(mLock);

status_t result = checkPidAndHardware();

if (result != NO_ERROR) return result;

switch(mode) {

case CAMERA_PREVIEW_MODE:

if (mSurface == 0 && mPreviewWindow == 0) {

LOG1("mSurface is not set yet.");

// still able to start preview in this case.

}

return startPreviewMode();

case CAMERA_RECORDING_MODE:

if (mSurface == 0 && mPreviewWindow == 0) {

LOGE("mSurface or mPreviewWindow must be set before startRecordingMode.");

return INVALID_OPERATION;

}

return startRecordingMode();

default:

return UNKNOWN_ERROR;

}

}[html] view plaincopyprint?status_t CameraService::Client::startPreviewMode() {

LOG1("startPreviewMode");

status_t result = NO_ERROR;

// if preview has been enabled, nothing needs to be done

if (mHardware->previewEnabled()) {

return NO_ERROR;

}

if (mPreviewWindow != 0) {

native_window_set_scaling_mode(mPreviewWindow.get(),

NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);

native_window_set_buffers_transform(mPreviewWindow.get(),

mOrientation);

}

#ifdef OMAP_ENHANCEMENT

disableMsgType(CAMERA_MSG_COMPRESSED_BURST_IMAGE);

#endif

mHardware->setPreviewWindow(mPreviewWindow);

result = mHardware->startPreview();

return result;

}

status_t CameraService::Client::startPreviewMode() {

LOG1("startPreviewMode");

status_t result = NO_ERROR;

// if preview has been enabled, nothing needs to be done

if (mHardware->previewEnabled()) {

return NO_ERROR;

}

if (mPreviewWindow != 0) {

native_window_set_scaling_mode(mPreviewWindow.get(),

NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);

native_window_set_buffers_transform(mPreviewWindow.get(),

mOrientation);

}

#ifdef OMAP_ENHANCEMENT

disableMsgType(CAMERA_MSG_COMPRESSED_BURST_IMAGE);

#endif

mHardware->setPreviewWindow(mPreviewWindow);

result = mHardware->startPreview();

return result;

}

在这个startPreviewMode方法中,同样对previewWindow进行了着重的初始化,这个window一直都是非常重要的,这里我一直没有说到setPreviewwindow这个方法的调用,只是因为这片文章的重点不在这里,但是不代表你可以小视他

这里我们把重点方法hal层中,在hal层的startPreview方法中,首先调用preview之前的initialize方法完成一些很重要的初始化过程,这里其实我学习的不是很透彻,仅个人思路,主要以下两个过程重点关注

1.申请preview buffers

2.hal向camera adapter发送CAMERA_USE_BUFFERS_PREVIEW命令

hal层的sendCommand方法最终会调用OMXCameraAdapter的UseBuffers方法,这个方法使OMXCameraAdapter的组件接口与传入的camerabuffer绑定起来

[html]

status_t OMXCameraAdapter::UseBuffersPreview(CameraBuffer * bufArr, int num)

{

status_t ret = NO_ERROR;

OMX_ERRORTYPE eError = OMX_ErrorNone;

int tmpHeight, tmpWidth;

LOG_FUNCTION_NAME;

if(!bufArr)

{

CAMHAL_LOGEA("NULL pointer passed for buffArr");

LOG_FUNCTION_NAME_EXIT;

return BAD_VALUE;

}

OMXCameraPortParameters * mPreviewData = NULL;

OMXCameraPortParameters *measurementData = NULL;

mPreviewData = &mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mPrevPortIndex];//<SPAN style="FONT-FAMILY: Arial, Helvetica, sans-serif">mPreviewData 指针指向preview port</SPAN>

measurementData = &mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mMeasurementPortIndex];

mPreviewData->mNumBufs = num ;

if ( 0 != mUsePreviewSem.Count() )

{

CAMHAL_LOGEB("Error mUsePreviewSem semaphore count %d", mUsePreviewSem.Count());

LOG_FUNCTION_NAME_EXIT;

return NO_INIT;

}

if(mPreviewData->mNumBufs != num)

{

CAMHAL_LOGEA("Current number of buffers doesnt equal new num of buffers passed!");

LOG_FUNCTION_NAME_EXIT;

return BAD_VALUE;

}

mStateSwitchLock.lock();

if ( mComponentState == OMX_StateLoaded ) {

if (mPendingPreviewSettings & SetLDC) {

mPendingPreviewSettings &= ~SetLDC;

ret = setLDC(mIPP);

if ( NO_ERROR != ret ) {

CAMHAL_LOGEB("setLDC() failed %d", ret);

}

}

if (mPendingPreviewSettings & SetNSF) {

mPendingPreviewSettings &= ~SetNSF;

ret = setNSF(mIPP);

if ( NO_ERROR != ret ) {

CAMHAL_LOGEB("setNSF() failed %d", ret);

}

}

if (mPendingPreviewSettings & SetCapMode) {

mPendingPreviewSettings &= ~SetCapMode;

ret = setCaptureMode(mCapMode);

if ( NO_ERROR != ret ) {

CAMHAL_LOGEB("setCaptureMode() failed %d", ret);

}

}

if(mCapMode == OMXCameraAdapter::VIDEO_MODE) {

if (mPendingPreviewSettings & SetVNF) {

mPendingPreviewSettings &= ~SetVNF;

ret = enableVideoNoiseFilter(mVnfEnabled);

if ( NO_ERROR != ret){

CAMHAL_LOGEB("Error configuring VNF %x", ret);

}

}

if (mPendingPreviewSettings & SetVSTAB) {

mPendingPreviewSettings &= ~SetVSTAB;

ret = enableVideoStabilization(mVstabEnabled);

if ( NO_ERROR != ret) {

CAMHAL_LOGEB("Error configuring VSTAB %x", ret);

}

}

}

}

ret = setSensorOrientation(mSensorOrientation);

if ( NO_ERROR != ret )

{

CAMHAL_LOGEB("Error configuring Sensor Orientation %x", ret);

mSensorOrientation = 0;

}

if ( mComponentState == OMX_StateLoaded )

{

///Register for IDLE state switch event

ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

OMX_EventCmdComplete,

OMX_CommandStateSet,

OMX_StateIdle,

mUsePreviewSem);

if(ret!=NO_ERROR)

{

CAMHAL_LOGEB("Error in registering for event %d", ret);

goto EXIT;

}

///Once we get the buffers, move component state to idle state and pass the buffers to OMX comp using UseBuffer

eError = OMX_SendCommand (mCameraAdapterParameters.mHandleComp ,

OMX_CommandStateSet,

OMX_StateIdle,

NULL);

CAMHAL_LOGDB("OMX_SendCommand(OMX_CommandStateSet) 0x%x", eError);

GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

mComponentState = OMX_StateIdle;

}

else

{

///Register for Preview port ENABLE event

ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

OMX_EventCmdComplete,

OMX_CommandPortEnable,

mCameraAdapterParameters.mPrevPortIndex,

mUsePreviewSem);

if ( NO_ERROR != ret )

{

CAMHAL_LOGEB("Error in registering for event %d", ret);

goto EXIT;

}

///Enable Preview Port

eError = OMX_SendCommand(mCameraAdapterParameters.mHandleComp,

OMX_CommandPortEnable,

mCameraAdapterParameters.mPrevPortIndex,

NULL);

}

///Configure DOMX to use either gralloc handles or vptrs

OMX_TI_PARAMUSENATIVEBUFFER domxUseGrallocHandles;

OMX_INIT_STRUCT_PTR (&domxUseGrallocHandles, OMX_TI_PARAMUSENATIVEBUFFER);

domxUseGrallocHandles.nPortIndex = mCameraAdapterParameters.mPrevPortIndex;

domxUseGrallocHandles.bEnable = OMX_TRUE;

eError = OMX_SetParameter(mCameraAdapterParameters.mHandleComp,

(OMX_INDEXTYPE)OMX_TI_IndexUseNativeBuffers, &domxUseGrallocHandles);

if(eError!=OMX_ErrorNone)

{

CAMHAL_LOGEB("OMX_SetParameter - %x", eError);

}

GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

OMX_BUFFERHEADERTYPE *pBufferHdr;

for(int index=0;index<num;index++) {

OMX_U8 *ptr;

ptr = (OMX_U8 *)camera_buffer_get_omx_ptr (&bufArr[index]);

<SPAN style="BACKGROUND-COLOR: rgb(255,204,0)">eError = OMX_UseBuffer( mCameraAdapterParameters.mHandleComp,

&pBufferHdr,

mCameraAdapterParameters.mPrevPortIndex,

0,

mPreviewData->mBufSize,

ptr);</SPAN>

if(eError!=OMX_ErrorNone)

{

CAMHAL_LOGEB("OMX_UseBuffer-0x%x", eError);

}

GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

pBufferHdr->pAppPrivate = (OMX_PTR)&bufArr[index];

pBufferHdr->nSize = sizeof(OMX_BUFFERHEADERTYPE);

pBufferHdr->nVersion.s.nVersionMajor = 1 ;

pBufferHdr->nVersion.s.nVersionMinor = 1 ;

pBufferHdr->nVersion.s.nRevision = 0 ;

pBufferHdr->nVersion.s.nStep =0;

<SPAN style="BACKGROUND-COLOR: rgb(51,255,51)">mPreviewData->mBufferHeader[index] = pBufferHdr;这里组件的属性结构指针通过bufferheader与申请的camerabuffer绑定到一起了</SPAN>

}

if ( mMeasurementEnabled )

{

for( int i = 0; i < num; i++ )

{

OMX_BUFFERHEADERTYPE *pBufHdr;

OMX_U8 *ptr;

ptr = (OMX_U8 *)camera_buffer_get_omx_ptr (&mPreviewDataBuffers[i]);

eError = OMX_UseBuffer( mCameraAdapterParameters.mHandleComp,

&pBufHdr,

mCameraAdapterParameters.mMeasurementPortIndex,

0,

measurementData->mBufSize,

ptr);

if ( eError == OMX_ErrorNone )

{

pBufHdr->pAppPrivate = (OMX_PTR *)&mPreviewDataBuffers[i];

pBufHdr->nSize = sizeof(OMX_BUFFERHEADERTYPE);

pBufHdr->nVersion.s.nVersionMajor = 1 ;

pBufHdr->nVersion.s.nVersionMinor = 1 ;

pBufHdr->nVersion.s.nRevision = 0 ;

pBufHdr->nVersion.s.nStep =0;

measurementData->mBufferHeader[i] = pBufHdr;

}

else

{

CAMHAL_LOGEB("OMX_UseBuffer -0x%x", eError);

ret = BAD_VALUE;

break;

}

}

}

CAMHAL_LOGDA("Registering preview buffers");

ret = mUsePreviewSem.WaitTimeout(OMX_CMD_TIMEOUT);

//If somethiing bad happened while we wait

if (mComponentState == OMX_StateInvalid)

{

CAMHAL_LOGEA("Invalid State after Registering preview buffers Exitting!!!");

goto EXIT;

}

if ( NO_ERROR == ret )

{

CAMHAL_LOGDA("Preview buffer registration successfull");

}

else

{

if ( mComponentState == OMX_StateLoaded )

{

ret |= RemoveEvent(mCameraAdapterParameters.mHandleComp,

OMX_EventCmdComplete,

OMX_CommandStateSet,

OMX_StateIdle,

NULL);

}

else

{

ret |= SignalEvent(mCameraAdapterParameters.mHandleComp,

OMX_EventCmdComplete,

OMX_CommandPortEnable,

mCameraAdapterParameters.mPrevPortIndex,

NULL);

}

CAMHAL_LOGEA("Timeout expired on preview buffer registration");

goto EXIT;

}

LOG_FUNCTION_NAME_EXIT;

return (ret | ErrorUtils::omxToAndroidError(eError));

///If there is any failure, we reach here.

///Here, we do any resource freeing and convert from OMX error code to Camera Hal error code

EXIT:

mStateSwitchLock.unlock();

CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, ret, eError);

performCleanupAfterError();

CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, ret, eError);

LOG_FUNCTION_NAME_EXIT;

return (ret | ErrorUtils::omxToAndroidError(eError));

}

status_t OMXCameraAdapter::UseBuffersPreview(CameraBuffer * bufArr, int num)

{

status_t ret = NO_ERROR;

OMX_ERRORTYPE eError = OMX_ErrorNone;

int tmpHeight, tmpWidth;

LOG_FUNCTION_NAME;

if(!bufArr)

{

CAMHAL_LOGEA("NULL pointer passed for buffArr");

LOG_FUNCTION_NAME_EXIT;

return BAD_VALUE;

}

OMXCameraPortParameters * mPreviewData = NULL;

OMXCameraPortParameters *measurementData = NULL;

mPreviewData = &mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mPrevPortIndex];//mPreviewData 指针指向preview port

measurementData = &mCameraAdapterParameters.mCameraPortParams[mCameraAdapterParameters.mMeasurementPortIndex];

mPreviewData->mNumBufs = num ;

if ( 0 != mUsePreviewSem.Count() )

{

CAMHAL_LOGEB("Error mUsePreviewSem semaphore count %d", mUsePreviewSem.Count());

LOG_FUNCTION_NAME_EXIT;

return NO_INIT;

}

if(mPreviewData->mNumBufs != num)

{

CAMHAL_LOGEA("Current number of buffers doesnt equal new num of buffers passed!");

LOG_FUNCTION_NAME_EXIT;

return BAD_VALUE;

}

mStateSwitchLock.lock();

if ( mComponentState == OMX_StateLoaded ) {

if (mPendingPreviewSettings & SetLDC) {

mPendingPreviewSettings &= ~SetLDC;

ret = setLDC(mIPP);

if ( NO_ERROR != ret ) {

CAMHAL_LOGEB("setLDC() failed %d", ret);

}

}

if (mPendingPreviewSettings & SetNSF) {

mPendingPreviewSettings &= ~SetNSF;

ret = setNSF(mIPP);

if ( NO_ERROR != ret ) {

CAMHAL_LOGEB("setNSF() failed %d", ret);

}

}

if (mPendingPreviewSettings & SetCapMode) {

mPendingPreviewSettings &= ~SetCapMode;

ret = setCaptureMode(mCapMode);

if ( NO_ERROR != ret ) {

CAMHAL_LOGEB("setCaptureMode() failed %d", ret);

}

}

if(mCapMode == OMXCameraAdapter::VIDEO_MODE) {

if (mPendingPreviewSettings & SetVNF) {

mPendingPreviewSettings &= ~SetVNF;

ret = enableVideoNoiseFilter(mVnfEnabled);

if ( NO_ERROR != ret){

CAMHAL_LOGEB("Error configuring VNF %x", ret);

}

}

if (mPendingPreviewSettings & SetVSTAB) {

mPendingPreviewSettings &= ~SetVSTAB;

ret = enableVideoStabilization(mVstabEnabled);

if ( NO_ERROR != ret) {

CAMHAL_LOGEB("Error configuring VSTAB %x", ret);

}

}

}

}

ret = setSensorOrientation(mSensorOrientation);

if ( NO_ERROR != ret )

{

CAMHAL_LOGEB("Error configuring Sensor Orientation %x", ret);

mSensorOrientation = 0;

}

if ( mComponentState == OMX_StateLoaded )

{

///Register for IDLE state switch event

ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

OMX_EventCmdComplete,

OMX_CommandStateSet,

OMX_StateIdle,

mUsePreviewSem);

if(ret!=NO_ERROR)

{

CAMHAL_LOGEB("Error in registering for event %d", ret);

goto EXIT;

}

///Once we get the buffers, move component state to idle state and pass the buffers to OMX comp using UseBuffer

eError = OMX_SendCommand (mCameraAdapterParameters.mHandleComp ,

OMX_CommandStateSet,

OMX_StateIdle,

NULL);

CAMHAL_LOGDB("OMX_SendCommand(OMX_CommandStateSet) 0x%x", eError);

GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

mComponentState = OMX_StateIdle;

}

else

{

///Register for Preview port ENABLE event

ret = RegisterForEvent(mCameraAdapterParameters.mHandleComp,

OMX_EventCmdComplete,

OMX_CommandPortEnable,

mCameraAdapterParameters.mPrevPortIndex,

mUsePreviewSem);

if ( NO_ERROR != ret )

{

CAMHAL_LOGEB("Error in registering for event %d", ret);

goto EXIT;

}

///Enable Preview Port

eError = OMX_SendCommand(mCameraAdapterParameters.mHandleComp,

OMX_CommandPortEnable,

mCameraAdapterParameters.mPrevPortIndex,

NULL);

}

///Configure DOMX to use either gralloc handles or vptrs

OMX_TI_PARAMUSENATIVEBUFFER domxUseGrallocHandles;

OMX_INIT_STRUCT_PTR (&domxUseGrallocHandles, OMX_TI_PARAMUSENATIVEBUFFER);

domxUseGrallocHandles.nPortIndex = mCameraAdapterParameters.mPrevPortIndex;

domxUseGrallocHandles.bEnable = OMX_TRUE;

eError = OMX_SetParameter(mCameraAdapterParameters.mHandleComp,

(OMX_INDEXTYPE)OMX_TI_IndexUseNativeBuffers, &domxUseGrallocHandles);

if(eError!=OMX_ErrorNone)

{

CAMHAL_LOGEB("OMX_SetParameter - %x", eError);

}

GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

OMX_BUFFERHEADERTYPE *pBufferHdr;

for(int index=0;index<num;index++) {

OMX_U8 *ptr;

ptr = (OMX_U8 *)camera_buffer_get_omx_ptr (&bufArr[index]);

eError = OMX_UseBuffer( mCameraAdapterParameters.mHandleComp,

&pBufferHdr,

mCameraAdapterParameters.mPrevPortIndex,

0,

mPreviewData->mBufSize,

ptr);

if(eError!=OMX_ErrorNone)

{

CAMHAL_LOGEB("OMX_UseBuffer-0x%x", eError);

}

GOTO_EXIT_IF((eError!=OMX_ErrorNone), eError);

pBufferHdr->pAppPrivate = (OMX_PTR)&bufArr[index];

pBufferHdr->nSize = sizeof(OMX_BUFFERHEADERTYPE);

pBufferHdr->nVersion.s.nVersionMajor = 1 ;

pBufferHdr->nVersion.s.nVersionMinor = 1 ;

pBufferHdr->nVersion.s.nRevision = 0 ;

pBufferHdr->nVersion.s.nStep =0;

mPreviewData->mBufferHeader[index] = pBufferHdr;这里组件的属性结构指针通过bufferheader与申请的camerabuffer绑定到一起了

}

if ( mMeasurementEnabled )

{

for( int i = 0; i < num; i++ )

{

OMX_BUFFERHEADERTYPE *pBufHdr;

OMX_U8 *ptr;

ptr = (OMX_U8 *)camera_buffer_get_omx_ptr (&mPreviewDataBuffers[i]);

eError = OMX_UseBuffer( mCameraAdapterParameters.mHandleComp,

&pBufHdr,

mCameraAdapterParameters.mMeasurementPortIndex,

0,

measurementData->mBufSize,

ptr);

if ( eError == OMX_ErrorNone )

{

pBufHdr->pAppPrivate = (OMX_PTR *)&mPreviewDataBuffers[i];

pBufHdr->nSize = sizeof(OMX_BUFFERHEADERTYPE);

pBufHdr->nVersion.s.nVersionMajor = 1 ;

pBufHdr->nVersion.s.nVersionMinor = 1 ;

pBufHdr->nVersion.s.nRevision = 0 ;

pBufHdr->nVersion.s.nStep =0;

measurementData->mBufferHeader[i] = pBufHdr;

}

else

{

CAMHAL_LOGEB("OMX_UseBuffer -0x%x", eError);

ret = BAD_VALUE;

break;

}

}

}

CAMHAL_LOGDA("Registering preview buffers");

ret = mUsePreviewSem.WaitTimeout(OMX_CMD_TIMEOUT);

//If somethiing bad happened while we wait

if (mComponentState == OMX_StateInvalid)

{

CAMHAL_LOGEA("Invalid State after Registering preview buffers Exitting!!!");

goto EXIT;

}

if ( NO_ERROR == ret )

{

CAMHAL_LOGDA("Preview buffer registration successfull");

}

else

{

if ( mComponentState == OMX_StateLoaded )

{

ret |= RemoveEvent(mCameraAdapterParameters.mHandleComp,

OMX_EventCmdComplete,

OMX_CommandStateSet,

OMX_StateIdle,

NULL);

}

else

{

ret |= SignalEvent(mCameraAdapterParameters.mHandleComp,

OMX_EventCmdComplete,

OMX_CommandPortEnable,

mCameraAdapterParameters.mPrevPortIndex,

NULL);

}

CAMHAL_LOGEA("Timeout expired on preview buffer registration");

goto EXIT;

}

LOG_FUNCTION_NAME_EXIT;

return (ret | ErrorUtils::omxToAndroidError(eError));

///If there is any failure, we reach here.

///Here, we do any resource freeing and convert from OMX error code to Camera Hal error code

EXIT:

mStateSwitchLock.unlock();

CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, ret, eError);

performCleanupAfterError();

CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, ret, eError);

LOG_FUNCTION_NAME_EXIT;

return (ret | ErrorUtils::omxToAndroidError(eError));

}

上面所说的过程个人感觉比较抽象,越到底层越是不太好理解了

上面camerbuffer申请处理结束,setpreviewwindow初始化结束,下面就是开始真正的preview,在真正开始preview这个方法中,其实总结起来只有一点,就是通过fillbuffer方法将你申请好的buffer填充到组件port,等待port处理完视频数据会放到你填充给他的buffer中,好像你让别人帮你盛饭似的,你得把饭碗给别人,人家盛饭完了,再把饭碗给你,这时候饭碗里已经有饭了,同样的,我们的buffer中已经有数据了,那我们怎么得到这个数据呢??我们给到port的buffer如果已经填满了数据,当然我们开始给到port时这个buffer是空的,port填充满这个buffer,会通知我们,同时携带着把这个buffer给到我们,这是通过fillbufferdone这个回调实现的

[html]

#endif

/*========================================================*/

/* @ fn SampleTest_FillBufferDone ::Application callback*/

/*========================================================*/

OMX_ERRORTYPE OMXCameraAdapter::OMXCameraAdapterFillBufferDone(OMX_IN OMX_HANDLETYPE hComponent,

OMX_IN OMX_BUFFERHEADERTYPE* pBuffHeader)

{

status_tstat = NO_ERROR;

status_tres1, res2;

OMXCameraPortParameters*pPortParam;

OMX_ERRORTYPE eError = OMX_ErrorNone;

CameraFrame::FrameType typeOfFrame = CameraFrame::ALL_FRAMES;

unsigned int refCount = 0;

BaseCameraAdapter::AdapterState state, nextState;

BaseCameraAdapter::getState(state);

BaseCameraAdapter::getNextState(nextState);

sp<CameraMetadataResult> metadataResult = NULL;

unsigned int mask = 0xFFFF;

CameraFrame cameraFrame;

OMX_OTHER_EXTRADATATYPE *extraData;

OMX_TI_ANCILLARYDATATYPE *ancillaryData = NULL;

bool snapshotFrame = false;

if ( NULL == pBuffHeader ) {

return OMX_ErrorBadParameter;

}

#ifdef CAMERAHAL_OMX_PROFILING

storeProfilingData(pBuffHeader);

#endif

res1 = res2 = NO_ERROR;

if ( !pBuffHeader || !pBuffHeader->pBuffer ) {

CAMHAL_LOGEA("NULL Buffer from OMX");

return OMX_ErrorNone;

}

pPortParam = &(mCameraAdapterParameters.mCameraPortParams[pBuffHeader->nOutputPortIndex]);

// Find buffer and mark it as filled

for (int i = 0; i < pPortParam->mNumBufs; i++) {

if (pPortParam->mBufferHeader[i] == pBuffHeader) {

pPortParam->mStatus[i] = OMXCameraPortParameters::DONE;

}

}

if (pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_PREVIEW)

{

if ( ( PREVIEW_ACTIVE & state ) != PREVIEW_ACTIVE )

{

return OMX_ErrorNone;

}

if ( mWaitingForSnapshot ) {

extraData = getExtradata(pBuffHeader->pPlatformPrivate,

(OMX_EXTRADATATYPE) OMX_AncillaryData);

if ( NULL != extraData ) {

ancillaryData = (OMX_TI_ANCILLARYDATATYPE*) extraData->data;

if ((OMX_2D_Snap == ancillaryData->eCameraView)

|| (OMX_3D_Left_Snap == ancillaryData->eCameraView)

|| (OMX_3D_Right_Snap == ancillaryData->eCameraView)) {

snapshotFrame = OMX_TRUE;

} else {

snapshotFrame = OMX_FALSE;

}

mPending3Asettings |= SetFocus;

}

}

///Prepare the frames to be sent - initialize CameraFrame object and reference count

// TODO(XXX): ancillary data for snapshot frame is not being sent for video snapshot

//if we are waiting for a snapshot and in video mode...go ahead and send

//this frame as a snapshot

if( mWaitingForSnapshot &&(mCapturedFrames > 0) &&

(snapshotFrame || (mCapMode == VIDEO_MODE)))

{

typeOfFrame = CameraFrame::SNAPSHOT_FRAME;

mask = (unsigned int)CameraFrame::SNAPSHOT_FRAME;

// video snapshot gets ancillary data and wb info from last snapshot frame

mCaptureAncillaryData = ancillaryData;

mWhiteBalanceData = NULL;

extraData = getExtradata(pBuffHeader->pPlatformPrivate,

(OMX_EXTRADATATYPE) OMX_WhiteBalance);

if ( NULL != extraData )

{

mWhiteBalanceData = (OMX_TI_WHITEBALANCERESULTTYPE*) extraData->data;

}

}

else

{

typeOfFrame = CameraFrame::PREVIEW_FRAME_SYNC;

mask = (unsigned int)CameraFrame::PREVIEW_FRAME_SYNC;

}

if (mRecording)

{

mask |= (unsigned int)CameraFrame::VIDEO_FRAME_SYNC;

mFramesWithEncoder++;

}

//LOGV("FBD pBuffer = 0x%x", pBuffHeader->pBuffer);

if( mWaitingForSnapshot )

{

if (!mBracketingEnabled &&

((HIGH_SPEED == mCapMode) || (VIDEO_MODE == mCapMode)) )

{

notifyShutterSubscribers();

}

}

stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

mFramesWithDisplay++;

mFramesWithDucati--;

#ifdef CAMERAHAL_DEBUG

if(mBuffersWithDucati.indexOfKey((uint32_t)pBuffHeader->pBuffer)<0)

{

LOGE("Buffer was never with Ducati!! %p", pBuffHeader->pBuffer);

for(unsigned int i=0;i<mBuffersWithDucati.size();i++) LOGE("0x%x", mBuffersWithDucati.keyAt(i));

}

mBuffersWithDucati.removeItem((int)pBuffHeader->pBuffer);

#endif

if(mDebugFcs)

CAMHAL_LOGEB("C[%d] D[%d] E[%d]", mFramesWithDucati, mFramesWithDisplay, mFramesWithEncoder);

recalculateFPS();

createPreviewMetadata(pBuffHeader, metadataResult, pPortParam->mWidth, pPortParam->mHeight);

if ( NULL != metadataResult.get() ) {

notifyMetadataSubscribers(metadataResult);

metadataResult.clear();

}

{

Mutex::Autolock lock(mFaceDetectionLock);

if ( mFDSwitchAlgoPriority ) {

//Disable region priority and enable face priority for AF

setAlgoPriority(REGION_PRIORITY, FOCUS_ALGO, false);

setAlgoPriority(FACE_PRIORITY, FOCUS_ALGO , true);

//Disable Region priority and enable Face priority

setAlgoPriority(REGION_PRIORITY, EXPOSURE_ALGO, false);

setAlgoPriority(FACE_PRIORITY, EXPOSURE_ALGO, true);

mFDSwitchAlgoPriority = false;

}

}

sniffDccFileDataSave(pBuffHeader);

stat |= advanceZoom();

// On the fly update to 3A settings not working

// Do not update 3A here if we are in the middle of a capture

// or in the middle of transitioning to it

if( mPending3Asettings &&

( (nextState & CAPTURE_ACTIVE) == 0 ) &&

( (state & CAPTURE_ACTIVE) == 0 ) ) {

apply3Asettings(mParameters3A);

}

}

else if( pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_MEASUREMENT )

{

typeOfFrame = CameraFrame::FRAME_DATA_SYNC;

mask = (unsigned int)CameraFrame::FRAME_DATA_SYNC;

stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

}

else if( pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_IMAGE_OUT_IMAGE )

{

OMX_COLOR_FORMATTYPE pixFormat;

const char *valstr = NULL;

pixFormat = pPortParam->mColorFormat;

if ( OMX_COLOR_FormatUnused == pixFormat )

{

typeOfFrame = CameraFrame::IMAGE_FRAME;

mask = (unsigned int) CameraFrame::IMAGE_FRAME;

} else if ( pixFormat == OMX_COLOR_FormatCbYCrY &&

((mPictureFormatFromClient &&

!strcmp(mPictureFormatFromClient,

CameraParameters::PIXEL_FORMAT_JPEG)) ||

!mPictureFormatFromClient) ) {

// signals to callbacks that this needs to be coverted to jpeg

// before returning to framework

typeOfFrame = CameraFrame::IMAGE_FRAME;

mask = (unsigned int) CameraFrame::IMAGE_FRAME;

cameraFrame.mQuirks |= CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG;

cameraFrame.mQuirks |= CameraFrame::FORMAT_YUV422I_UYVY;

// populate exif data and pass to subscribers via quirk

// subscriber is in charge of freeing exif data

ExifElementsTable* exif = new ExifElementsTable();

setupEXIF_libjpeg(exif, mCaptureAncillaryData, mWhiteBalanceData);

cameraFrame.mQuirks |= CameraFrame::HAS_EXIF_DATA;

cameraFrame.mCookie2 = (void*) exif;

} else {

typeOfFrame = CameraFrame::RAW_FRAME;

mask = (unsigned int) CameraFrame::RAW_FRAME;

}

pPortParam->mImageType = typeOfFrame;

if((mCapturedFrames>0) && !mCaptureSignalled)

{

mCaptureSignalled = true;

mCaptureSem.Signal();

}

if( ( CAPTURE_ACTIVE & state ) != CAPTURE_ACTIVE )

{

goto EXIT;

}

{

Mutex::Autolock lock(mBracketingLock);

if ( mBracketingEnabled )

{

doBracketing(pBuffHeader, typeOfFrame);

return eError;

}

}

if (mZoomBracketingEnabled) {

doZoom(mZoomBracketingValues[mCurrentZoomBracketing]);

CAMHAL_LOGDB("Current Zoom Bracketing: %d", mZoomBracketingValues[mCurrentZoomBracketing]);

mCurrentZoomBracketing++;

if (mCurrentZoomBracketing == ARRAY_SIZE(mZoomBracketingValues)) {

mZoomBracketingEnabled = false;

}

}

if ( 1 > mCapturedFrames )

{

goto EXIT;

}

#ifdef OMAP_ENHANCEMENT_CPCAM

setMetaData(cameraFrame.mMetaData, pBuffHeader->pPlatformPrivate);

#endif

CAMHAL_LOGDB("Captured Frames: %d", mCapturedFrames);

mCapturedFrames--;

#ifdef CAMERAHAL_USE_RAW_IMAGE_SAVING

if (mYuvCapture) {

struct timeval timeStampUsec;

gettimeofday(&timeStampUsec, NULL);

time_t saveTime;

time(&saveTime);

const struct tm * const timeStamp = gmtime(&saveTime);

char filename[256];

snprintf(filename,256, "%s/yuv_%d_%d_%d_%lu.yuv",

kYuvImagesOutputDirPath,

timeStamp->tm_hour,

timeStamp->tm_min,

timeStamp->tm_sec,

timeStampUsec.tv_usec);

const status_t saveBufferStatus = saveBufferToFile(((CameraBuffer*)pBuffHeader->pAppPrivate)->mapped,

pBuffHeader->nFilledLen, filename);

if (saveBufferStatus != OK) {

CAMHAL_LOGE("ERROR: %d, while saving yuv!", saveBufferStatus);

} else {

CAMHAL_LOGD("yuv_%d_%d_%d_%lu.yuv successfully saved in %s",

timeStamp->tm_hour,

timeStamp->tm_min,

timeStamp->tm_sec,

timeStampUsec.tv_usec,

kYuvImagesOutputDirPath);

}

}

#endif

stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

}

else if (pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_VIDEO) {

typeOfFrame = CameraFrame::RAW_FRAME;

pPortParam->mImageType = typeOfFrame;

{

Mutex::Autolock lock(mLock);

if( ( CAPTURE_ACTIVE & state ) != CAPTURE_ACTIVE ) {

goto EXIT;

}

}

CAMHAL_LOGD("RAW buffer done on video port, length = %d", pBuffHeader->nFilledLen);

mask = (unsigned int) CameraFrame::RAW_FRAME;

#ifdef CAMERAHAL_USE_RAW_IMAGE_SAVING

if ( mRawCapture ) {

struct timeval timeStampUsec;

gettimeofday(&timeStampUsec, NULL);

time_t saveTime;

time(&saveTime);

const struct tm * const timeStamp = gmtime(&saveTime);

char filename[256];

snprintf(filename,256, "%s/raw_%d_%d_%d_%lu.raw",

kRawImagesOutputDirPath,

timeStamp->tm_hour,

timeStamp->tm_min,

timeStamp->tm_sec,

timeStampUsec.tv_usec);

const status_t saveBufferStatus = saveBufferToFile( ((CameraBuffer*)pBuffHeader->pAppPrivate)->mapped,

pBuffHeader->nFilledLen, filename);

if (saveBufferStatus != OK) {

CAMHAL_LOGE("ERROR: %d , while saving raw!", saveBufferStatus);

} else {

CAMHAL_LOGD("raw_%d_%d_%d_%lu.raw successfully saved in %s",

timeStamp->tm_hour,

timeStamp->tm_min,

timeStamp->tm_sec,

timeStampUsec.tv_usec,

kRawImagesOutputDirPath);

stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

}

}

#endif

} else {

CAMHAL_LOGEA("Frame received for non-(preview/capture/measure) port. This is yet to be supported");

goto EXIT;

}

if ( NO_ERROR != stat )

{

CameraBuffer *camera_buffer;

camera_buffer = (CameraBuffer *)pBuffHeader->pAppPrivate;

CAMHAL_LOGDB("sendFrameToSubscribers error: %d", stat);

returnFrame(camera_buffer, typeOfFrame);

}

return eError;

EXIT:

CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, stat, eError);

if ( NO_ERROR != stat )

{

if ( NULL != mErrorNotifier )

{

mErrorNotifier->errorNotify(CAMERA_ERROR_UNKNOWN);

}

}

return eError;

}

#endif

/*========================================================*/

/* @ fn SampleTest_FillBufferDone ::Application callback*/

/*========================================================*/

OMX_ERRORTYPE OMXCameraAdapter::OMXCameraAdapterFillBufferDone(OMX_IN OMX_HANDLETYPE hComponent,

OMX_IN OMX_BUFFERHEADERTYPE* pBuffHeader)

{

status_tstat = NO_ERROR;

status_tres1, res2;

OMXCameraPortParameters*pPortParam;

OMX_ERRORTYPE eError = OMX_ErrorNone;

CameraFrame::FrameType typeOfFrame = CameraFrame::ALL_FRAMES;

unsigned int refCount = 0;

BaseCameraAdapter::AdapterState state, nextState;

BaseCameraAdapter::getState(state);

BaseCameraAdapter::getNextState(nextState);

sp<CameraMetadataResult> metadataResult = NULL;

unsigned int mask = 0xFFFF;

CameraFrame cameraFrame;

OMX_OTHER_EXTRADATATYPE *extraData;

OMX_TI_ANCILLARYDATATYPE *ancillaryData = NULL;

bool snapshotFrame = false;

if ( NULL == pBuffHeader ) {

return OMX_ErrorBadParameter;

}

#ifdef CAMERAHAL_OMX_PROFILING

storeProfilingData(pBuffHeader);

#endif

res1 = res2 = NO_ERROR;

if ( !pBuffHeader || !pBuffHeader->pBuffer ) {

CAMHAL_LOGEA("NULL Buffer from OMX");

return OMX_ErrorNone;

}

pPortParam = &(mCameraAdapterParameters.mCameraPortParams[pBuffHeader->nOutputPortIndex]);

// Find buffer and mark it as filled

for (int i = 0; i < pPortParam->mNumBufs; i++) {

if (pPortParam->mBufferHeader[i] == pBuffHeader) {

pPortParam->mStatus[i] = OMXCameraPortParameters::DONE;

}

}

if (pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_PREVIEW)

{

if ( ( PREVIEW_ACTIVE & state ) != PREVIEW_ACTIVE )

{

return OMX_ErrorNone;

}

if ( mWaitingForSnapshot ) {

extraData = getExtradata(pBuffHeader->pPlatformPrivate,

(OMX_EXTRADATATYPE) OMX_AncillaryData);

if ( NULL != extraData ) {

ancillaryData = (OMX_TI_ANCILLARYDATATYPE*) extraData->data;

if ((OMX_2D_Snap == ancillaryData->eCameraView)

|| (OMX_3D_Left_Snap == ancillaryData->eCameraView)

|| (OMX_3D_Right_Snap == ancillaryData->eCameraView)) {

snapshotFrame = OMX_TRUE;

} else {

snapshotFrame = OMX_FALSE;

}

mPending3Asettings |= SetFocus;

}

}

///Prepare the frames to be sent - initialize CameraFrame object and reference count

// TODO(XXX): ancillary data for snapshot frame is not being sent for video snapshot

//if we are waiting for a snapshot and in video mode...go ahead and send

//this frame as a snapshot

if( mWaitingForSnapshot &&(mCapturedFrames > 0) &&

(snapshotFrame || (mCapMode == VIDEO_MODE)))

{

typeOfFrame = CameraFrame::SNAPSHOT_FRAME;

mask = (unsigned int)CameraFrame::SNAPSHOT_FRAME;

// video snapshot gets ancillary data and wb info from last snapshot frame

mCaptureAncillaryData = ancillaryData;

mWhiteBalanceData = NULL;

extraData = getExtradata(pBuffHeader->pPlatformPrivate,

(OMX_EXTRADATATYPE) OMX_WhiteBalance);

if ( NULL != extraData )

{

mWhiteBalanceData = (OMX_TI_WHITEBALANCERESULTTYPE*) extraData->data;

}

}

else

{

typeOfFrame = CameraFrame::PREVIEW_FRAME_SYNC;

mask = (unsigned int)CameraFrame::PREVIEW_FRAME_SYNC;

}

if (mRecording)

{

mask |= (unsigned int)CameraFrame::VIDEO_FRAME_SYNC;

mFramesWithEncoder++;

}

//LOGV("FBD pBuffer = 0x%x", pBuffHeader->pBuffer);

if( mWaitingForSnapshot )

{

if (!mBracketingEnabled &&

((HIGH_SPEED == mCapMode) || (VIDEO_MODE == mCapMode)) )

{

notifyShutterSubscribers();

}

}

stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

mFramesWithDisplay++;

mFramesWithDucati--;

#ifdef CAMERAHAL_DEBUG

if(mBuffersWithDucati.indexOfKey((uint32_t)pBuffHeader->pBuffer)<0)

{

LOGE("Buffer was never with Ducati!! %p", pBuffHeader->pBuffer);

for(unsigned int i=0;i<mBuffersWithDucati.size();i++) LOGE("0x%x", mBuffersWithDucati.keyAt(i));

}

mBuffersWithDucati.removeItem((int)pBuffHeader->pBuffer);

#endif

if(mDebugFcs)

CAMHAL_LOGEB("C[%d] D[%d] E[%d]", mFramesWithDucati, mFramesWithDisplay, mFramesWithEncoder);

recalculateFPS();

createPreviewMetadata(pBuffHeader, metadataResult, pPortParam->mWidth, pPortParam->mHeight);

if ( NULL != metadataResult.get() ) {

notifyMetadataSubscribers(metadataResult);

metadataResult.clear();

}

{

Mutex::Autolock lock(mFaceDetectionLock);

if ( mFDSwitchAlgoPriority ) {

//Disable region priority and enable face priority for AF

setAlgoPriority(REGION_PRIORITY, FOCUS_ALGO, false);

setAlgoPriority(FACE_PRIORITY, FOCUS_ALGO , true);

//Disable Region priority and enable Face priority

setAlgoPriority(REGION_PRIORITY, EXPOSURE_ALGO, false);

setAlgoPriority(FACE_PRIORITY, EXPOSURE_ALGO, true);

mFDSwitchAlgoPriority = false;

}

}

sniffDccFileDataSave(pBuffHeader);

stat |= advanceZoom();

// On the fly update to 3A settings not working

// Do not update 3A here if we are in the middle of a capture

// or in the middle of transitioning to it

if( mPending3Asettings &&

( (nextState & CAPTURE_ACTIVE) == 0 ) &&

( (state & CAPTURE_ACTIVE) == 0 ) ) {

apply3Asettings(mParameters3A);

}

}

else if( pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_MEASUREMENT )

{

typeOfFrame = CameraFrame::FRAME_DATA_SYNC;

mask = (unsigned int)CameraFrame::FRAME_DATA_SYNC;

stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

}

else if( pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_IMAGE_OUT_IMAGE )

{

OMX_COLOR_FORMATTYPE pixFormat;

const char *valstr = NULL;

pixFormat = pPortParam->mColorFormat;

if ( OMX_COLOR_FormatUnused == pixFormat )

{

typeOfFrame = CameraFrame::IMAGE_FRAME;

mask = (unsigned int) CameraFrame::IMAGE_FRAME;

} else if ( pixFormat == OMX_COLOR_FormatCbYCrY &&

((mPictureFormatFromClient &&

!strcmp(mPictureFormatFromClient,

CameraParameters::PIXEL_FORMAT_JPEG)) ||

!mPictureFormatFromClient) ) {

// signals to callbacks that this needs to be coverted to jpeg

// before returning to framework

typeOfFrame = CameraFrame::IMAGE_FRAME;

mask = (unsigned int) CameraFrame::IMAGE_FRAME;

cameraFrame.mQuirks |= CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG;

cameraFrame.mQuirks |= CameraFrame::FORMAT_YUV422I_UYVY;

// populate exif data and pass to subscribers via quirk

// subscriber is in charge of freeing exif data

ExifElementsTable* exif = new ExifElementsTable();

setupEXIF_libjpeg(exif, mCaptureAncillaryData, mWhiteBalanceData);

cameraFrame.mQuirks |= CameraFrame::HAS_EXIF_DATA;

cameraFrame.mCookie2 = (void*) exif;

} else {

typeOfFrame = CameraFrame::RAW_FRAME;

mask = (unsigned int) CameraFrame::RAW_FRAME;

}

pPortParam->mImageType = typeOfFrame;

if((mCapturedFrames>0) && !mCaptureSignalled)

{

mCaptureSignalled = true;

mCaptureSem.Signal();

}

if( ( CAPTURE_ACTIVE & state ) != CAPTURE_ACTIVE )

{

goto EXIT;

}

{

Mutex::Autolock lock(mBracketingLock);

if ( mBracketingEnabled )

{

doBracketing(pBuffHeader, typeOfFrame);

return eError;

}

}

if (mZoomBracketingEnabled) {

doZoom(mZoomBracketingValues[mCurrentZoomBracketing]);

CAMHAL_LOGDB("Current Zoom Bracketing: %d", mZoomBracketingValues[mCurrentZoomBracketing]);

mCurrentZoomBracketing++;

if (mCurrentZoomBracketing == ARRAY_SIZE(mZoomBracketingValues)) {

mZoomBracketingEnabled = false;

}

}

if ( 1 > mCapturedFrames )

{

goto EXIT;

}

#ifdef OMAP_ENHANCEMENT_CPCAM

setMetaData(cameraFrame.mMetaData, pBuffHeader->pPlatformPrivate);

#endif

CAMHAL_LOGDB("Captured Frames: %d", mCapturedFrames);

mCapturedFrames--;

#ifdef CAMERAHAL_USE_RAW_IMAGE_SAVING

if (mYuvCapture) {

struct timeval timeStampUsec;

gettimeofday(&timeStampUsec, NULL);

time_t saveTime;

time(&saveTime);

const struct tm * const timeStamp = gmtime(&saveTime);

char filename[256];

snprintf(filename,256, "%s/yuv_%d_%d_%d_%lu.yuv",

kYuvImagesOutputDirPath,

timeStamp->tm_hour,

timeStamp->tm_min,

timeStamp->tm_sec,

timeStampUsec.tv_usec);

const status_t saveBufferStatus = saveBufferToFile(((CameraBuffer*)pBuffHeader->pAppPrivate)->mapped,

pBuffHeader->nFilledLen, filename);

if (saveBufferStatus != OK) {

CAMHAL_LOGE("ERROR: %d, while saving yuv!", saveBufferStatus);

} else {

CAMHAL_LOGD("yuv_%d_%d_%d_%lu.yuv successfully saved in %s",

timeStamp->tm_hour,

timeStamp->tm_min,

timeStamp->tm_sec,

timeStampUsec.tv_usec,

kYuvImagesOutputDirPath);

}

}

#endif

stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

}

else if (pBuffHeader->nOutputPortIndex == OMX_CAMERA_PORT_VIDEO_OUT_VIDEO) {

typeOfFrame = CameraFrame::RAW_FRAME;

pPortParam->mImageType = typeOfFrame;

{

Mutex::Autolock lock(mLock);

if( ( CAPTURE_ACTIVE & state ) != CAPTURE_ACTIVE ) {

goto EXIT;

}

}

CAMHAL_LOGD("RAW buffer done on video port, length = %d", pBuffHeader->nFilledLen);

mask = (unsigned int) CameraFrame::RAW_FRAME;

#ifdef CAMERAHAL_USE_RAW_IMAGE_SAVING

if ( mRawCapture ) {

struct timeval timeStampUsec;

gettimeofday(&timeStampUsec, NULL);

time_t saveTime;

time(&saveTime);

const struct tm * const timeStamp = gmtime(&saveTime);

char filename[256];

snprintf(filename,256, "%s/raw_%d_%d_%d_%lu.raw",

kRawImagesOutputDirPath,

timeStamp->tm_hour,

timeStamp->tm_min,

timeStamp->tm_sec,

timeStampUsec.tv_usec);

const status_t saveBufferStatus = saveBufferToFile( ((CameraBuffer*)pBuffHeader->pAppPrivate)->mapped,

pBuffHeader->nFilledLen, filename);

if (saveBufferStatus != OK) {

CAMHAL_LOGE("ERROR: %d , while saving raw!", saveBufferStatus);

} else {

CAMHAL_LOGD("raw_%d_%d_%d_%lu.raw successfully saved in %s",

timeStamp->tm_hour,

timeStamp->tm_min,

timeStamp->tm_sec,

timeStampUsec.tv_usec,

kRawImagesOutputDirPath);

stat = sendCallBacks(cameraFrame, pBuffHeader, mask, pPortParam);

}

}

#endif

} else {

CAMHAL_LOGEA("Frame received for non-(preview/capture/measure) port. This is yet to be supported");

goto EXIT;

}

if ( NO_ERROR != stat )

{

CameraBuffer *camera_buffer;

camera_buffer = (CameraBuffer *)pBuffHeader->pAppPrivate;

CAMHAL_LOGDB("sendFrameToSubscribers error: %d", stat);

returnFrame(camera_buffer, typeOfFrame);

}

return eError;

EXIT:

CAMHAL_LOGEB("Exiting function %s because of ret %d eError=%x", __FUNCTION__, stat, eError);

if ( NO_ERROR != stat )

{

if ( NULL != mErrorNotifier )

{

mErrorNotifier->errorNotify(CAMERA_ERROR_UNKNOWN);

}

}

return eError;

}

暂且先到这里

赞助本站

人工智能实验室

相关热词: android开发 教程

AiLab云推荐
推荐内容
展开

热门栏目HotCates

Copyright © 2010-2024 AiLab Team. 人工智能实验室 版权所有    关于我们 | 联系我们 | 广告服务 | 公司动态 | 免责声明 | 隐私条款 | 工作机会 | 展会港