Android APP Camera2应用(04)录像&保存视频流程 您所在的位置:网站首页 摄像头录制保存 Android APP Camera2应用(04)录像&保存视频流程

Android APP Camera2应用(04)录像&保存视频流程

2024-01-08 05:09| 来源: 网络整理| 查看: 265

说明:camera子系统 系列文章针对Android10.0系统,主要针对 camera API2 + HAL3 框架进行解读。

1 录像&保存视频流程简要解读

@1 当预览创建之后,点击录像 button,触发录像事件,首先是停止预览,准备切换到录制视频模式,该部分关键代码如下所示:

private void closeCameraCaptureSession() { if (mCaptureSession != null) { Log.i(TAG, "close Session:" + mCaptureSession.toString()); mCaptureSession.close(); mCaptureSession = null; } } //... //camera record process,step1 停止预览,准备切换到录制视频 try { mCaptureSession.stopRepeating(); closeCameraCaptureSession(); } catch (CameraAccessException e) { e.printStackTrace(); }

@2 创建 MediaRecorder 实例,并初始化相关设置,,关键代码如下所示:

//camera record process,step2 mMediaRecorder相关设置 mVideoFile = new File(mContext.getExternalCacheDir(),new SimpleDateFormat("yyyyMMddHHmmss").format(new Date()) +"demo.mp4"); mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);//设置音频来源 mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);//设置视频来源 mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);//设置输出格式 mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);//设置音频编码格式,请注意这里使用默认,实际app项目需要考虑兼容问题,应该选择AAC mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);//设置视频编码格式,请注意这里使用默认,实际app项目需要考虑兼容问题,应该选择H264 mMediaRecorder.setVideoEncodingBitRate(8*1024*1920);//设置比特率 一般是 1*分辨率 到 10*分辨率之间波动。比特率越大视频越清晰但是视频文件也越大。 mMediaRecorder.setVideoFrameRate(30);//设置帧数. mMediaRecorder.setVideoSize(mPreviewSize.getWidth(),mPreviewSize.getHeight()); mMediaRecorder.setOrientationHint(90); Surface surface = new Surface(mTextureView.getSurfaceTexture()); mMediaRecorder.setPreviewDisplay(surface); mMediaRecorder.setOutputFile(mVideoFile.getAbsolutePath()); try { mMediaRecorder.prepare(); } catch (IOException e) { e.printStackTrace(); }

@3 调用 CameraDevice.CreateRequest(CameraDevice.TEMPLATE_RECORED)方法, 为新的捕获请求创建一个 CaptureRequest.Build 对象,并用CameraDevice.TEMPLATE_RECORED 参数初始化,关键代码如下所示:

//camera record process,step3 创建CaptureRequest.Build 对象,并用CameraDevice.TEMPLATE_RECORED 参数初始化 mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD); mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);

@4 调用 CaptureRequestBuilder.addTarget()方法,将MediaRecorder和预览用的Surface实例添加到该请求的目标列表中,关键代码如下所示:

SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture(); surfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(),mPreviewSize.getHeight()); Surface previewSurface = new Surface(surfaceTexture); Surface recorderSurface = mMediaRecorder.getSurface();//从获取录制视频需要的Surface //camera record process,step4 将MediaRecorder和预览用的Surface实例添加到该请求的目标列表中 mPreviewRequestBuilder.addTarget(previewSurface); mPreviewRequestBuilder.addTarget(recorderSurface);

@5 执行CameraDevice.CreateCaptureSession 方法,通过提供目标输出集来创建新 的捕获会话,该方法传入三个参数:

List:新的用于捕获图像信息的 Surface 集合,此处为显示预览 信息的 surface 实例,以及记录图像信息用的 MediaRecorder 的实例CameraCaptureSession.StateCallback:用于通知新捕获 session 的callbackHandler:为一个句柄,代表执行 callback 的 handler,如果程序希望直 接在当前线程中执行 callback,则可以将 handler 参数设为 null

同时在这里重写onConfigured()方法,调用CameraCaptureSession . setRepeatingReqest()方法,通过此捕获session,持续重复捕获图像(也可以调用 CaptureRequestBuilder.set()方法,设置捕获的参数,可在此处设置 3A算法相)。最后调用 MediaRecorder.start()方法,开始捕获数据并将数据编码到指定文件。关键代码如下所示:

//camera record process,step5 执行CameraDevice.CreateCaptureSession 方法 mCameraDevice.createCaptureSession(Arrays.asList(previewSurface,recorderSurface),new CameraCaptureSession.StateCallback() { @Override public void onConfigured(CameraCaptureSession cameraCaptureSession) { mCaptureSession = cameraCaptureSession; Log.i(TAG, "Video Session:" + mCaptureSession.toString()); mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO); mCameraHandler.post(new Runnable() { @Override public void run() { try { mCaptureSession.setRepeatingRequest(mPreviewRequestBuilder.build(), null, mCameraHandler); } catch (CameraAccessException e) { throw new RuntimeException("Can't visit camera"); } } }); mMediaRecorder.start(); } @Override public void onConfigureFailed(CameraCaptureSession cameraCaptureSession) { } },mCameraHandler);

@6 录像后 当点击停止录像时,执行CameraCaptureSession的stopRepeating()方法,取消持续捕获,同时调用 CameraCaptureSession的abortCapture()方法,尽可能快地丢弃当前待处理和正在进行的所有捕获(若不做这两步处理,停止录像时会闪退)关键代码如下所示:

Log.d(TAG,"stopRecorder"); try { //camera record process,step6 取消持续捕获& mCaptureSession.stopRepeating(); mCaptureSession.abortCaptures(); } catch (CameraAccessException e) { e.printStackTrace(); }

@7 调用 MediaRecorder.stop,停止图像捕获 并且重启预览模式(根据需要可以此时处理视频,使得系统Gallery可以直接查看到该视频),关键代码如下所示:

//camera record process,step7 停止图像捕获 并且重启预览模式 mMediaRecorder.stop(); mMediaRecorder.reset(); //可根据需要将视频设置为系统gallery可见。 mCameraHandler.post(new FileUtils.VideoSaver(mContext, mVideoFile)); startCaptureSession();

关于视频对Gallery可见,以下为关键参考代码:

public static class VideoSaver implements Runnable { private final File mFile; Context mContext; VideoSaver(Context context,File file) { mContext = context; mFile = file; } private ContentValues getVideoContentValues(File paramFile,long paramLong) { ContentValues values = new ContentValues(); values.put(MediaStore.Video.Media.TITLE, paramFile.getName()); values.put(MediaStore.Video.Media.DISPLAY_NAME, paramFile.getName()); values.put(MediaStore.Video.Media.MIME_TYPE, "video/mp4"); values.put(MediaStore.Video.Media.DATE_TAKEN, Long.valueOf(paramLong)); values.put(MediaStore.Video.Media.DATE_MODIFIED, Long.valueOf(paramLong)); values.put(MediaStore.Video.Media.DATE_ADDED, Long.valueOf(paramLong)); values.put(MediaStore.Video.Media.DATA, paramFile.getAbsolutePath()); values.put(MediaStore.Video.Media.SIZE, Long.valueOf(paramFile.length())); return values; } @Override public void run() { Log.d(TAG, "recorder video Run"); ContentResolver localContentResolver = mContext.getContentResolver(); ContentValues localContentValues = getVideoContentValues(mFile, System.currentTimeMillis()); Uri localUri = localContentResolver.insert(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, localContentValues); OutputStream os = null; FileInputStream fis = null; byte[] buf = new byte[1024]; int len; try { if (localUri != null) { fis = new FileInputStream(mFile); os = localContentResolver.openOutputStream(localUri); } if (os != null) { while ((len = fis.read(buf)) >= 0) { os.write(buf, 0, len); } } } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } finally { try { if(os!=null) { os.close(); } if(fis!=null){ fis.close(); } } catch (IOException e) { e.printStackTrace(); } } } } 2 camera录像&保存视频流程代码完整解读 2.1 java源码部分(草稿)

Camera流程相关代码如下所示:

class CameraCoreManager { private static final String TAG = "CameraDemo"; private Context mContext; private CameraManager mCameraManager; private String mCameraId; private HandlerThread mCameraThread; private Handler mCameraHandler; private ImageReader mImageReader; private CameraDevice mCameraDevice; private CameraCharacteristics mCameraCharacteristics; private MediaRecorder mMediaRecorder; //Max preview width&height that is guaranteed by Camera2 API private static final int MAX_PREVIEW_WIDTH = 1080; private static final int MAX_PREVIEW_HEIGHT = 720; //A Semaphore to prevent the app from exiting before closing the camera. private Semaphore mCameraOpenCloseLock = new Semaphore(1); private Size mPreviewSize = new Size(1920, 1080); private CaptureRequest.Builder mPreviewRequestBuilder; private CaptureRequest.Builder mCaptureRequestBuilder; private CaptureRequest.Builder mRecordRequestBuilder; private CameraCaptureSession mCaptureSession; private int mFacing = CameraCharacteristics.LENS_FACING_BACK; private Choreographer.FrameCallback mFrameCallback; private SurfaceTexture mSurfaceTexture; private File mCameraFile; private File mVideoFile; private TextureView mTextureView; private enum State{ STATE_PREVIEW, STATE_CAPTURE, } State mState = State.STATE_PREVIEW; //camera capture process,step3 创建ImageReader并设置mImageAvailableListener,实现如下: private ImageReader.OnImageAvailableListener mImageAvailableListener = new ImageReader.OnImageAvailableListener() { @Override public void onImageAvailable(ImageReader reader) { if(mState == State.STATE_PREVIEW){ //Log.d(TAG, "##### onFrame: Preview"); Image image = reader.acquireNextImage(); image.close(); }else if(mState == State.STATE_CAPTURE) { Log.d(TAG,"capture one picture to gallery"); mCameraFile = new File("aa_" + new SimpleDateFormat("yyyyMMddHHmmss").format(new Date()) + ".jpg"); mCameraHandler.post(new FileUtils.ImageSaver(mContext, reader.acquireLatestImage(), mCameraFile)); mState = State.STATE_PREVIEW; }else{ Log.d(TAG, "##### onFrame: default/nothing"); } } }; //camera preview process,step2 mStateCallback 实例化 private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() { @Override public void onOpened(CameraDevice camera) { //重写onOpened方法,最为关键 mCameraOpenCloseLock.release(); mCameraDevice = camera; startCaptureSession(); } @Override public void onDisconnected(CameraDevice camera) { mCameraOpenCloseLock.release(); camera.close(); mCameraDevice = null; } @Override public void onError(CameraDevice camera, int error) { Log.e("DEBUG", "onError: " + error); mCameraOpenCloseLock.release(); camera.close(); mCameraDevice = null; Log.e("DEBUG", "onError: restart camera"); stopPreview(); startPreview(); } }; CameraCaptureSession.CaptureCallback mCaptureCallback = new CameraCaptureSession.CaptureCallback() { @Override public void onCaptureCompleted(CameraCaptureSession session,CaptureRequest request, TotalCaptureResult result) { super.onCaptureCompleted(session, request, result); } @Override public void onCaptureFailed(CameraCaptureSession session, CaptureRequest request, CaptureFailure failure) { super.onCaptureFailed(session, request, failure); } }; public CameraCoreManager(Context context, TextureView textureView) { mContext = context; mCameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE); mMediaRecorder = new MediaRecorder(); mState = State.STATE_PREVIEW; mTextureView = textureView; } public void startPreview() { Log.d(TAG,"startPreview"); if (!chooseCameraIdByFacing()) { Log.e(TAG, "Choose camera failed."); return; } mCameraThread = new HandlerThread("CameraThread"); mCameraThread.start(); mCameraHandler = new Handler(mCameraThread.getLooper()); if (mImageReader == null) { mImageReader = ImageReader.newInstance(mPreviewSize.getWidth(), mPreviewSize.getHeight(), ImageFormat.JPEG, 2); mImageReader.setOnImageAvailableListener(mImageAvailableListener, mCameraHandler); }else{ mImageReader.close(); } openCamera(); } public void stopPreview() { Log.d(TAG,"stopPreview"); closeCamera(); if (mCameraThread != null) { mCameraThread.quitSafely(); mCameraThread = null; } mCameraHandler = null; } private boolean chooseCameraIdByFacing() { try { String ids[] = mCameraManager.getCameraIdList(); if (ids.length == 0) { Log.e(TAG, "No available camera."); return false; } for (String cameraId : mCameraManager.getCameraIdList()) { CameraCharacteristics characteristics = mCameraManager.getCameraCharacteristics(cameraId); StreamConfigurationMap map = characteristics.get( CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP); if (map == null) { continue; } Integer level = characteristics.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL); if (level == null || level == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY) { continue; } Integer internal = characteristics.get(CameraCharacteristics.LENS_FACING); if (internal == null) { continue; } if (internal == mFacing) { mCameraId = cameraId; mCameraCharacteristics = characteristics; return true; } } mCameraId = ids[1]; mCameraCharacteristics = mCameraManager.getCameraCharacteristics(mCameraId); Integer level = mCameraCharacteristics.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL); if (level == null || level == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY) { return false; } Integer internal = mCameraCharacteristics.get(CameraCharacteristics.LENS_FACING); if (internal == null) { return false; } mFacing = CameraCharacteristics.LENS_FACING_BACK; } catch (CameraAccessException e) { e.printStackTrace(); } return true; } @SuppressLint("MissingPermission") public void openCamera() { if (TextUtils.isEmpty(mCameraId)) { Log.e(TAG, "Open camera failed. No camera available"); return; } try { if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) { throw new RuntimeException("Time out waiting to lock camera opening."); } //camera preview process,step1 打开camera mCameraManager.openCamera(mCameraId, mStateCallback, mCameraHandler); } catch (InterruptedException | CameraAccessException e) { Log.e(TAG, e.getMessage()); } } private void closeCamera() { try { mCameraOpenCloseLock.acquire(); if (mCaptureSession != null) { mCaptureSession.close(); mCaptureSession = null; } if (mCameraDevice != null) { mCameraDevice.close(); mCameraDevice = null; } if (mImageReader != null) { mImageReader.close(); mImageReader = null; } } catch (InterruptedException e) { throw new RuntimeException("Interrupted while trying to lock camera closing.", e); } finally { mCameraOpenCloseLock.release(); } } private void startCaptureSession() { mState = State.STATE_PREVIEW; if (mCameraDevice == null) { return; } if ((mImageReader != null || mSurfaceTexture != null)) { try { closeCameraCaptureSession(); //camera preview process,step3 创建一个 CaptureRequest.Builder,templateType来区分是拍照还是预览 mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW); //camera preview process,step4 将显示预览用的surface的实例传入,即将显示预览用的 surface 的实例,作为一个显示层添加到该 请求的目标列表中 mPreviewRequestBuilder.addTarget(mImageReader.getSurface()); List surfaceList = Arrays.asList(mImageReader.getSurface()); if (mSurfaceTexture != null) { mSurfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight()); Surface surface = new Surface(mSurfaceTexture); mPreviewRequestBuilder.addTarget(mImageReader.getSurface()); mPreviewRequestBuilder.addTarget(surface); //camera preview process,step5 将显示预览用的surface的实例传入,即将显示预览用的surface的实例,作为一个显示层添加到该请求的目标列表中 surfaceList = Arrays.asList(surface, mImageReader.getSurface()); } Range[] fpsRanges = mCameraCharacteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES); Log.d("DEBUG", "##### fpsRange: " + Arrays.toString(fpsRanges)); //camera preview process,step6 & 7 // 6 执行createCaptureSession方法 // 7 参数中实例化 CameraCaptureSession.stateCallback,并重写 onConfigured 方法 mCameraDevice.createCaptureSession(surfaceList, new CameraCaptureSession.StateCallback() { @Override public void onConfigured(CameraCaptureSession session) { if (mCameraDevice == null) return; mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_OFF); mCaptureSession = session; try { if (mCaptureSession != null) //camera preview process,step8 用 CameraCaptureSession.setRepeatingRequest()方法创建预览 mCaptureSession.setRepeatingRequest(mPreviewRequestBuilder.build(), mCaptureCallback, mCameraHandler); } catch (CameraAccessException | IllegalArgumentException | IllegalStateException | NullPointerException e) { e.printStackTrace(); } } @Override public void onConfigureFailed(CameraCaptureSession session) { Log.e(TAG, "Failed to configure capture session"); } @Override public void onClosed(CameraCaptureSession session) { if (mCaptureSession != null && mCaptureSession.equals(session)) { mCaptureSession = null; } } }, mCameraHandler); } catch (CameraAccessException e) { e.printStackTrace(); Log.e(TAG, e.getMessage()); } catch (IllegalStateException e) { stopPreview(); startPreview(); } catch (UnsupportedOperationException e) { e.printStackTrace(); Log.e(TAG, e.getMessage()); } } } public void captureStillPicture() { try { Log.d(TAG,"captureStillPicture"); mState = State.STATE_CAPTURE; if (mCameraDevice == null) { return; } // camera capture process,step1 创建作为拍照的CaptureRequest.Builder mCaptureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE); // camera capture process,step2 将imageReader的surface作为CaptureRequest.Builder的目标 mCaptureRequestBuilder.addTarget(mImageReader.getSurface()); // 设置自动对焦模式 mCaptureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE); // 设置自动曝光模式 mCaptureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH); // 设置为自动模式 mCaptureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO); // 设置摄像头旋转角度 mCaptureRequestBuilder.set(CaptureRequest.JPEG_ORIENTATION, Surface.ROTATION_0); // 停止连续取景 mCaptureSession.stopRepeating(); // camera capture process,step5 &6 捕获静态图像,结束后执行onCaptureCompleted mCaptureSession.capture(mCaptureRequestBuilder.build(), new CameraCaptureSession.CaptureCallback() { @Override// 拍照完成时激发该方法 public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) { Log.d(TAG,"onCaptureCompleted"); startCaptureSession(); } }, mCameraHandler); } catch (CameraAccessException e) { e.printStackTrace(); } } private void setupMediaRecorder(){ //camera record process,step1 停止预览,准备切换到录制视频 try { mCaptureSession.stopRepeating(); closeCameraCaptureSession(); } catch (CameraAccessException e) { e.printStackTrace(); } //camera record process,step2 mMediaRecorder相关设置 mVideoFile = new File(mContext.getExternalCacheDir(),new SimpleDateFormat("yyyyMMddHHmmss").format(new Date()) +"demo.mp4"); mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);//设置音频来源 mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);//设置视频来源 mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);//设置输出格式 mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);//设置音频编码格式,请注意这里使用默认,实际app项目需要考虑兼容问题,应该选择AAC mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);//设置视频编码格式,请注意这里使用默认,实际app项目需要考虑兼容问题,应该选择H264 mMediaRecorder.setVideoEncodingBitRate(8*1024*1920);//设置比特率 一般是 1*分辨率 到 10*分辨率 之间波动。比特率越大视频越清晰但是视频文件也越大。 mMediaRecorder.setVideoFrameRate(30);//设置帧数 选择 30即可, 过大帧数也会让视频文件更大当然也会更流畅,但是没有多少实际提升。人眼极限也就30帧了。 mMediaRecorder.setVideoSize(mPreviewSize.getWidth(),mPreviewSize.getHeight()); mMediaRecorder.setOrientationHint(90); Surface surface = new Surface(mTextureView.getSurfaceTexture()); mMediaRecorder.setPreviewDisplay(surface); mMediaRecorder.setOutputFile(mVideoFile.getAbsolutePath()); try { mMediaRecorder.prepare(); } catch (IOException e) { e.printStackTrace(); } SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture(); surfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(),mPreviewSize.getHeight()); Surface previewSurface = new Surface(surfaceTexture); Surface recorderSurface = mMediaRecorder.getSurface();//从获取录制视频需要的Surface try { //camera record process,step3 创建CaptureRequest.Build 对象,并用CameraDevice.TEMPLATE_RECORED 参数初始化 mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD); mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE); //camera record process,step4 将MediaRecorder和预览用的Surface实例添加到该请求的目标列表中 mPreviewRequestBuilder.addTarget(previewSurface); mPreviewRequestBuilder.addTarget(recorderSurface); //请注意这里设置了Arrays.asList(previewSurface,recorderSurface) 2个Surface,很好理解录制视频也需要有画面预览,第一个是预览的Surface,第二个是录制视频使用的Surface //camera record process,step5 执行CameraDevice.CreateCaptureSession 方法 mCameraDevice.createCaptureSession(Arrays.asList(previewSurface,recorderSurface),new CameraCaptureSession.StateCallback() { @Override public void onConfigured(CameraCaptureSession cameraCaptureSession) { mCaptureSession = cameraCaptureSession; Log.i(TAG, "Video Session:" + mCaptureSession.toString()); mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO); mCameraHandler.post(new Runnable() { @Override public void run() { try { mCaptureSession.setRepeatingRequest(mPreviewRequestBuilder.build(), null, mCameraHandler); } catch (CameraAccessException e) { throw new RuntimeException("Can't visit camera"); } } }); mMediaRecorder.start(); } @Override public void onConfigureFailed(CameraCaptureSession cameraCaptureSession) { } },mCameraHandler); } catch (CameraAccessException e) { e.printStackTrace(); } } private void closeCameraCaptureSession() { if (mCaptureSession != null) { Log.i(TAG, "close Session:" + mCaptureSession.toString()); mCaptureSession.close(); mCaptureSession = null; } } public void startRecorder(){ Log.d(TAG,"startRecorder"); setupMediaRecorder(); } public void stopRecorder(){ Log.d(TAG,"stopRecorder"); try { //camera record process,step6 取消持续捕获 mCaptureSession.stopRepeating(); //丢弃当前待处理和正在进行的所有捕获 mCaptureSession.abortCaptures(); } catch (CameraAccessException e) { e.printStackTrace(); } //camera record process,step7 停止图像捕获 并且重启预览模式 mMediaRecorder.stop(); mMediaRecorder.reset(); //可根据需要将视频设置为系统gallery可见。 mCameraHandler.post(new FileUtils.VideoSaver(mContext, mVideoFile)); startCaptureSession(); } public void setSurfaceTexture(SurfaceTexture surfaceTexture) { mSurfaceTexture = surfaceTexture; } }

保存照片和保存视频相关操作在FileUtils类中代码如下所示:

class FileUtils { private static final String TAG = "FileUtils"; //camera capture progress,step4 保存图片相关操作 public static class ImageSaver implements Runnable { private final Image mImage; private final File mFile; Context mContext; ImageSaver(Context context,Image image, File file) { mContext = context; mImage = image; mFile = file; } @Override public void run() { Log.d(TAG,"take picture Image Run"); ByteBuffer buffer = mImage.getPlanes()[0].getBuffer(); byte[] bytes = new byte[buffer.remaining()]; buffer.get(bytes); ContentValues values = new ContentValues(); values.put(MediaStore.Images.Media.DESCRIPTION, "This is an qr image"); values.put(MediaStore.Images.Media.DISPLAY_NAME, mFile.getName()); values.put(MediaStore.Images.Media.MIME_TYPE, "image/jpeg"); values.put(MediaStore.Images.Media.TITLE, "Image.jpg"); values.put(MediaStore.Images.Media.RELATIVE_PATH, "Pictures/"); Uri external = MediaStore.Images.Media.EXTERNAL_CONTENT_URI; ContentResolver resolver = mContext.getContentResolver(); Uri insertUri = resolver.insert(external, values); OutputStream os = null; try { if (insertUri != null) { os = resolver.openOutputStream(insertUri); } if (os != null) { os.write(bytes); } } catch (IOException e) { e.printStackTrace(); } finally { mImage.close(); try { if(os!=null) { os.close(); } } catch (IOException e) { e.printStackTrace(); } } } } public static class VideoSaver implements Runnable { private final File mFile; Context mContext; VideoSaver(Context context,File file) { mContext = context; mFile = file; } private ContentValues getVideoContentValues(File paramFile,long paramLong) { ContentValues values = new ContentValues(); values.put(MediaStore.Video.Media.TITLE, paramFile.getName()); values.put(MediaStore.Video.Media.DISPLAY_NAME, paramFile.getName()); values.put(MediaStore.Video.Media.MIME_TYPE, "video/mp4"); values.put(MediaStore.Video.Media.DATE_TAKEN, Long.valueOf(paramLong)); values.put(MediaStore.Video.Media.DATE_MODIFIED, Long.valueOf(paramLong)); values.put(MediaStore.Video.Media.DATE_ADDED, Long.valueOf(paramLong)); values.put(MediaStore.Video.Media.DATA, paramFile.getAbsolutePath()); values.put(MediaStore.Video.Media.SIZE, Long.valueOf(paramFile.length())); return values; } @Override public void run() { Log.d(TAG, "recorder video Run"); ContentResolver localContentResolver = mContext.getContentResolver(); ContentValues localContentValues = getVideoContentValues(mFile, System.currentTimeMillis()); Uri localUri = localContentResolver.insert(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, localContentValues); OutputStream os = null; FileInputStream fis = null; byte[] buf = new byte[1024]; int len; try { if (localUri != null) { fis = new FileInputStream(mFile); os = localContentResolver.openOutputStream(localUri); } if (os != null) { while ((len = fis.read(buf)) >= 0) { os.write(buf, 0, len); } } } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } finally { try { if(os!=null) { os.close(); } if(fis!=null){ fis.close(); } } catch (IOException e) { e.printStackTrace(); } } } } }

Activity UI相关代码如下所示:

public class MainActivity extends AppCompatActivity implements View.OnClickListener{ private static final String TAG = "CameraDemo"; private ImageButton mTakePictureBtn; private Button mBtnVideoStart; private Button mBtnVideoStop; private CameraCoreManager manager; private TextureView mTextureView; Lock Mutex; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_camera); int rotation = getWindowManager().getDefaultDisplay().getRotation(); mTakePictureBtn = findViewById(R.id.camera_take_picture); mTakePictureBtn.setOnClickListener(this); mBtnVideoStart = findViewById(R.id.btn_video_start); mBtnVideoStart.setOnClickListener(this); mBtnVideoStop = findViewById(R.id.btn_video_stop); mBtnVideoStop.setOnClickListener(this); mTextureView = findViewById(R.id.texture_view); mTextureView.setSurfaceTextureListener(new TextureView.SurfaceTextureListener() { @Override public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) { manager.setSurfaceTexture(surface); manager.startPreview(); } @Override public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) { } @Override public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) { return false; } @Override public void onSurfaceTextureUpdated(SurfaceTexture surface) { } }); manager = new CameraCoreManager(this,mTextureView); Log.d(TAG,"onCreate:init"); } @Override public void onResume() { super.onResume(); } @Override public void onPause() { super.onPause(); } @Override protected void onDestroy() { super.onDestroy(); manager.stopPreview(); } @Override public void onClick(View v) { switch (v.getId()) { case R.id.camera_take_picture: Log.d(TAG,"takepicture"); manager.captureStillPicture(); break; case R.id.btn_video_start: manager.startRecorder(); break; case R.id.btn_video_stop: manager.stopRecorder(); break; default: break; } } } 2.2 layout文件

这里涉及布局文件主要为activity_camera.xml,xml文件内容如下:



【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

    专题文章
      CopyRight 2018-2019 实验室设备网 版权所有