Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频写入H264 SEI数据

记录一下学习过程,得到一个需求是基于Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频。

需求:

  1. 在每一帧视频数据中,写入SEI额外数据,方便后期解码时获得每一帧中的自定义数据。
  2. 点击录制功能后,录制的是前N秒至后N秒这段时间的音视频,保存的文件都按照60s进行保存。

写在前面,整个学习过程涉及到以下内容,可以快速检索是否有想要的内容

  • MediaCodec的使用,采用的是createInputSurface()创建一个surface,通过EGL接受camera2传过来的画面。
  • AudioRecord的使用
  • Camera2的使用
  • OpenGL的简单使用
  • H264 SEI的写入简单例子

整体思路设计比较简单,打开相机,创建OpenGL相关环境,然后创建video线程录制video相关数据,创建audio线程录制audio相关数据,video和audio数据都存在自定义的List中作为缓存,最后使用一个编码线程,将video List和audio List中的数据编码到MP4中即可。用的安卓sdk 28,因为29以上保存比较麻烦。整个工程暂时没上传,有需要私。
将以上功能都模块化,分别写到不同的类中。先介绍一些独立的模块。

UI布局

ui很简单,一个GLSurfaceView,两个button控件。

在这里插入图片描述

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"xmlns:app="http://schemas.android.com/apk/res-auto"xmlns:tools="http://schemas.android.com/tools"android:layout_width="match_parent"android:layout_height="match_parent"tools:context=".MainActivity"><android.opengl.GLSurfaceViewandroid:id="@+id/glView"android:layout_width="match_parent"android:layout_height="match_parent"app:layout_constraintBottom_toBottomOf="parent"app:layout_constraintEnd_toEndOf="parent"app:layout_constraintStart_toStartOf="parent"app:layout_constraintTop_toTopOf="parent" /><Buttonandroid:id="@+id/recordBtn"android:layout_width="wrap_content"android:layout_height="wrap_content"android:layout_marginBottom="80dp"android:text="Record"app:layout_constraintBottom_toBottomOf="parent"app:layout_constraintLeft_toLeftOf="parent"app:layout_constraintRight_toRightOf="parent" /><Buttonandroid:id="@+id/exit"android:layout_width="wrap_content"android:layout_height="wrap_content"android:layout_marginTop="20dp"android:layout_marginRight="20dp"android:text="Eixt"app:layout_constraintTop_toTopOf="parent"app:layout_constraintRight_toRightOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>

Camera2

camera2框架的使用,比较简单,需要注意的一点是, startPreview函数中传入的surface用于后续mCaptureRequestBuilder.addTarget(surface)的参数传入。surface的产生由以下基本几步完成。现在简单提一下,下面会贴代码。
1.这个surface 就是通过openGL 生成的纹理, GLES30.glGenTextures(1, mTexture, 0);
2.纹理生成SurfaceTexture, mSurfaceTexture = new SurfaceTexture(mTexture[0]);
3.mSurfaceTexture生成一个surface, mSurface = new Surface(mSurfaceTexture);
4.mCamera.startPreview(mSurface);

public class Camera2 {private final String TAG = "Abbott Camera2";private Context mContext;private CameraManager mCameraManager;private CameraDevice mCameraDevice;private String[] mCamList;private String mCameraId;private Size mPreviewSize;private HandlerThread mBackgroundThread;private Handler mBackgroundHandler;private CaptureRequest.Builder mCaptureRequestBuilder;private CaptureRequest mCaptureRequest;private CameraCaptureSession mCameraCaptureSession;public Camera2(Context Context) {mContext = Context;mCameraManager = (CameraManager) mContext.getSystemService(android.content.Context.CAMERA_SERVICE);try {mCamList = mCameraManager.getCameraIdList();} catch (CameraAccessException e) {e.printStackTrace();}mBackgroundThread = new HandlerThread("CameraThread");mBackgroundThread.start();mBackgroundHandler = new Handler(mBackgroundThread.getLooper());}public void openCamera(int width, int height, String id) {try {Log.d(TAG, "openCamera: id:" + id);CameraCharacteristics characteristics = mCameraManager.getCameraCharacteristics(id);if (characteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT) {}StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);mPreviewSize = getOptimalSize(map.getOutputSizes(SurfaceTexture.class), width, height);mCameraId = id;} catch (CameraAccessException e) {e.printStackTrace();}try {if (ActivityCompat.checkSelfPermission(mContext, android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {return;}Log.d(TAG, "mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);: " + mCameraId);mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}private Size getOptimalSize(Size[] sizeMap, int width, int height) {List<Size> sizeList = new ArrayList<>();for (Size option : sizeMap) {if (width > height) {if (option.getWidth() > width && option.getHeight() > height) {sizeList.add(option);}} else {if (option.getWidth() > height && option.getHeight() > width) {sizeList.add(option);}}}if (sizeList.size() > 0) {return Collections.min(sizeList, new Comparator<Size>() {@Overridepublic int compare(Size lhs, Size rhs) {return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());}});}return sizeMap[0];}private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {@Overridepublic void onOpened(@NonNull CameraDevice camera) {mCameraDevice = camera;}@Overridepublic void onDisconnected(@NonNull CameraDevice camera) {camera.close();mCameraDevice = null;}@Overridepublic void onError(@NonNull CameraDevice camera, int error) {camera.close();mCameraDevice = null;}};public void startPreview(Surface surface) {try {mCaptureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);mCaptureRequestBuilder.addTarget(surface);mCameraDevice.createCaptureSession(Collections.singletonList(surface), new CameraCaptureSession.StateCallback() {@Overridepublic void onConfigured(@NonNull CameraCaptureSession session) {try {mCaptureRequest = mCaptureRequestBuilder.build();mCameraCaptureSession = session;mCameraCaptureSession.setRepeatingRequest(mCaptureRequest, null, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}@Overridepublic void onConfigureFailed(@NonNull CameraCaptureSession session) {}}, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}
}

ImageList

这个类就是用于video 和audio缓存类,没有什么可以介绍的,直接用就好了。

public class ImageList {private static final String TAG = "Abbott ImageList";private Object mImageListLock = new Object();int kCapacity;private List<ImageItem> mImageList = new CopyOnWriteArrayList<>();public ImageList(int capacity) {kCapacity = capacity;}public synchronized void addItem(long Timestamp, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {synchronized (mImageListLock) {ImageItem item = new ImageItem(Timestamp, byteBuffer, bufferInfo);mImageList.add(item);if (mImageList.size() > kCapacity) {int excessItems = mImageList.size() - kCapacity;mImageList.subList(0, excessItems).clear();}}}public synchronized List<ImageItem> getItemsInTimeRange(long startTimestamp, long endTimestamp) {List<ImageItem> itemsInTimeRange = new ArrayList<>();synchronized (mImageListLock) {for (ImageItem item : mImageList) {long itemTimestamp = item.getTimestamp();// 判断时间戳是否在指定范围内if (itemTimestamp >= startTimestamp && itemTimestamp <= endTimestamp) {itemsInTimeRange.add(item);}}}return itemsInTimeRange;}public synchronized ImageItem getItem() {return mImageList.get(0);}public synchronized void removeItem() {mImageList.remove(0);}public synchronized int getSize() {return mImageList.size();}public static class ImageItem {private long mTimestamp;private ByteBuffer mVideoBuffer;private MediaCodec.BufferInfo mVideoBufferInfo;public ImageItem(long first, ByteBuffer second, MediaCodec.BufferInfo bufferInfo) {this.mTimestamp = first;this.mVideoBuffer = second;this.mVideoBufferInfo = bufferInfo;}public synchronized long getTimestamp() {return mTimestamp;}public synchronized ByteBuffer getVideoByteBuffer() {return mVideoBuffer;}public synchronized MediaCodec.BufferInfo getVideoBufferInfo() {return mVideoBufferInfo;}}
}

GlProgram

用于创建OpenGL的程序的类。目前使用的是OpenGL3.0 版本

public class GlProgram {public static final String mVertexShader ="#version 300 es \n" +"in vec4 vPosition;" +"in vec2 vCoordinate;" +"out vec2 vTextureCoordinate;" +"void main() {" +"   gl_Position = vPosition;" +"   vTextureCoordinate = vCoordinate;" +"}";public static final String mFragmentShader ="#version 300 es \n" +"#extension GL_OES_EGL_image_external : require \n" +"#extension GL_OES_EGL_image_external_essl3 : require \n" +"precision mediump float;" +"in vec2 vTextureCoordinate;" +"uniform samplerExternalOES oesTextureSampler;" +"out vec4 gl_FragColor;" +"void main() {" +"    gl_FragColor = texture(oesTextureSampler, vTextureCoordinate);" +"}";public static int createProgram(String vertexShaderSource, String fragShaderSource) {int program = GLES30.glCreateProgram();if (0 == program) {Log.e("Arc_ShaderManager", "create program error ,error=" + GLES30.glGetError());return 0;}int vertexShader = loadShader(GLES30.GL_VERTEX_SHADER, vertexShaderSource);if (0 == vertexShader) {return 0;}int fragShader = loadShader(GLES30.GL_FRAGMENT_SHADER, fragShaderSource);if (0 == fragShader) {return 0;}GLES30.glAttachShader(program, vertexShader);GLES30.glAttachShader(program, fragShader);GLES30.glLinkProgram(program);int[] status = new int[1];GLES30.glGetProgramiv(program, GLES30.GL_LINK_STATUS, status, 0);if (GLES30.GL_FALSE == status[0]) {String errorMsg = GLES30.glGetProgramInfoLog(program);Log.e("Arc_ShaderManager", "createProgram error : " + errorMsg);GLES30.glDeleteShader(vertexShader);GLES30.glDeleteShader(fragShader);GLES30.glDeleteProgram(program);return 0;}GLES30.glDetachShader(program, vertexShader);GLES30.glDetachShader(program, fragShader);GLES30.glDeleteShader(vertexShader);GLES30.glDeleteShader(fragShader);return program;}private static int loadShader(int type, String shaderSource) {int shader = GLES30.glCreateShader(type);if (0 == shader) {Log.e("Arc_ShaderManager", "create shader error, shader type=" + type + " , error=" + GLES30.glGetError());return 0;}GLES30.glShaderSource(shader, shaderSource);GLES30.glCompileShader(shader);int[] status = new int[1];GLES30.glGetShaderiv(shader, GLES30.GL_COMPILE_STATUS, status, 0);if (0 == status[0]) {String errorMsg = GLES30.glGetShaderInfoLog(shader);Log.e("Arc_ShaderManager", "createShader shader = " + type + "  error: " + errorMsg);GLES30.glDeleteShader(shader);return 0;}return shader;}
}

OesTexture

连接上面介绍的OpenGL程序,通过顶点着色器和片元着色器的坐标生成纹理

public class OesTexture {private static final String TAG = "Abbott OesTexture";private int mProgram;private final FloatBuffer mCordsBuffer;private final FloatBuffer mPositionBuffer;private int mPositionHandle;private int mCordsHandle;private int mOESTextureHandle;public OesTexture() {float[] positions = {-1.0f, 1.0f,-1.0f, -1.0f,1.0f, 1.0f,1.0f, -1.0f};float[] texCords = {0.0f, 0.0f,0.0f, 1.0f,1.0f, 0.0f,1.0f, 1.0f,};mPositionBuffer = ByteBuffer.allocateDirect(positions.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();mPositionBuffer.put(positions).position(0);mCordsBuffer = ByteBuffer.allocateDirect(texCords.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();mCordsBuffer.put(texCords).position(0);}public void init() {this.mProgram = GlProgram.createProgram(GlProgram.mVertexShader, GlProgram.mFragmentShader);if (0 == this.mProgram) {Log.e(TAG, "createProgram failed");}mPositionHandle = GLES30.glGetAttribLocation(mProgram, "vPosition");mCordsHandle = GLES30.glGetAttribLocation(mProgram, "vCoordinate");mOESTextureHandle = GLES30.glGetUniformLocation(mProgram, "oesTextureSampler");GLES30.glDisable(GLES30.GL_DEPTH_TEST);}public void PrepareTexture(int OESTextureId) {GLES30.glUseProgram(this.mProgram);GLES30.glEnableVertexAttribArray(mPositionHandle);GLES30.glVertexAttribPointer(mPositionHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mPositionBuffer);GLES30.glEnableVertexAttribArray(mCordsHandle);GLES30.glVertexAttribPointer(mCordsHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mCordsBuffer);GLES30.glActiveTexture(GLES30.GL_TEXTURE0);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, OESTextureId);GLES30.glUniform1i(mOESTextureHandle, 0);GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);GLES30.glDisableVertexAttribArray(mPositionHandle);GLES30.glDisableVertexAttribArray(mCordsHandle);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);}
}

接下来介绍的VideoRecorder,AudioEncoder,EncodingRunnable三个类需要互相搭配使用

public class AudioEncoder extends Thread {private static final String TAG = "Abbott AudioEncoder";private static final int SAVEMP4_INTERNAL = Param.recordInternal * 1000 * 1000;private static final int SAMPLE_RATE = 44100;private static final int CHANNEL_COUNT = 1;private static final int BIT_RATE = 96000;private EncodingRunnable mEncodingRunnable;private MediaCodec mMediaCodec;private AudioRecord mAudioRecord;private MediaFormat mFormat;private MediaFormat mOutputFormat;private long nanoTime;int mBufferSizeInBytes = 0;boolean mExitThread = true;private ImageList mAudioList;private MediaCodec.BufferInfo mAudioBufferInfo;private boolean mAlarm = false;private long mAlarmTime;private long mAlarmStartTime;private long mAlarmEndTime;private List<ImageList.ImageItem> mMuxerImageItem;private Object mLock = new Object();private MediaCodec.BufferInfo mAlarmBufferInfo;public AudioEncoder( EncodingRunnable encodingRunnable) throws IOException {mEncodingRunnable = encodingRunnable;nanoTime = System.nanoTime();createAudio();createMediaCodec();int kCapacity = 1000 / 20 * Param.recordInternal;mAudioList = new ImageList(kCapacity);}public void createAudio() {mBufferSizeInBytes = AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, mBufferSizeInBytes);}public void createMediaCodec() throws IOException {mFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, SAMPLE_RATE, CHANNEL_COUNT);mFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);mFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);mFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 8192);mMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);mMediaCodec.configure(mFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);}public synchronized void setAlarm() {synchronized (mLock) {Log.d(TAG, "setAudio Alarm enter");mEncodingRunnable.setAudioFormat(mOutputFormat);mEncodingRunnable.setAudioAlarmTrue();mAlarmTime = mAlarmBufferInfo.presentationTimeUs;mAlarmEndTime = mAlarmTime + SAVEMP4_INTERNAL;if (!mAlarm) {mAlarmStartTime = mAlarmTime - SAVEMP4_INTERNAL;}mAlarm = true;Log.d(TAG, "setAudio Alarm exit");}}@Overridepublic void run() {super.run();mMediaCodec.start();mAudioRecord.startRecording();while (mExitThread) {synchronized (mLock) {byte[] inputAudioData = new byte[mBufferSizeInBytes];int res = mAudioRecord.read(inputAudioData, 0, inputAudioData.length);if (res > 0) {if (mAudioRecord != null) {enCodeAudio(inputAudioData);}}}}Log.d(TAG, "AudioRecord run: exit");}private void enCodeAudio(byte[] inputAudioData) {mAudioBufferInfo = new MediaCodec.BufferInfo();int index = mMediaCodec.dequeueInputBuffer(-1);if (index < 0) {return;}ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();ByteBuffer audioInputBuffer = inputBuffers[index];audioInputBuffer.clear();audioInputBuffer.put(inputAudioData);audioInputBuffer.limit(inputAudioData.length);mMediaCodec.queueInputBuffer(index, 0, inputAudioData.length, (System.nanoTime() - nanoTime) / 1000, 0);int status = mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);ByteBuffer outputBuffer;if (status == MediaCodec.INFO_TRY_AGAIN_LATER) {} else if (status == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mOutputFormat = mMediaCodec.getOutputFormat();} else {while (status >= 0) {MediaCodec.BufferInfo tmpaudioBufferInfo = new MediaCodec.BufferInfo();tmpaudioBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);mAlarmBufferInfo = new MediaCodec.BufferInfo();mAlarmBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);outputBuffer = mMediaCodec.getOutputBuffer(status);ByteBuffer buffer = ByteBuffer.allocate(tmpaudioBufferInfo.size);buffer.limit(tmpaudioBufferInfo.size);buffer.put(outputBuffer);buffer.flip();if (tmpaudioBufferInfo.size > 0) {if (mAlarm) {mMuxerImageItem = mAudioList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);for (ImageList.ImageItem item : mMuxerImageItem) {mEncodingRunnable.pushAudio(item);}mAlarmStartTime = tmpaudioBufferInfo.presentationTimeUs;mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);if (tmpaudioBufferInfo.presentationTimeUs - mAlarmTime > SAVEMP4_INTERNAL) {mAlarm = false;mEncodingRunnable.setAudioAlarmFalse();Log.d(TAG, "mEncodingRunnable.setAudio itemAlarmFalse();");}} else {mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);}}mMediaCodec.releaseOutputBuffer(status, false);status = mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);}}}public synchronized void stopAudioRecord() throws IllegalStateException {synchronized (mLock) {mExitThread = false;}try {join();} catch (InterruptedException e) {e.printStackTrace();}mMediaCodec.stop();mMediaCodec.release();mMediaCodec = null;}
}
public class VideoRecorder extends Thread {private static final String TAG = "Abbott VideoRecorder";private static final int SAVE_MP4_Internal = 1000 * 1000 * Param.recordInternal;// EGLprivate static final int EGL_RECORDABLE_ANDROID = 0x3142;private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;private EGLContext mSharedContext = EGL14.EGL_NO_CONTEXT;private Surface mSurface;private int mOESTextureId;private OesTexture mOesTexture;private ImageList mImageList;private List<ImageList.ImageItem> muxerImageItem;// Threadprivate boolean mExitThread;private Object mLock = new Object();private Object object = new Object();private MediaCodec mMediaCodec;private MediaFormat mOutputFormat;private boolean mAlarm = false;private long mAlarmTime;private long mAlarmStartTime;private long mAlarmEndTime;private MediaCodec.BufferInfo mBufferInfo;private EncodingRunnable mEncodingRunnable;private String mSeiMessage;public VideoRecorder(EGLContext eglContext, EncodingRunnable encodingRunnable) {mSharedContext = eglContext;mEncodingRunnable = encodingRunnable;int kCapacity = 1000 / 40 * Param.recordInternal;mImageList = new ImageList(kCapacity);try {MediaFormat mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1920, 1080);mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1920 * 1080 * 25 / 5);mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);mMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);mSurface = mMediaCodec.createInputSurface();} catch (IOException e) {e.printStackTrace();}}@Overridepublic void run() {super.run();try {initEgl();mOesTexture = new OesTexture();mOesTexture.init();synchronized (mLock) {mLock.wait(33);}guardedRun();} catch (Exception e) {e.printStackTrace();}}private void guardedRun() throws InterruptedException, RuntimeException {mExitThread = false;while (true) {synchronized (mLock) {if (mExitThread) {break;}mLock.wait(33);}mOesTexture.PrepareTexture(mOESTextureId);swapBuffers();enCodeVideo();}Log.d(TAG, "guardedRun: exit");unInitEgl();}private void enCodeVideo() {mBufferInfo = new MediaCodec.BufferInfo();int status = mMediaCodec.dequeueOutputBuffer(mBufferInfo, 0);ByteBuffer outputBuffer = null;if (status == MediaCodec.INFO_TRY_AGAIN_LATER) {} else if (status == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mOutputFormat = mMediaCodec.getOutputFormat();} else if (status == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {} else {outputBuffer = mMediaCodec.getOutputBuffer(status);if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {mBufferInfo.size = 0;}if (mBufferInfo.size > 0) {outputBuffer.position(mBufferInfo.offset);outputBuffer.limit(mBufferInfo.size - mBufferInfo.offset);mSeiMessage = "avcIndex" + String.format("%05d", 0);}mMediaCodec.releaseOutputBuffer(status, false);}if (mBufferInfo.size > 0) {mEncodingRunnable.setTimeUs(mBufferInfo.presentationTimeUs);ByteBuffer seiData = buildSEIData(mSeiMessage);ByteBuffer frameWithSEI = ByteBuffer.allocate(outputBuffer.remaining() + seiData.remaining());frameWithSEI.put(seiData);frameWithSEI.put(outputBuffer);frameWithSEI.flip();mBufferInfo.size = frameWithSEI.remaining();MediaCodec.BufferInfo tmpAudioBufferInfo = new MediaCodec.BufferInfo();tmpAudioBufferInfo.set(mBufferInfo.offset, mBufferInfo.size, mBufferInfo.presentationTimeUs, mBufferInfo.flags);if (mAlarm) {muxerImageItem = mImageList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);mAlarmStartTime = tmpAudioBufferInfo.presentationTimeUs;for (ImageList.ImageItem item : muxerImageItem) {mEncodingRunnable.push(item);}mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);if (mBufferInfo.presentationTimeUs - mAlarmTime > SAVE_MP4_Internal) {Log.d(TAG, "mEncodingRunnable.set itemAlarmFalse()");Log.d(TAG, tmpAudioBufferInfo.presentationTimeUs + " " + mAlarmTime);mAlarm = false;mEncodingRunnable.setVideoAlarmFalse();}} else {mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);}}}public synchronized void setAlarm() {synchronized (mLock) {Log.d(TAG, "setAlarm enter");mEncodingRunnable.setMediaFormat(mOutputFormat);mEncodingRunnable.setVideoAlarmTrue();if (mBufferInfo.presentationTimeUs != 0) {mAlarmTime = mBufferInfo.presentationTimeUs;}mAlarmEndTime = mAlarmTime + SAVE_MP4_Internal;if (!mAlarm) {mAlarmStartTime = mAlarmTime - SAVE_MP4_Internal;}mAlarm = true;Log.d(TAG, "setAlarm exit");}}public synchronized void startRecord() throws IllegalStateException {super.start();mMediaCodec.start();}public synchronized void stopVideoRecord() throws IllegalStateException {synchronized (mLock) {mExitThread = true;mLock.notify();}try {join();} catch (InterruptedException e) {e.printStackTrace();}mMediaCodec.signalEndOfInputStream();mMediaCodec.stop();mMediaCodec.release();mMediaCodec = null;}public void requestRender(int i) {synchronized (object) {mOESTextureId = i;}}private void initEgl() {this.mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);if (this.mEGLDisplay == EGL14.EGL_NO_DISPLAY) {throw new RuntimeException("EGL14.eglGetDisplay fail...");}int[] major_version = new int[2];boolean eglInited = EGL14.eglInitialize(this.mEGLDisplay, major_version, 0, major_version, 1);if (!eglInited) {this.mEGLDisplay = null;throw new RuntimeException("EGL14.eglInitialize fail...");}//4. 设置显示设备的属性int[] attrib_list = new int[]{EGL14.EGL_SURFACE_TYPE, EGL14.EGL_WINDOW_BIT,EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,EGL14.EGL_RED_SIZE, 8,EGL14.EGL_GREEN_SIZE, 8,EGL14.EGL_BLUE_SIZE, 8,EGL14.EGL_ALPHA_SIZE, 8,EGL14.EGL_DEPTH_SIZE, 16,EGL_RECORDABLE_ANDROID, 1,EGL14.EGL_NONE};EGLConfig[] configs = new EGLConfig[1];int[] numConfigs = new int[1];boolean eglChose = EGL14.eglChooseConfig(this.mEGLDisplay, attrib_list, 0, configs, 0, configs.length, numConfigs, 0);if (!eglChose) {throw new RuntimeException("eglChooseConfig [RGBA888 + recordable] ES2 EGL_config_fail...");}int[] attr_list = {EGL14.EGL_CONTEXT_CLIENT_VERSION, 2, EGL14.EGL_NONE};this.mEGLContext = EGL14.eglCreateContext(this.mEGLDisplay, configs[0], this.mSharedContext, attr_list, 0);checkEglError("eglCreateContext");if (this.mEGLContext == EGL14.EGL_NO_CONTEXT) {throw new RuntimeException("eglCreateContext == EGL_NO_CONTEXT");}int[] surface_attr = {EGL14.EGL_NONE};this.mEGLSurface = EGL14.eglCreateWindowSurface(this.mEGLDisplay, configs[0], this.mSurface, surface_attr, 0);if (this.mEGLSurface == EGL14.EGL_NO_SURFACE) {throw new RuntimeException("eglCreateWindowSurface == EGL_NO_SURFACE");}Log.d(TAG, "initEgl , display=" + this.mEGLDisplay + " ,context=" + this.mEGLContext + " ,sharedContext= " +this.mSharedContext + ", surface=" + this.mEGLSurface);boolean success = EGL14.eglMakeCurrent(this.mEGLDisplay, this.mEGLSurface, this.mEGLSurface, this.mEGLContext);if (!success) {checkEglError("makeCurrent");throw new RuntimeException("eglMakeCurrent failed");}}private void unInitEgl() {boolean success = EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_CONTEXT);if (!success) {checkEglError("makeCurrent");throw new RuntimeException("eglMakeCurrent failed");}if (this.mEGLDisplay != EGL14.EGL_NO_DISPLAY) {EGL14.eglDestroySurface(this.mEGLDisplay, this.mEGLSurface);EGL14.eglDestroyContext(this.mEGLDisplay, this.mEGLContext);EGL14.eglTerminate(this.mEGLDisplay);}this.mEGLDisplay = EGL14.EGL_NO_DISPLAY;this.mEGLContext = EGL14.EGL_NO_CONTEXT;this.mEGLSurface = EGL14.EGL_NO_SURFACE;this.mSharedContext = EGL14.EGL_NO_CONTEXT;this.mSurface = null;}private boolean swapBuffers() {if ((null == this.mEGLDisplay) || (null == this.mEGLSurface)) {return false;}boolean success = EGL14.eglSwapBuffers(this.mEGLDisplay, this.mEGLSurface);if (!success) {checkEglError("eglSwapBuffers");}return success;}private void checkEglError(String msg) {int error = EGL14.eglGetError();if (error != EGL14.EGL_SUCCESS) {throw new RuntimeException(msg + ": EGL_ERROR_CODE: 0x" + Integer.toHexString(error));}}private ByteBuffer buildSEIData(String message) {// 构建 SEI 数据int seiSize = 128;ByteBuffer seiBuffer = ByteBuffer.allocate(seiSize);seiBuffer.put(new byte[]{0, 0, 0, 1, 6, 5});// 设置 SEI messageString seiMessage = "h264testdata" + message;seiBuffer.put((byte) seiMessage.length());// 设置 SEI user dataseiBuffer.put(seiMessage.getBytes());seiBuffer.flip();return seiBuffer;}}
public class EncodingRunnable extends Thread {private static final String TAG = "Abbott EncodingRunnable";private Object mRecordLock = new Object();private boolean mExitThread = false;private MediaMuxer mMediaMuxer;private int avcIndex;private int mAudioIndex;private MediaFormat mOutputFormat;private MediaFormat mAudioOutputFormat;private ImageList mImageList;private ImageList mAudioImageList;private boolean itemAlarm;private long mAudioImageListTimeUs = -1;private boolean mAudioAlarm;private int mVideoCapcity = 1000 / 40 * Param.recordInternal;private int mAudioCapcity = 1000 / 20 * Param.recordInternal;private int recordSecond = 1000 * 1000 * 60;long Video60sStart = -1;public EncodingRunnable() {mImageList = new ImageList(mVideoCapcity);mAudioImageList = new ImageList(mAudioCapcity);}private boolean mIsRecoding = false;public void setMediaFormat(MediaFormat OutputFormat) {if (mOutputFormat == null) {mOutputFormat = OutputFormat;}}public void setAudioFormat(MediaFormat OutputFormat) {if (mAudioOutputFormat == null) {mAudioOutputFormat = OutputFormat;}}public void setMediaMuxerConfig() {long currentTimeMillis = System.currentTimeMillis();Date currentDate = new Date(currentTimeMillis);SimpleDateFormat dateFormat = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault());String fileName = dateFormat.format(currentDate);File mFile = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),fileName + ".MP4");Log.d(TAG, "setMediaMuxerSavaPath: new MediaMuxer  " + mFile.getPath());try {mMediaMuxer = new MediaMuxer(mFile.getPath(), MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);} catch (IOException e) {e.printStackTrace();}avcIndex = mMediaMuxer.addTrack(mOutputFormat);mAudioIndex = mMediaMuxer.addTrack(mAudioOutputFormat);mMediaMuxer.start();}public void setMediaMuxerSavaPath() {if (!mIsRecoding) {mExitThread = false;setMediaMuxerConfig();setRecording();notifyStartRecord();}}@Overridepublic void run() {super.run();while (true) {synchronized (mRecordLock) {try {mRecordLock.wait();} catch (InterruptedException e) {e.printStackTrace();}}MediaCodec.BufferInfo tmpAudioBufferInfo = new MediaCodec.BufferInfo();while (mIsRecoding) {if (mAudioImageList.getSize() > 0) {ImageList.ImageItem audioItem = mAudioImageList.getItem();tmpAudioBufferInfo.set(audioItem.getVideoBufferInfo().offset,audioItem.getVideoBufferInfo().size,audioItem.getVideoBufferInfo().presentationTimeUs + mAudioImageListTimeUs,audioItem.getVideoBufferInfo().flags);mMediaMuxer.writeSampleData(mAudioIndex, audioItem.getVideoByteBuffer(), tmpAudioBufferInfo);mAudioImageList.removeItem();}if (mImageList.getSize() > 0) {ImageList.ImageItem item = mImageList.getItem();if (Video60sStart < 0) {Video60sStart = item.getVideoBufferInfo().presentationTimeUs;}mMediaMuxer.writeSampleData(avcIndex, item.getVideoByteBuffer(), item.getVideoBufferInfo());if (item.getVideoBufferInfo().presentationTimeUs - Video60sStart > recordSecond) {Log.d(TAG, "System.currentTimeMillis() - Video60sStart :" + (item.getVideoBufferInfo().presentationTimeUs - Video60sStart));mMediaMuxer.stop();mMediaMuxer.release();mMediaMuxer = null;setMediaMuxerConfig();Video60sStart = -1;}mImageList.removeItem();}if (itemAlarm == false && mAudioAlarm == false) {mIsRecoding = false;Log.d(TAG, "mediaMuxer.stop()");mMediaMuxer.stop();mMediaMuxer.release();mMediaMuxer = null;break;}}if (mExitThread) {break;}}}public synchronized void setRecording() throws IllegalStateException {synchronized (mRecordLock) {mIsRecoding = true;}}public synchronized void setAudioAlarmTrue() throws IllegalStateException {synchronized (mRecordLock) {mAudioAlarm = true;}}public synchronized void setVideoAlarmTrue() throws IllegalStateException {synchronized (mRecordLock) {itemAlarm = true;}}public synchronized void setAudioAlarmFalse() throws IllegalStateException {synchronized (mRecordLock) {mAudioAlarm = false;}}public synchronized void setVideoAlarmFalse() throws IllegalStateException {synchronized (mRecordLock) {itemAlarm = false;}}public synchronized void notifyStartRecord() throws IllegalStateException {synchronized (mRecordLock) {mRecordLock.notify();}}public synchronized void push(ImageList.ImageItem item) {mImageList.addItem(item.getTimestamp(),item.getVideoByteBuffer(),item.getVideoBufferInfo());}public synchronized void pushAudio(ImageList.ImageItem item) {synchronized (mRecordLock) {mAudioImageList.addItem(item.getTimestamp(),item.getVideoByteBuffer(),item.getVideoBufferInfo());}}public synchronized void setTimeUs(long l) {if (mAudioImageListTimeUs != -1) {return;}mAudioImageListTimeUs = l;Log.d(TAG, "setTimeUs: " + l);}public synchronized void setExitThread() {mExitThread = true;mIsRecoding = false;notifyStartRecord();try {join();} catch (InterruptedException e) {e.printStackTrace();}}}

最后介绍一下Camera2Renderer和MainActivity

Camera2Renderer

Camera2Renderer继承GLSurfaceView.Renderer,通过这个类来调动所有的代码。

public class Camera2Renderer implements GLSurfaceView.Renderer {private static final String TAG = "Abbott Camera2Renderer";final private Context mContext;final private GLSurfaceView mGlSurfaceView;private Camera2 mCamera;private int[] mTexture = new int[1];private SurfaceTexture mSurfaceTexture;private Surface mSurface;private OesTexture mOesTexture;private EGLContext mEglContext = null;private VideoRecorder mVideoRecorder;private EncodingRunnable mEncodingRunnable;private AudioEncoder mAudioEncoder;public Camera2Renderer(Context context, GLSurfaceView glSurfaceView, EncodingRunnable encodingRunnable) {mContext = context;mGlSurfaceView = glSurfaceView;mEncodingRunnable = encodingRunnable;}@Overridepublic void onSurfaceCreated(GL10 gl, EGLConfig config) {mCamera = new Camera2(mContext);mCamera.openCamera(1920, 1080, "0");mOesTexture = new OesTexture();mOesTexture.init();mEglContext = EGL14.eglGetCurrentContext();mVideoRecorder = new VideoRecorder(mEglContext, mEncodingRunnable);mVideoRecorder.startRecord();try {mAudioEncoder = new AudioEncoder(mEncodingRunnable);mAudioEncoder.start();} catch (IOException e) {e.printStackTrace();}}@Overridepublic void onSurfaceChanged(GL10 gl, int width, int height) {GLES30.glGenTextures(1, mTexture, 0);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTexture[0]);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);mSurfaceTexture = new SurfaceTexture(mTexture[0]);mSurfaceTexture.setDefaultBufferSize(1920, 1080);mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {@Overridepublic void onFrameAvailable(SurfaceTexture surfaceTexture) {mGlSurfaceView.requestRender();}});mSurface = new Surface(mSurfaceTexture);mCamera.startPreview(mSurface);}@Overridepublic void onDrawFrame(GL10 gl) {mSurfaceTexture.updateTexImage();mOesTexture.PrepareTexture(mTexture[0]);mVideoRecorder.requestRender(mTexture[0]);}public VideoRecorder getVideoRecorder() {return mVideoRecorder;}public AudioEncoder getAudioEncoder() {return mAudioEncoder;}
}

主函数比较简单,就是申请权限而已。

public class MainActivity extends AppCompatActivity {private static final String TAG = "Abbott MainActivity";private static final String FRAGMENT_DIALOG = "dialog";private final Object mLock = new Object();private GLSurfaceView mGlSurfaceView;private Button mRecordButton;private Button mExitButton;private Camera2Renderer mCamera2Renderer;private VideoRecorder mVideoRecorder;private EncodingRunnable mEncodingRunnable;private AudioEncoder mAudioEncoder;private static final int REQUEST_CAMERA_PERMISSION = 1;@Overrideprotected void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {requestCameraPermission();return;}setContentView(R.layout.activity_main);mGlSurfaceView = findViewById(R.id.glView);mRecordButton = findViewById(R.id.recordBtn);mExitButton = findViewById(R.id.exit);mGlSurfaceView.setEGLContextClientVersion(3);mEncodingRunnable = new EncodingRunnable();mEncodingRunnable.start();mCamera2Renderer = new Camera2Renderer(this, mGlSurfaceView, mEncodingRunnable);mGlSurfaceView.setRenderer(mCamera2Renderer);mGlSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);}@Overrideprotected void onResume() {super.onResume();mRecordButton.setOnClickListener(new View.OnClickListener() {@Overridepublic void onClick(View view) {synchronized (MainActivity.this) {startRecord();}}});mExitButton.setOnClickListener(new View.OnClickListener() {@Overridepublic void onClick(View view) {stopRecord();Log.d(TAG, "onClick: exit program");finish();}});}private void requestCameraPermission() {if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) ||shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||shouldShowRequestPermissionRationale(Manifest.permission.RECORD_AUDIO)) {new ConfirmationDialog().show(getSupportFragmentManager(), FRAGMENT_DIALOG);} else {requestPermissions(new String[]{Manifest.permission.CAMERA,Manifest.permission.WRITE_EXTERNAL_STORAGE,Manifest.permission.RECORD_AUDIO}, REQUEST_CAMERA_PERMISSION);}}public static class ConfirmationDialog extends DialogFragment {@NonNull@Overridepublic Dialog onCreateDialog(Bundle savedInstanceState) {final Fragment parent = getParentFragment();return new AlertDialog.Builder(getActivity()).setMessage(R.string.request_permission).setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {@Overridepublic void onClick(DialogInterface dialog, int which) {}}).setNegativeButton(android.R.string.cancel,new DialogInterface.OnClickListener() {@Overridepublic void onClick(DialogInterface dialog, int which) {Activity activity = parent.getActivity();if (activity != null) {activity.finish();}}}).create();}}private void startRecord() {synchronized (mLock) {try {if (mVideoRecorder == null) {mVideoRecorder = mCamera2Renderer.getVideoRecorder();}if (mAudioEncoder == null) {mAudioEncoder = mCamera2Renderer.getAudioEncoder();}mVideoRecorder.setAlarm();mAudioEncoder.setAlarm();mEncodingRunnable.setMediaMuxerSavaPath();Log.d(TAG, "Start Record ");} catch (Exception e) {e.printStackTrace();}}}private void stopRecord() {if (mVideoRecorder == null) {mVideoRecorder = mCamera2Renderer.getVideoRecorder();}if (mAudioEncoder == null) {mAudioEncoder = mCamera2Renderer.getAudioEncoder();}mEncodingRunnable.setExitThread();mVideoRecorder.stopVideoRecord();mAudioEncoder.stopAudioRecord();}}

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/662468.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

AI实践于学习3_大模型之AI解题提示词优化实践

前言 提示词只是让LLM具有一定的推理能力&#xff0c;并不能直接提高模型能力&#xff0c;可以借助CoT思维链、提示词规则一定程度微调模型。 尽量使用新模型&#xff0c;通用性能更好&#xff0c;支持提示词规则更多。 参考 https://www.rgznbk.com/archives/Prompt-Engin…

海外社媒营销平台及运营规则,如何降低封号率?

社交媒体已经成为人们生活和日常习惯不可或缺的一部分&#xff0c;在跨境电商出海过程中&#xff0c;海外社媒营销平台可以起到非凡的助力&#xff1b;而平台的选择以及平台的运营技巧、规则都各有不同。很多海外社媒工作者经常会被封号&#xff0c;这也是难度之一&#xff0c;…

2024美赛数学建模C题思路分析 - 网球的动量

1 赛题 问题C&#xff1a;网球的动量 在2023年温布尔登绅士队的决赛中&#xff0c;20岁的西班牙新星卡洛斯阿尔卡拉兹击败了36岁的诺瓦克德约科维奇。这是德约科维奇自2013年以来首次在温布尔登公开赛失利&#xff0c;并结束了他在大满贯赛事中历史上最伟大的球员之一的非凡表…

2024美赛C题完整解题教程 网球运动势头(持续更新)

2024美赛已经于今天早上6点准时公布题目。本次美赛将全程跟大家一起战斗冲刺O奖&#xff01;思路持续更新。 2024 MCM Problem C: Momentum in Tennis &#xff08;网球运动的势头&#xff09; 注&#xff1a;在网球运动中&#xff0c;"势头"通常指的是比赛中因一系…

为客户解决痛点,电子纸增加制表功能

为客户解决痛点&#xff0c;电子纸增加制表功能 部分客户购买我们的电子纸后反馈效果很好&#xff0c;但是在配套组态软件制作电子纸模板时&#xff0c;遇到需要制作表格的时候比较麻烦。像是在画板作画一样&#xff0c;比较费时&#xff0c;而且效果不是很好&#xff0c;没办…

机器学习算法决策树

决策树的介绍 决策树是一种常见的分类模型&#xff0c;在金融风控、医疗辅助诊断等诸多行业具有较为广泛的应用。决策树的核心思想是基于树结构对数据进行划分&#xff0c;这种思想是人类处理问题时的本能方法。例如在婚恋市场中&#xff0c;女方通常会先询问男方是否有房产&a…

2024.2.1日总结

web的运行原理&#xff1a; 用户通过浏览器发送HTTP请求到服务器&#xff08;网页操作&#xff09;。web服务器接收到用户特定的HTTP请求&#xff0c;由web服务器请求信息移交给在web服务器中部署的javaweb应用程序&#xff08;Java程序&#xff09;。启动javaweb应用程序执行…

k8s之基础组件说明

前言 K8S&#xff0c;全称 Kubernetes&#xff0c;是一个用于管理容器的开源平台。它可以让用户更加方便地部署、扩展和管理容器化应用程序&#xff0c;并通过自动化的方式实现负载均衡、服务发现和自动弹性伸缩等功能。 具体来说&#xff0c;Kubernetes 可以将应用程序打包成…

2024年美赛C题:Momentum in Tennis思路解析

Problem C: Momentum in Tennis 网球运动中的动力 【扫描下方二维码加入群聊&#xff0c;了解更多思路~】 中文题目&#xff1a; 在2023年温布尔登男子单打决赛中&#xff0c;20岁的西班牙新星卡洛斯阿尔卡拉斯击败了36岁的诺瓦克德约科维奇。这是德约科维奇自2013年以来在温布…

RFID技术的应用在汽车座椅加工中的优势

RFID技术的应用在汽车座椅加工中的优势 在传统的汽车座椅加工过程中&#xff0c;需要人工核对和记录座椅的信息&#xff0c;如型号、序列号、生产日期等。这种方式不仅效率低下&#xff0c;而且容易出错。而通过使用RFID技术&#xff0c;这些问题得到了有效解决。 在座椅的生…

使用VScode编译betaflight固件--基于ubuntu平台

使用VScode编译betaflight固件--基于ubuntu平台 1、使用git克隆betaflight的开源代码2、配置编译环境3、使用VScode编译代码 window平台的见上一篇文章 使用VScode编译betaflight固件–基于windows平台 本文主要介绍在linux系统 ubuntu平台下使用VScode编译betaflight固件的方法…

C++集群聊天服务器 网络模块+业务模块+CMake构建项目 笔记 (上)

跟着施磊老师做C项目&#xff0c;施磊老师_腾讯课堂 (qq.com) 一、网络模块ChatServer chatserver.hpp #ifndef CHATSERVER_H #define CHATSERVER_H#include <muduo/net/TcpServer.h> #include <muduo/net/EventLoop.h> using namespace muduo; using namespace …

jsp 产品维修管理系统Myeclipse开发mysql数据库web结构java编程计算机网页项目

一、源码特点 JSP 产品维修管理系统是一套完善的java web信息管理系统&#xff0c;对理解JSP java编程开发语言有帮助&#xff0c;系统具有完整的源代码和数据库&#xff0c;系统主要采用B/S模式开发。开发环境为 TOMCAT7.0,Myeclipse8.5开发&#xff0c;数据库为Mysql5.…

如何使用内网穿透工具在公网实现实时监测DashDot服务器仪表盘

文章目录 1. 本地环境检查1.1 安装docker1.2 下载Dashdot镜像 2. 部署DashDot应用3. 本地访问DashDot服务4. 安装cpolar内网穿透5. 固定DashDot公网地址 本篇文章我们将使用Docker在本地部署DashDot服务器仪表盘&#xff0c;并且结合cpolar内网穿透工具可以实现公网实时监测服务…

2024 TikTok Shop本土店入驻流程全解,建议收藏

如果要在2023选出最具潜力的跨境电商平台&#xff0c;TikTok Shop无疑是一个佼佼者。从上线全托管模式初出锋芒&#xff0c;再到遭遇印尼、东南亚政策打击&#xff0c;最后在黑五电商大促中取得辉煌成绩。2024TikTok势必是红海一片&#xff0c;现在上车还来得及&#xff01;下面…

SpringBoot security 安全认证(三)——自定义注解实现接口放行配置

背景&#xff1a;通过Security实现了安全管理&#xff0c;可以配置哪些接口可以无token直接访问。但一个麻烦就是每增加一个匿名访问接口时都要去修改SecurityConfig配置&#xff0c;从程序设计上讲是不太让人接受的。 本节内容&#xff1a;即是解决以上问题&#xff0c;增加一…

Model Checking Guided Testing for Distributed Systems——论文泛读

EuroSys 2023 Paper 论文阅读笔记整理 问题 分布式系统已成为云计算的支柱&#xff0c;不正确的系统设计和实现可能严重影响分布式系统的可靠性。尽管使用形式化规范建模的分布式系统设计可以通过形式化模型检查进行验证&#xff0c;但要弄清其相应的实现是否符合已验证的规范…

【EI会议征稿通知】第三届信号处理与通信安全国际学术会议(ICSPCS 2024)

第三届信号处理与通信安全国际学术会议&#xff08;ICSPCS 2024&#xff09; 2024 3rd International Conference on Signal Processing and Communication Security 信号处理和通信安全是现代信息技术应用的重要领域&#xff0c;近年来这两个领域的研究相互交叉促进&#xf…

【机器学习】贝叶斯垃圾邮件识别

实验三&#xff1a;贝叶斯垃圾邮件识别 本次作业以垃圾邮件分类任务为基础&#xff0c;要求提取文本特征并使用朴素贝叶斯算法进行垃圾邮件识别&#xff08;调用已有工具包或自行实现&#xff09;。 1 任务介绍 ​ 电子邮件是互联网的一项重要服务&#xff0c;在大家的学习、…

【ADI 知识库】X 波段相控阵开发平台 硬件 2

ADAR1000EVAL1Z (Stingray) ADAR1000-EVAL1Z评估板是一款模拟波束成形前端&#xff0c;设计用于测试ADAR1000和ADTR1107的性能。ADAR1000 是一款 8 GHz 至 16 GHz、4 通道、X 波段和 Ku 波段波束成形器 IC。ADTR1107是 6 GHz 至 18 GHz 前端发送/接收模块。 ADAR1000-EVAL1Z板…