WebRTC音视频通话-RTC直播本地视频及相册视频文件

WebRTC音视频通话-RTC直播本地视频及相册视频文件

WebRTC音视频通话-RTC直播本地视频文件效果图如下
在这里插入图片描述

WebRTC音视频通话-RTC直播本地视频文件时候,用到了AVPlayer、CADisplayLink。

一、通过AVPlayer播放本地视频

  • AVPlayer是什么?

AVPlayer是基于AVFoundation框架的一个类,很接近底层,灵活性强,可以自定义视频播放样式。

  • AVPlayerLayer是什么?

AVPlayerLayer是视频播放时候的画面展示层。

  • CADisplayLink是什么?

CADisplayLink和NSTimer一样,是一个定时器。但是CADisplayLink会和屏幕的刷新率始终保持一致(很多时候会使用CADisplayLink来检测屏幕的帧率)。

  • AVPlayerItemVideoOutput是什么?

AVPlayerItemVideoOutput是PlayerItem视频的输出,通过AVPlayerItemVideoOutput可以获取视频的CVPixelBufferRef

下面就实现本地视频播放过程中结合ossrs进行WebRTC直播。

AVPlayer设置播放本地视频

 AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL:[NSURL fileURLWithPath:videoPath]];
 (void)reloadPlayItem:(AVPlayerItem *)playerItem {self.playerItem = playerItem;[self initPlayerVideoOutput];self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
}

开始配置播放展示AVPlayerLayer

- (void)startPlay {if (self.isPlaying) {return;}AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];self.playerLayer = playerLayer;self.playerLayer.backgroundColor = [UIColor clearColor].CGColor;self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;[self.superView.layer addSublayer:self.playerLayer];self.playerLayer.frame = self.superView.bounds;[self.player seekToTime:CMTimeMake(0, 1)];[self.player play];[self startAnimating];
}

通过KVO监控播放器状态

/***  通过KVO监控播放器状态*/
- (void)observeValueForKeyPath:(NSString *)keyPath {DebugLog(@"observeValueForKeyPath:%@", keyPath);AVPlayerItem *videoItem = self.playerItem;if ([keyPath isEqualToString:@"timeControlStatus"]) {/**typedef NS_ENUM(NSInteger, AVPlayerTimeControlStatus) {AVPlayerTimeControlStatusPaused = 0,AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate = 1,AVPlayerTimeControlStatusPlaying = 2} API_AVAILABLE(macos(10.12), ios(10.0), tvos(10.0), watchos(3.0));*/// 监听播放器timeControlStatus 指示当前是否正在播放,无限期暂停播放,或在等待适当的网络条件时暂停播放if (@available(iOS 10.0, *)) {switch (self.player.timeControlStatus) {case AVPlayerTimeControlStatusPaused: {NSLog(@"AVPlayerTimeControlStatusPaused");// 暂停self.isPlaying = NO;}break;case AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate: {NSLog(@"AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate");// 等待}break;case AVPlayerTimeControlStatusPlaying: {NSLog(@"AVPlayerTimeControlStatusPlaying");// 播放self.isPlaying = YES;}break;default:break;}} else {// Fallback on earlier versions}}
}

设置关键的AVPlayerItemVideoOutput

- (void)initPlayerVideoOutput {//输出yuv 420格式NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];[self.playerItem addOutput:output];self.playerItemVideoOutput = output;[self.playerItemVideoOutput setDelegate:self queue:dispatch_get_main_queue()];// 如果将 AVPlayerItemVideoOutput 类的输出(对于 suppressesPlayerRendering 的值为 YES)添加到 AVPlayerItem,则该项目的视频媒体将不会由 AVPlayer 呈现,而音频媒体、字幕媒体和其他类型的媒体(如果存在) , 将被渲染。self.playerItemVideoOutput.suppressesPlayerRendering = NO;
}

之后通过displayLink开启实时调用AVPlayerItemVideoOutput得到视频画面CVPixelBufferRef

#pragma mark - DisplayLink
- (void)startDisplayLink {if (self.displayLink) {return;}self.displayLink = [CADisplayLink displayLinkWithTarget:[YYWeakProxy proxyWithTarget:self]selector:@selector(handleDisplayLink:)];[self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];// self.displayLink.preferredFramesPerSecond = 2;self.displayLink.paused = NO;
}- (void)handleDisplayLink:(CADisplayLink *)displayLink {//do somethingCMTime outputItemTime = kCMTimeInvalid;CFTimeInterval nextVSync = ([displayLink timestamp] + [displayLink duration]);outputItemTime = [[self playerItemVideoOutput] itemTimeForHostTime:nextVSync];if ([[self playerItemVideoOutput] hasNewPixelBufferForItemTime:outputItemTime]) {CVPixelBufferRef pixelBuffer = NULL;pixelBuffer = [[self playerItemVideoOutput] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:NULL];// ..... do something with pixbufferif (self.delegate && [self.delegate respondsToSelector:@selector(videoLivePlayerPixelBufferRef:)]) {[self.delegate videoLivePlayerPixelBufferRef:pixelBuffer];}if (pixelBuffer != NULL) {CFRelease(pixelBuffer);}}
}- (void)stopDisplayLink {[self.displayLink invalidate];self.displayLink = nil;
}

这样在播放过程中,通过CADisplayLink获取到CVPixelBufferRef,之后使用WebRTC来进行直播。

完整播放视频过程中获得CVPixelBufferRef代码如下

SDVideoLivePlayer.h

#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>@protocol SDVideoLivePlayerDelegate;
@interface SDVideoLivePlayer : NSObject@property (nonatomic, strong) UIView *superView;@property (nonatomic, weak) id<SDVideoLivePlayerDelegate> delegate;- (instancetype)initWithSuperView:(UIView *)superView;- (void)reloadPlayItem:(AVPlayerItem *)playerItem;/// 开始播放
- (void)startPlay;/// 结束播放
- (void)stopPlay;@end@protocol SDVideoLivePlayerDelegate <NSObject>- (void)videoLivePlayerPixelBufferRef:(CVPixelBufferRef)pixelBufferRef;@end

SDVideoLivePlayer.m

#import "SDVideoLivePlayer.h"@interface SDVideoLivePlayer ()<AVPlayerItemOutputPullDelegate> {}@property (nonatomic, strong) AVPlayer *player;
@property (nonatomic, strong) AVPlayerLayer *playerLayer;
@property (nonatomic, strong) AVPlayerItem *playerItem;
@property (nonatomic, assign) BOOL isPlaying;
@property (nonatomic, strong) AVPlayerItemVideoOutput *playerItemVideoOutput;
@property (nonatomic, strong) CADisplayLink *displayLink;@end@implementation SDVideoLivePlayer- (instancetype)initWithSuperView:(UIView *)superView
{self = [super init];if (self) {self.superView = superView;self.isPlaying = NO;[self addNotifications];[self initPlayerVideoOutput];}return self;
}- (void)initPlayerVideoOutput {//输出yuv 420格式NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];[self.playerItem addOutput:output];self.playerItemVideoOutput = output;[self.playerItemVideoOutput setDelegate:self queue:dispatch_get_main_queue()];// 如果将 AVPlayerItemVideoOutput 类的输出(对于 suppressesPlayerRendering 的值为 YES)添加到 AVPlayerItem,则该项目的视频媒体将不会由 AVPlayer 呈现,而音频媒体、字幕媒体和其他类型的媒体(如果存在) , 将被渲染。self.playerItemVideoOutput.suppressesPlayerRendering = NO;
}- (void)reloadPlayItem:(AVPlayerItem *)playerItem {self.playerItem = playerItem;[self initPlayerVideoOutput];self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
}- (void)startPlay {if (self.isPlaying) {return;}AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];self.playerLayer = playerLayer;self.playerLayer.backgroundColor = [UIColor clearColor].CGColor;self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;[self.superView.layer addSublayer:self.playerLayer];self.playerLayer.frame = self.superView.bounds;[self.player seekToTime:CMTimeMake(0, 1)];[self.player play];[self startAnimating];
}- (void)stopPlay {self.isPlaying = NO;[self.player pause];[self stopAnimating];
}/***  通过KVO监控播放器状态*/
- (void)observeValueForKeyPath:(NSString *)keyPath {DebugLog(@"observeValueForKeyPath:%@", keyPath);AVPlayerItem *videoItem = self.playerItem;if ([keyPath isEqualToString:@"timeControlStatus"]) {/**typedef NS_ENUM(NSInteger, AVPlayerTimeControlStatus) {AVPlayerTimeControlStatusPaused = 0,AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate = 1,AVPlayerTimeControlStatusPlaying = 2} API_AVAILABLE(macos(10.12), ios(10.0), tvos(10.0), watchos(3.0));*/// 监听播放器timeControlStatus 指示当前是否正在播放,无限期暂停播放,或在等待适当的网络条件时暂停播放if (@available(iOS 10.0, *)) {switch (self.player.timeControlStatus) {case AVPlayerTimeControlStatusPaused: {NSLog(@"AVPlayerTimeControlStatusPaused");// 暂停self.isPlaying = NO;}break;case AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate: {NSLog(@"AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate");// 等待}break;case AVPlayerTimeControlStatusPlaying: {NSLog(@"AVPlayerTimeControlStatusPlaying");// 播放self.isPlaying = YES;}break;default:break;}} else {// Fallback on earlier versions}}
}- (void)audioSessionInterrupted:(NSNotification *)notification{//通知类型NSDictionary * info = notification.userInfo;if ([[info objectForKey:AVAudioSessionInterruptionTypeKey] integerValue] == 1) {[self.player pause];} else {[self.player play];}
}- (void)startAnimating {[self startDisplayLink];self.displayLink.paused = NO;
}- (void)stopAnimating {self.displayLink.paused = YES;[self stopDisplayLink];
}- (void)pauseAnimating {self.displayLink.paused = YES;[self stopDisplayLink];
}- (void)resumeAnimating {if (!self.displayLink) {[self startDisplayLink];}self.displayLink.paused = NO;
}#pragma mark - DisplayLink
- (void)startDisplayLink {if (self.displayLink) {return;}self.displayLink = [CADisplayLink displayLinkWithTarget:[YYWeakProxy proxyWithTarget:self]selector:@selector(handleDisplayLink:)];[self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];// self.displayLink.preferredFramesPerSecond = 2;self.displayLink.paused = NO;
}- (void)handleDisplayLink:(CADisplayLink *)displayLink {//do somethingCMTime outputItemTime = kCMTimeInvalid;CFTimeInterval nextVSync = ([displayLink timestamp] + [displayLink duration]);outputItemTime = [[self playerItemVideoOutput] itemTimeForHostTime:nextVSync];if ([[self playerItemVideoOutput] hasNewPixelBufferForItemTime:outputItemTime]) {CVPixelBufferRef pixelBuffer = NULL;pixelBuffer = [[self playerItemVideoOutput] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:NULL];// ..... do something with pixbufferif (self.delegate && [self.delegate respondsToSelector:@selector(videoLivePlayerPixelBufferRef:)]) {[self.delegate videoLivePlayerPixelBufferRef:pixelBuffer];}if (pixelBuffer != NULL) {CFRelease(pixelBuffer);}}
}- (void)stopDisplayLink {[self.displayLink invalidate];self.displayLink = nil;
}#pragma mark - AVPlayerItemOutputPullDelegate
- (void)outputMediaDataWillChange:(AVPlayerItemOutput *)sender {[self stopPlay];
}#pragma mark - Observers
- (void)addNotifications {[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(replay:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];// 音频播放被中断[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(audioSessionInterrupted:) name:AVAudioSessionInterruptionNotification object:nil];__weak typeof(self) weakSelf = self;if (@available(iOS 10.0, *)) {[self.KVOController observe:self.player keyPath:@"timeControlStatus" options:NSKeyValueObservingOptionOld|NSKeyValueObservingOptionNew block:^(id  _Nullable observer, id  _Nonnull object, NSDictionary<NSString *,id> * _Nonnull change) {__strong typeof(weakSelf) strongSelf = weakSelf;[strongSelf observeValueForKeyPath:@"timeControlStatus"];}];}
}- (void)removeNotifications {[[NSNotificationCenter defaultCenter] removeObserver:self];[self.KVOController unobserveAll];
}- (void)replay:(NSNotification *)notification {if (notification.object == self.player.currentItem) {[self.player seekToTime:CMTimeMake(0, 1)];[self.player play];}
}- (void)dealloc {[self removeNotifications];
}@end

二、获取相册视频

获取相册视频代码如下,得到AVPlayerItem,调用- (void)reloadPlayItem:(AVPlayerItem *)playerItem;

- (void)startPlayAlbumVideo:(SDMediaModel *)mediaModel {if (!(mediaModel && mediaModel.phasset)) {return;}__weak typeof(self) weakSelf = self;[[PhotoKitManager shareInstance] requestPlayerItemForVideo:mediaModel.phasset completion:^(AVPlayerItem *playerItem) {__strong typeof(weakSelf) strongSelf = weakSelf;[strongSelf.videoLivePlayer reloadPlayItem:playerItem];[strongSelf.videoLivePlayer startPlay];} failure:^{__strong typeof(weakSelf) strongSelf = weakSelf;}];
}

三、WebRTC来进行RTC直播视频

CADisplayLink获取到CVPixelBufferRef,我们使用WebRTC来进行直播,相当于我们直播我们预先准备好的一个视频。
在之前实现了GPUImage视频通话视频美颜滤镜,这里用到了RTCVideoFrame,我们需要将得到的视频的CVPixelBufferRef,通过封装成RTCVideoFrame,通过调用RTCVideoSource的didCaptureVideoFrame方法实现最终的效果。

RTCVideoSource设置如下

- (RTCVideoTrack *)createVideoTrack {RTCVideoSource *videoSource = [self.factory videoSource];self.localVideoSource = videoSource;// 如果是模拟器if (TARGET_IPHONE_SIMULATOR) {if (@available(iOS 10, *)) {self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];} else {// Fallback on earlier versions}} else{self.videoCapturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:self];}RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];return videoTrack;
}- (void)createMediaSenders {if (!self.isPublish) {return;}NSString *streamId = @"stream";// AudioRTCAudioTrack *audioTrack = [self createAudioTrack];self.localAudioTrack = audioTrack;RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;audioTrackTransceiver.streamIds = @[streamId];[self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];// VideoRTCVideoTrack *videoTrack = [self createVideoTrack];self.localVideoTrack = videoTrack;RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;videoTrackTransceiver.streamIds = @[streamId];[self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];
}

详细内容请看iOS端调用ossrs音视频通话部分。

将得到的视频的CVPixelBufferRef,通过封装成RTCVideoFrame,再通过调用RTCVideoSource的didCaptureVideoFrame方法。

- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client videoPixelBufferRef:(CVPixelBufferRef)videoPixelBufferRef {RTCCVPixelBuffer *rtcPixelBuffer =[[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *1000000000;RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer                                   rotation:RTCVideoRotation_0                                      timeStampNs:timeStampNs];return rtcVideoFrame;
}

使用RTCVideoSource调用didCaptureVideoFrame

[self.localVideoSource capturer:capturer didCaptureVideoFrame:frame];

其他
之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
之前实现iOS端调用ossrs音视频通话,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
之前WebRTC音视频通话高分辨率不显示画面问题,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
修改SDP中的码率Bitrate,可以查看:https://blog.csdn.net/gloryFlow/article/details/132263021
GPUImage视频通话视频美颜滤镜,可以查看:https://blog.csdn.net/gloryFlow/article/details/132265842

四、小结

WebRTC音视频通话-RTC直播本地视频文件。主要用到AVPlayer播放视频,AVPlayerItemVideoOutput得到CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。内容较多,描述可能不准确,请见谅。

本文地址:https://blog.csdn.net/gloryFlow/article/details/132267068

学习记录,每天不停进步。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/43429.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

35_windows环境debug Nginx 源码-CLion配置CMake和启动

文章目录 生成 CMakeLists.txt 组态档35_windows环境debug Nginx 源码-CLion配置CMake和启动生成 CMakeLists.txt 组态档 修改auto目录configure文件,在 . auto/make 上边增加 . auto/cmake, 大概在 106 行。在 auto 目录下创建cmake 文件其内容如下: #!/usr/bin/env bash NG…

从外部访问K8s中Pod的五种方式

hostNetwork、 hostPort、 NodePort、 LoadBalancer、 Ingress 暴露Pod与Service一样&#xff0c;因为Pod就是Service的backend 1、hostNetwork&#xff1a;true 这是一种直接定义 Pod 网络的方式。 如果在 Pod 中使用 hostNetwork:true 配置&#xff0c; pod 中运行的应用程序…

C++头文件

C头文件 一般头文件特殊头文件windows.hbits/stdc.h 一般头文件 C头文件是一种包含预定义函数、类和变量声明的文件。它们通常用于在源代码文件中引入外部库或模块的功能。 头文件的作用是提供程序所需的声明信息&#xff0c;以便在源代码文件中使用这些声明。当你在源代码文…

前端面试题-CSS

1. 盒模型 ⻚⾯渲染时&#xff0c; dom 元素所采⽤的 布局模型。可通过 box-sizing 进⾏设置。根据计算宽⾼的区域可分为 content-box ( W3C 标准盒模型)border-box ( IE 盒模型)padding-boxmargin-box (浏览器未实现) 2. BFC 块级格式化上下⽂&#xff0c;是⼀个独⽴的渲染…

题解:ABC277E - Crystal Switches

题解&#xff1a;ABC277E - Crystal Switches 题目 链接&#xff1a;Atcoder。 链接&#xff1a;洛谷。 难度 算法难度&#xff1a;B。 思维难度&#xff1a;A。 调码难度&#xff1a;C。 综合评价&#xff1a;普及/提高。 算法 宽度优先搜索拆点思路 思路 把每个点…

Android WakefulBroadcastReceiver的使用

WakefulBroadcastReceiver 是一种特殊类型的广播接收器&#xff0c;为应用创建和管理 PARTIAL_WAKE_LOCK 。 简单来说&#xff0c; WakefulBroadcastReceiver 是持有系统唤醒锁的 BroadcastReceiver &#xff0c;用于执行需要保持CPU运转的场景。 注册 注册 Receiver &#…

将vue项目通过electron打包成windows可执行程序

将vue项目打包成windows可执行程序 1、准备好dist将整个项目打包 npm run build2、安装electron依赖 npm install electron --save-dev npm install electron-packager --save-dev"electron": "^13.1.4", "electron-packager": "^15.2.0…

九耶丨阁瑞钛伦特-在项目中找到的经典BUG是什么?

在项目中找到的经典BUG有很多种&#xff0c;以下是其中一些常见的例子&#xff1a; 空指针异常&#xff08;NullPointerException&#xff09;&#xff1a;当程序试图访问一个空对象或未初始化的变量时&#xff0c;会抛出空指针异常。这通常是由于缺少对变量的正确初始化或检查…

Neo4j之FOREACH基础

在 Neo4j 中&#xff0c;FOREACH 语句用于在查询中对一组元素执行某些操作&#xff0c;通常是在创建或更新节点关系时。它常常与 CREATE 或 SET 等操作结合使用。 创建多个关系&#xff1a; MATCH (p:Person), (m:Movie) WHERE p.name Alice AND m.title The Matrix FOREAC…

MySQL常用练手题目

数据库表名和字段设计 1.学生表 Student(s_id,s_name,s_birth,s_sex) 学生编号,学生姓名, 出生年月,学生性别 2.课程表 Course(c_id,c_name,t_id) 课程编号, 课程名称, 教师编号 3.教师表 Teacher(t_id,t_name) 教师编号,教师姓名 4.成绩表 Score (s_id,c_id,s_score) 学生编号…

C# window forms 进度条实现

在 C# Windows Forms 应用程序中&#xff0c;如果在后台执行长时间运行的任务&#xff0c;并希望同时更新进度条&#xff0c;可以使用多线程来实现。这将确保进度条的更新不会阻塞主线程&#xff0c;从而保持界面的响应性。以下是一个示例&#xff0c;演示了如何在后台执行任务…

【Datawhale 科大讯飞-基于论文摘要的文本分类与关键词抽取挑战赛】机器学习方法baseline

内容 科大讯飞AI开发者大赛NLP赛道题目&#xff1a; 基于论文摘要的文本分类与关键词抽取挑战赛 任务&#xff1a; 1.机器通过对论文摘要等信息的理解&#xff0c;判断该论文是否属于医学领域的文献。 2.提取出该论文关键词。 数据集的获取 训练集&#xff1a; 这里读取tit…

【基础】Android Handler

一、博客参考 Handler机制详解【重点】&#xff1a;https://www.jianshu.com/p/b4d745c7ff7a Handler Thread工作线程操作UI范例【重点】&#xff1a;https://www.cnblogs.com/net168/p/4075126.html 二、内存泄漏的解决&#xff1a;静态内部类弱引用 关于 Handler&#xf…

vue+flask基于知识图谱的抑郁症问答系统

vueflask基于知识图谱的抑郁症问答系统 抑郁症已经成为当今社会刻不容缓需要解决的问题&#xff0c;抑郁症的危害主要有以下几种&#xff1a;1.可导致病人情绪低落&#xff1a;抑郁症的病人长期处于悲观的状态中&#xff0c;感觉不到快乐&#xff0c;总是高兴不起来。2.可导致工…

智慧工地平台工地人员管理系统 可视化大数据智能云平台源码

智慧工地概述&#xff1a; 智慧工地管理平台是以物联网、移动互联网技术为基础&#xff0c;充分应用大数据、人工智能、移动通讯、云计算等信息技术&#xff0c;利用前端信息采通过人机交互、感知、决策、执行和反馈等&#xff0c;实现对工程项目內人员、车辆、安全、设备、材…

elaticsearch(3)

整合springboot 1.整合依赖 注意依赖版本和安装的版本一致 <properties> <java.version>1.8</java.version> <!-- 统一版本 --> <elasticsearch.version>7.6.1</elasticsearch.version> </properties> 导入elastics…

数据结构算法--1 顺序查找二分查找

顺序查找时间复杂度为O(n) 我们可以借助Python中的函数enumerate,通过enumerate遍历列表返回其索引和值 def linnear_search(li, val):for ind, v in enumerate(li):if v val:return indelse:return None 也可以通过列表长度依次遍历: def linear_search(li, val): # …

浏览器渲染原理 - 输入url 回车后发生了什么

目录 渲染时间点渲染流水线1&#xff0c;解析&#xff08;parse&#xff09;HTML1.1&#xff0c;DOM树1.2&#xff0c;CSSOM树1.3&#xff0c;解析时遇到 css 是怎么做的1.4&#xff0c;解析时遇到 js 是怎么做的 2&#xff0c;样式计算 Recalculate style3&#xff0c;布局 la…

创建react native项目的笔记

创建react native项目的笔记 重新下载 ruby安装 watchman安装 cocoapods安装 react native 项目创建好项目后安装 ios 依赖清除设备缓存安装 android 依赖链接 网易 mumu 模拟器安装路由 Navigation页面之间的跳转逻辑自定义头部样式判断不同设备平台代码示例安装 Axios安装本地…

java mysql druid mybatis-plus里使用多表删除出错的一种处理方式

今天在出来多表删除的时候在mapper.xml用了下面的多个delete语句 <?xml version"1.0" encoding"UTF-8"?> <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd"…