flutter开发实战-外接纹理texture处理图片展示

flutter开发实战-外接纹理处理图片展示

在Flutter中,如果你想要创建一个外接纹理的widget,你可以使用Texture widget。Texture widget用于显示视频或者画布(canvas)的内容。该组件只有唯一入参textureId

通过外接纹理的方式,实际上Flutter和Native传输的数据载体就是PixelBuffer,Native端的数据源(摄像头、播放器等)将数据写入PixelBuffer,Flutter拿到PixelBuffer以后转成OpenGLES Texture,交由Skia绘制。

flutter渲染框架如图
在这里插入图片描述
layerTree的一个简单架构图
在这里插入图片描述
这篇文章分析外接纹理:https://juejin.im/post/5b7b9051e51d45388b6aeceb

一、Texture使用

import 'package:flutter/material.dart';
import 'package:flutter/services.dart';void main() {runApp(MyApp());
}class MyApp extends StatelessWidget {@overrideWidget build(BuildContext context) {return MaterialApp(home: Scaffold(appBar: AppBar(title: Text('外接纹理示例'),),body: Center(child: Texture(textureId: 12345, // 这里应该是从平台通道接收到的纹理ID),),),);}
}

这里的textureId是一个假设的纹理ID,实际使用时你需要从平台通道(platform channel)获取实际的纹理ID。例如,如果你是从Android原生代码中获取纹理,你可能会使用MethodChannel来发送纹理ID给Dart代码。

二、外接纹理展示图片

通过外接纹理的方式,flutter与native传输的载体是PixelBuffer,flutter端将图片缓存等交给iOS端的SDWebImage来获取,
通过SDWebImage下载后得到UIImage,将UIImage转换成CVPixelBufferRef。

这样,可以有一个SDTexturePresenter需要实现FlutterTexture协议。SDTexturePresenter来通过SDWebImage下载图片并且转换成CVPixelBufferRef。flutter端通过channel调用iOS端NSObject *textures的registerTexture方法。

  • Flutter端使用Texture的图片Widget
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';enum NetworkImageBoxFit { Fill, AspectFit, AspectFill }class NetworkImageWidget extends StatefulWidget {const NetworkImageWidget({Key? key,required this.imageUrl,this.boxFit = NetworkImageBoxFit.AspectFill,this.width = 0,this.height = 0,this.placeholder,this.errorHolder,}) : super(key: key);final String imageUrl;final NetworkImageBoxFit boxFit;final double width;final double height;final Widget? placeholder;final Widget? errorHolder;@override_NetworkImageWidgetState createState() => _NetworkImageWidgetState();
}class _NetworkImageWidgetState extends State<NetworkImageWidget> {final MethodChannel _channel = MethodChannel('sd_texture_channel'); //名称随意, 2端统一就好int textureId = -1; //系统返回的正常id会大于等于0, -1则可以认为 还未加载纹理@overridevoid initState() {super.initState();newTexture();}@overridevoid dispose() {super.dispose();if (textureId >= 0) {_channel.invokeMethod('dispose', {'textureId': textureId});}}BoxFit textureBoxFit(NetworkImageBoxFit imageBoxFit) {if (imageBoxFit == NetworkImageBoxFit.Fill) {return BoxFit.fill;}if (imageBoxFit == NetworkImageBoxFit.AspectFit) {return BoxFit.contain;}if (imageBoxFit == NetworkImageBoxFit.AspectFill) {return BoxFit.cover;}return BoxFit.fill;}Widget showTextureWidget(BuildContext context) {return Container(color: Colors.white,width: widget.width,height: widget.height,child: Texture(textureId: textureId),);}void newTexture() async {int aTextureId = await _channel.invokeMethod('create', {'imageUrl': widget.imageUrl, //本地图片名'width': widget.width,'height': widget.height,'asGif': false, //是否是gif,也可以不这样处理, 平台端也可以自动判断});setState(() {textureId = aTextureId;});}@overrideWidget build(BuildContext context) {Widget body = textureId >= 0? showTextureWidget(context): showDefault() ??Container(color: Colors.white,width: widget.width,height: widget.height,);return body;}Widget? showDefault() {if (widget.placeholder != null) {return widget.placeholder;}if (widget.errorHolder != null) {return widget.errorHolder;}return Container(color: Colors.white,width: widget.width,height: widget.height,);}
}
  • iOS端下载图片并转换CVPixelBufferRef

SDTexturePresenter.h

#import <Foundation/Foundation.h>
#import <Flutter/Flutter.h>@interface SDTexturePresenter : NSObject <FlutterTexture>@property(copy,nonatomic) void(^updateBlock) (void);- (instancetype)initWithImageStr:(NSString*)imageStr size:(CGSize)size asGif:(Boolean)asGif;-(void)dispose;@end

SDTexturePresenter.m

//
//  SDTexturePresenter.m
//  Pods
//
//  Created by xhw on 2020/5/15.
//#import "SDTexturePresenter.h"
#import <Foundation/Foundation.h>
//#import <OpenGLES/EAGL.h>
//#import <OpenGLES/ES2/gl.h>
//#import <OpenGLES/ES2/glext.h>
//#import <CoreVideo/CVPixelBuffer.h>
#import <UIKit/UIKit.h>
#import <SDWebImage/SDWebImageDownloader.h>
#import <SDWebImage/SDWebImageManager.h>static uint32_t bitmapInfoWithPixelFormatType(OSType inputPixelFormat, bool hasAlpha){if (inputPixelFormat == kCVPixelFormatType_32BGRA) {uint32_t bitmapInfo = kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Host;if (!hasAlpha) {bitmapInfo = kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Host;}return bitmapInfo;}else if (inputPixelFormat == kCVPixelFormatType_32ARGB) {uint32_t bitmapInfo = kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Big;return bitmapInfo;}else{NSLog(@"不支持此格式");return 0;}
}// alpha的判断
BOOL CGImageRefContainsAlpha(CGImageRef imageRef) {if (!imageRef) {return NO;}CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);BOOL hasAlpha = !(alphaInfo == kCGImageAlphaNone ||alphaInfo == kCGImageAlphaNoneSkipFirst ||alphaInfo == kCGImageAlphaNoneSkipLast);return hasAlpha;
}
@interface SDTexturePresenter()
@property (nonatomic) CVPixelBufferRef target;@property (nonatomic,assign) CGSize size;
@property (nonatomic,assign) CGSize imageSize;//图片实际大小 px
@property(nonatomic,assign)Boolean useExSize;//是否使用外部设置的大小@property(nonatomic,assign)Boolean iscopy;//gif
@property (nonatomic, assign) Boolean asGif;//是否是gif
//下方是展示gif图相关的
@property (nonatomic, strong) CADisplayLink * displayLink;
@property (nonatomic, strong) NSMutableArray<NSDictionary*> *images;
@property (nonatomic, assign) int now_index;//当前展示的第几帧
@property (nonatomic, assign) CGFloat can_show_duration;//下一帧要展示的时间差@end@implementation SDTexturePresenter- (instancetype)initWithImageStr:(NSString*)imageStr size:(CGSize)size asGif:(Boolean)asGif {self = [super init];if (self){self.size = size;self.asGif = asGif;self.useExSize = YES;//默认使用外部传入的大小if ([imageStr hasPrefix:@"http://"]||[imageStr hasPrefix:@"https://"]) {[self loadImageWithStrFromWeb:imageStr];} else {[self loadImageWithStrForLocal:imageStr];}}return self;
}-(void)dealloc{}
- (CVPixelBufferRef)copyPixelBuffer {//copyPixelBuffer方法执行后 释放纹理id的时候会自动释放_target//如果没有走copyPixelBuffer方法时 则需要手动释放_target_iscopy = YES;CVPixelBufferRetain(_target);//运行发现 这里不用加;return _target;
}-(void)dispose{self.displayLink.paused = YES;[self.displayLink invalidate];self.displayLink = nil;if (!_iscopy) {CVPixelBufferRelease(_target);}
}// 此方法能还原真实的图片
- (CVPixelBufferRef)CVPixelBufferRefFromUiImage:(UIImage *)img size:(CGSize)size {if (!img) {return nil;}CGImageRef image = [img CGImage];//    CGSize size = CGSizeMake(5000, 5000);
//    CGFloat frameWidth = CGImageGetWidth(image);
//    CGFloat frameHeight = CGImageGetHeight(image);CGFloat frameWidth = size.width;CGFloat frameHeight = size.height;//兼容外部 不传大小if (frameWidth<=0 || frameHeight<=0) {if (img!=nil) {frameWidth = CGImageGetWidth(image);frameHeight = CGImageGetHeight(image);}else{frameWidth  = 1;frameHeight  = 1;}}else if (!self.useExSize && img!=nil) {//使用图片大小frameWidth = CGImageGetWidth(image);frameHeight = CGImageGetHeight(image);}BOOL hasAlpha = CGImageRefContainsAlpha(image);CFDictionaryRef empty = CFDictionaryCreate(kCFAllocatorDefault, NULL, NULL, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,empty, kCVPixelBufferIOSurfacePropertiesKey,nil];//    NSDictionary *options = @{//        (NSString *)kCVPixelBufferCGImageCompatibilityKey:@YES,//        (NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey:@YES,//        (NSString *)kCVPixelBufferIOSurfacePropertiesKey:[NSDictionary dictionary]//    };CVPixelBufferRef pxbuffer = NULL;CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameWidth, frameHeight, kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef) options, &pxbuffer);NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);CVPixelBufferLockBaseAddress(pxbuffer, 0);void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);NSParameterAssert(pxdata != NULL);CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();uint32_t bitmapInfo = bitmapInfoWithPixelFormatType(kCVPixelFormatType_32BGRA, (bool)hasAlpha);CGContextRef context = CGBitmapContextCreate(pxdata, frameWidth, frameHeight, 8, CVPixelBufferGetBytesPerRow(pxbuffer), rgbColorSpace, bitmapInfo);//    CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, CVPixelBufferGetBytesPerRow(pxbuffer), rgbColorSpace, (CGBitmapInfo)kCGImageAlphaNoneSkipFirst);NSParameterAssert(context);CGContextConcatCTM(context, CGAffineTransformIdentity);CGContextDrawImage(context, CGRectMake(0, 0, frameWidth, frameHeight), image);CGColorSpaceRelease(rgbColorSpace);CGContextRelease(context);CVPixelBufferUnlockBaseAddress(pxbuffer, 0);return pxbuffer;
}#pragma mark - image
-(void)loadImageWithStrForLocal:(NSString*)imageStr{if (self.asGif) {self.images = [NSMutableArray array];[self sd_GIFImagesWithLocalNamed:imageStr];} else {UIImage *iamge = [UIImage imageNamed:imageStr];self.target = [self CVPixelBufferRefFromUiImage:iamge size:self.size];}
}
-(void)loadImageWithStrFromWeb:(NSString*)imageStr{__weak typeof(SDTexturePresenter*) weakSelf = self;[[SDWebImageDownloader sharedDownloader] downloadImageWithURL:[NSURL URLWithString:imageStr] completed:^(UIImage * _Nullable image, NSData * _Nullable data, NSError * _Nullable error, BOOL finished) {if (weakSelf.asGif) {for (UIImage * uiImage in image.images) {NSDictionary *dic = @{@"duration":@(image.duration*1.0/image.images.count),@"image":uiImage};[weakSelf.images addObject:dic];}[weakSelf startGifDisplay];} else {weakSelf.target = [weakSelf CVPixelBufferRefFromUiImage:image size:weakSelf.size];if (weakSelf.updateBlock) {weakSelf.updateBlock();}}}];}-(void)updategif:(CADisplayLink*)displayLink{//    NSLog(@"123--->%f",displayLink.duration);if (self.images.count==0) {self.displayLink.paused = YES;[self.displayLink invalidate];self.displayLink = nil;return;}self.can_show_duration -=displayLink.duration;if (self.can_show_duration<=0) {NSDictionary*dic = [self.images objectAtIndex:self.now_index];if (_target &&!_iscopy) {CVPixelBufferRelease(_target);}self.target = [self CVPixelBufferRefFromUiImage:[dic objectForKey:@"image"] size:self.size];_iscopy = NO;self.updateBlock();self.now_index += 1;if (self.now_index>=self.images.count) {self.now_index = 0;//            self.displayLink.paused = YES;//            [self.displayLink invalidate];//            self.displayLink = nil;}self.can_show_duration = ((NSNumber*)[dic objectForKey:@"duration"]).floatValue;}}
- (void)startGifDisplay {self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(updategif:)];//    self.displayLink.paused = YES;[self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];
}- (void)sd_GifImagesWithLocalData:(NSData *)data {if (!data) {return;}CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);size_t count = CGImageSourceGetCount(source);UIImage *animatedImage;if (count <= 1) {animatedImage = [[UIImage alloc] initWithData:data];}else {//        CVPixelBufferRef targets[count];for (size_t i = 0; i < count; i++) {CGImageRef image = CGImageSourceCreateImageAtIndex(source, i, NULL);if (!image) {continue;}UIImage *uiImage = [UIImage imageWithCGImage:image scale:[UIScreen mainScreen].scale orientation:UIImageOrientationUp];NSDictionary *dic = @{@"duration":@([self sd_frameDurationAtIndex:i source:source]),@"image":uiImage};[_images addObject:dic];CGImageRelease(image);}}CFRelease(source);[self startGifDisplay];
}- (float)sd_frameDurationAtIndex:(NSUInteger)index source:(CGImageSourceRef)source {float frameDuration = 0.1f;CFDictionaryRef cfFrameProperties = CGImageSourceCopyPropertiesAtIndex(source, index, nil);NSDictionary *frameProperties = (__bridge NSDictionary *)cfFrameProperties;NSDictionary *gifProperties = frameProperties[(NSString *)kCGImagePropertyGIFDictionary];NSNumber *delayTimeUnclampedProp = gifProperties[(NSString *)kCGImagePropertyGIFUnclampedDelayTime];if (delayTimeUnclampedProp) {frameDuration = [delayTimeUnclampedProp floatValue];}else {NSNumber *delayTimeProp = gifProperties[(NSString *)kCGImagePropertyGIFDelayTime];if (delayTimeProp) {frameDuration = [delayTimeProp floatValue];}}if (frameDuration < 0.011f) {frameDuration = 0.100f;}CFRelease(cfFrameProperties);return frameDuration;
}- (void)sd_GIFImagesWithLocalNamed:(NSString *)name {if ([name hasSuffix:@".gif"]) {name = [name  stringByReplacingCharactersInRange:NSMakeRange(name.length-4, 4) withString:@""];}CGFloat scale = [UIScreen mainScreen].scale;if (scale > 1.0f) {NSData *data = nil;if (scale>2.0f) {NSString *retinaPath = [[NSBundle mainBundle] pathForResource:[name stringByAppendingString:@"@3x"] ofType:@"gif"];data = [NSData dataWithContentsOfFile:retinaPath];}if (!data){NSString *retinaPath = [[NSBundle mainBundle] pathForResource:[name stringByAppendingString:@"@2x"] ofType:@"gif"];data = [NSData dataWithContentsOfFile:retinaPath];}if (!data) {NSString *path = [[NSBundle mainBundle] pathForResource:name ofType:@"gif"];data = [NSData dataWithContentsOfFile:path];}if (data) {[self sd_GifImagesWithLocalData:data];}}else {NSString *path = [[NSBundle mainBundle] pathForResource:name ofType:@"gif"];NSData *data = [NSData dataWithContentsOfFile:path];if (data) {[self sd_GifImagesWithLocalData:data];}}
}
@end
  • 通过channel将textureId回调回传给flutter端

SDTexturePlugin.h

#import <Flutter/Flutter.h>@interface SDTexturePlugin : NSObject<FlutterPlugin>+ (void)registerWithRegistrar:(NSObject<FlutterPluginRegistrar>*)registrar;@end

SDTexturePlugin.m

#import "SDTexturePlugin.h"
#import "SDTexturePresenter.h"
#import "SDWeakProxy.h"@interface SDTexturePlugin()@property (nonatomic, strong) NSMutableDictionary<NSNumber *, SDTexturePresenter *> *renders;
@property (nonatomic, strong) NSObject<FlutterTextureRegistry> *textures;@end@implementation SDTexturePlugin- (instancetype)initWithTextures:(NSObject<FlutterTextureRegistry> *)textures {self = [super init];if (self) {_renders = [[NSMutableDictionary alloc] init];_textures = textures;}return self;
}+ (void)registerWithRegistrar:(NSObject<FlutterPluginRegistrar>*)registrar {FlutterMethodChannel* channel = [FlutterMethodChannelmethodChannelWithName:@"sd_texture_channel"binaryMessenger:[registrar messenger]];SDTexturePlugin* instance = [[SDTexturePlugin alloc] initWithTextures:[registrar textures]];[registrar addMethodCallDelegate:instance channel:channel];
}- (void)handleMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result {NSLog(@"call method:%@ arguments:%@", call.method, call.arguments);if ([@"create" isEqualToString:call.method] || [@"acreate" isEqualToString:call.method]) {NSString *imageStr = call.arguments[@"imageUrl"];Boolean asGif = NO;CGFloat width = [call.arguments[@"width"] floatValue]*[UIScreen mainScreen].scale;CGFloat height = [call.arguments[@"height"] floatValue]*[UIScreen mainScreen].scale;CGSize size = CGSizeMake(width, height);SDTexturePresenter *render = [[SDTexturePresenter alloc] initWithImageStr:imageStr size:size asGif:asGif];int64_t textureId = [self.textures registerTexture:render];render.updateBlock = ^{[self.textures textureFrameAvailable:textureId];};NSLog(@"handleMethodCall textureId:%lld", textureId);[_renders setObject:render forKey:@(textureId)];result(@(textureId));}else if ([@"dispose" isEqualToString:call.method]) {if (call.arguments[@"textureId"]!=nil && ![call.arguments[@"textureId"] isKindOfClass:[NSNull class]]) {SDTexturePresenter *render = [_renders objectForKey:call.arguments[@"textureId"]];[_renders removeObjectForKey:call.arguments[@"textureId"]];[render dispose];NSNumber*numb =  call.arguments[@"textureId"];if (numb) {[self.textures unregisterTexture:numb.longLongValue];}}}else {result(FlutterMethodNotImplemented);}
}-(void)refreshTextureWithTextureId:(int64_t)textureId{}
@end

三、小结

flutter开发实战-外接纹理处理图片展示

学习记录,每天不停进步。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/pingmian/18256.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

迁移基于MicroBlaze处理器的设计

迁移基于MicroBlaze处理器的设计 生成系统基础设施&#xff08;MicroBlaze、AXI_Interconnect&#xff0c; Clk_Wiz、Proc_Sys_Reset&#xff09; 生成系统基础设施&#xff08;MicroBlaze、AXI_Interconnect、Clk_Wiz和 Proc_Sys_Reset&#xff09;&#xff1a; 1.使用所需的板…

2024年【危险化学品经营单位安全管理人员】考试报名及危险化学品经营单位安全管理人员找解析

题库来源&#xff1a;安全生产模拟考试一点通公众号小程序 危险化学品经营单位安全管理人员考试报名考前必练&#xff01;安全生产模拟考试一点通每个月更新危险化学品经营单位安全管理人员找解析题目及答案&#xff01;多做几遍&#xff0c;其实通过危险化学品经营单位安全管…

芋道源码 / yudao-cloud:前端技术架构探索与实践

摘要&#xff1a; 随着企业信息化建设的深入&#xff0c;后台管理系统在企业运营中扮演着至关重要的角色。本文将以芋道源码的yudao-cloud项目为例&#xff0c;深入探讨其前端技术架构的设计思路、关键技术与实现细节&#xff0c;并分享在开发过程中遇到的挑战与解决方案。 一、…

TCP UDP 传输协议 Acl 访问控制列表

TCP的应用 端口 21 FTP 22 SSH 25 SMTP 53 DNS 80 HTTP 443 HTTPS UDP 的应用 端口 69 TFTP 53 DNS 123 NTP Acl 访问控制列表 路由器ACL配置&#xff1a;[Huawei]acl 2000 //创建acl 列表号是2000[Huawei-acl-basic-2000]rule deny source 192…

列表、元组、字典和集合的区别

自学python如何成为大佬(目录):https://blog.csdn.net/weixin_67859959/article/details/139049996?spm1001.2014.3001.5501 在前面介绍了序列中的列表、元组、字典和集合的应用&#xff0c;下面通过表2对这几个数据序列进行比较。 表2 列表、元组、字典和集合的区别 数 据…

半导体测试基础 - 功能测试

功能测试(Functional Test)主要是验证逻辑功能,是运用测试矢量和测试命令来进行的一种测试,相比于纯 DC 测试而言,组合步骤相对复杂且耦合度高。 在功能测试阶段时,测试系统会以周期为单位,将测试矢量输入 DUT,提供预测的结果并与输出的数据相比较,如果实际的结果与测…

凤香的“蜜”密

执笔 | 文 清 编辑 | 古利特 “遇水则漏&#xff0c;遇酒生香”。酒海&#xff0c;一种大型盛酒容器&#xff0c;因盛酒量以“吨”计算&#xff0c;故称“海”&#xff0c;传于唐宋&#xff0c;兴盛于明清&#xff0c;距今有1400多年的历史。文人墨客笔下&#xff0c;也多有…

C++重点基础知识汇总大全

文章目录 一些基础知识点指针和引用 一些基础知识点 1、十进制的数字比较长的时候&#xff0c;可以加方便阅读到底是几位&#xff0c;输出的时候跟不加是一样的效果 // 十进制可以加 cout << 13890324 << endl; // 13890324 // 二进制前加0b cout << 0b111…

LeetCode/NowCoder-链表经典算法OJ练习4

人的才华就如海绵的水&#xff0c;没有外力的挤压&#xff0c;它是绝对流不出来的。流出来后&#xff0c;海绵才能吸收新的源泉。&#x1f493;&#x1f493;&#x1f493; 目录 说在前面 题目一&#xff1a;环形链表 题目二&#xff1a;环形链表 II 题目三&#xff1a;随机…

《Python编程从入门到实践》day34

# 昨日知识点回顾 json文件提取数据、绘制图表渐变色显示 # 今日知识点学习 第17章 17.1 使用Web API Web API作为网站的一部分&#xff0c;用于与使用具体URL请求特定信息的程序交互&#xff0c;这种请求称为API调用。 17.1.1 Git 和 GitHub Git&#xff1a;分布式版本控制系…

Media Encoder 2024 for Mac媒体编码器安装教程ME2024安装包下载

安装 步骤 1&#xff0c;双击打开下载好的安装包。 2&#xff0c;选择install ame_24...双击打开启动安装程序。 3&#xff0c;点击install。 4&#xff0c;输入电脑密码。 5&#xff0c;软件安装中... 6&#xff0c;安装结束点击好。 7&#xff0c;返回打开的镜像 选择激活补…

零基础,想做一名网络安全工程师,该怎么学习?

​ 相比IT类的其它岗位&#xff0c;网络工程师的学习方向是比较明亮的。想要成为网络工程师&#xff0c;华为认证就是最好的学习方法。而网络工程师的从零开始学习就是从华为认证的初级开始学起&#xff0c;也就是HCIA&#xff0c;也就是从最基本的什么是IP地址、什么是交换机这…

响应式流和reactor框架进阶

响应式流和reactor框架进阶 响应式流创建、转换、处理 本文档主要介绍在响应式编程中如何从流中获取数据并处理。 前提条件 假设您已经能掌握Java基础、Maven使用、Lamda表达式、响应式编程等基础。 如何获取流中数据 &#x1f30f; 说明 1、不要试图从流中获取数据出来&a…

Angular(1):使用Angular CLI创建空项目

要创建一个空的 Angular 项目&#xff0c;可以使用 Angular CLI&#xff08;命令行界面&#xff09;。以下是使用 Angular CLI 创建一个新项目的步骤&#xff1a; 1、安装 Angular CLI&#xff1a; 打开你的命令行界面&#xff08;在 Windows 上是 CMD、PowerShell 或 Git Bas…

使用python绘制一个五颜六色的爱心

使用python绘制一个五颜六色的爱心 介绍效果代码 介绍 使用numpy与matplotlib绘制一个七彩爱心&#xff01; 效果 代码 import numpy as np import matplotlib.pyplot as plt# Heart shape function def heart_shape(t):x 16 * np.sin(t)**3y 13 * np.cos(t) - 5 * np.cos…

微软:最新ChatGPT-4o模型,可在 Azure OpenAI上使用

北京时间5月14日凌晨&#xff0c;OpenAI 一场不到 30 分钟的发布会&#xff0c;正式发布了 GPT-4o&#xff0c;视频语音交互丝滑到吓人&#xff0c;还即将免费可用&#xff01; GPT-4o&#xff0c;其中的「o」代表「omni」&#xff08;即全面、全能的意思&#xff09;&#xff…

AIGC行业:巨头引领的创新浪潮与市场前景

AIGC&#xff08;AI Generated Content&#xff09;技术&#xff0c;作为新兴的技术力量&#xff0c;正逐渐改变内容创作的生态。在这一变革中&#xff0c;国内科技巨头如百度、阿里巴巴、腾讯等的积极参与&#xff0c;不仅为行业带来资本和技术支持&#xff0c;更预示着AIGC技…

react 下拉框内容回显

需要实现效果如下 目前效果如下 思路 : 将下拉框选项的value和label一起存储到state中 , 初始化表单数据时 , 将faqType对应的label查找出来并设置到Form.Item中 , 最后修改useEffect 旧代码 //可以拿到faqType为0 但是却没有回显出下拉框的内容 我需要faqType为0 回显出下拉…

Shell脚本基本命令

文件名后缀.sh 编写shell脚本一定要说明一下在#&#xff01;/bin/bash在进行编写。命令选项空格隔开。Shell脚本是解释的语言&#xff0c;bash 文件名即可打印出编写的脚本。chmod给权限命令。如 chmod 0777 文件名意思是给最高权限。 注意:count赋值不能加空格。取消变量可在变…

如何提升网络性能监控和流量回溯分析的效率?

目录 什么是网络性能监控&#xff1f; 网络性能监控的关键指标 什么是流量回溯分析&#xff1f; 流量回溯分析的应用场景 网络性能监控与流量回溯分析的结合 实例应用&#xff1a;AnaTraf网络流量分析仪 如何选择适合的网络性能监控和流量回溯分析工具&#xff1f; 结论…