Threejs中顶视图截图

Threejs中顶视图截图

一般项目中的每个模型,都需要有一张对应的图片,一般是顶视图,在对应的2D场景场景中展示。以下分享一个实现方式,先将清空模型材质的纹理,把颜色设置为白色,使用正交相机截取顶视图,生成一张图片,作为模型在2D场景的图标。

这个是截图模型顶视图的代码:

import * as THREE from 'three';
import { OutlinePostProcess } from './OutlinePostProcess';export class ModelCapture {private renderer: THREE.WebGLRenderer;private scene: THREE.Scene;private camera: THREE.OrthographicCamera;private outlineProcess: OutlinePostProcess;private width: number = 240;private height: number = 260;constructor() {this.scene = new THREE.Scene();this.renderer = new THREE.WebGLRenderer({antialias: true,alpha: true,preserveDrawingBuffer: true});this.camera = new THREE.OrthographicCamera(0, 0, 0, 0, 0.1, 2000)this.camera.position.set(0, 100, 0);this.camera.lookAt(0, 0, 0);const ambientLight = new THREE.AmbientLight(0xffffff, 1);this.scene.add(ambientLight);this.outlineProcess = new OutlinePostProcess(this.renderer,this.scene,this.camera,this.width,this.height);this.outlineProcess.setDefaultEnabled(true);this.outlineProcess.setEnabled(true);this.outlineProcess.makeOutlineDirty();}public captureModel(model: THREE.Group): void {const root = model;this.scene.add(root);const boundingBox = new THREE.Box3().setFromObject(root);const size = new THREE.Vector3();boundingBox.getSize(size);this.updateSize(size.x, size.z);root.traverse((child: THREE.Object3D) => {if (child instanceof THREE.Mesh) {if (Array.isArray(child.material)) {child.material.forEach(material => {if (material.map) material.map = null;material.color = new THREE.Color(1, 1, 1);});} else if (child.material && child.material.map) {child.material.map = null;child.material.color = new THREE.Color(1, 1, 1);}}});this.outlineProcess.makeOutlineDirty();this.outlineProcess.render();const imageUrl = this.renderer.domElement.toDataURL('image/png');const img = document.createElement('img');img.id = 'model-capture';img.src = imageUrl;img.style.position = 'absolute';img.style.top = '0';img.style.right = '0';img.style.width = '20%';img.style.height = '20%';document.body.appendChild(img);}// 更新场景尺寸的函数public updateSize(width: number, height: number) {// 更新渲染器尺寸this.renderer.setSize(width, height)// 更新相机参数this.camera.left = width / -2this.camera.right = width / 2this.camera.top = height / 2this.camera.bottom = height / -2this.camera.updateProjectionMatrix()this.outlineProcess.onResize(width, height);}
} 

为了方便顶视图的效果,搭建了一个同样的场景,来实时观察相机截图的内容,代码如下:

import * as THREE from 'three'
import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls.js';
import { GLTFLoader } from 'three/examples/jsm/loaders/GLTFLoader'
import { OutlinePostProcess } from './OutlinePostProcess';
import { ModelCapture } from './ModelCapture';export class TopView {private static initialized = false;private static renderer: THREE.WebGLRenderer;private static scene: THREE.Scene;private static camera: THREE.OrthographicCamera;private static outlineProcess: OutlinePostProcess;private static model: THREE.Group;private static modelCapture: ModelCapture;public static main() {TopView.modelCapture = new ModelCapture();if (TopView.initialized) {return;}TopView.initialized = true;console.log("TopView")this.scene = new THREE.Scene()const container = document.getElementById('main') as HTMLDivElementif (!container) {console.error('找不到容器元素')return}this.renderer = new THREE.WebGLRenderer({antialias: true,alpha: true,preserveDrawingBuffer: true})container.appendChild(this.renderer.domElement)this.camera = new THREE.OrthographicCamera(0, 0, 0, 0, 0.1, 2000)this.outlineProcess = new OutlinePostProcess(this.renderer, this.scene, this.camera, 240, 260);this.updateSize(240, 260)window.addEventListener('resize', ()=>{this.updateSize(240, 260);})this.camera.position.set(0, 100, 0);this.camera.lookAt(0, 0, 0);(globalThis as any).testCamera = TopView.camera;// 添加环境光const ambientLight = new THREE.AmbientLight(0xffffff, 1)this.scene.add(ambientLight)// 添加坐标轴辅助器const axesHelper = new THREE.AxesHelper(500)this.scene.add(axesHelper)// 添加网格辅助器const gridHelper = new THREE.GridHelper(1000, 20)this.scene.add(gridHelper)// 加载 GLB 模型const loader = new GLTFLoader()loader.load('/bed.glb', (gltf: any) => {let root = gltf.scene;root.scale.set(0.1, 0.1, 0.1);root.rotation.set(0, 0, 0);// 获取模型的包围盒const boundingBox = new THREE.Box3().setFromObject(root);const size = new THREE.Vector3();boundingBox.getSize(size);console.log('模型尺寸:', size);TopView.scene.add(root);TopView.model = root.clone();}, undefined, (error: any) => {console.error('加载模型出错:', error)})// 添加场景控制器const controls = new OrbitControls(this.camera, this.renderer.domElement)controls.enableDamping = true // 启用阻尼效果controls.dampingFactor = 0.05 // 阻尼系数controls.screenSpacePanning = false // 禁用屏幕空间平移controls.minDistance = 100 // 最小缩放距离controls.maxDistance = 500 // 最大缩放距离controls.maxPolarAngle = Math.PI / 2 // 限制垂直旋转角度// 渲染场景function animate() {requestAnimationFrame(animate)controls.update() // 更新控制器// TopView.renderer.render(TopView.scene, TopView.camera)TopView.outlineProcess.makeOutlineDirty();TopView.outlineProcess.render();}animate()}// 更新场景尺寸的函数public static updateSize(width: number, height: number) {// 更新渲染器尺寸this.renderer.setSize(width, height)// 更新相机参数this.camera.left = width / -2this.camera.right = width / 2this.camera.top = height / 2this.camera.bottom = height / -2this.camera.updateProjectionMatrix()this.outlineProcess.onResize(width, height);}public static async captureScene() {this.outlineProcess.makeOutlineDirty();this.outlineProcess.render();let imageUrl = await this.renderer.domElement.toDataURL('image/png');const img = await document.createElement('img');img.id = 'scene-capture';img.src = imageUrl;img.style.position = 'absolute';img.style.top = '0';img.style.left = '0';img.style.width = '20%';img.style.height = '20%';document.body.appendChild(img);}// 创建一个函数来捕获渲染结果public static async captureModel() {await TopView.modelCapture.captureModel(TopView.model.clone())}
}(globalThis as any).TopView = TopView;

可以在控制台输入如下代码,调用 TopView 中的两个方法,来测试:

// 截取当前场景
TopView.captureScene()
// 使用截图工具类截图
TopView.captureModel()

效果如下,左边是截取的场景,右边是截图工具类截的图。
res
其中,用到的描边方式,我在上一篇博客中有介绍,代码有一点修改,便于调式,也把源码放在下面。

import * as THREE from "three";
import { EffectComposer, FXAAShader, GammaCorrectionShader, RenderPass, ShaderPass, SMAAPass } from "three/examples/jsm/Addons.js";export class OutlinePostProcess {private _composer!: EffectComposer;private _normalIdRenderTarget!: THREE.WebGLRenderTarget;private _renderPass!: RenderPass;private _outlinePass!: ShaderPass;private _fxaaPass!: ShaderPass;private _smaaPass!: SMAAPass;// 抗锯齿模式,0: FXAA,1: SMAAprivate _aaMode: number = 0;private _defaultEnabled: boolean = true;private _enabled: boolean = true;private _isRenderingNormalId: boolean = false;private _normalIdMaterial!: THREE.ShaderMaterial;// 避免每帧都重复渲染一次描边,场景没变化时无需渲染private _outlineDirty: boolean = true;// 是否启用对角线采样private _enableDiagonalSampling: boolean = false;constructor(private renderer: THREE.WebGLRenderer,private scene: THREE.Scene,private _camera: THREE.Camera,private _width: number,private _height: number,) {this.initNormalIdMaterial();this.initRenderTarget();this.initComposer();}public set camera(camera: THREE.Camera) {this._camera = camera;this._renderPass.camera = camera;this.makeOutlineDirty();}public get width() {const pixelRatio = this.renderer.getPixelRatio();return this._width * pixelRatio;}public get height() {const pixelRatio = this.renderer.getPixelRatio();return this._height * pixelRatio;}private initNormalIdMaterial() {this._normalIdMaterial = new THREE.ShaderMaterial({uniforms: {meshID: { value: 0.0 }},vertexShader: `varying vec3 vNormal;void main() {vNormal = normalize(normalMatrix * normal);gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);}`,fragmentShader: `uniform float meshID;varying vec3 vNormal;vec2 encodeNormal(vec3 n) {vec2 enc = normalize(n.xy) * (sqrt(-n.z * 0.5 + 0.5));enc = enc * 0.5 + 0.5;return enc;}vec2 encodeID(float id) {float tempID = id / 255.0;float highID = floor(tempID);return vec2(highID / 255.0, tempID - highID);}void main() {vec2 encodedNormal = encodeNormal(normalize(vNormal));vec2 encodedID = encodeID(meshID);gl_FragColor = vec4(encodedNormal, encodedID);}`});}private switchMaterial(isNormalId: boolean) {if (isNormalId === this._isRenderingNormalId) {return;}let meshID = 1;const processMesh = (object: THREE.Object3D, parentSkipOutline: boolean = false) => {// 如果父级节点禁用描边,则当前节点也禁用描边const skipOutline = parentSkipOutline || object.userData.SkipOutline;// 检查对象是否可见if (!object.visible) {return;}if (object instanceof THREE.Mesh ||object instanceof THREE.Line ||object instanceof THREE.Points ||object instanceof THREE.Sprite) {if (isNormalId) {object.userData.originalMaterial = object.material;let normalIdMaterial = object.userData.normalIdMaterial;if (!normalIdMaterial) {normalIdMaterial = this._normalIdMaterial.clone();object.userData.normalIdMaterial = normalIdMaterial;}normalIdMaterial.uniforms.meshID.value = skipOutline ? 0 : meshID++;object.material = normalIdMaterial;} else {object.material = object.userData.originalMaterial;}}// 递归处理所有子节点object.children.forEach(child => processMesh(child, skipOutline));};// 从场景根节点开始处理processMesh(this.scene);this._isRenderingNormalId = isNormalId;}private initRenderTarget() {this._normalIdRenderTarget = new THREE.WebGLRenderTarget(this.width,this.height,{format: THREE.RGBAFormat,type: THREE.FloatType,minFilter: THREE.NearestFilter,magFilter: THREE.NearestFilter,colorSpace: THREE.SRGBColorSpace,count: 1});}private initComposer() {this._composer = new EffectComposer(this.renderer);// 添加主渲染通道this._renderPass = new RenderPass(this.scene, this._camera);this._composer.addPass(this._renderPass);// 放在renderPass之后,修复渲染后颜色变暗的问题const gammaCorrectionShader = new ShaderPass(GammaCorrectionShader);this._composer.addPass(gammaCorrectionShader);// 添加轮廓后处理通道this._outlinePass = new ShaderPass({uniforms: {tDiffuse: { value: null },tNormalId: { value: null },resolution: { value: new THREE.Vector2(1 / this.width, 1 / this.height) },outlineColor: { value: new THREE.Vector4(0.0, 0.0, 0.0, 1.0) },lowIDConfig: { value: 1.0 },lowNormalConfig: { value: 0.8 },intensityConfig: { value: 0.3 },enableDiagonalSampling: { value: this._enableDiagonalSampling }},vertexShader: `varying vec2 vUv;void main() {vUv = uv;gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);}`,fragmentShader: `uniform sampler2D tDiffuse;uniform sampler2D tNormalId;uniform vec2 resolution;uniform vec4 outlineColor;uniform float lowIDConfig;uniform float lowNormalConfig;uniform float intensityConfig;uniform bool enableDiagonalSampling;varying vec2 vUv;vec3 decodeNormal(vec2 enc) {vec4 nn = vec4(enc, 0.0, 0.0) * vec4(2.0,2.0,0.0,0.0) + vec4(-1.0,-1.0,1.0,-1.0);float l = dot(nn.xyz,-nn.xyw);nn.z = l;nn.xy *= sqrt(l);return nn.xyz * 2.0 + vec3(0.0,0.0,-1.0);}float decodeID(vec2 enc) {return floor((enc.x * 255.0 + enc.y) * 255.0 + 0.5);}// 采样辅助函数vec2 sampleDirection(vec2 uv, vec2 offset, vec3 currentNormal, float currentID) {vec4 texSample = texture2D(tNormalId, uv + offset);float id = decodeID(texSample.zw);if(id < 0.5) {return vec2(0.0);}vec3 normalSample = decodeNormal(texSample.xy);float normalDiff = 1.0 - abs(dot(currentNormal, normalSample));float idDiff = abs(currentID - id) < 0.0001 ? 0.0 : 1.0;return vec2(normalDiff, idDiff);}void main() {vec4 tex = texture2D(tNormalId, vUv);if(tex.x == 0.0 && tex.y == 0.0 && tex.z == 0.0) {gl_FragColor = texture2D(tDiffuse, vUv);return;}float currentID = decodeID(tex.zw);if(currentID < 0.5) {gl_FragColor = texture2D(tDiffuse, vUv);return;}vec3 currentNormal = decodeNormal(tex.xy);// 使用采样辅助函数处理四个方向vec2 rightSample = sampleDirection(vUv, vec2(resolution.x, 0.0), currentNormal, currentID);vec2 leftSample = sampleDirection(vUv, vec2(-resolution.x, 0.0), currentNormal, currentID);vec2 downSample = sampleDirection(vUv, vec2(0.0, resolution.y), currentNormal, currentID);vec2 upSample = sampleDirection(vUv, vec2(0.0, -resolution.y), currentNormal, currentID);// 处理对角线方向的采样float diagonalIdDiff = 0.0;float diagonalNormalDiff = 0.0;if(enableDiagonalSampling) {vec2 rightUpSample = sampleDirection(vUv, vec2(resolution.x, -resolution.y), currentNormal, currentID);vec2 rightDownSample = sampleDirection(vUv, vec2(resolution.x, resolution.y), currentNormal, currentID);vec2 leftUpSample = sampleDirection(vUv, vec2(-resolution.x, -resolution.y), currentNormal, currentID);vec2 leftDownSample = sampleDirection(vUv, vec2(-resolution.x, resolution.y), currentNormal, currentID);diagonalNormalDiff = rightUpSample.x + rightDownSample.x + leftUpSample.x + leftDownSample.x;diagonalIdDiff = rightUpSample.y + rightDownSample.y + leftUpSample.y + leftDownSample.y;}float totalIdDiff = rightSample.y + leftSample.y + downSample.y + upSample.y + diagonalIdDiff * 0.5;float totalNormalDiff = rightSample.x + leftSample.x + downSample.x + upSample.x + diagonalNormalDiff * 0.5;vec2 result = clamp(vec2(totalNormalDiff * lowNormalConfig, totalIdDiff * lowIDConfig) * intensityConfig,0.0,1.0);float outlineStrength = max(result.x, result.y);vec4 sceneColor = texture2D(tDiffuse, vUv);gl_FragColor = mix(sceneColor, outlineColor, outlineStrength * outlineColor.a);}`});this._composer.addPass(this._outlinePass);if (this._aaMode === 0) {// 添加FXAA抗锯齿通道this._fxaaPass = new ShaderPass(FXAAShader);this._fxaaPass.material.uniforms.resolution.value.x = 1 / (this.width);this._fxaaPass.material.uniforms.resolution.value.y = 1 / (this.height);this._composer.addPass(this._fxaaPass);}else {// 创建 SMAA Passthis._smaaPass = new SMAAPass(this.width, this.height);this._composer.addPass(this._smaaPass);}}public setEnabled(enabled: boolean) {this._enabled = enabled;if (enabled) {this._outlineDirty = true;}}public setDefaultEnabled(t: boolean) {this._defaultEnabled = t;}public get isEnabled(): boolean {return this._enabled;}public get isDefaultEnabled() {return this._defaultEnabled;}public onResize(w: number, h: number) {this._width = w;this._height = h;// 更新渲染器尺寸this.renderer.setSize(this.width, this.height, false);// 更新后处理效果尺寸this._normalIdRenderTarget.setSize(this.width, this.height);this._composer.setSize(this.width, this.height);this._outlinePass.uniforms.resolution.value.set(1 / this.width, 1 / this.height);// 更新抗锯齿通道尺寸if (this._aaMode === 0) {this._fxaaPass.material.uniforms.resolution.value.x = 1 / (this.width);this._fxaaPass.material.uniforms.resolution.value.y = 1 / (this.height);}else {this._smaaPass.setSize(this.width, this.height);}}public render() {if (!this._enabled) {// 如果禁用了描边效果,直接进行普通渲染this.renderer.render(this.scene, this._camera);return;}// 渲染法线和ID到渲染目标if (this._outlineDirty) {this.switchMaterial(true);this.renderer.setRenderTarget(this._normalIdRenderTarget);this.renderer.render(this.scene, this._camera);this._outlineDirty = true;}// 更新轮廓通道的纹理this._outlinePass.uniforms.tNormalId.value = this._normalIdRenderTarget.texture;// this.showRenderTarget(this.renderer, this._normalIdRenderTarget, this.width, this.height);// 恢复正常渲染this.switchMaterial(false);this.renderer.setRenderTarget(null);// 执行后处理渲染this._composer.render();}public makeOutlineDirty() {this._outlineDirty = true;}public setLowIDConfig(value: number) {this._outlinePass.uniforms.lowIDConfig.value = value;this.makeOutlineDirty();}public getLowIDConfig() {return this._outlinePass.uniforms.lowIDConfig.value;}public setLowNormalConfig(value: number) {this._outlinePass.uniforms.lowNormalConfig.value = value;this.makeOutlineDirty();}public getLowNormalConfig() {return this._outlinePass.uniforms.lowNormalConfig.value;}public setIntensityConfig(value: number) {this._outlinePass.uniforms.intensityConfig.value = value;this.makeOutlineDirty();}public getIntensityConfig() {return this._outlinePass.uniforms.intensityConfig.value;}// 设置是否启用对角线采样public setEnableDiagonalSampling(enable: boolean) {this._enableDiagonalSampling = enable;this._outlinePass.uniforms.enableDiagonalSampling.value = enable;this.makeOutlineDirty();}// 获取是否启用对角线采样public getEnableDiagonalSampling(): boolean {return this._enableDiagonalSampling;}public getOutlineColor(): THREE.Vector4 {return this._outlinePass.uniforms.outlineColor.value;}public setOutlineColor(x: number, y: number, z: number) {this._outlinePass.uniforms.outlineColor.value.set(x, y, z, 1);}public showRenderTarget(render: THREE.WebGLRenderer, target: THREE.WebGLRenderTarget, width: number, height: number) {// 根据渲染目标的格式选择正确的数据类型let pixels;if (target.texture.type === THREE.FloatType) {pixels = new Float32Array(width * height * 4);} else {pixels = new Uint8Array(width * height * 4);}// 将 renderTarget 的纹理数据读取到 Canvas 上render.setRenderTarget(target);render.readRenderTargetPixels(target, 0, 0, width, height, pixels);render.setRenderTarget(null);// 将 Canvas 数据展示到 img let imgElement = document.getElementById('normalIdTexture') as HTMLImageElement;if (!imgElement) {imgElement = document.createElement('img');imgElement.id = 'normalIdTexture';// 添加样式使图片可见imgElement.style.position = 'fixed';imgElement.style.top = '120px';imgElement.style.left = '10px';imgElement.style.width = '400px';imgElement.style.height = 'auto';imgElement.style.border = '1px solid #ccc';imgElement.style.zIndex = '100000';document.body.appendChild(imgElement);}const canvas = document.createElement('canvas');canvas.width = width;canvas.height = height;const ctx = canvas.getContext('2d');if (ctx) {let uint8ClampedArray;if (pixels instanceof Float32Array) {// 如果是 Float32Array,需要将数据转换为 Uint8ClampedArrayuint8ClampedArray = new Uint8ClampedArray(width * height * 4);for (let i = 0; i < pixels.length; i++) {uint8ClampedArray[i] = Math.min(255, Math.max(0, pixels[i] * 255));}} else {uint8ClampedArray = new Uint8ClampedArray(pixels);}// 确保 alpha 通道不透明// for (let i = 3; i < pixels.length; i += 4) {//     uint8ClampedArray[i] = 255;// }const imageData = new ImageData(uint8ClampedArray, width, height);// 创建临时 Canvas 来存储原始图像const tempCanvas = document.createElement('canvas');tempCanvas.width = width;tempCanvas.height = height;const tempCtx = tempCanvas.getContext('2d');if (tempCtx) {tempCtx.putImageData(imageData, 0, 0);// 使用 GPU 加速的变换来翻转图像ctx.save();ctx.scale(1, -1);ctx.translate(0, -height);ctx.drawImage(tempCanvas, 0, 0);ctx.restore();}}imgElement.src = canvas.toDataURL();}
} 

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/bicheng/78475.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

深度探索:DeepSeek赋能WPS图表绘制

一、研究背景 在当今数字化信息爆炸的时代&#xff0c;数据处理与可视化分析已成为众多领域研究和决策的关键环节。随着数据量的急剧增长和数据维度的不断丰富&#xff0c;传统的数据可视化工具在应对复杂数据时逐渐显露出局限性。Excel作为广泛应用的电子表格软件&#xff0c;…

第11章 面向分类任务的表示模型微调

​​​​​​第1章 对大型语言模型的介绍第2章 分词和嵌入第3章 解析大型语言模型的内部机制第4章 文本分类第5章 文本聚类与主题建模第6章 提示工程第7章 高级文本生成技术与工具第8章 语义搜索与检索增强生成第9章 多模态大语言模型第10章 构建文本嵌入模型第12章 微调生成模…

4.换行和续写

一.FileOutputStream写出数据的两个小问题&#xff1a; 问题一&#xff1a;换行 假设在本地文件中要输出数据aweihaoshuai 666&#xff0c;在输出这个数据时要换行写出&#xff0c;如下图&#xff1a; 问题二&#xff1a;续写 假设在一个文本文件中已经存在数据aweihaoshuai…

联易融受邀参加上海审计局金融审计处专题交流座谈

近日&#xff0c;联易融科技集团受邀出席了由上海市审计局金融审计处组织的专题交流座谈&#xff0c;凭借其在供应链金融领域的深厚积累和创新实践&#xff0c;联易融为与会人员带来了精彩的分享&#xff0c;进一步加深现场对供应链金融等金融发展前沿领域的理解。 在交流座谈…

SOC估算:开路电压修正的安时积分法

SOC估算&#xff1a;开路电压修正的安时积分法 基本概念 开路电压修正的安时积分法是一种结合了两种SOC估算方法的混合技术&#xff1a; 安时积分法&#xff08;库仑计数法&#xff09; - 通过电流积分计算SOC变化 开路电压法 - 通过电池电压与SOC的关系曲线进行校准 方法原…

代码随想录打卡|Day27(合并区间、单调递增的数字、监控二叉树)

贪心算法 Part05 合并区间 力扣题目链接 代码随想录链接 视频讲解链接 题目描述&#xff1a; 以数组 intervals 表示若干个区间的集合&#xff0c;其中单个区间为 intervals[i] [starti, endi] 。请你合并所有重叠的区间&#xff0c;并返回 一个不重叠的区间数组&#xff0…

PostgreSQL的扩展 pg_cron

PostgreSQL的扩展 pg_cron pg_cron 是 PostgreSQL 的一个开源扩展&#xff0c;它允许在数据库内部使用 cron 语法调度定期任务&#xff0c;是最接近 Oracle DBMS_SCHEDULER 的解决方案。 一 安装与配置 1 安装方法 下载路径&#xff1a; https://github.com/citusdata/pg_…

卷积神经网络迁移学习:原理与实践指南

引言 在深度学习领域&#xff0c;卷积神经网络(CNN)已经在计算机视觉任务中取得了巨大成功。然而&#xff0c;从头开始训练一个高性能的CNN模型需要大量标注数据和计算资源。迁移学习(Transfer Learning)技术为我们提供了一种高效解决方案&#xff0c;它能够将预训练模型的知识…

图论---朴素Prim(稠密图)

O( n ^2 ) 题目通常会提示数据范围&#xff1a; 若 V ≤ 500&#xff0c;两种方法均可&#xff08;朴素Prim更稳&#xff09;。 若 V ≤ 1e5&#xff0c;必须用优先队列Prim vector 存图。 // 最小生成树 —朴素Prim #include<cstring> #include<iostream> #i…

Spring-Cache替换Keys为Scan—负优化?

背景 使用ORM工具是往往会配合缓存框架实现三级缓存提高查询效率&#xff0c;spring-cache配合redis是非常常规的实现方案&#xff0c;如未做特殊配置&#xff0c;CacheEvict(allEntries true) 的批量驱逐方式&#xff0c;默认使用keys的方式查询历史缓存列表而后delete&…

【N8N】Docker Desktop + WSL 安装过程(Docker Desktop - WSL update Failed解决方法)

背景说明&#xff1a; 因为要用n8n&#xff0c;官网推荐这个就下载了&#xff0c;然后又是一堆卡的安装问题记录过程。 1. 下载安装包 直接去官网Get Docker | Docker Docs下载 下载的是第一个windows - x86_64. &#xff08;*下面那个beta的感觉是测试版&#xff09; PS&am…

RT Thread 发生异常时打印输出cpu寄存器信息和栈数据

打印输出发生hardfault时,当前栈十六进制数据和cpu寄存器信息 在发生 HardFault 时,打印当前栈的十六进制数据和 CPU 寄存器信息是非常重要的调试手段。以下是如何实现这一功能的具体步骤和示例代码。 1. 实现 HardFault 处理函数 我们需要在 HardFault 中捕获异常上下文,…

【安装neo4j-5.26.5社区版 完整过程】

1. 安装java 下载 JDK21-windows官网地址 配置环境变量 在底下的系统变量中新建系统变量&#xff0c;变量名为JAVA_HOME21&#xff0c;变量值为JDK文件夹路径&#xff0c;默认为&#xff1a; C:\Program Files\Java\jdk-21然后在用户变量的Path中&#xff0c;添加下面两个&am…

android jatpack Compose 多数据源依赖处理:从状态管理到精准更新的架构设计

Android Compose 多接口数据依赖管理&#xff1a;ViewModel 状态共享最佳实践 &#x1f4cc; 问题背景 在 Jetpack Compose 开发中&#xff0c;经常遇到以下场景&#xff1a; 页面由多个独立接口数据组成&#xff08;如 Part1、Part2&#xff09;Part2 的某些 UI 需要依赖 P…

面试之消息队列

消息队列场景 什么是消息队列&#xff1f; 消息队列是一个使用队列来通信的组件&#xff0c;它的本质就是个转发器&#xff0c;包含发消息、存消息、消费消息。 消息队列怎么选型&#xff1f; 特性ActiveMQRabbitMQRocketMQKafka单机吞吐量万级万级10万级10万级时效性毫秒级…

GStreamer 简明教程(十一):插件开发,以一个音频生成(Audio Source)插件为例

系列文章目录 GStreamer 简明教程&#xff08;一&#xff09;&#xff1a;环境搭建&#xff0c;运行 Basic Tutorial 1 Hello world! GStreamer 简明教程&#xff08;二&#xff09;&#xff1a;基本概念介绍&#xff0c;Element 和 Pipeline GStreamer 简明教程&#xff08;三…

Linux kernel signal原理(下)- aarch64架构sigreturn流程

一、前言 在上篇中写到了linux中signal的处理流程&#xff0c;在do_signal信号处理的流程最后&#xff0c;会通过sigreturn再次回到线程现场&#xff0c;上篇文章中介绍了在X86_64架构下的实现&#xff0c;本篇中介绍下在aarch64架构下的实现原理。 二、sigaction系统调用 #i…

华为OD机试真题——简易内存池(2025A卷:200分)Java/python/JavaScript/C++/C/GO最佳实现

2025 A卷 200分 题型 本文涵盖详细的问题分析、解题思路、代码实现、代码详解、测试用例以及综合分析&#xff1b; 并提供Java、python、JavaScript、C、C语言、GO六种语言的最佳实现方式&#xff01; 本文收录于专栏&#xff1a;《2025华为OD真题目录全流程解析/备考攻略/经验…

腾讯一面面经:总结一下

1. Java 中的 和 equals 有什么区别&#xff1f;比较对象时使用哪一个 1. 操作符&#xff1a; 用于比较对象的内存地址&#xff08;引用是否相同&#xff09;。 对于基本数据类型、 比较的是值。&#xff08;8种基本数据类型&#xff09;对于引用数据类型、 比较的是两个引…

计算机网络中的DHCP是什么呀? 详情解答

目录 DHCP 是什么&#xff1f; DHCP 的工作原理 主要功能 DHCP 与网络安全的关系 1. 正面作用 2. 潜在安全风险 DHCP 的已知漏洞 1. 协议设计缺陷 2. 软件实现漏洞 3. 配置错误导致的漏洞 4. 已知漏洞总结 举例说明 DHCP 与网络安全 如何提升 DHCP 安全性 总结 D…