flink Transformation算子(更新中)

flink Transformation算子部分

Transformation算子

map

该方法是将一个DataStream调用map方法返回一个新的DataStream。本质是将该DataStream中对应的每一条数据依次迭代出来,应用map方法传入的计算逻辑,返回一个新的DataStream。原来的DataStream中对应的每一条数据,与新生成的DataStream中数据是一一对应的,也可以说是存在着映射关系的。
package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.RichMapFunction;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;public class MapDemo  {public static void main(String[] args) throws  Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();DataStreamSource<String> dataSource = env.socketTextStream("192.168.25.62", 8899);dataSource.map(new QueryCategoryNameFromMySQLFunction()).print();env.execute();}
}
 package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.RichMapFunction;
import org.apache.flink.api.java.tuple.Tuple4;
import org.apache.flink.configuration.Configuration;import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;public class QueryCategoryNameFromMySQLFunction extends RichMapFunction<String, Tuple4<String, String, String, Double>> {private Connection connection;private PreparedStatement preparedStatement;@Overridepublic void open(Configuration parameters) throws Exception {connection=DriverManager.getConnection("jdbc:mysql://localhost:3306/dy_flink?characterEncoding=utf-8","root","");preparedStatement=connection.prepareStatement("select name from t_category where id = ?");super.open(parameters);}@Overridepublic Tuple4<String, String, String, Double> map(String s) throws Exception {String[] fields = s.split(",");String cid = fields[0];preparedStatement.setInt(1,Integer.parseInt(cid));ResultSet resultSet = preparedStatement.executeQuery();String name = "未知";while (resultSet.next()){name = resultSet.getString(1);}resultSet.close();return  Tuple4.of(fields[0], fields[1], name, Double.parseDouble(fields[3]));}@Overridepublic void close() throws Exception {if(connection!=null){connection.close();}if(preparedStatement!=null){preparedStatement.close();}}
}

flatMap扁平化映射(DataStream → DataStream)

- 该方法是将一个DataStream调用flatMap方法返回一个新的DataStream,本质上是将该DataStream中的对应的每一条数据依次迭代出来,应用flatMap方法传入的计算逻辑,返回一个新的DataStream。原来的DataStream中输入的一条数据经过flatMap方法传入的计算逻辑后,会返回零到多条数据。所谓的扁平化即将原来的数据压平,返回多条数据。
package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.util.Collector;public class FlatMapDemo1 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8889);SingleOutputStreamOperator<Tuple2<String, Integer>> flatMapData = source.flatMap(new FlatMapFunction<String, Tuple2<String, Integer>>() {@Overridepublic void flatMap(String value, Collector<Tuple2<String, Integer>> out) throws Exception {String[] strings = value.split(" ");for (String string : strings) {out.collect(Tuple2.of(string, 1));}}});flatMapData.print();env.execute("pxj");}
}
package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.typeinfo.TypeHint;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.util.Collector;public class FlatMapDemo2 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8887);SingleOutputStreamOperator<Tuple2<String, Integer>> myflatMap = source.transform("MyflatMap", TypeInformation.of(new TypeHint<Tuple2<String, Integer>>() {}), new MyflatMap());myflatMap.print();env.execute("pxj");}
}
package com.lyj.sx.flink.day03;import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;public  class MyflatMap extends AbstractStreamOperator<Tuple2<String, Integer>> implements OneInputStreamOperator<String, Tuple2<String, Integer>> {@Overridepublic void processElement(StreamRecord<String> element) throws Exception {String[] split = element.getValue().split(",");for (String s : split) {output.collect(element.replace(Tuple2.of(s,1)));}}@Overridepublic void setKeyContextElement(StreamRecord<String> record) throws Exception {System.out.println("StreamRecord..............");OneInputStreamOperator.super.setKeyContextElement(record);}
}

keyBy按key分区(DataStream → KeyedStream)

package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;public class KeyByDemo1 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8888);MapFunction<String, Tuple2<String, Integer>> mapFunction = new MapFunction<String, Tuple2<String, Integer>>() {Tuple2<String, Integer> t;@Overridepublic Tuple2<String, Integer> map(String s) throws Exception {String[] strings = s.split(" ");for (String string : strings) {t = Tuple2.of(string, 1);}return t;}};source.map(mapFunction).print();env.execute("pxj");}
}
 package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple;
import org.apache.flink.api.java.tuple.Tuple3;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.KeyedStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;public class KeyByDemo2 {public static void main(String[] args) throws  Exception {StreamExecutionEnvironment env= StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8886);SingleOutputStreamOperator<Tuple3<String, String, Double>> tpStream = source.map(new MapFunction<String, Tuple3<String, String, Double>>() {@Overridepublic Tuple3<String, String, Double> map(String s) throws Exception {String[] fields = s.split(",");return Tuple3.of(fields[0], fields[1], Double.parseDouble(fields[2]));}});KeyedStream<Tuple3<String, String, Double>, Tuple> tuple3TupleKeyedStream = tpStream.keyBy("f0", "f1");tuple3TupleKeyedStream.print();env.execute("pxj");}
}
package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.KeyedStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;public class KeyByDemo3 {public static void main(String[] args)throws Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8886);SingleOutputStreamOperator<Tuple2<String, Integer>> map = source.map(new MapFunction<String, Tuple2<String, Integer>>() {Tuple2<String,Integer> t;@Overridepublic Tuple2<String, Integer> map(String s) throws Exception {for (String string : s.split(",")) {t=Tuple2.of(string,1);}return t;}});map.print();KeyedStream<Tuple2<String, Integer>, String> keyedStream = map.keyBy(new KeySelector<Tuple2<String, Integer>, String>() {@Overridepublic String getKey(Tuple2<String, Integer> value) throws Exception {return value.f0;}});keyedStream.print();env.execute("pxj");}
}
package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.tuple.Tuple3;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;public class KeyByDemo4 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8886);SingleOutputStreamOperator<Tuple3<String, String, Double>> mapped = source.map(new MapFunction<String, Tuple3<String, String, Double>>() {@Overridepublic Tuple3<String, String, Double> map(String s) throws Exception {String[] fields = s.split(",");return Tuple3.of(fields[0], fields[1], Double.parseDouble(fields[2]));}});mapped.keyBy(new KeySelector<Tuple3<String, String, Double>, String>() {@Overridepublic String getKey(Tuple3<String, String, Double> key) throws Exception {return key.f0+key.f1;}}).print();env.execute("pxj");}
}
 package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.common.typeinfo.TypeHint;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.api.java.tuple.Tuple3;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.KeyedStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;public class KeyByDemo5 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8886);//山东省,济南市,5000SingleOutputStreamOperator<Tuple3<String, String, Double>> map = source.map(new MapFunction<String, Tuple3<String, String, Double>>() {@Overridepublic Tuple3<String, String, Double> map(String s) throws Exception {String[] fields = s.split(",");return Tuple3.of(fields[0], fields[1], Double.parseDouble(fields[2]));}});KeyedStream<Tuple3<String, String, Double>, Tuple2<String, String>> tuple3Tuple2KeyedStream = map.keyBy(t -> Tuple2.of(t.f0, t.f1), TypeInformation.of(new TypeHint<Tuple2<String, String>>() {}));tuple3Tuple2KeyedStream.print();env.execute("pxj");}
}
 package com.lyj.sx.flink.day03;import com.lyj.sx.flink.bean.DataBean;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.KeyedStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;import java.beans.beancontext.BeanContext;public class KeyByDemo6 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8886);SingleOutputStreamOperator<DataBean> beanStream = source.map(new MapFunction<String, DataBean>() {@Overridepublic DataBean map(String s) throws Exception {String[] fields = s.split(",");return DataBean.of(fields[0], fields[1], Double.parseDouble(fields[2]));}});KeyedStream<DataBean, DataBean> keyedStream = beanStream.keyBy(t -> t);keyedStream.print();env.execute("pxj");}
}

filter过滤(DataStream → DataStream)

- 该方法是将一个DataStream调用filter方法返回一个新的DataStream,本质上是将该DataStream中的对应的每一条输入数据依次迭代出来,应用filter方法传入的过滤逻辑,返回一个新的DataStream。原来的DataStream中输入的一条数据经过fliter方法传入的过滤逻辑后,返回true就会保留这条数据,返回false就会过滤掉该数据。
 package com.lyj.sx.flink.day03;import org.apache.flink.api.common.functions.FilterFunction;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;public class FilterDemo1 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8886);SingleOutputStreamOperator<String> filter = source.filter(new FilterFunction<String>() {@Overridepublic boolean filter(String s) throws Exception {return s.startsWith("h");}}).setParallelism(2);filter.print();env.execute("pxj");}
}
 package com.lyj.sx.flink.day03;import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.operators.StreamFilter;public class FilterDemo2 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env= StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8886);SingleOutputStreamOperator<String> transform = source.transform("myfilter", TypeInformation.of(String.class), new StreamFilter<>(w -> w.startsWith("h")));transform.print();env.execute("pxj");}
}
package com.lyj.sx.flink.day03;import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;public class FilterDemo3 {public static void main(String[] args) throws Exception {StreamExecutionEnvironment env= StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(new Configuration());DataStreamSource<String> source = env.socketTextStream("192.168.25.62", 8886);SingleOutputStreamOperator<String> transform = source.transform("MyFilterFunction", TypeInformation.of(String.class), new MyFilterFunction());transform.print();env.execute("pxj");}
}
 package com.lyj.sx.flink.day03;import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;public class MyFilterFunction extends AbstractStreamOperator<String> implements OneInputStreamOperator<String, String> {@Overridepublic void processElement(StreamRecord<String> element) throws Exception {String elementValue = element.getValue();if(elementValue.startsWith("h")){output.collect(element);}}@Overridepublic void setKeyContextElement(StreamRecord<String> record) throws Exception {OneInputStreamOperator.super.setKeyContextElement(record);System.out.println("setKeyContextElement.........");}
}

整理人:pxj_sx(潘陈)
日 期:2024-06-02 16:06:42

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/web/21044.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

短视频毫无营养:四川京之华锦信息技术公司

短视频毫无营养&#xff1a;现象背后的深度剖析 在数字时代&#xff0c;短视频以其短小精悍、易于传播的特点迅速崛起&#xff0c;成为社交媒体上的热门内容。然而&#xff0c;随着短视频的泛滥&#xff0c;关于其内容质量参差不齐、缺乏营养价值的争议也日益加剧。四川京之华…

SELF-RAG: Learning to Retrieve, Generate, and Critique Through Self-reflection

更多文章&#xff0c;请关注微信公众号&#xff1a;NLP分享汇 原文链接&#xff1a;ICLR2024&#xff1a;能够自我反思的SELF-RAG 下面介绍的这篇论文是最近被ICLR 2024 accepted oral&#xff0c;作者来自University of Washington & Allen Institute for AI & IBM R…

leetcode:最近的请求次数

class RecentCounter { public:RecentCounter() {cou 0;}int ping(int t) {q.push(t);while(!q.empty()){auto Front q.front();if(t-Front>3000)q.pop();else break;}return q.size();} private:int cou;queue<int> q; }; 仅个人做法&#xff0c;非最优解

postgressql——事务提交会通过delayChkpt阻塞checkpoint(9)

事务提交会通过delayChkpt阻塞checkpoint Postgresql事务在事务提交时&#xff08;执行commit的最后阶段&#xff09;会通过加锁阻塞checkpoint的执行&#xff0c;尽管时间非常短&#xff0c;分析为什么需要这样做&#xff1a; 首先看提交堆栈 #1 0x0000000000539175 in Co…

VueRoute url参数

版本 4.x 获取query参数 使用$router.query&#xff0c;可以获取参数对应的json对象。 获取url参数 需要在路由配置中定义。使用$router.param获取。

Django表单革命:打造安全、高效、用户友好的Web应用

Django表单处理&#xff0c;听起来是不是有点枯燥&#xff1f;别急&#xff0c;阿佑将带你领略Django表单的艺术之美。我们将以轻松幽默的语言&#xff0c;一步步引导你从表单的创建到管理&#xff0c;再到验证和自定义&#xff0c;让你在不知不觉中掌握Django表单的精髓。文章…

stm32寄存器开发

在stm32开发中&#xff0c;利用寄存器和标准库一起使用会比较方便。 有些寄存器在使用前要先清零以GPIO控制器配置为例PA8 GPIOA->CRH & ~(15<<0);//清零 15的2进制是1111 相当于把1111从最右边开始左移了0位 作用是把GPIOA的CRH寄存器的最后4位置零 GPIOA-&…

支付宝支付(沙盒支付)

后端页面代码 Controller RequestMapping("/pay") public class PayController {private String orderId;Autowiredprivate OrdersService ordersService;Value("${appId}")private String appId;Value("${privateKey}")private String private…

奇安信 网神SecGate 3600防火墙 (相关问题整理)

本文所有问题&#xff0c;基于实际项目中出的问题&#xff0c;设备&#xff1a;网神SecGate 3600防火墙 会话记录 数据中心>会话>会话监控&#xff0c;终端发起SYN请求&#xff0c;就能看到记录&#xff0c;无需完全建立TCP连接 默认安全策略 防火墙的安全策略中&…

CAD二次开发(7)- 实现Ribbon选项卡,面板,功能按钮的添加

1. 创建工程 2. 需要引入的依赖 如图&#xff0c;去掉依赖复制到本地 3. 代码实现 RibbonTool.cs 实现添加Ribbon选项卡&#xff0c;添加面板&#xff0c;以及给面板添加下拉组合按钮。 using Autodesk.Windows; using System; using System.Collections.Generic; using S…

Ubuntu18.04安装pwntools报错解决方案

报错1&#xff1a;ModuleNotFoundError: No module named ‘setuptools_rust’ 报错信息显示ModuleNotFoundError: No module named setuptools_rust&#xff0c;如下图所示 解决方案&#xff1a;pip install setuptools_rust 报错2&#xff1a;pip版本低 解决方案&#xff…

天气数据集-Jena Climate dataset

天气数据集-Jena Climate dataset 1.数据集基本信息 Dataset Name: mpi_saale_2021b.csv Size: 26495 rows; 1 year (2021), 10 min 气象学、农业、环境科学 开源机构: Max Planck Institute for Biogeochemistry 2.数据特征 2.1 特征简介 数据共有31个特征&#xff0c;…

LabVIEW与欧陆温控表通讯的实现与应用:厂商软件与自主开发的优缺点

本文探讨了LabVIEW与欧陆温控表通讯的具体实现方法&#xff0c;并对比了使用厂商提供的软件与自行开发LabVIEW程序的优缺点。通过综合分析&#xff0c;帮助用户在实际应用中选择最适合的方案&#xff0c;实现高效、灵活的温控系统。 LabVIEW与欧陆温控表通讯的实现与应用&#…

Linux项目编程必备武器!

本文目录 一、更换源服务器二、下载man开发手册(一般都自带&#xff0c;没有的话使用下面方法下载) 一、更换源服务器 我们使用apt-get等下载命令下载的软件都是从源服务器上获取的&#xff0c;有些软件包在某个服务器上存在&#xff0c;而另一个服务器不存在。所以我们可以添加…

力扣 20. 有效的括号

给定一个只包括 (&#xff0c;)&#xff0c;{&#xff0c;}&#xff0c;[&#xff0c;] 的字符串 s &#xff0c;判断字符串是否有效。 有效字符串需满足&#xff1a; 左括号必须用相同类型的右括号闭合。 左括号必须以正确的顺序闭合。 每个右括号都有一个对应的相同类型的…

Typescript高级: 深入理解in与in keyof

概述 in 和 keyof 是两个非常重要的操作符&#xff0c;它们允许开发者对对象的键&#xff08;key&#xff09;进行更精细化的操作和约束in 关键词 in关键词则在TypeScript的类型上下文中有特定的用途&#xff0c;它用于映射类型和类型查询当与keyof一起使用时&#xff0c;in可…

派生类中调用基类的__init__()方法

自学python如何成为大佬(目录):https://blog.csdn.net/weixin_67859959/article/details/139049996?spm1001.2014.3001.5501 在派生类中定义__init__()方法时&#xff0c;不会自动调用基类的__init__()方法。例如&#xff0c;定义一个Fruit类&#xff0c;在__init__()方法中创…

【安卓基础】-- 消息机制 Handler

目录 消息机制 Handler面试问题 消息机制 Handler 对handler机制的基本作用、用法、时序流程进行介绍&#xff0c;针对handler机制中的内存泄漏问题讲解&#xff1a;一篇读懂Android Handler机制 Android-Handler机制详解 全面解析 | Android之Handler机制 需要掌握的&#x…

通过非欧几何体改变 AI 嵌入

目录 一、说明 二、LLM嵌入的形势 三、了解一些背景信息 3.1 什么是嵌入&#xff1f; 3.2 为什么嵌入在 NLP 中很重要&#xff1f; 3.3 复数Complex 几何的角色 3.4 C主动学习 3.5 角度嵌入 &#xff08;AE&#xff09;&#xff1a;解锁稳健排序 3.6 RotatE&#xff1a;将关系…

浅谈序列化

序列化的基本概念 序列化&#xff08;Serialization&#xff09;是将对象的状态转换为字节流的过程&#xff0c;而反序列化&#xff08;Deserialization&#xff09;则是将字节流恢复为对象的过程。这一过程允许对象能够被保存到文件、传输通过网络、保存到数据库或其他数据存…