Datax,hbase与mysql数据相互同步

参考文章:datax mysql 和hbase的 相互导入

目录

0、软件版本说明

1、hbase数据同步至mysql

1.1、hbase数据

1.2、mysql数据

1.3、json脚本(hbase2mysql.json)

1.4、同步成功日志

2、mysql数据同步至hbase

1.1、hbase数据

1.2、mysql数据

1.3、json脚本(mysql2hbase.json)

1.4、同步成功日志

3、总结


0、软件版本说明

  1. hbase版本:2.0.5
  2. mysql版本:8.0.34

1、hbase数据同步至mysql

1.1、hbase数据

hbase(main):018:0> scan "bigdata:student"
ROW                                                              COLUMN+CELL                                                                                                                                                                                  1001                                                            column=info:age, timestamp=1712562704820, value=18                                                                                                                                           1001                                                            column=info:name, timestamp=1712562696088, value=lisi                                                                                                                                        1002                                                            column=info:age, timestamp=1712566667737, value=222                                                                                                                                          1002                                                            column=info:name, timestamp=1712566689576, value=\xE5\xAE\x8B\xE5\xA3\xB9                                                                                                                    
2 row(s)
Took 0.0805 seconds                                                                                                                                                                                                                                           
hbase(main):019:0> 
[atguigu@node001 hbase]$ cd hbase-2.0.5/
[atguigu@node001 hbase-2.0.5]$ bin/hbase shellFor more on the HBase Shell, see http://hbase.apache.org/book.html
hbase(main):002:0> create_namespace 'bigdata'
Took 3.4979 seconds                                                                                                                                                                                                                                           
hbase(main):003:0> list_namespace
NAMESPACE                                                                                                                                                                                                                                                     
EDU_REALTIME                                                                                                                                                                                                                                                  
SYSTEM                                                                                                                                                                                                                                                        
bigdata                                                                                                                                                                                                                                                       
default                                                                                                                                                                                                                                                       
hbase                                                                                                                                                                                                                                                         
5 row(s)
Took 0.1244 seconds                                                                                                                                                                                                                                           
hbase(main):004:0> create_namespace 'bigdata2'
Took 0.5109 seconds                                                                                                                                                                                                                                           
hbase(main):005:0> list_namespace
NAMESPACE                                                                                                                                                                                                                                                     
EDU_REALTIME                                                                                                                                                                                                                                                  
SYSTEM                                                                                                                                                                                                                                                        
bigdata                                                                                                                                                                                                                                                       
bigdata2                                                                                                                                                                                                                                                      
default                                                                                                                                                                                                                                                       
hbase                                                                                                                                                                                                                                                         
6 row(s)
Took 0.0450 seconds                                                                                                                                                                                                                                           
hbase(main):006:0> create 'bigdata:student', {NAME => 'info', VERSIONS =>5}, {NAME => 'msg'}
Created table bigdata:student
Took 4.7854 seconds                                                                                                                                                                                                                                           
=> Hbase::Table - bigdata:student
hbase(main):007:0> create 'bigdata2:student', {NAME => 'info', VERSIONS =>5}, {NAME => 'msg'}
Created table bigdata2:student
Took 2.4732 seconds                                                                                                                                                                                                                                           
=> Hbase::Table - bigdata2:student
hbase(main):008:0> list
TABLE                                                                                                                                                                                                                                                         
EDU_REALTIME:DIM_BASE_CATEGORY_INFO                                                                                                                                                                                                                           
EDU_REALTIME:DIM_BASE_PROVINCE                                                                                                                                                                                                                                
EDU_REALTIME:DIM_BASE_SOURCE                                                                                                                                                                                                                                  
EDU_REALTIME:DIM_BASE_SUBJECT_INFO                                                                                                                                                                                                                            
EDU_REALTIME:DIM_CHAPTER_INFO                                                                                                                                                                                                                                 
EDU_REALTIME:DIM_COURSE_INFO                                                                                                                                                                                                                                  
EDU_REALTIME:DIM_KNOWLEDGE_POINT                                                                                                                                                                                                                              
EDU_REALTIME:DIM_TEST_PAPER                                                                                                                                                                                                                                   
EDU_REALTIME:DIM_TEST_PAPER_QUESTION                                                                                                                                                                                                                          
EDU_REALTIME:DIM_TEST_POINT_QUESTION                                                                                                                                                                                                                          
EDU_REALTIME:DIM_TEST_QUESTION_INFO                                                                                                                                                                                                                           
EDU_REALTIME:DIM_TEST_QUESTION_OPTION                                                                                                                                                                                                                         
EDU_REALTIME:DIM_USER_INFO                                                                                                                                                                                                                                    
EDU_REALTIME:DIM_VIDEO_INFO                                                                                                                                                                                                                                   
SYSTEM:CATALOG                                                                                                                                                                                                                                                
SYSTEM:FUNCTION                                                                                                                                                                                                                                               
SYSTEM:LOG                                                                                                                                                                                                                                                    
SYSTEM:MUTEX                                                                                                                                                                                                                                                  
SYSTEM:SEQUENCE                                                                                                                                                                                                                                               
SYSTEM:STATS                                                                                                                                                                                                                                                  
bigdata2:student                                                                                                                                                                                                                                              
bigdata:student                                                                                                                                                                                                                                               
22 row(s)
Took 0.0711 seconds                                                                                                                                                                                                                                           
=> ["EDU_REALTIME:DIM_BASE_CATEGORY_INFO", "EDU_REALTIME:DIM_BASE_PROVINCE", "EDU_REALTIME:DIM_BASE_SOURCE", "EDU_REALTIME:DIM_BASE_SUBJECT_INFO", "EDU_REALTIME:DIM_CHAPTER_INFO", "EDU_REALTIME:DIM_COURSE_INFO", "EDU_REALTIME:DIM_KNOWLEDGE_POINT", "EDU_REALTIME:DIM_TEST_PAPER", "EDU_REALTIME:DIM_TEST_PAPER_QUESTION", "EDU_REALTIME:DIM_TEST_POINT_QUESTION", "EDU_REALTIME:DIM_TEST_QUESTION_INFO", "EDU_REALTIME:DIM_TEST_QUESTION_OPTION", "EDU_REALTIME:DIM_USER_INFO", "EDU_REALTIME:DIM_VIDEO_INFO", "SYSTEM:CATALOG", "SYSTEM:FUNCTION", "SYSTEM:LOG", "SYSTEM:MUTEX", "SYSTEM:SEQUENCE", "SYSTEM:STATS", "bigdata2:student", "bigdata:student"]
hbase(main):009:0> put 'bigdata:student','1001','info:name','zhangsan'
Took 0.8415 seconds                                                                                                                                                                                                                                           
hbase(main):010:0> put 'bigdata:student','1001','info:name','lisi'
Took 0.0330 seconds                                                                                                                                                                                                                                           
hbase(main):011:0> put 'bigdata:student','1001','info:age','18'
Took 0.0201 seconds                                                                                                                                                                                                                                           
hbase(main):012:0> get 'bigdata:student','1001'
COLUMN                                                           CELL                                                                                                                                                                                         info:age                                                        timestamp=1712562704820, value=18                                                                                                                                                            info:name                                                       timestamp=1712562696088, value=lisi                                                                                                                                                          
1 row(s)
Took 0.4235 seconds                                                                                                                                                                                                                                           
hbase(main):013:0> scan student
NameError: undefined local variable or method `student' for main:Objecthbase(main):014:0> scan bigdata:student
NameError: undefined local variable or method `student' for main:Objecthbase(main):015:0> scan "bigdata:student"
ROW                                                              COLUMN+CELL                                                                                                                                                                                  1001                                                            column=info:age, timestamp=1712562704820, value=18                                                                                                                                           1001                                                            column=info:name, timestamp=1712562696088, value=lisi                                                                                                                                        
1 row(s)
Took 0.9035 seconds                                                                                                                                                                                                                                           
hbase(main):016:0> put 'bigdata:student','1002','info:age','222'
Took 0.0816 seconds                                                                                                                                                                                                                                           
hbase(main):017:0> put 'bigdata:student','1002','info:name','宋壹'
Took 0.0462 seconds                                                                                                                                                                                                                                           
hbase(main):018:0> scan "bigdata:student"
ROW                                                              COLUMN+CELL                                                                                                                                                                                  1001                                                            column=info:age, timestamp=1712562704820, value=18                                                                                                                                           1001                                                            column=info:name, timestamp=1712562696088, value=lisi                                                                                                                                        1002                                                            column=info:age, timestamp=1712566667737, value=222                                                                                                                                          1002                                                            column=info:name, timestamp=1712566689576, value=\xE5\xAE\x8B\xE5\xA3\xB9                                                                                                                    
2 row(s)
Took 0.0805 seconds                                                                                                                                                                                                                                           
hbase(main):019:0> 

1.2、mysql数据

SELECT VERSION(); -- 查看mysql版本/*Navicat Premium Data TransferSource Server         : 大数据-node001Source Server Type    : MySQLSource Server Version : 80034 (8.0.34)Source Host           : node001:3306Source Schema         : testTarget Server Type    : MySQLTarget Server Version : 80034 (8.0.34)File Encoding         : 65001Date: 08/04/2024 17:11:56
*/SET NAMES utf8mb4;
SET FOREIGN_KEY_CHECKS = 0;-- ----------------------------
-- Table structure for student
-- ----------------------------
DROP TABLE IF EXISTS `student`;
CREATE TABLE `student`  (`info` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL,`msg` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL
) ENGINE = InnoDB CHARACTER SET = utf8mb4 COLLATE = utf8mb4_general_ci ROW_FORMAT = Dynamic;-- ----------------------------
-- Records of student
-- ----------------------------
INSERT INTO `student` VALUES ('111', '111111');
INSERT INTO `student` VALUES ('222', '222222');
INSERT INTO `student` VALUES ('18', 'lisi');
INSERT INTO `student` VALUES ('222', '宋壹');SET FOREIGN_KEY_CHECKS = 1;

1.3、json脚本(hbase2mysql.json)

{"job": {"content": [{"reader": {"name": "hbase11xreader","parameter": {"hbaseConfig": {"hbase.zookeeper.quorum": "node001:2181"},"table": "bigdata:student","encoding": "utf-8","mode": "normal","column": [{"name": "info:age","type": "string"},{"name": "info:name","type": "string"}],"range": {"startRowkey": "","endRowkey": "","isBinaryRowkey": true}}},"writer": {"name": "mysqlwriter","parameter": {"column": ["info","msg"],"connection": [{"jdbcUrl": "jdbc:mysql://node001:3306/test","table": ["student"]}],"username": "root","password": "123456","preSql": [],"session": [],"writeMode": "insert"}}}],"setting": {"speed": {"channel": "1"}}}
}

1.4、同步成功日志

[atguigu@node001 datax]$ python bin/datax.py job/hbase/hbase2mysql.jsonDataX (DATAX-OPENSOURCE-3.0), From Alibaba !
Copyright (C) 2010-2017, Alibaba Group. All Rights Reserved.2024-04-08 17:02:00.785 [main] INFO  VMInfo - VMInfo# operatingSystem class => sun.management.OperatingSystemImpl
2024-04-08 17:02:00.804 [main] INFO  Engine - the machine info  => osInfo: Red Hat, Inc. 1.8 25.372-b07jvmInfo:        Linux amd64 3.10.0-862.el7.x86_64cpu num:        4totalPhysicalMemory:    -0.00GfreePhysicalMemory:     -0.00GmaxFileDescriptorCount: -1currentOpenFileDescriptorCount: -1GC Names        [PS MarkSweep, PS Scavenge]MEMORY_NAME                    | allocation_size                | init_size                      PS Eden Space                  | 256.00MB                       | 256.00MB                       Code Cache                     | 240.00MB                       | 2.44MB                         Compressed Class Space         | 1,024.00MB                     | 0.00MB                         PS Survivor Space              | 42.50MB                        | 42.50MB                        PS Old Gen                     | 683.00MB                       | 683.00MB                       Metaspace                      | -0.00MB                        | 0.00MB                         2024-04-08 17:02:00.840 [main] INFO  Engine - 
{"content":[{"reader":{"name":"hbase11xreader","parameter":{"column":[{"name":"info:age","type":"string"},{"name":"info:name","type":"string"}],"encoding":"utf-8","hbaseConfig":{"hbase.zookeeper.quorum":"node001:2181"},"mode":"normal","range":{"endRowkey":"","isBinaryRowkey":true,"startRowkey":""},"table":"bigdata:student"}},"writer":{"name":"mysqlwriter","parameter":{"column":["info","msg"],"connection":[{"jdbcUrl":"jdbc:mysql://node001:3306/test","table":["student"]}],"password":"******","preSql":[],"session":[],"username":"root","writeMode":"insert"}}}],"setting":{"speed":{"channel":"1"}}
}2024-04-08 17:02:00.875 [main] WARN  Engine - prioriy set to 0, because NumberFormatException, the value is: null
2024-04-08 17:02:00.881 [main] INFO  PerfTrace - PerfTrace traceId=job_-1, isEnable=false, priority=0
2024-04-08 17:02:00.881 [main] INFO  JobContainer - DataX jobContainer starts job.
2024-04-08 17:02:00.885 [main] INFO  JobContainer - Set jobId = 0
2024-04-08 17:02:03.040 [job-0] INFO  OriginalConfPretreatmentUtil - table:[student] all columns:[
info,msg
].
2024-04-08 17:02:03.098 [job-0] INFO  OriginalConfPretreatmentUtil - Write data [
insert INTO %s (info,msg) VALUES(?,?)
], which jdbcUrl like:[jdbc:mysql://node001:3306/test?yearIsDateType=false&zeroDateTimeBehavior=convertToNull&tinyInt1isBit=false&rewriteBatchedStatements=true]
2024-04-08 17:02:03.099 [job-0] INFO  JobContainer - jobContainer starts to do prepare ...
2024-04-08 17:02:03.099 [job-0] INFO  JobContainer - DataX Reader.Job [hbase11xreader] do prepare work .
2024-04-08 17:02:03.099 [job-0] INFO  JobContainer - DataX Writer.Job [mysqlwriter] do prepare work .
2024-04-08 17:02:03.100 [job-0] INFO  JobContainer - jobContainer starts to do split ...
2024-04-08 17:02:03.100 [job-0] INFO  JobContainer - Job set Channel-Number to 1 channels.
四月 08, 2024 5:02:03 下午 org.apache.hadoop.util.NativeCodeLoader <clinit>
警告: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
四月 08, 2024 5:02:03 下午 org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper <init>
信息: Process identifier=hconnection-0x50313382 connecting to ZooKeeper ensemble=node001:2181
2024-04-08 17:02:03.982 [job-0] INFO  ZooKeeper - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2024-04-08 17:02:03.983 [job-0] INFO  ZooKeeper - Client environment:host.name=node001
2024-04-08 17:02:03.983 [job-0] INFO  ZooKeeper - Client environment:java.version=1.8.0_372
2024-04-08 17:02:03.983 [job-0] INFO  ZooKeeper - Client environment:java.vendor=Red Hat, Inc.
2024-04-08 17:02:03.983 [job-0] INFO  ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.372.b07-1.el7_9.x86_64/jre
2024-04-08 17:02:03.983 [job-0] INFO  ZooKeeper - Client environment:java.class.path=/opt/module/datax/lib/commons-io-2.4.jar:/opt/module/datax/lib/groovy-all-2.1.9.jar:/opt/module/datax/lib/datax-core-0.0.1-SNAPSHOT.jar:/opt/module/datax/lib/fluent-hc-4.4.jar:/opt/module/datax/lib/commons-beanutils-1.9.2.jar:/opt/module/datax/lib/commons-codec-1.9.jar:/opt/module/datax/lib/httpclient-4.4.jar:/opt/module/datax/lib/commons-cli-1.2.jar:/opt/module/datax/lib/commons-lang-2.6.jar:/opt/module/datax/lib/logback-core-1.0.13.jar:/opt/module/datax/lib/hamcrest-core-1.3.jar:/opt/module/datax/lib/fastjson-1.1.46.sec01.jar:/opt/module/datax/lib/commons-lang3-3.3.2.jar:/opt/module/datax/lib/commons-logging-1.1.1.jar:/opt/module/datax/lib/janino-2.5.16.jar:/opt/module/datax/lib/commons-configuration-1.10.jar:/opt/module/datax/lib/slf4j-api-1.7.10.jar:/opt/module/datax/lib/datax-common-0.0.1-SNAPSHOT.jar:/opt/module/datax/lib/datax-transformer-0.0.1-SNAPSHOT.jar:/opt/module/datax/lib/logback-classic-1.0.13.jar:/opt/module/datax/lib/httpcore-4.4.jar:/opt/module/datax/lib/commons-collections-3.2.1.jar:/opt/module/datax/lib/commons-math3-3.1.1.jar:.
2024-04-08 17:02:03.984 [job-0] INFO  ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2024-04-08 17:02:03.984 [job-0] INFO  ZooKeeper - Client environment:java.io.tmpdir=/tmp
2024-04-08 17:02:03.984 [job-0] INFO  ZooKeeper - Client environment:java.compiler=<NA>
2024-04-08 17:02:03.984 [job-0] INFO  ZooKeeper - Client environment:os.name=Linux
2024-04-08 17:02:03.984 [job-0] INFO  ZooKeeper - Client environment:os.arch=amd64
2024-04-08 17:02:03.984 [job-0] INFO  ZooKeeper - Client environment:os.version=3.10.0-862.el7.x86_64
2024-04-08 17:02:03.987 [job-0] INFO  ZooKeeper - Client environment:user.name=atguigu
2024-04-08 17:02:03.988 [job-0] INFO  ZooKeeper - Client environment:user.home=/home/atguigu
2024-04-08 17:02:03.988 [job-0] INFO  ZooKeeper - Client environment:user.dir=/opt/module/datax
2024-04-08 17:02:03.990 [job-0] INFO  ZooKeeper - Initiating client connection, connectString=node001:2181 sessionTimeout=90000 watcher=hconnection-0x503133820x0, quorum=node001:2181, baseZNode=/hbase
2024-04-08 17:02:04.069 [job-0-SendThread(node001:2181)] INFO  ClientCnxn - Opening socket connection to server node001/192.168.10.101:2181. Will not attempt to authenticate using SASL (unknown error)
2024-04-08 17:02:04.092 [job-0-SendThread(node001:2181)] INFO  ClientCnxn - Socket connection established to node001/192.168.10.101:2181, initiating session
2024-04-08 17:02:04.139 [job-0-SendThread(node001:2181)] INFO  ClientCnxn - Session establishment complete on server node001/192.168.10.101:2181, sessionid = 0x200000707b70025, negotiated timeout = 40000
2024-04-08 17:02:06.334 [job-0] INFO  Hbase11xHelper - HBaseReader split job into 1 tasks.
2024-04-08 17:02:06.335 [job-0] INFO  JobContainer - DataX Reader.Job [hbase11xreader] splits to [1] tasks.
2024-04-08 17:02:06.336 [job-0] INFO  JobContainer - DataX Writer.Job [mysqlwriter] splits to [1] tasks.
2024-04-08 17:02:06.366 [job-0] INFO  JobContainer - jobContainer starts to do schedule ...
2024-04-08 17:02:06.394 [job-0] INFO  JobContainer - Scheduler starts [1] taskGroups.
2024-04-08 17:02:06.402 [job-0] INFO  JobContainer - Running by standalone Mode.
2024-04-08 17:02:06.426 [taskGroup-0] INFO  TaskGroupContainer - taskGroupId=[0] start [1] channels for [1] tasks.
2024-04-08 17:02:06.457 [taskGroup-0] INFO  Channel - Channel set byte_speed_limit to -1, No bps activated.
2024-04-08 17:02:06.458 [taskGroup-0] INFO  Channel - Channel set record_speed_limit to -1, No tps activated.
2024-04-08 17:02:06.529 [taskGroup-0] INFO  TaskGroupContainer - taskGroup[0] taskId[0] attemptCount[1] is started
四月 08, 2024 5:02:06 下午 org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper <init>
信息: Process identifier=hconnection-0x3b2eec42 connecting to ZooKeeper ensemble=node001:2181
2024-04-08 17:02:06.680 [0-0-0-reader] INFO  ZooKeeper - Initiating client connection, connectString=node001:2181 sessionTimeout=90000 watcher=hconnection-0x3b2eec420x0, quorum=node001:2181, baseZNode=/hbase
2024-04-08 17:02:06.740 [0-0-0-reader-SendThread(node001:2181)] INFO  ClientCnxn - Opening socket connection to server node001/192.168.10.101:2181. Will not attempt to authenticate using SASL (unknown error)
2024-04-08 17:02:06.773 [0-0-0-reader-SendThread(node001:2181)] INFO  ClientCnxn - Socket connection established to node001/192.168.10.101:2181, initiating session
2024-04-08 17:02:06.808 [0-0-0-reader-SendThread(node001:2181)] INFO  ClientCnxn - Session establishment complete on server node001/192.168.10.101:2181, sessionid = 0x200000707b70026, negotiated timeout = 40000
2024-04-08 17:02:06.960 [0-0-0-reader] INFO  HbaseAbstractTask - The task set startRowkey=[], endRowkey=[].
2024-04-08 17:02:07.262 [taskGroup-0] INFO  TaskGroupContainer - taskGroup[0] taskId[0] is successed, used[738]ms
2024-04-08 17:02:07.263 [taskGroup-0] INFO  TaskGroupContainer - taskGroup[0] completed it's tasks.
2024-04-08 17:02:16.483 [job-0] INFO  StandAloneJobContainerCommunicator - Total 2 records, 11 bytes | Speed 1B/s, 0 records/s | Error 0 records, 0 bytes |  All Task WaitWriterTime 0.000s |  All Task WaitReaderTime 0.248s | Percentage 100.00%
2024-04-08 17:02:16.483 [job-0] INFO  AbstractScheduler - Scheduler accomplished all tasks.
2024-04-08 17:02:16.484 [job-0] INFO  JobContainer - DataX Writer.Job [mysqlwriter] do post work.
2024-04-08 17:02:16.485 [job-0] INFO  JobContainer - DataX Reader.Job [hbase11xreader] do post work.
2024-04-08 17:02:16.485 [job-0] INFO  JobContainer - DataX jobId [0] completed successfully.
2024-04-08 17:02:16.487 [job-0] INFO  HookInvoker - No hook invoked, because base dir not exists or is a file: /opt/module/datax/hook
2024-04-08 17:02:16.491 [job-0] INFO  JobContainer - [total cpu info] => averageCpu                     | maxDeltaCpu                    | minDeltaCpu                    -1.00%                         | -1.00%                         | -1.00%[total gc info] => NAME                 | totalGCCount       | maxDeltaGCCount    | minDeltaGCCount    | totalGCTime        | maxDeltaGCTime     | minDeltaGCTime     PS MarkSweep         | 1                  | 1                  | 1                  | 0.136s             | 0.136s             | 0.136s             PS Scavenge          | 1                  | 1                  | 1                  | 0.072s             | 0.072s             | 0.072s             2024-04-08 17:02:16.491 [job-0] INFO  JobContainer - PerfTrace not enable!
2024-04-08 17:02:16.493 [job-0] INFO  StandAloneJobContainerCommunicator - Total 2 records, 11 bytes | Speed 1B/s, 0 records/s | Error 0 records, 0 bytes |  All Task WaitWriterTime 0.000s |  All Task WaitReaderTime 0.248s | Percentage 100.00%
2024-04-08 17:02:16.495 [job-0] INFO  JobContainer - 
任务启动时刻                    : 2024-04-08 17:02:00
任务结束时刻                    : 2024-04-08 17:02:16
任务总计耗时                    :                 15s
任务平均流量                    :                1B/s
记录写入速度                    :              0rec/s
读出记录总数                    :                   2
读写失败总数                    :                   0[atguigu@node001 datax]$ 

2、mysql数据同步至hbase

1.1、hbase数据

"bigdata2:student"一开始是空数据,后来使用datax执行同步任务后,可以看到:"bigdata2:student"新增了一些数据。

hbase(main):019:0> scan "bigdata2:student"
ROW                                                              COLUMN+CELL                                                                                                                                                                                  
0 row(s)
Took 1.6445 seconds                                                                                                                                                                                                                                           
hbase(main):020:0> scan "bigdata2:student"
ROW                                                              COLUMN+CELL                                                                                                                                                                                  111111111                                                       column=info:age, timestamp=123456789, value=111                                                                                                                                              111111111                                                       column=info:name, timestamp=123456789, value=111111                                                                                                                                          18lisi                                                          column=info:age, timestamp=123456789, value=18                                                                                                                                               18lisi                                                          column=info:name, timestamp=123456789, value=lisi                                                                                                                                            222222222                                                       column=info:age, timestamp=123456789, value=222                                                                                                                                              222222222                                                       column=info:name, timestamp=123456789, value=222222                                                                                                                                          333\xE5\xAE\x8B\xE5\xA3\xB9                                     column=info:age, timestamp=123456789, value=333                                                                                                                                              333\xE5\xAE\x8B\xE5\xA3\xB9                                     column=info:name, timestamp=123456789, value=\xE5\xAE\x8B\xE5\xA3\xB9                                                                                                                        
4 row(s)
Took 0.3075 seconds                                                                                                                                                                                                                                           
hbase(main):021:0> 

1.2、mysql数据

SELECT VERSION(); -- 查看mysql版本/*Navicat Premium Data TransferSource Server         : 大数据-node001Source Server Type    : MySQLSource Server Version : 80034 (8.0.34)Source Host           : node001:3306Source Schema         : testTarget Server Type    : MySQLTarget Server Version : 80034 (8.0.34)File Encoding         : 65001Date: 08/04/2024 17:11:56
*/SET NAMES utf8mb4;
SET FOREIGN_KEY_CHECKS = 0;-- ----------------------------
-- Table structure for student
-- ----------------------------
DROP TABLE IF EXISTS `student`;
CREATE TABLE `student`  (`info` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL,`msg` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL
) ENGINE = InnoDB CHARACTER SET = utf8mb4 COLLATE = utf8mb4_general_ci ROW_FORMAT = Dynamic;-- ----------------------------
-- Records of student
-- ----------------------------
INSERT INTO `student` VALUES ('111', '111111');
INSERT INTO `student` VALUES ('222', '222222');
INSERT INTO `student` VALUES ('18', 'lisi');
INSERT INTO `student` VALUES ('222', '宋壹');SET FOREIGN_KEY_CHECKS = 1;

1.3、json脚本(mysql2hbase.json)

{"job": {"setting": {"speed": {"channel": 1}},"content": [{"reader": {"name": "mysqlreader","parameter": {"column": ["info","msg"],"connection": [{"jdbcUrl": ["jdbc:mysql://127.0.0.1:3306/test"],"table": ["student"]}],"username": "root","password": "123456","where": ""}},"writer": {"name": "hbase11xwriter","parameter": {"hbaseConfig": {"hbase.zookeeper.quorum": "node001:2181"},"table": "bigdata2:student","mode": "normal","rowkeyColumn": [{"index": 0,"type": "string"},{"index": 1,"type": "string","value": "_"}],"column": [{"index": 0,"name": "info:age","type": "string"},{"index": 1,"name": "info:name","type": "string"}],"versionColumn": {"index": -1,"value": "123456789"},"encoding": "utf-8"}}}]}
}

1.4、同步成功日志

[atguigu@node001 datax]$ python bin/datax.py job/hbase/mysql2hbase.json DataX (DATAX-OPENSOURCE-3.0), From Alibaba !
Copyright (C) 2010-2017, Alibaba Group. All Rights Reserved.2024-04-08 17:44:45.536 [main] INFO  VMInfo - VMInfo# operatingSystem class => sun.management.OperatingSystemImpl
2024-04-08 17:44:45.552 [main] INFO  Engine - the machine info  => osInfo: Red Hat, Inc. 1.8 25.372-b07jvmInfo:        Linux amd64 3.10.0-862.el7.x86_64cpu num:        4totalPhysicalMemory:    -0.00GfreePhysicalMemory:     -0.00GmaxFileDescriptorCount: -1currentOpenFileDescriptorCount: -1GC Names        [PS MarkSweep, PS Scavenge]MEMORY_NAME                    | allocation_size                | init_size                      PS Eden Space                  | 256.00MB                       | 256.00MB                       Code Cache                     | 240.00MB                       | 2.44MB                         Compressed Class Space         | 1,024.00MB                     | 0.00MB                         PS Survivor Space              | 42.50MB                        | 42.50MB                        PS Old Gen                     | 683.00MB                       | 683.00MB                       Metaspace                      | -0.00MB                        | 0.00MB                         2024-04-08 17:44:45.579 [main] INFO  Engine - 
{"content":[{"reader":{"name":"mysqlreader","parameter":{"column":["info","msg"],"connection":[{"jdbcUrl":["jdbc:mysql://127.0.0.1:3306/test"],"table":["student"]}],"password":"******","username":"root","where":""}},"writer":{"name":"hbase11xwriter","parameter":{"column":[{"index":0,"name":"info:age","type":"string"},{"index":1,"name":"info:name","type":"string"}],"encoding":"utf-8","hbaseConfig":{"hbase.zookeeper.quorum":"node001:2181"},"mode":"normal","rowkeyColumn":[{"index":0,"type":"string"},{"index":1,"type":"string","value":"_"}],"table":"bigdata2:student","versionColumn":{"index":-1,"value":"123456789"}}}}],"setting":{"speed":{"channel":1}}
}2024-04-08 17:44:45.615 [main] WARN  Engine - prioriy set to 0, because NumberFormatException, the value is: null
2024-04-08 17:44:45.618 [main] INFO  PerfTrace - PerfTrace traceId=job_-1, isEnable=false, priority=0
2024-04-08 17:44:45.619 [main] INFO  JobContainer - DataX jobContainer starts job.
2024-04-08 17:44:45.622 [main] INFO  JobContainer - Set jobId = 0
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
2024-04-08 17:44:47.358 [job-0] INFO  OriginalConfPretreatmentUtil - Available jdbcUrl:jdbc:mysql://127.0.0.1:3306/test?yearIsDateType=false&zeroDateTimeBehavior=convertToNull&tinyInt1isBit=false&rewriteBatchedStatements=true.
2024-04-08 17:44:47.734 [job-0] INFO  OriginalConfPretreatmentUtil - table:[student] has columns:[info,msg].
2024-04-08 17:44:47.761 [job-0] INFO  JobContainer - jobContainer starts to do prepare ...
2024-04-08 17:44:47.762 [job-0] INFO  JobContainer - DataX Reader.Job [mysqlreader] do prepare work .
2024-04-08 17:44:47.763 [job-0] INFO  JobContainer - DataX Writer.Job [hbase11xwriter] do prepare work .
2024-04-08 17:44:47.764 [job-0] INFO  JobContainer - jobContainer starts to do split ...
2024-04-08 17:44:47.764 [job-0] INFO  JobContainer - Job set Channel-Number to 1 channels.
2024-04-08 17:44:47.773 [job-0] INFO  JobContainer - DataX Reader.Job [mysqlreader] splits to [1] tasks.
2024-04-08 17:44:47.774 [job-0] INFO  JobContainer - DataX Writer.Job [hbase11xwriter] splits to [1] tasks.
2024-04-08 17:44:47.815 [job-0] INFO  JobContainer - jobContainer starts to do schedule ...
2024-04-08 17:44:47.821 [job-0] INFO  JobContainer - Scheduler starts [1] taskGroups.
2024-04-08 17:44:47.825 [job-0] INFO  JobContainer - Running by standalone Mode.
2024-04-08 17:44:47.839 [taskGroup-0] INFO  TaskGroupContainer - taskGroupId=[0] start [1] channels for [1] tasks.
2024-04-08 17:44:47.846 [taskGroup-0] INFO  Channel - Channel set byte_speed_limit to -1, No bps activated.
2024-04-08 17:44:47.846 [taskGroup-0] INFO  Channel - Channel set record_speed_limit to -1, No tps activated.
2024-04-08 17:44:47.870 [taskGroup-0] INFO  TaskGroupContainer - taskGroup[0] taskId[0] attemptCount[1] is started
2024-04-08 17:44:47.876 [0-0-0-reader] INFO  CommonRdbmsReader$Task - Begin to read record by Sql: [select info,msg from student 
] jdbcUrl:[jdbc:mysql://127.0.0.1:3306/test?yearIsDateType=false&zeroDateTimeBehavior=convertToNull&tinyInt1isBit=false&rewriteBatchedStatements=true].
2024-04-08 17:44:48.010 [0-0-0-reader] INFO  CommonRdbmsReader$Task - Finished read record by Sql: [select info,msg from student 
] jdbcUrl:[jdbc:mysql://127.0.0.1:3306/test?yearIsDateType=false&zeroDateTimeBehavior=convertToNull&tinyInt1isBit=false&rewriteBatchedStatements=true].
四月 08, 2024 5:44:49 下午 org.apache.hadoop.util.NativeCodeLoader <clinit>
警告: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
四月 08, 2024 5:44:50 下午 org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper <init>
信息: Process identifier=hconnection-0x26654712 connecting to ZooKeeper ensemble=node001:2181
2024-04-08 17:44:50.143 [0-0-0-writer] INFO  ZooKeeper - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2024-04-08 17:44:50.143 [0-0-0-writer] INFO  ZooKeeper - Client environment:host.name=node001
2024-04-08 17:44:50.143 [0-0-0-writer] INFO  ZooKeeper - Client environment:java.version=1.8.0_372
2024-04-08 17:44:50.143 [0-0-0-writer] INFO  ZooKeeper - Client environment:java.vendor=Red Hat, Inc.
2024-04-08 17:44:50.143 [0-0-0-writer] INFO  ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.372.b07-1.el7_9.x86_64/jre
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:java.class.path=/opt/module/datax/lib/commons-io-2.4.jar:/opt/module/datax/lib/groovy-all-2.1.9.jar:/opt/module/datax/lib/datax-core-0.0.1-SNAPSHOT.jar:/opt/module/datax/lib/fluent-hc-4.4.jar:/opt/module/datax/lib/commons-beanutils-1.9.2.jar:/opt/module/datax/lib/commons-codec-1.9.jar:/opt/module/datax/lib/httpclient-4.4.jar:/opt/module/datax/lib/commons-cli-1.2.jar:/opt/module/datax/lib/commons-lang-2.6.jar:/opt/module/datax/lib/logback-core-1.0.13.jar:/opt/module/datax/lib/hamcrest-core-1.3.jar:/opt/module/datax/lib/fastjson-1.1.46.sec01.jar:/opt/module/datax/lib/commons-lang3-3.3.2.jar:/opt/module/datax/lib/commons-logging-1.1.1.jar:/opt/module/datax/lib/janino-2.5.16.jar:/opt/module/datax/lib/commons-configuration-1.10.jar:/opt/module/datax/lib/slf4j-api-1.7.10.jar:/opt/module/datax/lib/datax-common-0.0.1-SNAPSHOT.jar:/opt/module/datax/lib/datax-transformer-0.0.1-SNAPSHOT.jar:/opt/module/datax/lib/logback-classic-1.0.13.jar:/opt/module/datax/lib/httpcore-4.4.jar:/opt/module/datax/lib/commons-collections-3.2.1.jar:/opt/module/datax/lib/commons-math3-3.1.1.jar:.
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:java.io.tmpdir=/tmp
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:java.compiler=<NA>
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:os.name=Linux
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:os.arch=amd64
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:os.version=3.10.0-862.el7.x86_64
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:user.name=atguigu
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:user.home=/home/atguigu
2024-04-08 17:44:50.144 [0-0-0-writer] INFO  ZooKeeper - Client environment:user.dir=/opt/module/datax
2024-04-08 17:44:50.145 [0-0-0-writer] INFO  ZooKeeper - Initiating client connection, connectString=node001:2181 sessionTimeout=90000 watcher=hconnection-0x266547120x0, quorum=node001:2181, baseZNode=/hbase
2024-04-08 17:44:50.256 [0-0-0-writer-SendThread(node001:2181)] INFO  ClientCnxn - Opening socket connection to server node001/192.168.10.101:2181. Will not attempt to authenticate using SASL (unknown error)
2024-04-08 17:44:50.381 [0-0-0-writer-SendThread(node001:2181)] INFO  ClientCnxn - Socket connection established to node001/192.168.10.101:2181, initiating session
2024-04-08 17:44:50.427 [0-0-0-writer-SendThread(node001:2181)] INFO  ClientCnxn - Session establishment complete on server node001/192.168.10.101:2181, sessionid = 0x200000707b70028, negotiated timeout = 40000
2024-04-08 17:44:53.794 [taskGroup-0] INFO  TaskGroupContainer - taskGroup[0] taskId[0] is successed, used[5930]ms
2024-04-08 17:44:53.795 [taskGroup-0] INFO  TaskGroupContainer - taskGroup[0] completed it's tasks.
2024-04-08 17:44:57.857 [job-0] INFO  StandAloneJobContainerCommunicator - Total 4 records, 29 bytes | Speed 2B/s, 0 records/s | Error 0 records, 0 bytes |  All Task WaitWriterTime 0.000s |  All Task WaitReaderTime 0.000s | Percentage 100.00%
2024-04-08 17:44:57.858 [job-0] INFO  AbstractScheduler - Scheduler accomplished all tasks.
2024-04-08 17:44:57.858 [job-0] INFO  JobContainer - DataX Writer.Job [hbase11xwriter] do post work.
2024-04-08 17:44:57.859 [job-0] INFO  JobContainer - DataX Reader.Job [mysqlreader] do post work.
2024-04-08 17:44:57.859 [job-0] INFO  JobContainer - DataX jobId [0] completed successfully.
2024-04-08 17:44:57.862 [job-0] INFO  HookInvoker - No hook invoked, because base dir not exists or is a file: /opt/module/datax/hook
2024-04-08 17:44:57.866 [job-0] INFO  JobContainer - [total cpu info] => averageCpu                     | maxDeltaCpu                    | minDeltaCpu                    -1.00%                         | -1.00%                         | -1.00%[total gc info] => NAME                 | totalGCCount       | maxDeltaGCCount    | minDeltaGCCount    | totalGCTime        | maxDeltaGCTime     | minDeltaGCTime     PS MarkSweep         | 1                  | 1                  | 1                  | 0.120s             | 0.120s             | 0.120s             PS Scavenge          | 1                  | 1                  | 1                  | 0.095s             | 0.095s             | 0.095s             2024-04-08 17:44:57.867 [job-0] INFO  JobContainer - PerfTrace not enable!
2024-04-08 17:44:57.868 [job-0] INFO  StandAloneJobContainerCommunicator - Total 4 records, 29 bytes | Speed 2B/s, 0 records/s | Error 0 records, 0 bytes |  All Task WaitWriterTime 0.000s |  All Task WaitReaderTime 0.000s | Percentage 100.00%
2024-04-08 17:44:57.876 [job-0] INFO  JobContainer - 
任务启动时刻                    : 2024-04-08 17:44:45
任务结束时刻                    : 2024-04-08 17:44:57
任务总计耗时                    :                 12s
任务平均流量                    :                2B/s
记录写入速度                    :              0rec/s
读出记录总数                    :                   4
读写失败总数                    :                   0[atguigu@node001 datax]$ 

3、总结

搞了一下午,ヾ(◍°∇°◍)ノ゙加油~

参考文章“datax,mysql和hbase的相互导入”,加上ChatGPT的帮助,搞了大约4个小时,做了一个小案例。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/800425.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

SpringBoot快速入门笔记(5)

文章目录 一、elemetnUI1、main.js2、App.vue3、fontAwesome 一、elemetnUI 开源前端框架&#xff0c;安装 npm i element-ui -S 建议查看官方文档 Element组件&#xff0c;这里是Vue2搭配elementUI&#xff0c;如果是vue3就搭配elementPlus&#xff0c;这里初学就以Vue2为例子…

docker基础学习指令

文章目录 [toc] docker基础常用指令一、docker 基础命令二、docker 镜像命令1. docker images2. docker search3. docker pull4. docker system df5. docker rmi1. Commit 命令 三、 docker 容器命令1. docker run2. docker logs3. docker top4. docker inspect5. docker cp6. …

合并主分支到子分支

参考&#xff1a;【Git】合并分支出现 Please enter a commit message to explain why this merge is necessary.-CSDN博客 git 如何将主分支(master)合并到子分支上_git 将主分支合并到子分支-CSDN博客 1、先切换到主分支master git checkout master 2、把主分支代码拉到本地…

RankCSE

前置知识复习 https://www.cnblogs.com/Allen-rg/p/13958508.htmlword2vec详解&#xff1a;https://zhuanlan.zhihu.com/p/114538417 Word2Vec和GloVe都是用于将词与向量相关联的流行词嵌入模型。 相同点&#xff1a; 目标&#xff1a;Word2Vec和GloVe的共同目标是将词汇映射…

Windows/Jerry

Jerry Enumeration nmap 扫描系统发现对外开放了 8080 端口&#xff0c;再次使用 nmap 扫描端口详细信息&#xff0c;发现运行着 Apache Tomcat ┌──(kali㉿kali)-[~/vegetable/HTB/Jerry] └─$ nmap -sV -sC -p 8080 -oA nmap 10.10.10.95 -Pn Starting Nmap 7.93 ( htt…

【QT学习】Graphics View框架(进阶篇)- 派生QGraphicsItem类创建自定义图元item

【QT学习】Graphics View框架&#xff08;进阶篇&#xff09;- 派生QGraphicsItem类创建自定义图元item-CSDN博客 前言 本篇&#xff0c;我们将通过对QGraphicsItem类进行派生&#xff0c;创建自定义图元item并显示在窗口中。我们将以创建一张从文件读取的图片item为例进行分…

使用Nodejs + express连接数据库mongodb

文章目录 先创建一个js文档安装 MongoDB 驱动程序&#xff1a;引入 MongoDB 模块&#xff1a;设置数据库连接&#xff1a;新建一个表试试执行数据库操作&#xff1a;关闭数据库连接&#xff1a; 前面需要准备的内容可看前面的文章&#xff1a; Express框架搭建项目 node.js 简单…

轻松上手Jackjson(珍藏版)

写在前面 虽然现在市面上有很多优秀的json解析库&#xff0c;但 Spring默认采用Jackson解析Json。 本文将通过一系列通俗易懂的代码示例&#xff0c;带你逐步掌握 Jackson 的基础用法、进阶技巧以及在实际项目中的应用场景。 一、Jackjson简介 Jackson 是当前用的比较广泛的&a…

idea开发 java web 疫情信息查询系统bootstrap框架web结构java编程计算机网页接口查询

一、源码特点 java 疫情信息查询系统是一套完善的完整信息系统&#xff0c;结合java web开发和bootstrap UI框架完成本系统 &#xff0c;对理解JSP java编程开发语言有帮助&#xff0c;系统具有完整的源代码和数据库&#xff0c;系统主要采用B/S模式开发。 前段主要技术 css j…

The Closest Pair Problem

Problem Let S be a set of n points ((xi,yi), 1≤i≤n) in the plane, finding a pair of points p and q in S whose mutual distance is minimum. Algorithm 1: Θ ( n 2 ) The brute-force algorithm simply examines all the possible n(n−1)/2 distances and returns t…

CentOS部署Apache Superset大数据可视化BI分析工具并实现无公网IP远程访问

文章目录 前言1. 使用Docker部署Apache Superset1.1 第一步安装docker 、docker compose1.2 克隆superset代码到本地并使用docker compose启动 2. 安装cpolar内网穿透&#xff0c;实现公网访问3. 设置固定连接公网地址 前言 Superset是一款由中国知名科技公司开源的“现代化的…

模具行业项目管理系统哪家好?模具项目管理系统找企智汇!

模具行业是一个对项目管理要求非常高的领域&#xff0c;涉及到复杂的制造流程、精细的工艺要求以及紧密的供应链协作。因此&#xff0c;选择一款适合模具行业的项目管理系统至关重要。 企智汇作为一款专注于企业数字一体化项目管理的软件&#xff0c;为模具行业提供了全面的项…

解决Xshell连接不上虚拟机

相信有很多同学和我一样遇到这个问题&#xff0c;在网上看了很多教程基本上都先让在虚拟机输入ifconfig命令查看ip地址&#xff0c;弄来弄去最后还是解决不了&#x1f62d;&#x1f62d;&#xff0c;其实问题根本就不在命令上&#xff0c;很大概率是我们的虚拟机没有开启网卡&a…

CSS实现热门创作者排行榜(毛玻璃效果)

CSS实现热门创作者排行榜&#xff08;毛玻璃效果&#xff09; 效果展示 CSS 知识点 CSS 基础知识回顾filter 属性运用回顾 整体页面布局实现 <div class"container"><h3>Popular Creator Rank List</h3><!-- 用户列表容器 --><div cl…

【剪映专业版】02软件快捷键

视频课程&#xff1a;有知公开课【剪映电脑版教程】 快捷键 苹果电脑选择Final Cut Pro X Windows电脑选择Premiere Pro 常用快捷键 更改快捷键

刷题之动态规划-子序列

前言 大家好&#xff0c;我是jiantaoyab&#xff0c;开始刷动态规划的子序列类型相关的题目&#xff0c;子序列元素的相对位置和原序列的相对位置是一样的 动态规划5个步骤 状态表示 &#xff1a;dp数组中每一个下标对应值的含义是什么>dp[i]表示什么状态转移方程&#xf…

RuoYi-Cloud下载与运行

一、源码下载 若依官网:RuoYi 若依官方网站 鼠标放到"源码地址"上,点击"RuoYi-Cloud 微服务版"。 跳转至Gitee页面,点击"克隆/下载",复制HTTPS链接即可。 源码地址为:https://gitee.com/y_project/RuoYi-Cloud.git 点击复制 打开IDEA,选…

千万不要错过这6款能让你快速写作成长的宝藏软件…… #学习方法#AI写作

国外ChatGPT爆火&#xff0c;AI写作在国内也引起不小的瞩目&#xff0c;目前国内的AI写作工具少说也有几十上百个&#xff0c;要在这么多AI写作中找出适合自己的工具&#xff0c;一个一个尝试是不太现实的&#xff0c;所以今天就给大家推荐一些款AI写作工具。帮助你少走弯路&am…

开源AI程序员SWE-Agent的实现方法

1 引子 前几天&#xff0c;AI 编程机器人 Devin 引起了热议。传言称&#xff1a;程序员的饭碗就要丢了。这两天&#xff0c;一个类似功能的产品 SWE-Agent 开源&#xff0c;在 SWE-Bench 上实现了与 Devin 类似的效果。下面让我们来看看 AI 程序员的具体实现方法。 2 信息 地…

抖音电商罗盘品牌人群运营策略指南

【干货资料持续更新&#xff0c;以防走丢】 抖音电商罗盘品牌人群运营策略指南 部分资料预览 资料部分是网络整理&#xff0c;仅供学习参考。 抖音运营资料合集&#xff08;完整资料包含以下内容&#xff09; 目录 品牌人群运营策略&#xff0c;旨在帮助品牌通过精细化运营提…