报错信息
报错:
Cannot set priority of namenode process 8095
查看日志文件:
2024-03-10 01:36:50,840 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2024-03-10 01:36:51,061 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/tmp/hadoop-root/dfs/data
2024-03-10 01:36:51,131 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2024-03-10 01:36:51,174 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2024-03-10 01:36:51,174 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2024-03-10 01:36:51,329 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
2024-03-10 01:36:51,330 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576
2024-03-10 01:36:51,333 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is bigdata1
2024-03-10 01:36:51,334 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
2024-03-10 01:36:51,336 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2024-03-10 01:36:51,349 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:9866
2024-03-10 01:36:51,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s
2024-03-10 01:36:51,350 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50
2024-03-10 01:36:51,374 INFO org.eclipse.jetty.util.log: Logging initialized @826ms
2024-03-10 01:36:51,436 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2024-03-10 01:36:51,437 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
2024-03-10 01:36:51,440 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2024-03-10 01:36:51,441 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
2024-03-10 01:36:51,441 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2024-03-10 01:36:51,441 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2024-03-10 01:36:51,454 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 42864
2024-03-10 01:36:51,455 INFO org.eclipse.jetty.server.Server: jetty-9.3.24.v20180605, build timestamp: 2018-06-05T17:11:56Z, git hash: 84205aa28f11a4f31f2a3b86d1bba2cc8ab69827
2024-03-10 01:36:51,472 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@3576ddc2{/logs,file:///opt/module/hadoop-3.1.3/logs/,AVAILABLE}
2024-03-10 01:36:51,472 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2e570ded{/static,file:///opt/module/hadoop-3.1.3/share/hadoop/hdfs/webapps/static/,AVAILABLE}
2024-03-10 01:36:51,540 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@6304101a{/,file:///opt/module/hadoop-3.1.3/share/hadoop/hdfs/webapps/datanode/,AVAILABLE}{/datanode}
2024-03-10 01:36:51,543 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@45a4b042{HTTP/1.1,[http/1.1]}{localhost:42864}
2024-03-10 01:36:51,543 INFO org.eclipse.jetty.server.Server: Started @996ms
2024-03-10 01:36:51,626 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:9864
2024-03-10 01:36:51,630 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor
2024-03-10 01:36:51,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = root
2024-03-10 01:36:51,648 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
2024-03-10 01:36:51,687 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
2024-03-10 01:36:51,696 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867
2024-03-10 01:36:51,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:9867
2024-03-10 01:36:51,816 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null
2024-03-10 01:36:51,821 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Unable to get NameNode addresses.
2024-03-10 01:36:51,823 INFO org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.w.WebAppContext@6304101a{/,null,UNAVAILABLE}{/datanode}
2024-03-10 01:36:51,825 INFO org.eclipse.jetty.server.AbstractConnector: Stopped ServerConnector@45a4b042{HTTP/1.1,[http/1.1]}{localhost:0}
2024-03-10 01:36:51,826 INFO org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.s.ServletContextHandler@2e570ded{/static,file:///opt/module/hadoop-3.1.3/share/hadoop/hdfs/webapps/static/,UNAVAILABLE}
2024-03-10 01:36:51,826 INFO org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.s.ServletContextHandler@3576ddc2{/logs,file:///opt/module/hadoop-3.1.3/logs/,UNAVAILABLE}
2024-03-10 01:36:51,827 INFO org.apache.hadoop.ipc.Server: Stopping server on 9867
2024-03-10 01:36:51,828 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping DataNode metrics system...
2024-03-10 01:36:51,828 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system stopped.
2024-03-10 01:36:51,828 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system shutdown complete.
2024-03-10 01:36:51,830 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Shutdown complete.
2024-03-10 01:36:51,831 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.IOException: No services to connect, missing NameNode address.at org.apache.hadoop.hdfs.server.datanode.BlockPoolManager.refreshNamenodes(BlockPoolManager.java:165)at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1441)at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:501)at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2806)at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2714)at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2756)at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2900)at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2924)
2024-03-10 01:36:51,832 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.io.IOException: No services to connect, missing NameNode address.
发现是NameNode 的地址丢失了,也就是hadoop的core-site.xml文件没有配置好
解决办法
修改core-site.xml文件
<configuration><property><name>fs.defaultFS</name><value>hdfs://bigdata1:9820</value>
</property>
<property><name>hadoop.tmp.dir</name><value>/opt/module/hadoop-3.1.3/data</value>
</property>
<property><name>hadoop.http.staticuser.user</name><value>root</value>
</property>
<property><name>hadoop.proxyuser.root.hosts</name><value>*</value>
</property>
<property><name>hadoop.proxyuser.root.groups</name><value>*</value>
</property>
<property><name>hadoop.proxyuser.root.groups</name><value>*</value>
</property></configuration>
其他可能
也有可能是hadoop的端口被占用,这表示无法将NameNode绑定到9870端口。
可以先用以下命令查看9870端口被哪个程序占用了
netstat -tuln | grep 9870
查到直接用 下面的命令杀死,在重新启动hadoop集群即可
kill -9 进程号