机器学习/数据分析--通俗语言带你入门决策树(结合分类和回归案例)

  • 🍨 本文为🔗365天深度学习训练营 中的学习记录博客
  • 🍖 原作者:K同学啊

前言

  • 机器学习是深度学习和数据分析的基础,接下来将更新常见的机器学习算法
  • 注意:在打数学建模比赛中,机器学习用的也很多,可以一起学习
  • 决策树模型数学原理很复杂,强烈推荐看书,看书,看书,这里推荐《统计学习方法》和《机器学习西瓜书》。
  • 这里只是介绍了决策树组成,但是原理没有详细介绍,后面会出详介绍篇章。
  • 最近开学,更新不太及时,请大家见谅,欢迎收藏 + 点赞 + 关注

文章目录

  • 决策树模型
    • 简介
    • 建立决策树的方法
  • 分类案例
    • 导入数据和数据分析
    • 划分自变量和因变量
    • 模型训练
    • 模型预测结果
  • 回归案例
    • 导入数据
    • 划分数据
    • 创建模型
    • 模型预测与训练
    • 模型评估
    • 树图绘制

决策树模型

简介

定义(统计学习方法):分类决策树模型是一种描述对实例进行分类的树形结构,决策树由节点、有向边组成,节点类型有两种,内部节点和叶子节点,内部节点表示一个特征或者属性,叶子节点表示一个类

决策树与if-then

学过任何语言的人都知道if-else结构,决策树也是这样,如果if满足某一种条件,则归到一类,不满足条件的归到另外一类,如此循环判断,一直到所有特征、属性和类都归类到某一类,最终形成一颗注意:一个原则互斥且完备

决策树过程

特征选择、建立决策树、决策树剪枝三个过程

决策树解决问题

回归和分类,如果分类的叶子节点,就是回归,否则就是分类

建立决策树的方法

在这里插入图片描述

决策树背后由很多的数学原理,这里只介绍信息增益、信息增益比、基尼系数,其他的概念推荐翻阅统计学习方法西瓜书,想要电子版资料的可以私聊我。

建议:这一部分一定要看书,推荐统计学习方法和机器学习西瓜书,书中有很详细的案例帮助我们理解。67y

以下概念均来自于《统计学习方法》

信息增益: 特征A对训练数据集D的信息增益g(D.A),定义为集合D的经验熵H(D)特征A给定条件下D的经验条件H(DA)之,即:

g ( D , A ) = H ( D ) − H ( D ∣ A ) g\left(D,A\right)=H\left(D\right)-H\left(D|A\right) g(D,A)=H(D)H(DA)

信息增益比:特征A对训练数据集D的信息增益比gR(D)定为其信息增益 g(D,A)与训练数据集 D 关于特征 A的值的熵 HA(D)之比。即:

g R ( D , A ) = g ( D , A ) H A ( D ) g_{R}(D,A)=\frac{g(D,A)}{H_{A}(D)} gR(D,A)=HA(D)g(D,A)

其中: H A ( D ) = − ∑ i = 1 n ∣ D i ∣ ∣ D ∣ log ⁡ 2 ∣ D i ∣ ∣ D ∣ H_{A}(D)=-\sum_{i=1}^{n}\frac{\left|D_{i}\right|}{\left|D\right|}\log_{2}\frac{\left|D_{i}\right|}{\left|D\right|} HA(D)=i=1nDDilog2DDi ,n表示特征A的数量。

基尼指数:分类问题中,假设有区个类,样本点属于第k 类的概率为 pk,则概率分布的基尼指数定义为:

G i n i ( p ) = ∑ k = 1 K p k ( 1 − p k ) = 1 − ∑ k = 1 K p k 2 Gini\left(p\right)=\sum_{k=1}^{K}p_{k}\left(1-p_{k}\right)=1-\sum_{k=1}^{K}p_{k}^{2} Gini(p)=k=1Kpk(1pk)=1k=1Kpk2

建议:看书,通过案例和公式来理解。

分类案例

简介:通过鸢尾花的叶子特征,构建判别叶子类别的树。

导入数据和数据分析

import numpy as np 
import pandas as pd url = "https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data" 
columns = ['花萼-length', '花萼-width', '花瓣-length', '花瓣-width', 'class']data = pd.read_csv(url, names=columns)
data
花萼-length花萼-width花瓣-length花瓣-widthclass
05.13.51.40.2Iris-setosa
14.93.01.40.2Iris-setosa
24.73.21.30.2Iris-setosa
34.63.11.50.2Iris-setosa
45.03.61.40.2Iris-setosa
..................
1456.73.05.22.3Iris-virginica
1466.32.55.01.9Iris-virginica
1476.53.05.22.0Iris-virginica
1486.23.45.42.3Iris-virginica
1495.93.05.11.8Iris-virginica

150 rows × 5 columns

# 查看值的类别和数量
data['class'].value_counts()

结果:

class
Iris-setosa        50
Iris-versicolor    50
Iris-virginica     50
Name: count, dtype: int64
# 查看变量信息
data.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 150 entries, 0 to 149
Data columns (total 5 columns):#   Column     Non-Null Count  Dtype  
---  ------     --------------  -----  0   花萼-length  150 non-null    float641   花萼-width   150 non-null    float642   花瓣-length  150 non-null    float643   花瓣-width   150 non-null    float644   class      150 non-null    object 
dtypes: float64(4), object(1)
memory usage: 6.0+ KB
# 查看缺失值
data.isnull().sum()

结果:

花萼-length    0
花萼-width     0
花瓣-length    0
花瓣-width     0
class        0
dtype: int64
# 查看特征的统计变量
data.describe()

结果:

花萼-length花萼-width花瓣-length花瓣-width
count150.000000150.000000150.000000150.000000
mean5.8433333.0540003.7586671.198667
std0.8280660.4335941.7644200.763161
min4.3000002.0000001.0000000.100000
25%5.1000002.8000001.6000000.300000
50%5.8000003.0000004.3500001.300000
75%6.4000003.3000005.1000001.800000
max7.9000004.4000006.9000002.500000
# 查看特征变量的相关性
name_corr = ['花萼-length', '花萼-width', '花瓣-length', '花瓣-width']
corr = data[name_corr].corr()
print(corr)
           花萼-length  花萼-width  花瓣-length  花瓣-width
花萼-length   1.000000 -0.109369   0.871754  0.817954
花萼-width   -0.109369  1.000000  -0.420516 -0.356544
花瓣-length   0.871754 -0.420516   1.000000  0.962757
花瓣-width    0.817954 -0.356544   0.962757  1.000000

说明:特征变量之间存在共线性问题

划分自变量和因变量

X = data.iloc[:, [0, 1, 2, 3]].values   # .values转化成矩阵
y = data.iloc[:, 4].values

模型训练

from sklearn import treemodel = tree.DecisionTreeClassifier()
model.fit(X, y)   # 模型训练
# 打印模型结构
r = tree.export_text(model)

模型预测结果

# 随机选取值
x_test = X[[0, 30, 60, 90, 120, 130], :]
y_pred_prob = model.predict_proba(x_test)   # 预测概率
y_pred = model.predict(x_test)     # 预测值
print("\n===模型===")
print(r)
===模型===
|--- feature_3 <= 0.80
|   |--- class: Iris-setosa
|--- feature_3 >  0.80
|   |--- feature_3 <= 1.75
|   |   |--- feature_2 <= 4.95
|   |   |   |--- feature_3 <= 1.65
|   |   |   |   |--- class: Iris-versicolor
|   |   |   |--- feature_3 >  1.65
|   |   |   |   |--- class: Iris-virginica
|   |   |--- feature_2 >  4.95
|   |   |   |--- feature_3 <= 1.55
|   |   |   |   |--- class: Iris-virginica
|   |   |   |--- feature_3 >  1.55
|   |   |   |   |--- feature_2 <= 5.45
|   |   |   |   |   |--- class: Iris-versicolor
|   |   |   |   |--- feature_2 >  5.45
|   |   |   |   |   |--- class: Iris-virginica
|   |--- feature_3 >  1.75
|   |   |--- feature_2 <= 4.85
|   |   |   |--- feature_0 <= 5.95
|   |   |   |   |--- class: Iris-versicolor
|   |   |   |--- feature_0 >  5.95
|   |   |   |   |--- class: Iris-virginica
|   |   |--- feature_2 >  4.85
|   |   |   |--- class: Iris-virginica
print("\n===测试数据===")
print(x_test)
===测试数据===
[[5.1 3.5 1.4 0.2][4.8 3.1 1.6 0.2][5.  2.  3.5 1. ][5.5 2.6 4.4 1.2][6.9 3.2 5.7 2.3][7.4 2.8 6.1 1.9]]
print("\n===预测所属类别概率===")
print(y_pred_prob)
===预测所属类别概率===
[[1. 0. 0.][1. 0. 0.][0. 1. 0.][0. 1. 0.][0. 0. 1.][0. 0. 1.]]
print("\n===测试所属类别==")
print(y_pred)
===测试所属类别==
['Iris-setosa' 'Iris-setosa' 'Iris-versicolor' 'Iris-versicolor''Iris-virginica' 'Iris-virginica']

回归案例

通过鸢尾花三个特征,预测花瓣长度

导入数据

import pandas as pd 
import numpy as np url = "https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data"
names = ['花萼-width', '花萼-length', '花瓣-width', '花瓣-length', 'class']data = pd.read_csv(url, names=names)
data
花萼-width花萼-length花瓣-width花瓣-lengthclass
05.13.51.40.2Iris-setosa
14.93.01.40.2Iris-setosa
24.73.21.30.2Iris-setosa
34.63.11.50.2Iris-setosa
45.03.61.40.2Iris-setosa
..................
1456.73.05.22.3Iris-virginica
1466.32.55.01.9Iris-virginica
1476.53.05.22.0Iris-virginica
1486.23.45.42.3Iris-virginica
1495.93.05.11.8Iris-virginica

150 rows × 5 columns

划分数据

# 划分数据
X = data.iloc[:, [0, 1, 2]]
y = data.iloc[:, 3]

创建模型

from sklearn import tree model = tree.DecisionTreeRegressor()
model.fit(X, y)   # 模型训练

模型预测与训练

x_test = X.iloc[[0, 1, 50, 51, 100, 120], :]
y_test = y.iloc[[0, 1, 50, 51, 100, 120]]   # 只有一列y_pred = model.predict(x_test)

模型评估

# 输出原始值和真实值
df = pd.DataFrame()
df['原始值'] = y_test 
df['预测值'] = y_preddf
原始值预测值
00.20.25
10.20.20
501.41.40
511.51.50
1002.52.50
1202.32.30
from sklearn.metrics import mean_absolute_error
# 误差计算
mse = mean_absolute_error(y_test, y_pred)
mse

结果:

0.008333333333333331
# 打印树结构
r = tree.export_text(model)
print(r)
# 树模型结构比较复杂,可以运行后面代码绘图展示。
|--- feature_2 <= 2.45
|   |--- feature_1 <= 3.25
|   |   |--- feature_1 <= 2.60
|   |   |   |--- value: [0.30]
|   |   |--- feature_1 >  2.60
|   |   |   |--- feature_0 <= 4.85
|   |   |   |   |--- feature_0 <= 4.35
|   |   |   |   |   |--- value: [0.10]
|   |   |   |   |--- feature_0 >  4.35
|   |   |   |   |   |--- feature_2 <= 1.35
|   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |--- feature_2 >  1.35
|   |   |   |   |   |   |--- feature_1 <= 2.95
|   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |   |--- feature_1 >  2.95
|   |   |   |   |   |   |   |--- feature_0 <= 4.65
|   |   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |   |   |--- feature_0 >  4.65
|   |   |   |   |   |   |   |   |--- feature_1 <= 3.05
|   |   |   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |   |   |   |--- feature_1 >  3.05
|   |   |   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |--- feature_0 >  4.85
|   |   |   |   |--- feature_0 <= 4.95
|   |   |   |   |   |--- feature_1 <= 3.05
|   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |--- feature_1 >  3.05
|   |   |   |   |   |   |--- value: [0.10]
|   |   |   |   |--- feature_0 >  4.95
|   |   |   |   |   |--- value: [0.20]
|   |--- feature_1 >  3.25
|   |   |--- feature_2 <= 1.55
|   |   |   |--- feature_1 <= 4.30
|   |   |   |   |--- feature_1 <= 3.95
|   |   |   |   |   |--- feature_1 <= 3.85
|   |   |   |   |   |   |--- feature_1 <= 3.65
|   |   |   |   |   |   |   |--- feature_0 <= 5.30
|   |   |   |   |   |   |   |   |--- feature_2 <= 1.45
|   |   |   |   |   |   |   |   |   |--- feature_1 <= 3.55
|   |   |   |   |   |   |   |   |   |   |--- feature_2 <= 1.35
|   |   |   |   |   |   |   |   |   |   |   |--- value: [0.30]
|   |   |   |   |   |   |   |   |   |   |--- feature_2 >  1.35
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |--- feature_1 >  3.55
|   |   |   |   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |   |   |   |--- feature_2 >  1.45
|   |   |   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |   |   |--- feature_0 >  5.30
|   |   |   |   |   |   |   |   |--- feature_1 <= 3.45
|   |   |   |   |   |   |   |   |   |--- value: [0.40]
|   |   |   |   |   |   |   |   |--- feature_1 >  3.45
|   |   |   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |   |--- feature_1 >  3.65
|   |   |   |   |   |   |   |--- feature_0 <= 5.20
|   |   |   |   |   |   |   |   |--- feature_1 <= 3.75
|   |   |   |   |   |   |   |   |   |--- value: [0.40]
|   |   |   |   |   |   |   |   |--- feature_1 >  3.75
|   |   |   |   |   |   |   |   |   |--- value: [0.30]
|   |   |   |   |   |   |   |--- feature_0 >  5.20
|   |   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |--- feature_1 >  3.85
|   |   |   |   |   |   |--- value: [0.40]
|   |   |   |   |--- feature_1 >  3.95
|   |   |   |   |   |--- feature_0 <= 5.35
|   |   |   |   |   |   |--- value: [0.10]
|   |   |   |   |   |--- feature_0 >  5.35
|   |   |   |   |   |   |--- value: [0.20]
|   |   |   |--- feature_1 >  4.30
|   |   |   |   |--- value: [0.40]
|   |   |--- feature_2 >  1.55
|   |   |   |--- feature_0 <= 4.90
|   |   |   |   |--- value: [0.20]
|   |   |   |--- feature_0 >  4.90
|   |   |   |   |--- feature_0 <= 5.05
|   |   |   |   |   |--- feature_1 <= 3.45
|   |   |   |   |   |   |--- value: [0.40]
|   |   |   |   |   |--- feature_1 >  3.45
|   |   |   |   |   |   |--- value: [0.60]
|   |   |   |   |--- feature_0 >  5.05
|   |   |   |   |   |--- feature_1 <= 3.35
|   |   |   |   |   |   |--- value: [0.50]
|   |   |   |   |   |--- feature_1 >  3.35
|   |   |   |   |   |   |--- feature_2 <= 1.65
|   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |   |--- feature_2 >  1.65
|   |   |   |   |   |   |   |--- feature_1 <= 3.60
|   |   |   |   |   |   |   |   |--- value: [0.20]
|   |   |   |   |   |   |   |--- feature_1 >  3.60
|   |   |   |   |   |   |   |   |--- feature_0 <= 5.55
|   |   |   |   |   |   |   |   |   |--- value: [0.40]
|   |   |   |   |   |   |   |   |--- feature_0 >  5.55
|   |   |   |   |   |   |   |   |   |--- value: [0.30]
|--- feature_2 >  2.45
|   |--- feature_2 <= 4.75
|   |   |--- feature_2 <= 4.15
|   |   |   |--- feature_1 <= 2.65
|   |   |   |   |--- feature_2 <= 3.95
|   |   |   |   |   |--- feature_2 <= 3.75
|   |   |   |   |   |   |--- feature_2 <= 3.15
|   |   |   |   |   |   |   |--- value: [1.10]
|   |   |   |   |   |   |--- feature_2 >  3.15
|   |   |   |   |   |   |   |--- value: [1.00]
|   |   |   |   |   |--- feature_2 >  3.75
|   |   |   |   |   |   |--- feature_0 <= 5.55
|   |   |   |   |   |   |   |--- value: [1.10]
|   |   |   |   |   |   |--- feature_0 >  5.55
|   |   |   |   |   |   |   |--- value: [1.10]
|   |   |   |   |--- feature_2 >  3.95
|   |   |   |   |   |--- feature_0 <= 5.90
|   |   |   |   |   |   |--- feature_0 <= 5.65
|   |   |   |   |   |   |   |--- feature_1 <= 2.40
|   |   |   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |   |   |   |--- feature_1 >  2.40
|   |   |   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |   |   |--- feature_0 >  5.65
|   |   |   |   |   |   |   |--- value: [1.20]
|   |   |   |   |   |--- feature_0 >  5.90
|   |   |   |   |   |   |--- value: [1.00]
|   |   |   |--- feature_1 >  2.65
|   |   |   |   |--- feature_0 <= 5.75
|   |   |   |   |   |--- feature_0 <= 5.40
|   |   |   |   |   |   |--- value: [1.40]
|   |   |   |   |   |--- feature_0 >  5.40
|   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |--- feature_0 >  5.75
|   |   |   |   |   |--- feature_2 <= 4.05
|   |   |   |   |   |   |--- feature_2 <= 3.95
|   |   |   |   |   |   |   |--- value: [1.20]
|   |   |   |   |   |   |--- feature_2 >  3.95
|   |   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |   |--- feature_2 >  4.05
|   |   |   |   |   |   |--- value: [1.00]
|   |   |--- feature_2 >  4.15
|   |   |   |--- feature_2 <= 4.45
|   |   |   |   |--- feature_0 <= 5.80
|   |   |   |   |   |--- feature_1 <= 2.65
|   |   |   |   |   |   |--- value: [1.20]
|   |   |   |   |   |--- feature_1 >  2.65
|   |   |   |   |   |   |--- feature_1 <= 2.95
|   |   |   |   |   |   |   |--- feature_1 <= 2.80
|   |   |   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |   |   |   |--- feature_1 >  2.80
|   |   |   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |   |   |--- feature_1 >  2.95
|   |   |   |   |   |   |   |--- value: [1.20]
|   |   |   |   |--- feature_0 >  5.80
|   |   |   |   |   |--- feature_1 <= 2.95
|   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |   |--- feature_1 >  2.95
|   |   |   |   |   |   |--- feature_0 <= 6.25
|   |   |   |   |   |   |   |--- value: [1.50]
|   |   |   |   |   |   |--- feature_0 >  6.25
|   |   |   |   |   |   |   |--- value: [1.40]
|   |   |   |--- feature_2 >  4.45
|   |   |   |   |--- feature_0 <= 5.15
|   |   |   |   |   |--- value: [1.70]
|   |   |   |   |--- feature_0 >  5.15
|   |   |   |   |   |--- feature_1 <= 3.25
|   |   |   |   |   |   |--- feature_1 <= 2.95
|   |   |   |   |   |   |   |--- feature_2 <= 4.65
|   |   |   |   |   |   |   |   |--- feature_0 <= 5.85
|   |   |   |   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |   |   |   |   |--- feature_0 >  5.85
|   |   |   |   |   |   |   |   |   |--- feature_0 <= 6.55
|   |   |   |   |   |   |   |   |   |   |--- value: [1.50]
|   |   |   |   |   |   |   |   |   |--- feature_0 >  6.55
|   |   |   |   |   |   |   |   |   |   |--- value: [1.30]
|   |   |   |   |   |   |   |--- feature_2 >  4.65
|   |   |   |   |   |   |   |   |--- feature_1 <= 2.85
|   |   |   |   |   |   |   |   |   |--- value: [1.20]
|   |   |   |   |   |   |   |   |--- feature_1 >  2.85
|   |   |   |   |   |   |   |   |   |--- value: [1.40]
|   |   |   |   |   |   |--- feature_1 >  2.95
|   |   |   |   |   |   |   |--- feature_2 <= 4.55
|   |   |   |   |   |   |   |   |--- value: [1.50]
|   |   |   |   |   |   |   |--- feature_2 >  4.55
|   |   |   |   |   |   |   |   |--- feature_2 <= 4.65
|   |   |   |   |   |   |   |   |   |--- value: [1.40]
|   |   |   |   |   |   |   |   |--- feature_2 >  4.65
|   |   |   |   |   |   |   |   |   |--- feature_1 <= 3.15
|   |   |   |   |   |   |   |   |   |   |--- value: [1.50]
|   |   |   |   |   |   |   |   |   |--- feature_1 >  3.15
|   |   |   |   |   |   |   |   |   |   |--- value: [1.40]
|   |   |   |   |   |--- feature_1 >  3.25
|   |   |   |   |   |   |--- value: [1.60]
|   |--- feature_2 >  4.75
|   |   |--- feature_2 <= 5.05
|   |   |   |--- feature_0 <= 6.75
|   |   |   |   |--- feature_0 <= 5.80
|   |   |   |   |   |--- value: [2.00]
|   |   |   |   |--- feature_0 >  5.80
|   |   |   |   |   |--- feature_1 <= 2.35
|   |   |   |   |   |   |--- value: [1.50]
|   |   |   |   |   |--- feature_1 >  2.35
|   |   |   |   |   |   |--- feature_0 <= 6.25
|   |   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |   |   |--- feature_0 >  6.25
|   |   |   |   |   |   |   |--- feature_2 <= 4.95
|   |   |   |   |   |   |   |   |--- feature_1 <= 2.60
|   |   |   |   |   |   |   |   |   |--- value: [1.50]
|   |   |   |   |   |   |   |   |--- feature_1 >  2.60
|   |   |   |   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |   |   |   |--- feature_2 >  4.95
|   |   |   |   |   |   |   |   |--- feature_0 <= 6.50
|   |   |   |   |   |   |   |   |   |--- value: [1.90]
|   |   |   |   |   |   |   |   |--- feature_0 >  6.50
|   |   |   |   |   |   |   |   |   |--- value: [1.70]
|   |   |   |--- feature_0 >  6.75
|   |   |   |   |--- feature_0 <= 6.85
|   |   |   |   |   |--- value: [1.40]
|   |   |   |   |--- feature_0 >  6.85
|   |   |   |   |   |--- value: [1.50]
|   |   |--- feature_2 >  5.05
|   |   |   |--- feature_1 <= 3.05
|   |   |   |   |--- feature_0 <= 6.35
|   |   |   |   |   |--- feature_0 <= 5.85
|   |   |   |   |   |   |--- feature_1 <= 2.75
|   |   |   |   |   |   |   |--- value: [1.90]
|   |   |   |   |   |   |--- feature_1 >  2.75
|   |   |   |   |   |   |   |--- value: [2.40]
|   |   |   |   |   |--- feature_0 >  5.85
|   |   |   |   |   |   |--- feature_1 <= 2.85
|   |   |   |   |   |   |   |--- feature_1 <= 2.65
|   |   |   |   |   |   |   |   |--- value: [1.40]
|   |   |   |   |   |   |   |--- feature_1 >  2.65
|   |   |   |   |   |   |   |   |--- feature_1 <= 2.75
|   |   |   |   |   |   |   |   |   |--- value: [1.60]
|   |   |   |   |   |   |   |   |--- feature_1 >  2.75
|   |   |   |   |   |   |   |   |   |--- value: [1.50]
|   |   |   |   |   |   |--- feature_1 >  2.85
|   |   |   |   |   |   |   |--- feature_1 <= 2.95
|   |   |   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |   |   |   |--- feature_1 >  2.95
|   |   |   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |--- feature_0 >  6.35
|   |   |   |   |   |--- feature_0 <= 7.50
|   |   |   |   |   |   |--- feature_0 <= 7.15
|   |   |   |   |   |   |   |--- feature_1 <= 2.75
|   |   |   |   |   |   |   |   |--- feature_0 <= 6.55
|   |   |   |   |   |   |   |   |   |--- value: [1.90]
|   |   |   |   |   |   |   |   |--- feature_0 >  6.55
|   |   |   |   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |   |   |   |--- feature_1 >  2.75
|   |   |   |   |   |   |   |   |--- feature_0 <= 6.60
|   |   |   |   |   |   |   |   |   |--- feature_2 <= 5.55
|   |   |   |   |   |   |   |   |   |   |--- feature_2 <= 5.35
|   |   |   |   |   |   |   |   |   |   |   |--- value: [2.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_2 >  5.35
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |   |   |   |   |   |--- feature_2 >  5.55
|   |   |   |   |   |   |   |   |   |   |--- feature_2 <= 5.70
|   |   |   |   |   |   |   |   |   |   |   |--- value: [2.15]
|   |   |   |   |   |   |   |   |   |   |--- feature_2 >  5.70
|   |   |   |   |   |   |   |   |   |   |   |--- value: [2.20]
|   |   |   |   |   |   |   |   |--- feature_0 >  6.60
|   |   |   |   |   |   |   |   |   |--- feature_0 <= 6.75
|   |   |   |   |   |   |   |   |   |   |--- value: [2.30]
|   |   |   |   |   |   |   |   |   |--- feature_0 >  6.75
|   |   |   |   |   |   |   |   |   |   |--- value: [2.10]
|   |   |   |   |   |   |--- feature_0 >  7.15
|   |   |   |   |   |   |   |--- feature_2 <= 5.95
|   |   |   |   |   |   |   |   |--- value: [1.60]
|   |   |   |   |   |   |   |--- feature_2 >  5.95
|   |   |   |   |   |   |   |   |--- feature_1 <= 2.85
|   |   |   |   |   |   |   |   |   |--- value: [1.90]
|   |   |   |   |   |   |   |   |--- feature_1 >  2.85
|   |   |   |   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |   |--- feature_0 >  7.50
|   |   |   |   |   |   |--- feature_2 <= 6.80
|   |   |   |   |   |   |   |--- feature_2 <= 6.35
|   |   |   |   |   |   |   |   |--- value: [2.30]
|   |   |   |   |   |   |   |--- feature_2 >  6.35
|   |   |   |   |   |   |   |   |--- feature_1 <= 2.90
|   |   |   |   |   |   |   |   |   |--- value: [2.00]
|   |   |   |   |   |   |   |   |--- feature_1 >  2.90
|   |   |   |   |   |   |   |   |   |--- value: [2.10]
|   |   |   |   |   |   |--- feature_2 >  6.80
|   |   |   |   |   |   |   |--- value: [2.30]
|   |   |   |--- feature_1 >  3.05
|   |   |   |   |--- feature_1 <= 3.25
|   |   |   |   |   |--- feature_2 <= 5.95
|   |   |   |   |   |   |--- feature_0 <= 6.60
|   |   |   |   |   |   |   |--- feature_2 <= 5.40
|   |   |   |   |   |   |   |   |--- feature_0 <= 6.45
|   |   |   |   |   |   |   |   |   |--- value: [2.30]
|   |   |   |   |   |   |   |   |--- feature_0 >  6.45
|   |   |   |   |   |   |   |   |   |--- value: [2.00]
|   |   |   |   |   |   |   |--- feature_2 >  5.40
|   |   |   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |   |   |--- feature_0 >  6.60
|   |   |   |   |   |   |   |--- feature_2 <= 5.50
|   |   |   |   |   |   |   |   |--- feature_2 <= 5.25
|   |   |   |   |   |   |   |   |   |--- value: [2.30]
|   |   |   |   |   |   |   |   |--- feature_2 >  5.25
|   |   |   |   |   |   |   |   |   |--- value: [2.10]
|   |   |   |   |   |   |   |--- feature_2 >  5.50
|   |   |   |   |   |   |   |   |--- feature_2 <= 5.65
|   |   |   |   |   |   |   |   |   |--- value: [2.40]
|   |   |   |   |   |   |   |   |--- feature_2 >  5.65
|   |   |   |   |   |   |   |   |   |--- value: [2.30]
|   |   |   |   |   |--- feature_2 >  5.95
|   |   |   |   |   |   |--- value: [1.80]
|   |   |   |   |--- feature_1 >  3.25
|   |   |   |   |   |--- feature_0 <= 7.45
|   |   |   |   |   |   |--- feature_2 <= 5.85
|   |   |   |   |   |   |   |--- feature_2 <= 5.65
|   |   |   |   |   |   |   |   |--- feature_2 <= 5.50
|   |   |   |   |   |   |   |   |   |--- value: [2.30]
|   |   |   |   |   |   |   |   |--- feature_2 >  5.50
|   |   |   |   |   |   |   |   |   |--- value: [2.40]
|   |   |   |   |   |   |   |--- feature_2 >  5.65
|   |   |   |   |   |   |   |   |--- value: [2.30]
|   |   |   |   |   |   |--- feature_2 >  5.85
|   |   |   |   |   |   |   |--- feature_0 <= 6.75
|   |   |   |   |   |   |   |   |--- value: [2.50]
|   |   |   |   |   |   |   |--- feature_0 >  6.75
|   |   |   |   |   |   |   |   |--- value: [2.50]
|   |   |   |   |   |--- feature_0 >  7.45
|   |   |   |   |   |   |--- feature_0 <= 7.80
|   |   |   |   |   |   |   |--- value: [2.20]
|   |   |   |   |   |   |--- feature_0 >  7.80
|   |   |   |   |   |   |   |--- value: [2.00]

树图绘制

from sklearn.tree import export_graphviz
import graphviz#设置字体
from pylab import mpl
mpl.rcParams["font.sans-serif"] = ["SimHei"]  # 显示中文# 使用export_graphviz生成DOT文件
dot_data = export_graphviz(model, out_file=None, feature_names=['花萼-width', '花萼-length', '花瓣-width'],  class_names=['花瓣-length'],filled=True, rounded=True,special_characters=True) # 使用graphviz渲染DOT文件
graph = graphviz.Source(dot_data)
graph.render("decision_tree") # 将图形保存为PDF或其它格式
graph.view() # 在默认查看器中打开图形

图太长了,不方便展示,可以运行代码绘制。

ue: [2.50]
| | | | | | | |— feature_0 > 6.75
| | | | | | | | |— value: [2.50]
| | | | | |— feature_0 > 7.45
| | | | | | |— feature_0 <= 7.80
| | | | | | | |— value: [2.20]
| | | | | | |— feature_0 > 7.80
| | | | | | | |— value: [2.00]


​    ## 树图绘制```python
from sklearn.tree import export_graphviz
import graphviz#设置字体
from pylab import mpl
mpl.rcParams["font.sans-serif"] = ["SimHei"]  # 显示中文# 使用export_graphviz生成DOT文件
dot_data = export_graphviz(model, out_file=None, feature_names=['花萼-width', '花萼-length', '花瓣-width'],  class_names=['花瓣-length'],filled=True, rounded=True,special_characters=True) # 使用graphviz渲染DOT文件
graph = graphviz.Source(dot_data)
graph.render("decision_tree") # 将图形保存为PDF或其它格式
graph.view() # 在默认查看器中打开图形

图太长了,不方便展示,可以运行代码绘制。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/878008.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

从零开始:渗透测试环境安装详细教程

一、引言 在进行渗透测试之前&#xff0c;搭建一个合适的渗透测试环境是至关重要的。一个良好的渗透测试环境可以帮助我们更好地学习和实践渗透测试技术&#xff0c;同时也可以降低对实际生产环境造成的风险。本文将详细介绍如何安装渗透测试环境&#xff0c;包括选择虚拟机软…

李沐--动手学深度学习 批量规范化

1.理论 2.从零开始实现批量规范化 import torch from torch import nn from d2l import torch as d2l from torch.utils.hooks import RemovableHandle #从零开始实现批量规范化 def batch_norm(X,gamma,beta,moving_mean,moving_var,eps,momentum):#通过is_grad_enabled来判断…

PyCharm汉化:简单一步到胃!PyCharm怎么设置中文简体

最近在弄python的项目 一起加油哦 步骤&#xff1a; PyCharm的汉化可以通过两种主要方法完成&#xff1a; 方法一&#xff1a;通过PyCharm内置的插件市场安装中文语言包 1. 打开PyCharm&#xff0c;点击File -> Settings&#xff08;在Mac上是PyCharm -> Preferences…

java一键生成数据库说明文档html格式

要验收项目了&#xff0c;要写数据库文档&#xff0c;一大堆表太费劲了&#xff0c;直接生成一个吧&#xff0c;本来想用个别人的轮子&#xff0c;网上看了几个&#xff0c;感觉效果不怎么好&#xff0c;自己动手写一个吧。抽空再把字典表补充进去就OK了 先看效果&#xff1a; …

Python3:多行文本内容转换为标准的cURL请求参数值

背景 在最近的工作中&#xff0c;经常需要处理一些接口请求的参数&#xff0c;参数来源形式很多&#xff0c;可能是Excel、知识库文档等&#xff0c;有些数据形式比较复杂&#xff0c;比如多行或者包含很多不同的字符&#xff0c;示例如下&#xff1a; **客服质检分析指引** …

【精选】分享9款AI毕业论文生成初稿题目网站

在当今学术研究领域&#xff0c;AI技术的应用日益广泛&#xff0c;尤其是在学术论文的撰写过程中。AI论文生成器的出现&#xff0c;极大地简化了学术写作流程&#xff0c;提高了写作效率。以下是9款推荐的AI毕业论文生成初稿的网站&#xff0c;它们各有特色&#xff0c;能够满足…

MFC工控项目实例之七点击下拉菜单弹出对话框

承接专栏《MFC工控项目实例之六CFile添加菜单栏》 1、在SEAL_PRESSUREDlg.h文件中添加代码 class CSEAL_PRESSUREDlg : public CDialog { ...afx_msg void OnTypeManage(); ... } 2、在SEAL_PRESSUREDlg.cpp文件中添加代码 BEGIN_MESSAGE_MAP(CSEAL_PRESSUREDlg, CDialog)//…

MySQL的源码安装及基本部署(基于RHEL7.9)

这里源码安装mysql的5.7.44版本 一、源码安装 1.下载并解压mysql , 进入目录: wget https://downloads.mysql.com/archives/get/p/23/file/mysql-boost-5.7.44.tar.gz tar xf mysql-boost-5.7.44.tar.gz cd mysql-5.7.44/ 2.准备好mysql编译安装依赖: yum install cmake g…

Python爬虫——简单网页抓取(实战案例)小白篇

Python 爬虫是一种强大的工具&#xff0c;用于从网页中提取数据。这里&#xff0c;我将通过一个简单的实战案例来展示如何使用 Python 和一些流行的库&#xff08;如 requests 和 BeautifulSoup&#xff09;来抓取网页数据。 实战案例&#xff1a;抓取一个新闻网站的头条新闻标…

Windows上传Linux文件行尾符转换

Windows上传Linux文件行尾符转换 1、Windows与Linux文件行尾符2、Windows与Linux文件格式转换 1、Windows与Linux文件行尾符 众所周知&#xff0c;Windows、Mac与Linux三种系统的文件行尾符不同&#xff0c;其中 Windows文件行尾符&#xff08;\r\n&#xff09;&#xff1a; L…

使用kafka改造分布式事务

文章目录 1、kafka确保消息不丢失&#xff1f;1.1、生产者端确保消息不丢失1.2、kafka服务端确保消息不丢失1.3、消费者确保正确无误的消费 2、生产者发送消息 KafkaService3、UserInfoServiceImpl -> login()4、service-account - > AccountListener.java 1、kafka确保消…

day31-测试之性能测试工具JMeter的功能概要、元件作用域和执行顺序

目录 一、JMeter的功能概要 1.1.文件目录介绍 1).bin目录 2).docs目录 3).printable_docs目录 4).lib目录 1.2.基本配置 1).汉化 2).主题修改 1.3.基本使用流程 二、JMeter元件作用域和执行顺序 2.1.名称解释 2.2.基本元件 2.3.元件作用域 1).核心 2).提示 3).作用域的原则 2.…

Redis 实现哨兵模式

目录 1 哨兵模式介绍 1.1 什么是哨兵模式 1.2 sentinel中的三个定时任务 2 配置哨兵 2.1 实验环境 2.2 实现哨兵的三条参数&#xff1a; 2.3 修改配置文件 2.3.1 MASTER 2.3.2 SLAVE 2.4 将 sentinel 进行备份 2.5 开启哨兵模式 2.6 故障模拟 3 在整个架构中可能会出现的问题 …

go中 panicrecoverdefer机制

go的defer机制-CSDN博客 常见panic场景 数组或切片越界&#xff0c;例如 s : make([]int, 3); fmt.Println(s[5]) 会引发 panic: runtime error: index out of range空指针调用&#xff0c;例如 var p *Person; fmt.Println(p.Name) 会引发 panic: runtime error: invalid m…

网络通信tcp

一、udp案例 二、基于tcp: tcp //c/s tcp 客户端: 1.建立连接 socket bind connect 2.通信过程 read write close tcp服务器: 1.建立连接 socket bind listen accept 2.通信过程 read write close connect函数 int connect(int sockfd, con…

Git克隆仓库太大导致拉不下来的解决方法 fatal: fetch-pack: invalid index-pack output

一般这种问题是因为某个文件/某个文件夹/某些文件夹过大导致整个项目超过1G了导致的 试过其他教程里的设置depth为1,也改过git的postBuffer,都不管用 最后还是靠克隆指定文件夹这种方式成功把项目拉下来 1. Git Bash 输入命令 git clone --filterblob:none --sparse 项目路径…

探索Unity3D URP后处理在UI控件Image上的应用

探索Unity3D URP后处理在UI控件Image上的应用 前言初识URP配置后处理效果将后处理应用于UI控件方法一&#xff1a;自定义Shader方法二&#xff1a;RenderTexture的使用 实践操作步骤一&#xff1a;创建RenderTexture步骤二&#xff1a;UI渲染至RenderTexture步骤三&#xff1a;…

视频如何转gif?分享这几款软件!

在这个快节奏、高创意的互联网时代&#xff0c;动图&#xff08;GIF&#xff09;以其独特的魅力成为了社交媒体、聊天软件中的宠儿。它们不仅能瞬间抓住眼球&#xff0c;还能让信息传递更加生动有趣。然而&#xff0c;你是否曾为如何将精彩瞬间从视频中精准截取并转换成GIF而苦…

​北斗终端:无人驾驶领域的导航新星

一、北斗终端在无人驾驶领域的应用 北斗终端&#xff0c;作为我国自主研发的北斗卫星导航系统的重要组成部分&#xff0c;其在无人驾驶领域中的应用正逐步显现其独特魅力。北斗系统的高精度、高可靠性和良好的抗干扰性能&#xff0c;为无人驾驶车辆提供了精确的定位和导航服务…

关于超长字符串/文本对应的数据从excel导入到PL/SQL中的尝试

问题&#xff1a; 1.字符串太长 2.str绑定之的结尾null缺失 将csv文件导入到PL/SQL表中存在的一些问题 1.本来我是需要将exceL上的几十条数据导入到PL/SQL数据库的一张表中&#xff0c;结果我花了许多时间 去导入。 想想一般情况下也就几十条数据&#xff0c;直接复制粘贴就…