Python开发运维:Celery连接Redis

目录

一、理论

1.Celery

二、实验

1.Windows11安装Redis

2.Python3.8环境中配置Celery

三、问题

1.Celery命令报错

2.执行Celery命令报错

3.Win11启动Celery报ValueErro错误


 

 

 

一、理论

1.Celery

(1) 概念

 Celery是一个基于python开发的分布式系统,它是简单、灵活且可靠的,处理大量消息,专注于实时处理的异步任务队列,同时也支持任务调度。

e603186045544c238c4fb2a222ea12e8.jpeg

 

(2) 架构

Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。

1)消息中间件
Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, Redis等等2)任务执行单元
Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。3)任务结果存储
Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, redis等

5312be781c034a5db52c3796f345803f.png

 

 (3)  特点

1)简单
Celery易于使用和维护,并且它不需要配置文件并且配置和使用是比较简单的2)高可用
当任务执行失败或执行过程中发生连接中断,celery会自动尝试重新执行任务3)快速
单个 Celery 进程每分钟可处理数以百万计的任务,而保持往返延迟在亚毫秒级4)灵活
Celery几乎所有部分都可以扩展或单独使用,各个部分可以自定义。

 

(4)场景

Celery是一个强大的 分布式任务队列的异步处理框架,它可以让任务的执行完全脱离主程序,甚至可以被分配到其他主机上运行。通常使用它来实现异步任务(async task)和定时任务(crontab)。

1)异步任务
将耗时操作任务提交给Celery去异步执行,比如发送短信/邮件、消息推送、音视频处理等等2)定时任务
定时执行某件事情,比如每天数据统计

 

二、实验

1.Windows11安装Redis

(1)下载最新版Redis

Redis-x64-xxx.zip压缩包到D盘,解压后,将文件夹重新命名为 Redis

(2)查看目录

D:\Redis>dir

5ae0dc109bd14283871d6b2729e8b311.png

(3)打开一个 cmd 窗口 使用 cd 命令切换目录到 D:\Redis 运行

redis-server.exe redis.windows.conf

2a3ce0e66a724e6b82652bcd9542b81c.png

 

(4)把 redis 的路径加到系统的环境变量

fea7cb630e4b49b1af6995f17dc6f6a0.png

 

(5)另外开启一个 cmd 窗口,原来的不要关闭,因为先前打开的是redis服务端

 

#切换到 redis 目录下运行
redis-cli.exe -h 127.0.0.1 -p 6379

2baf1597234a4fac8e7675d55476cfdf.png

(6)检测连接是否成功

#设置键值对
set firstKey 123#取出键值对
get firstKey#退出
exit

 

5d02fd34c1e348d19d4230ba1aca676b.png

 

(7)ctrl+c 退出先前打开的服务端

836c461954584108a4b165ef7ea57eae.png

(8)注册Redis服务

#通过 cmd 命令行工具进入 Redis 安装目录,将 Redis 服务注册到 Windows 服务中,执行以下命令
redis-server.exe --service-install redis.windows.conf --loglevel verbose

8bbbba5f87fa4735bed7bafb9cb497ca.png

(9)启动Redis服务

#执行以下命令启动 Redis 服务
redis-server --service-start

650f6e301d1c4d9aa075d9fd1ad05dc6.png

(10)Redis 已经被添加到 Windows 服务中

b77309f2aa214246b83e81213970669e.png

(11)打开Redis服务,将启动类型设置为自动,即可实现开机自启动

8cec0e57cda84312a1185fe0c8960edd.png

 

2.Python3.8环境中配置Celery

(1) PyCharm安装celery+redis

#celery是典型的生产者+消费者的模式,生产者生产任务并加入队列中,消费者取出任务消费。多用于处理异步任务或者定时任务。#第一种方式
pip install celery
pip install redis#第二种方式
pip install -i https://pypi.douban.com/simple celery
pip install -i https://pypi.douban.com/simple redis

6fcc1ca414684d29a18adaff6ba30620.png

 

(2)新建异步任务执行文件celery_task.py.相当于注册了celery app

# -*- coding: utf-8 -*-
from celery import Celery
import time
app = Celery('demo', backend='redis://localhost:6379/1', broker='redis://localhost:6379/2')
@app.task
def send_email(name):print("向%s发送邮件..."%name)time.sleep(5)print("向%s发送邮件完成"%name)return "ok"

e2c71ca92a1841818b47e6cb8349db96.png

(3) 在项目文件目录下创建worker消费任务

PS D:\soft\pythonProject> celery --app=celerypro.celery_task worker -n node1 -l INFO-------------- celery@node1 v5.3.5 (emerald-rush)
--- ***** -----
-- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 17:26:39
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         test:0x1e6fa358550
- ** ---------- .> transport:   redis://127.0.0.1:6379/2
- ** ---------- .> results:     redis://127.0.0.1:6379/1
- *** --- * --- .> concurrency: 32 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ------------------- [queues].> celery           exchange=celery(direct) key=celery[tasks]. celerypro.celery_task.send_email[2023-11-22 17:26:39,265: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
[2023-11-22 20:30:08,249: INFO/MainProcess] mingle: searching for neighbors
[2023-11-22 20:30:15,379: INFO/MainProcess] mingle: all alone
[2023-11-22 20:30:25,608: INFO/MainProcess] celery@node1 ready.

3c72f0c84b774bc6b4698d3546168029.png

98d51e6e8dd740e495f85fa8298734ca.png

 

(4)ctrl+c 退出

140565377d5a4ff1b3a34f9026d5ae0f.png

(5)修改celery_task.py文件,增加一个task

# -*- coding: utf-8 -*-
from celery import Celery
import time
app = Celery('demo', backend='redis://localhost:6379/1', broker='redis://localhost:6379/2')
@app.task
def send_email(name):print("向%s发送邮件..."%name)time.sleep(5)print("向%s发送邮件完成"%name)return "ok"
@app.task
def send_msg(name):print("向%s发送短信..."%name)time.sleep(5)print("向%s发送邮件完成"%name)return "ok"

66807234d7e64b8891d82f81bea6cc48.png

(6)再次在项目文件目录下创建worker消费任务

PS D:\soft\pythonProject> celery --app=celerypro.celery_task worker -n node1 -l INFO-------------- celery@node1 v5.3.5 (emerald-rush)
--- ***** ----- 
-- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 21:01:43
- *** --- * --- 
- ** ---------- [config]
- ** ---------- .> app:         demo:0x29cea446250
- ** ---------- .> transport:   redis://localhost:6379/2
- ** ---------- .> results:     redis://localhost:6379/1
- *** --- * --- .> concurrency: 32 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- -------------- [queues].> celery           exchange=celery(direct) key=celery[tasks]. celerypro.celery_task.send_email. celerypro.celery_task.send_msg[2023-11-22 21:01:43,381: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
[2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-23] child process 23988 calling self.run()
[2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-17] child process 16184 calling self.run()
[2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-21] child process 22444 calling self.run()
[2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-27] child process 29480 calling self.run()
[2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-24] child process 5844 calling self.run()
[2023-11-22 21:01:43,631: INFO/SpawnPoolWorker-25] child process 8896 calling self.run()
[2023-11-22 21:01:43,634: INFO/SpawnPoolWorker-29] child process 28068 calling self.run()
[2023-11-22 21:01:43,634: INFO/SpawnPoolWorker-28] child process 18952 calling self.run()
[2023-11-22 21:01:43,636: INFO/SpawnPoolWorker-26] child process 13680 calling self.run()
[2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-31] child process 25472 calling self.run()
[2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-30] child process 28688 calling self.run()
[2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-32] child process 10072 calling self.run()
[2023-11-22 21:01:45,401: INFO/MainProcess] Connected to redis://localhost:6379/2
[2023-11-22 21:01:45,401: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.warnings.warn([2023-11-22 21:01:49,477: INFO/MainProcess] mingle: searching for neighbors
[2023-11-22 21:01:56,607: INFO/MainProcess] mingle: all alone
[2023-11-22 21:02:04,753: INFO/MainProcess] celery@node1 ready.

0c78eaa6c74c494888757c8050c10bd7.png

(6)ctrl+c 退出创建执行任务文件produce_task.py

# -*- coding: utf-8 -*-
from celerypro.celery_task  import send_email,send_msg
result = send_email.delay("david")
print(result.id)
result2 = send_msg.delay("mao")
print(result2.id)

e09e10141a60455ab943cad7e1fae99f.png

 

(7)运行produce_task.py

5521912205114196b1b4c583d983cb37.png

(8)同时取到id值

07f8d1c9596b426c8e87cf9550885939.png

(9)如遇到报错需要安装包 eventlet

PS D:\soft\pythonProject> pip install eventlet

8c95c48bc5504bf0b2d648256968509b.png
(10)重新在项目文件目录下创建worker消费任务

PS D:\soft\pythonProject> celery --app=celerypro.celery_task worker -n node1 -l INFO -P eventlet-------------- celery@node1 v5.3.5 (emerald-rush)
--- ***** -----
-- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 21:29:34
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         demo:0x141511962e0
- ** ---------- .> transport:   redis://localhost:6379/2
- ** ---------- .> results:     redis://localhost:6379/1
- *** --- * --- .> concurrency: 32 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ------------------- [queues].> celery           exchange=celery(direct) key=celery[tasks]. celerypro.celery_task.send_email. celerypro.celery_task.send_msgr_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.warnings.warn([2023-11-22 21:29:48,022: INFO/MainProcess] pidbox: Connected to redis://localhost:6379/2.
[2023-11-22 21:29:52,117: INFO/MainProcess] celery@node1 ready.


b9b48832640a472f8eb3366cdc9fe2dc.png

(11) 运行produce_task.py

5521912205114196b1b4c583d983cb37.png

(12)生成id

007b9689be2b46439d14211b0085d0e3.png

(13)查看任务消息

[2023-11-22 21:30:35,194: INFO/MainProcess] Task celerypro.celery_task.send_email[c1a473d5-49ac-4468-9370-19226f377e00] received
[2023-11-22 21:30:35,195: WARNING/MainProcess] 向david发送邮件...
[2023-11-22 21:30:35,197: INFO/MainProcess] Task celerypro.celery_task.send_msg[de30d70b-9110-4dfb-bcfd-45a61403357f] received
[2023-11-22 21:30:35,198: WARNING/MainProcess] 向mao发送短信...
[2023-11-22 21:30:40,210: WARNING/MainProcess] 向david发送邮件完成
[2023-11-22 21:30:40,210: WARNING/MainProcess] 向mao发送邮件完成
[2023-11-22 21:30:42,270: INFO/MainProcess] Task celerypro.celery_task.send_msg[de30d70b-9110-4dfb-bcfd-45a61403357f] succeeded in 7.063000000001921s: 'ok'
[2023-11-22 21:30:42,270: INFO/MainProcess] Task celerypro.celery_task.send_email[c1a473d5-49ac-4468-9370-19226f377e00] succeeded in 7.063000000001921s: 'ok'

dd07e64e52474ee9a14aeee201e655fd.png

(14)创建py文件:result.py,查看任务执行结果

取第2个id:de30d70b-9110-4dfb-bcfd-45a61403357f

# -*- coding: utf-8 -*-
from celery.result import AsyncResult
from celerypro.celery_task import app
async_result = AsyncResult(id="de30d70b-9110-4dfb-bcfd-45a61403357f", app=app)
if async_result.successful():result = async_result.get()print(result)
elif async_result.failed():print('执行失败')
elif async_result.status == 'PENDING':print('任务等待中被执行')
elif async_result.status == 'RETRY':print('任务异常后正在重试')
elif async_result.status == 'STARTED':print('任务已经开始被执行')

3cb2468f7ef94b5596325b7a77c40382.png

(15) 运行result.py文件

f4278e4abcf34edc95c652941ef3646f.png

(16)输出ok

de18aec5d2bf43b4acb8fb91a82b1711.png

 

三、问题

1.Celery命令报错

(1)报错

66136da541e742eeb913a0b4ddba106f.png

428d102f7add401fbf1c8c695e690f0e.png

(2)原因分析

celery版本不同命令不同。

查看帮助命令

PS D:\soft\pythonProject> celery --help
Usage: celery [OPTIONS] COMMAND [ARGS]...Celery command entrypoint.Options:-A, --app APPLICATION-b, --broker TEXT--result-backend TEXT--loader TEXT--config TEXT--workdir PATH-C, --no-color-q, --quiet--version--skip-checks          Skip Django core checks on startup. Setting theSKIP_CHECKS environment variable to any non-emptystring will have the same effect.--help                 Show this message and exit.Commands:amqp     AMQP Administration Shell.beat     Start the beat periodic task scheduler.call     Call a task by name.control  Workers remote control.events   Event-stream utilities.graph    The ``celery graph`` command.inspect  Inspect the worker at runtime.list     Get info from broker.logtool  The ``celery logtool`` command.migrate  Migrate tasks from one broker to another.multi    Start multiple worker instances.purge    Erase all messages from all known task queues.report   Shows information useful to include in bug-reports.result   Print the return value for a given task id.shell    Start shell session with convenient access to celery symbols.status   Show list of workers that are online.upgrade  Perform upgrade between versions.worker   Start worker instance.
PS D:\soft\pythonProject> celery  worker --help
Usage: celery worker [OPTIONS]Start worker instance.Examples--------$ celery --app=proj worker -l INFO$ celery -A proj worker -l INFO -Q hipri,lopri$ celery -A proj worker --concurrency=4$ celery -A proj worker --concurrency=1000 -P eventlet$ celery worker --autoscale=10,0Worker Options:-n, --hostname HOSTNAME         Set custom hostname (e.g., 'w1@%%h').Expands: %%h (hostname), %%n (name) and %%d,(domain).-D, --detach                    Start worker as a background process.-S, --statedb PATH              Path to the state database. The extension'.db' may be appended to the filename.-l, --loglevel [DEBUG|INFO|WARNING|ERROR|CRITICAL|FATAL]Logging level.-O, --optimization [default|fair]Apply optimization profile.--prefetch-multiplier <prefetch multiplier>Set custom prefetch multiplier value forthis worker instance.Pool Options:-c, --concurrency <concurrency>Number of child processes processing thequeue.  The default is the number of CPUsavailable on your system.-P, --pool [prefork|eventlet|gevent|solo|processes|threads|custom]Pool implementation.-E, --task-events, --events     Send task-related events that can becaptured by monitors like celery events,celerymon, and others.--time-limit FLOAT              Enables a hard time limit (in secondsint/float) for tasks.--soft-time-limit FLOAT         Enables a soft time limit (in secondsint/float) for tasks.--max-tasks-per-child INTEGER   Maximum number of tasks a pool worker canexecute before it's terminated and replacedby a new worker.--max-memory-per-child INTEGER  Maximum amount of resident memory, in KiB,that may be consumed by a child processbefore it will be replaced by a new one.  Ifa single task causes a child process toexceed this limit, the task will becompleted and the child process will bereplaced afterwards. Default: no limit.--scheduler TEXTDaemonization Options:-f, --logfile TEXT  Log destination; defaults to stderr--pidfile TEXT--uid TEXT--gid TEXT--umask TEXT--executable TEXTOptions:--help  Show this message and exit.

(3)解决方法

修改命令

PS D:\soft\pythonProject> celery --app=celerypro.celery_task worker -n node1 -l INFO

成功

577654ea7a024e639f347a4282e8da16.png

 

2.执行Celery命令报错

(1)报错

AttributeError: 'NoneType' object has no attribute 'Redis'

5f0ebb3ce11b46969600d1af9d0a2b31.png

 

(2)原因分析

PyCharm未安装redis插件。

(3)解决方法

安装redis插件

d2610162bdb04d24aa06a5941fa820de.png

 

3.Win11启动Celery报ValueErro错误

(1)报错

Windows 在开发 Celery 异步任务,通过命令 celery --app=celerypro.celery_task worker -n node1 -l INFO 启动 Celery 服务后正常;

但在使用 delay() 调用任务时会出现以下报错信息:

Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
 

388cc7dd79524e98b34cfacbe7cb594a.png

(2)原因分析

PyCharm未安装eventlet

(3)解决方法

安装包 eventlet

pip install eventlet

6c57942c19d44ded89153a51b9e3e41b.png

通过以下命令启动服务

celery --app=celerypro.celery_task worker -n node1 -l INFO -P eventlet

0565710b258946c48643a046ea458b0f.png

 

 

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/160620.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

JSP内置对象

一、request对象 1、访问请求参数 2、在作用域中管理属性 3、获取Cookie 4、解决中文乱码 5、获取客户端信息 6、显示国际化信息 是一个javax.servlet.http.HttpServletRequest对象 request封装了用户浏览器提交的信息&#xff0c;因此可以调用相应的方法可以获取这些封…

优先经验回放(prioritized experience replay)

prioritized experience replay 思路 优先经验回放出自ICLR 2016的论文《prioritized experience replay》。 prioritized experience replay的作者们认为&#xff0c;按照一定的优先级来对经验回放池中的样本采样&#xff0c;相比于随机均匀的从经验回放池中采样的效率更高&…

UML建模图文详解教程——类图

版权声明 本文原创作者&#xff1a;谷哥的小弟作者博客地址&#xff1a;http://blog.csdn.net/lfdfhl本文参考资料&#xff1a;《UML面向对象分析、建模与设计&#xff08;第2版&#xff09;》吕云翔&#xff0c;赵天宇 著 类图概述 类图用来描述系统内各种实体的类型以及不同…

Unsupervised MVS论文笔记

Unsupervised MVS论文笔记 摘要1 引言2 相关工作3 实现方法 Tejas Khot and Shubham Agrawal and Shubham Tulsiani and Christoph Mertz and Simon Lucey and Martial Hebert. Tejas Khot and Shubham Agrawal and Shubham Tulsiani and Christoph Mertz and Simon Lucey and …

JAVA小游戏拼图

第一步是创建项目 项目名自拟 第二部创建个包名 来规范class 然后是创建类 创建一个代码类 和一个运行类 代码如下&#xff1a; package heima; import java.awt.event.ActionEvent; import java.awt.event.ActionListener; import java.awt.event.KeyEvent; import …

10、信息打点——APP小程序篇抓包封包XP框架反编译资产提取

APP信息搜集思路 外在——抓包封包——资产安全测试 抓包&#xff08;Fiddle&茶杯&burp&#xff09;封包&#xff08;封包监听工具&#xff09;&#xff0c;提取资源信息 资产收集——资源提取——ICO、MAD、hash——FOFA等网络测绘进行资产搜集 外在——功能逻辑 内在…

国际版Amazon Lightsail的功能解析

Amazon Lightsail是一项易于使用的云服务,可为您提供部署应用程序或网站所需的一切,从而实现经济高效且易于理解的月度计划。它是部署简单的工作负载、网站或开始使用亚马逊云科技的理想选择。 作为 AWS 免费套餐的一部分&#xff0c;可以免费开始使用 Amazon Lightsail。注册…

【Python进阶】近200页md文档14大体系第4篇:Python进程使用详解(图文演示)

本文从14大模块展示了python高级用的应用。分别有Linux命令&#xff0c;多任务编程、网络编程、Http协议和静态Web编程、htmlcss、JavaScript、jQuery、MySql数据库的各种用法、python的闭包和装饰器、mini-web框架、正则表达式等相关文章的详细讲述。 Python全套笔记直接地址…

028 - STM32学习笔记 - ADC结构体学习(二)

028 - STM32学习笔记 - 结构体学习&#xff08;二&#xff09; 上节对ADC基础知识进行了学习&#xff0c;这节在了解一下ADC相关的结构体。 一、ADC初始化结构体 在标准库函数中基本上对于外设都有一个初始化结构体xx_InitTypeDef&#xff08;其中xx为外设名&#xff0c;例如…

YOLO目标检测——卫星遥感多类别检测数据集下载分享【含对应voc、coco和yolo三种格式标签】

实际项目应用&#xff1a;卫星遥感目标检测数据集说明&#xff1a;卫星遥感多类别检测数据集&#xff0c;真实场景的高质量图片数据&#xff0c;数据场景丰富&#xff0c;含网球场、棒球场、篮球场、田径场、储罐、车辆、桥、飞机、船等类别标签说明&#xff1a;使用lableimg标…

2023年【上海市安全员C证】考试及上海市安全员C证找解析

题库来源&#xff1a;安全生产模拟考试一点通公众号小程序 2023年上海市安全员C证考试为正在备考上海市安全员C证操作证的学员准备的理论考试专题&#xff0c;每个月更新的上海市安全员C证找解析祝您顺利通过上海市安全员C证考试。 1、【多选题】2017年9月颁发的《中共上海市委…

基于STM32的烟雾浓度检测报警仿真设计(仿真+程序+讲解视频)

这里写目录标题 &#x1f4d1;1.主要功能&#x1f4d1;2.仿真&#x1f4d1;3. 程序&#x1f4d1;4. 资料清单&下载链接&#x1f4d1;[资料下载链接](https://docs.qq.com/doc/DS0VHTmxmUHBtVGVP) 基于STM32的烟雾浓度检测报警仿真设计(仿真程序讲解&#xff09; 仿真图prot…

SkyWalking配置报警推送到企业微信

1、先在企业微信群里创建一个机器人&#xff0c;复制webhook的地址&#xff1a; 2、找到SkyWalking部署位置的alarm-settings.yml文件 编辑&#xff0c;在最后面加上此段配置 &#xff01;&#xff01;&#xff01;一定格式要对&#xff0c;不然一直报警报不出来按照网上指导…

JVM 堆外内存详解

Java 进程内存占用除了JVM 运行时数据区&#xff0c;还有直接内存&#xff08;Direct Memory&#xff09;区域及 JVM 程序自身也会占用内存 直接内存&#xff08;Direct Memory&#xff09;区域&#xff1a;直接内存通过使用Native堆外内存来存储数据&#xff0c;这意味着数据…

大数据平台实践之CDH6.2.1+spark3.3.0+kyuubi-1.6.0

前言&#xff1a;关于kyuubi的原理和功能这里不做详细的介绍&#xff0c;感兴趣的同学可以直通官网&#xff1a;https://kyuubi.readthedocs.io/en/v1.7.1-rc0/index.html 下载软件版本 wget http://distfiles.macports.org/scala2.12/scala-2.12.16.tgz wget https://archi…

pikachu_php反序列化

pikachu_php反序列化 源代码 class S{var $test "pikachu";function __construct(){echo $this->test;} }//O:1:"S":1:{s:4:"test";s:29:"<script>alert(xss)</script>";} $html; if(isset($_POST[o])){$s $_POST[…

基于python人脸性别年龄检测系统-深度学习项目

欢迎大家点赞、收藏、关注、评论啦 &#xff0c;由于篇幅有限&#xff0c;只展示了部分核心代码。 文章目录 一项目简介简介技术组成1. OpenCV2. Dlib3. TensorFlow 和 Keras 功能流程 二、功能三、系统四. 总结 一项目简介 # Python 人脸性别年龄检测系统介绍 简介 该系统基…

Android studio 迁移之后打开没反应

把Android studio由d盘迁移到c盘&#xff0c;点击没反应&#xff1b; 需要把C:\Users\xxxx\AppData\Roaming\Google\AndroidStudio2022.3 目录下的studio64.exe.vmoptions 修改为C:&#xff0c;删除该文件会导致无法安装app。 里面配置了一个

SpringMVC问题

文章目录 SpringMVC运行流程MVC的概念与请求在MVC中的执行路径&#xff0c;ResponsBody注解的用途SpringMVC启动流程 SpringMVC运行流程 • 客户端&#xff08;浏览器&#xff09;发送请求&#xff0c;直接请求到 DispatcherServlet 。 • DispatcherServlet 根据请求信息调用 …

【React-Router】路由导航

1. 概念 路由系统中的多个路由之间需要进行路由跳转&#xff0c;并且在跳转的同时有可能需要传递参数进行通信。 2. 声明式导航 // /page/Login/index.jsimport { Link } from react-router-dom const Login () > {return <div>登录页{/* 解析成 a 链接 */}<Li…