在celery中如何对tortoiseORM模型类进行数据的增删改?


`整个代码逻辑是在将post请求传递过来的数据保存到数据库中,Order模型类是tortoise ORM定义的,我在celery中对数据进行保存的时候总是提示never await,celery的版本是5.3.6,我没有实现成功,希望有大佬帮忙看下我的问题是啥`

database.py
from tortoise.contrib.fastapi import register_tortoise
from fastapi import FastAPI
from etc.settings import settings

TORTOISE_ORM={
“connections”:{
“default”:{
“engine”:“tortoise.backends.mysql”,
“credentials”:{
“host”:127.0.0.1,
“port”:“3306”,
“user”:“username”,
“password”:“password”,
“database”:“test”,
“minsize”:1,
“maxsize”:15,
“charset”:“utf8mb4”,
“echo”:True
}
},

},
"apps":{
    "models":{
        "models":["aerich.models",
                  "models.details",
                  ],
        "default_connections":"default"
    },
"use_tz":False,
"timezone":"Asia/Shanghai",

}

async def register_mysql(app: FastAPI):
# 注册数据库
register_tortoise(
app,
config=TORTOISE_ORM,
)


core.Events.py
from fastapi import FastAPI
from typing import Callable
from database import register_mysql

from core.Topic import create_topic

def startup(app: FastAPI) -> Callable:
“”"
FastApi 启动完成事件
:param app: FastAPI
:return: start_app
“”"

async def app_start() -> None:
    # APP启动完成后触发
    print("fastapi已启动")
    # 注册数据库
    await register_mysql(app)
    print("数据库注册成功")
return app_start

def stopping(app: FastAPI) -> Callable:
“”"
FastApi 停止事件
:param app: FastAPI
:return: stop_app
“”"
async def stop_app() -> None:
# APP停止时触发
print(“fastapi已停止”)

return stop_app

main.py
from fastapi import FastAPI
import uvicorn
from fastapi.middleware.cors import CORSMiddleware
from core import Router,Events,Middleware
from etc.settings import settings

application = FastAPI()

路由

application.include_router(Router.router)

事件监听

application.add_event_handler(“startup”, Events.startup(application))

application.add_middleware(
CORSMiddleware,
allow_origins=settings.CORS_ORIGINS,
allow_credentials=settings.CORS_ALLOW_CREDENTIALS,
allow_methods=settings.CORS_ALLOW_METHODS,
allow_headers=settings.CORS_ALLOW_HEADERS,
)
app=application

if name == ‘main’:
uvicorn.run(“main:app”,port=8090,reload=True,
workers=1)


task_manager.__init.py
from celery import Celery

app=Celery(“task_manager”) #创建Celery_APP
app.config_from_object(“task_manager.config”) #加载配置文件
#加载任务模块
app.autodiscover_tasks()

task_manager.config.py

from task_manager import app
from etc.settings import settings

broker_url=settings.CELERY_BROKER_URL
result_backend=None
broker_connection_retry_on_startup = True
accept_content = [‘json’]
task_serializer = ‘json’
result_serializer= “json”
result_expires=60
timezone=“Asia/Shanghai”
task_default_queue = ‘delivery_queue’
task_default_exchange = ‘delivery_exchange’

imports=(
“task_manager.tasks”,
)

task_manager.tasks.py

from task_manager import app
from models import Orders
def confirm_order_distribute(data):
Orders.create(**data)

Router.py

from fastapi import APIRouter
from urllib.parse import unquote_plus
from task_manager.task import confirm_order_distribute

router = APIRouter(prefix=‘/delivery’)

@router.post(“/delivery”,summary=“数据处理”)
async def data_view(request:Request):
content_type = request.headers.get(“content-type”, “”)
if “application/x-www-form-urlencoded” in content_type:
form_data = await request.form()
decoded_form_data = {key: unquote_plus(value) for key, value in form_data.items()}
if decoded_form_data :
confirm_order_distribute.delay(decoded_form_data)
return {“data”:“ok”}

models.PY
from tortoise import fields,Model
class Orders(Model):
id=fields.CharField(max_length=32,pk=True,unique=True,description=“Id”)
latitude=fields.FloatField(null=True,description=“纬度”)
longitude=fields.FloatField(null=True,description=“经度”)

  • 3
    点赞
  • 9
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
使用Celery可以很方便地实现定时任务和根据数据数据进行批量推送邮件。以下是一个示例代码,演示如何使用Celery定时发送根据数据数据进行批量邮件推送: 1. 安装Celery和Redis ```bash pip install celery redis ``` 2. 创建Celery实例 ```python # tasks.py from celery import Celery app = Celery('tasks', broker='redis://localhost:6379/0') ``` 3. 编写发送邮件的函数 ```python # tasks.py import smtplib from email.mime.text import MIMEText from myapp.models import User # 假设我们的User模型在myapp.models定义 @app.task def send_email(recipient, subject, body): # 创建邮件内容 message = MIMEText(body) message['From'] = 'sender@example.com' message['To'] = recipient message['Subject'] = subject # 连接邮件服务器 server = smtplib.SMTP('smtp.example.com', 587) server.starttls() server.login('username', 'password') # 发送邮件 server.sendmail('sender@example.com', recipient, message.as_string()) # 关闭连接 server.quit() ``` 4. 编写定时任务 ```python # tasks.py from datetime import datetime, timedelta from myapp.models import User @app.task def send_batch_emails(): # 从数据获取收件人列表 recipients = User.objects.values_list('email', flat=True) subject = 'Test Email' body = 'This is a test email.' # 发送邮件 for recipient in recipients: send_email.apply_async(args=[recipient, subject, body]) @app.task def send_periodic_emails(): # 每天10点发送一次邮件 schedule_time = datetime.now().replace(hour=10, minute=0, second=0) if schedule_time < datetime.now(): schedule_time += timedelta(days=1) send_batch_emails.apply_async(eta=schedule_time) ``` 5. 启动Celery ```bash celery -A tasks worker --loglevel=info ``` 6. 定时发送邮件 ```python # 在任意位置调用 send_periodic_emails() 函数即可 from tasks import send_periodic_emails send_periodic_emails.delay() ``` 在上面的代码,我们使用Celery创建了一个异步任务send_email(),用于发送单个邮件。然后,我们创建了一个异步任务send_batch_emails(),用于从数据获取收件人列表,为每个收件人创建一个send_email任务,并异步发送邮件。 最后,我们创建了一个定时任务send_periodic_emails(),用于每天10点发送一次邮件。我们使用Celery的apply_async()方法设置任务的执行时间,并在启动Celery后调用该函数即可。 您可以根据自己的需要修邮件内容、从数据获取收件人列表和定时任务的时间,并使用似的方法来实现根据数据数据进行定时批量邮件推送。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值