Coder Social home page Coder Social logo

collinr3 / django-db-queue Goto Github PK

View Code? Open in Web Editor NEW

This project forked from dabapps/django-db-queue

0.0 0.0 0.0 254 KB

Simple database-backed job queue with shift limit option added

License: BSD 2-Clause "Simplified" License

Python 100.00%

django-db-queue's Introduction

django-db-queue

pypi release

Simple database-backed job queue. Jobs are defined in your settings, and are processed by management commands.

Asynchronous tasks are run via a job queue. This system is designed to support multi-step job workflows.

Supported and tested against:

  • Django 3.2 and 4.0
  • Python 3.6, 3.7, 3.8, 3.9 and 3.10

Getting Started

Installation

Install from PIP

pip install django-db-queue

Add django_dbq to your installed apps

INSTALLED_APPS = [
    ...,
    "django_dbq",
]

Run migrations

manage.py migrate

Upgrading from 1.x to 2.x

Note that version 2.x only supports Django 3.1 or newer. If you need support for Django 2.2, please stick with the latest 1.x release.

Describe your job

In e.g. project.common.jobs:

import time


def my_task(job):
    logger.info("Working hard...")
    time.sleep(10)
    logger.info("Job's done!")

Set up your job

In project.settings:

JOBS = {
    "my_job": {
        "tasks": ["project.common.jobs.my_task"],
    },
}

Hooks

Failure Hooks

When an unhandled exception is raised by a job, a failure hook will be called if one exists enabling you to clean up any state left behind by your failed job. Failure hooks are run in your worker process (if your job fails).

A failure hook receives the failed Job instance along with the unhandled exception raised by your failed job as its arguments. Here's an example:

def my_task_failure_hook(job, e):
    ...  # clean up after failed job

To ensure this hook gets run, simply add a failure_hook key to your job config like so:

JOBS = {
    "my_job": {
        "tasks": ["project.common.jobs.my_task"],
        "failure_hook": "project.common.jobs.my_task_failure_hook",
    },
}

Creation Hooks

You can also run creation hooks, which happen just after the creation of your Job instances and are executed in the process in which the job was created, not the worker process.

A creation hook receives your Job instance as its only argument. Here's an example:

def my_task_creation_hook(job):
    ...  # configure something before running your job

To ensure this hook gets run, simply add a creation_hook key to your job config like so:

JOBS = {
    "my_job": {
        "tasks": ["project.common.jobs.my_task"],
        "creation_hook": "project.common.jobs.my_task_creation_hook",
    },
}

Start the worker

In another terminal:

python manage.py worker

Create a job

Using the name you configured for your job in your settings, create an instance of Job.

Job.objects.create(name="my_job")

Prioritising jobs

Sometimes it is necessary for certain jobs to take precedence over others. For example; you may have a worker which has a primary purpose of dispatching somewhat important emails to users. However, once an hour, you may need to run a really important job which needs to be done on time and cannot wait in the queue for dozens of emails to be dispatched before it can begin.

In order to make sure that an important job is run before others, you can set the priority field to an integer higher than 0 (the default). For example:

Job.objects.create(name="normal_job")
Job.objects.create(name="important_job", priority=1)
Job.objects.create(name="critical_job", priority=2)

Jobs will be ordered by their priority (highest to lowest) and then the time which they were created (oldest to newest) and processed in that order.

Scheduling jobs

If you'd like to create a job but have it run at some time in the future, you can use the run_after field on the Job model:

Job.objects.create(
    name="scheduled_job",
    run_after=(timezone.now() + timedelta(minutes=10)),
)

Of course, the scheduled job will only be run if your python manage.py worker process is running at the time when the job is scheduled to run. Otherwise, it will run the next time you start your worker process after that time has passed.

It's also worth noting that, by default, scheduled jobs run as part of the same queue as all other jobs, and so if a job is already being processed at the time when your scheduled job is due to run, it won't run until that job has finished. If increased precision is important, you might consider using the queue_name feature to run a separate worker dedicated to only running scheduled jobs.

Terminology

Job

The top-level abstraction of a standalone piece of work. Jobs are stored in the database (ie they are represented as Django model instances).

Task

Jobs are processed to completion by tasks. These are simply Python functions, which must take a single argument - the Job instance being processed. A single job will often require processing by more than one task to be completed fully. Creating the task functions is the responsibility of the developer. For example:

def my_task(job):
    logger.info("Doing some hard work")
    do_some_hard_work()

Workspace

The workspace is an area that can be used 1) to provide additional arguments to task functions, and 2) to categorize jobs with additional metadata. It is implemented as a Python dictionary, available on the job instance passed to tasks as job.workspace. The initial workspace of a job can be empty, or can contain some parameters that the tasks require (for example, API access tokens, account IDs etc).

When creating a Job, the workspace is passed as a keyword argument:

Job.objects.create(name="my_job", workspace={"key": value})

Then, the task function can access the workspace to get the data it needs to perform its task:

def my_task(job):
    cats_import = CatsImport.objects.get(pk=job.workspace["cats_import_id"])

Tasks within a single job can use the workspace to communicate with each other. A single task can edit the workspace, and the modified workspace will be passed on to the next task in the sequence. For example:

def my_first_task(job):
    job.workspace['message'] = 'Hello, task 2!'

def my_second_task(job):
    logger.info("Task 1 says: %s" % job.workspace['message'])

The workspace can be queried like any JSONField. For instance, if you wanted to display a list of jobs that a certain user had initiated, add user_id to the workspace when creating the job:

Job.objects.create(name="foo", workspace={"user_id": request.user.id})

Then filter the query with it in the view that renders the list:

user_jobs = Job.objects.filter(workspace__user_id=request.user.id)

Worker process

A worker process is a long-running process, implemented as a Django management command, which is responsible for executing the tasks associated with a job. There may be many worker processes running concurrently in the final system. Worker processes wait for a new job to be created in the database, and call the each associated task in the correct sequeunce.. A worker can be started using python manage.py worker, and a single worker instance is included in the development procfile.

Configuration

Jobs are configured in the Django settings.py file. The JOBS setting is a dictionary mapping a job name (eg import_cats) to a list of one or more task function paths. For example:

JOBS = {
    'import_cats': ['apps.cat_importer.import_cats.step_one', 'apps.cat_importer.import_cats.step_two'],
}

Job states

Jobs have a state field which can have one of the following values:

  • NEW (has been created, waiting for a worker process to run the next task)
  • READY (has run a task before, awaiting a worker process to run the next task)
  • PROCESSING (a task is currently being processed by a worker)
  • STOPPING (the worker process has received a signal from the OS requesting it to exit)
  • COMPLETED (all job tasks have completed successfully)
  • FAILED (a job task failed)

State diagram

state diagram

API

Model methods

Job.get_queue_depths

If you need to programatically get the depth of any queue you can run the following:

from django_dbq.models import Job

...

Job.objects.create(name="do_work", workspace={})
Job.objects.create(name="do_other_work", queue_name="other_queue", workspace={})

queue_depths = Job.get_queue_depths()
print(queue_depths)  # {"default": 1, "other_queue": 1}

Important: When checking queue depths, do not assume that the key for your queue will always be available. Queue depths of zero won't be included in the dict returned by this method.

Management commands

manage.py delete_old_jobs

There is a management command, manage.py delete_old_jobs, which deletes any jobs from the database which are in state COMPLETE or FAILED and were created more than 24 hours ago. This could be run, for example, as a cron task, to ensure the jobs table remains at a reasonable size.

manage.py worker

To start a worker:

manage.py worker [queue_name] [--rate_limit]
  • queue_name is optional, and will default to default
  • The --rate_limit flag is optional, and will default to 1. It is the minimum number of seconds that must have elapsed before a subsequent job can be run.
manage.py queue_depth

If you'd like to check your queue depth from the command line, you can run manage.py queue_depth [queue_name [queue_name ...]] and any jobs in the "NEW" or "READY" states will be returned.

Important: If you misspell or provide a queue name which does not have any jobs, a depth of 0 will always be returned.

Testing

It may be necessary to supply a DATABASE_PORT environment variable.

Windows support

Windows is supported on a best-effort basis only, and is not covered by automated or manual testing.

Code of conduct

For guidelines regarding the code of conduct when contributing to this repository please review https://www.dabapps.com/open-source/code-of-conduct/

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.