dify/api
2024-06-28 17:39:11 +08:00
..
configs fix(api/configs): Ignore empty environment variables when loading config. (#5647) 2024-06-26 21:39:19 +08:00
constants fix(api): language list (#5649) 2024-06-27 08:46:53 +08:00
controllers FR: #4048 - Add color customization to the chatbot (#4885) 2024-06-26 17:51:00 +08:00
core feat: add support Spark4.0 (#5688) 2024-06-28 17:39:11 +08:00
docker Feat/fix ops trace (#5672) 2024-06-28 00:24:37 +08:00
events feat: support opensearch approximate k-NN (#5322) 2024-06-19 12:44:33 +08:00
extensions feat: introduce pydantic-settings for config definition and validation (#5202) 2024-06-19 13:41:12 +08:00
fields FR: #4048 - Add color customization to the chatbot (#4885) 2024-06-26 17:51:00 +08:00
libs chore: apply flake8-comprehensions Ruff rules to improve collection comprehensions (#5652) 2024-06-27 11:21:31 +08:00
migrations Feat/fix ops trace (#5672) 2024-06-28 00:24:37 +08:00
models Feat/fix ops trace (#5672) 2024-06-28 00:24:37 +08:00
schedule Feat/dify rag (#2528) 2024-02-22 23:31:57 +08:00
services chore: apply flake8-comprehensions Ruff rules to improve collection comprehensions (#5652) 2024-06-27 11:21:31 +08:00
tasks Feat/fix ops trace (#5672) 2024-06-28 00:24:37 +08:00
templates fix: email template style (#1914) 2024-01-04 16:53:11 +08:00
tests chore: apply flake8-comprehensions Ruff rules to improve collection comprehensions (#5652) 2024-06-27 11:21:31 +08:00
.dockerignore build: support Poetry for depencencies tool in api's Dockerfile (#5105) 2024-06-22 01:34:08 +08:00
.env.example fix: Modify the incorrect configuration name for Google storage (#5595) 2024-06-26 07:54:22 +08:00
app.py feat: change TRACE_QUEUE_MANAGER_INTERVAL default value (#5698) 2024-06-28 17:34:58 +08:00
commands.py feat: support opensearch approximate k-NN (#5322) 2024-06-19 12:44:33 +08:00
Dockerfile Feat/add json process tool (#5555) 2024-06-28 11:57:32 +08:00
poetry.lock Feat/add json process tool (#5555) 2024-06-28 11:57:32 +08:00
poetry.toml build: initial support for poetry build tool (#4513) 2024-06-11 13:11:28 +08:00
pyproject.toml Feat/add json process tool (#5555) 2024-06-28 11:57:32 +08:00
README.md Fix docker command (#5681) 2024-06-28 01:23:01 +08:00

Dify Backend API

Usage

Important

In the v0.6.12 release, we deprecated pip as the package management tool for Dify API Backend service and replaced it with poetry.

  1. Start the docker-compose stack

    The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

    cd ../docker
    docker compose -f docker-compose.middleware.yaml -p dify up -d
    cd ../api
    
  2. Copy .env.example to .env

  3. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  4. Create environment.

    Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

  5. Install dependencies

    poetry env use 3.10
    poetry install
    

    In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

    poetry shell                                               # activate current environment
    poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
    poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
    
  6. Run migrate

    Before the first launch, migrate the database to the latest version.

    poetry run python -m flask db upgrade
    
  7. Start backend

    poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
    
  8. Start Dify web service.

  9. Setup your application by visiting http://localhost:3000...

  10. If you need to debug local async processing, please start the worker service.

poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment

    poetry install --with dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml

    cd ../
    poetry run -C api bash dev/pytest/pytest_all_tests.sh