使用Docker搭建FastAPI和PostgreSQL

首先

我已经搭建了FastAPI和postgresql的环境。由于我习惯于使用Laravel和Rails框架,所以在引入FastAPI时感觉有点麻烦。为了备忘起见,我写下了环境搭建的步骤。省略了每段代码的详细说明。

对于繁忙的人来说,这是一个最终的资源。
https://github.com/nonamenme/docker-fastapi-postgres
只需要使用docker-compose up -d命令来启动该项目,基本的FastAPI环境就会搭建完成。

准备

请按照以下格式准备文件。

─── project
    ├── docker-compose.yml
    ├── Dockerfile
    ├── requirements.txt
    └── fastapi
        └── main.py

・每个文件的内容

version: '3.7'

services:
  fastapi:
    build: .
    volumes:
      - ./fastapi:/app
    ports:
      - 8000:8000
    restart: always
    tty: true
    depends_on:
      - db

  db:
    image: postgres:15
    container_name: postgres-db
    volumes:
      - postgres_data:/var/lib/postgresql/data/
    environment:
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=password
    ports:
      - 5432:5432

volumes:
  postgres_data:

根据需要随时更改容器名称等信息。

FROM python:3.9-alpine

ENV LANG C.UTF-8
ENV TZ Asia/Tokyo

WORKDIR /app

# pip installs
COPY ./requirements.txt requirements.txt

RUN apk add --no-cache postgresql-libs \
 && apk add --no-cache --virtual .build-deps gcc musl-dev postgresql-dev \
 && python3 -m pip install -r /app/requirements.txt --no-cache-dir \
 && apk --purge del .build-deps

COPY . /app

# FastAPIの起動
CMD ["uvicorn", "main:app", "--reload", "--host", "0.0.0.0", "--port", "8000"]

通过添加”–reload”可以在main.py进行更改后即时反映出来。

import uvicorn
from fastapi import FastAPI

app = FastAPI()

@app.get("/")
async def root():
    return {"message": "Hello World"}

if __name__ == "__main__":
    uvicorn.run(app, host="0.0.0.0", port=8000)
fastapi
uvicorn

SQLAlchemy==1.3.22
SQLAlchemy-Utils==0.41.1
alembic==1.5.2
psycopg2==2.8.6
psycopg2-binary==2.9.3
pydantic[email]==1.6.1
python-jose[cryptography]==3.2.0
python-multipart==0.0.6
python-dotenv==1.0.0

2. 开启容器

$ docker-compose up -d

3.确认容器是否已启动。

$ docker-compose ps
       Name                     Command               State           Ports         
------------------------------------------------------------------------------------
fast-api_fastapi_1   uvicorn main:app --reload  ...   Up      0.0.0.0:8000->8000/tcp
postgres-db          docker-entrypoint.sh postgres    Up      0.0.0.0:5432->5432/tcp

在这一点上,应用已经启动起来了。要在浏览器中打开http://localhost:8000。

{"message":"Hello World"}

应该显示为 “应该显示为”.

スクリーンショット 2023-05-12 19.59.59.png

4. 使用alembic创建数据库迁移

「这里有点麻烦……」。
・alembic是一个数据库迁移工具。

进入应用容器

$ docker-compose exec fastapi sh

使用alembic init命令创建迁移环境。

/app # alembic init migration
  Creating directory /app/migration ...  done
  Creating directory /app/migration/versions ...  done
  Generating /app/migration/script.py.mako ...  done
  Generating /app/migration/README ...  done
  Generating /app/alembic.ini ...  done
  Generating /app/migration/env.py ...  done
  Please edit configuration/connection/logging settings in '/app/alembic.ini' before proceeding.

完成至此,以下是所构建的文件结构。

─── project
    ├── docker-compose.yml
    ├── Dockerfile
    ├── requirements.txt
    └── fastapi
        ├── main.py
+       ├── alembic.ini
+       └── migration
+           ├── versions
+           ├── env.py
+           ├── README
+           └── script.py.mako

更改DB的连接地址为4-3。

创建.env文件

/app # touch .env
DATABASE_URL=postgresql://postgres:password@postgres-db:5432/postgres

创建核心/config.py文件。

/app # mkdir core
/app # touch config.py

创建一个用于读取环境变量的配置文件。

import os
from functools import lru_cache
from pydantic import BaseSettings

PROJECT_ROOT = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

class Environment(BaseSettings):
    """ 環境変数を読み込む
    """
    database_url: str

    class Config:
        env_file = os.path.join(PROJECT_ROOT, '.env')

@lru_cache
def get_env():
    """ @lru_cacheで.envの結果をキャッシュする
    """
    return Environment()

设置连接的数据库。

+ import sys
+ # 相対パスでcoreディレクトリが参照できないので、読み取れるように
+ sys.path = ['', '..'] + sys.path[1:]

+ import os
+ from core.config import PROJECT_ROOT
+ from dotenv import load_dotenv

from logging.config import fileConfig

from sqlalchemy import engine_from_config
from sqlalchemy import pool

from alembic import context

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = None

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
+ load_dotenv(dotenv_path=os.path.join(PROJECT_ROOT, '.env'))
+ config.set_main_option('sqlalchemy.url', os.getenv('DATABASE_URL'))

def run_migrations_offline():
...

4-4. 更改生成的migration文件的文件名。

# A generic, single database configuration.

[alembic]
# path to migration scripts
script_location = migration

# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
+ file_template = %%(year)d%%(month).2d%%(day).2d%%(hour).2d%%(minute).2d_%%(slug)s

# timezone to use when rendering the date
# within the migration file as well as the filename.
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =
+ timezone = Asia/Tokyo

# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40
...

创建4-5个迁移文件

/app # alembic revision -m "create users table"
  Generating /app/migration/versions/202305121847_create_users_table.py ...  done

这样就在migration/version/目录下生成了一个迁移文件。

5. 运行迁移

5-1. 修改迁移文件

"""create users table

Revision ID: xxxxxxxx
Revises: 
Create Date: YYYY-MM-dd hh:ss:mm.ssssss

"""
from alembic import op
import sqlalchemy as sa


# revision identifiers, used by Alembic.
revision = 'xxxxxxxx'
down_revision = None
branch_labels = None
depends_on = None


def upgrade():
-   pass
+   op.create_table(
+       'users',
+        sa.Column('id', sa.Integer, primary_key=True),
+        sa.Column('name', sa.String(50), nullable=False),
+        sa.Column('login_id', sa.String(50), nullable=False),
+        sa.Column('password', sa.Text(), nullable=False),
+   )

def downgrade():
-   pass
+   op.drop_table('users')

执行迁移的操作5-2

/app # alembic upgrade head
INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO  [alembic.runtime.migration] Will assume transactional DDL.
INFO  [alembic.runtime.migration] Running upgrade  -> xxxxxxxx, create users table

执行此操作后,将进行迁移并在PostgreSQL数据库中创建用户表。

根据模型生成迁移文件。

6-1. 创建models.py

/app # touch migration/models.py
from datetime import datetime

from sqlalchemy import create_engine, Column, String, Integer, Text, DateTime
from sqlalchemy.ext.declarative import declarative_base
from core.config import get_env

# Engine の作成
Engine = create_engine(
    get_env().database_url,
    encoding="utf-8",
    echo=False
)

BaseModel = declarative_base()

6-2. 创建用户模型

from datetime import datetime

from sqlalchemy import create_engine, Column, String, Integer, Text, DateTime
from sqlalchemy.ext.declarative import declarative_base
from core.config import get_env

# Engine の作成
Engine = create_engine(
    get_env().database_url,
    encoding="utf-8",
    echo=False
)

BaseModel = declarative_base()

+ class User(BaseModel):
+    __tablename__ = 'users'
+
+    id = Column(Integer, primary_key=True)
+    name = Column(String(50), nullable=False)
+    login_id = Column(String(50), unique=True, nullable=False)
+    password = Column(Text, nullable=False)
+    created_at = Column(DateTime, default=datetime.now, nullable=False) # 追加分
+    updated_at = Column(DateTime, default=datetime.now, nullable=False) # 追加分

・migration文件和差异是created_at和updated_at

将创建的model设置为alembic调用

import sys
sys.path = ['', '..'] + sys.path[1:]

import os
from core.config import PROJECT_ROOT
from dotenv import load_dotenv

from logging.config import fileConfig

- from sqlalchemy import engine_from_config
- from sqlalchemy import pool

from alembic import context

+ from migration.models import BaseModel, Engine

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
- target_metadata = None
+ # target_metadata = None
+ target_metadata = BaseModel.metadata


# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
load_dotenv(dotenv_path=os.path.join(PROJECT_ROOT, '.env'))
config.set_main_option('sqlalchemy.url', os.getenv('DATABASE_URL'))

def run_migrations_offline():
    """Run migrations in 'offline' mode.

    This configures the context with just a URL
    and not an Engine, though an Engine is acceptable
    here as well.  By skipping the Engine creation
    we don't even need a DBAPI to be available.

    Calls to context.execute() here emit the given string to the
    script output.

    """
    url = config.get_main_option("sqlalchemy.url")
    context.configure(
        url=url,
        target_metadata=target_metadata,
        literal_binds=True,
        dialect_opts={"paramstyle": "named"},
    )

    with context.begin_transaction():
        context.run_migrations()


def run_migrations_online():
    """Run migrations in 'online' mode.

    In this scenario we need to create an Engine
    and associate a connection with the context.

    """
-   connectable = engine_from_config(
-       config.get_section(config.config_ini_section),
-       prefix="sqlalchemy.",
-       poolclass=pool.NullPool,
-   )

+   # connectable = engine_from_config(
+   #     config.get_section(config.config_ini_section),
+   #     prefix="sqlalchemy.",
+   #     poolclass=pool.NullPool,
+   # )
+   url = config.get_main_option("sqlalchemy.url")
+   connectable = Engine

    with connectable.connect() as connection:
        context.configure(
+           url=url,
            connection=connection,
            target_metadata=target_metadata
        )

        with context.begin_transaction():
            context.run_migrations()


if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()

生成迁移文件 6-4. migration

/app # alembic revision --autogenerate -m "add columns"
INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO  [alembic.runtime.migration] Will assume transactional DDL.
INFO  [alembic.ddl.postgresql] Detected sequence named 'users_id_seq' as owned by integer column 'users(id)', assuming SERIAL and omitting
INFO  [alembic.autogenerate.compare] Detected added column 'users.created_at'
INFO  [alembic.autogenerate.compare] Detected added column 'users.updated_at'
INFO  [alembic.autogenerate.compare] Detected added unique constraint 'None' on '['login_id']'
  Generating /app/migration/versions/YYYYMMddHHmm_add_columns.py ...  done

通过添加–autogenerate选项,它可以根据models.py文件自动生成与现有migration文件之间的差异migration文件。

"""add columns

Revision ID: xxxxxxx
Revises: yyyyyyyyy
Create Date: YYYY-MM-dd HH:mm:ss.ssssss+09:00

"""
from alembic import op
import sqlalchemy as sa


# revision identifiers, used by Alembic.
revision = 'xxxxxxx'
down_revision = 'yyyyyyyyy'
branch_labels = None
depends_on = None


def upgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.add_column('users', sa.Column('created_at', sa.DateTime(), nullable=False))
    op.add_column('users', sa.Column('updated_at', sa.DateTime(), nullable=False))
    op.create_unique_constraint(None, 'users', ['login_id'])
    # ### end Alembic commands ###


def downgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.drop_constraint(None, 'users', type_='unique')
    op.drop_column('users', 'updated_at')
    op.drop_column('users', 'created_at')
    # ### end Alembic commands ###

6-5. 进行迁移操作

alembic upgrade head

7. 结束了

在此之前,环境已经设置完成。
第一次在FastAPI中进行环境设置时感到非常困惑,但很有趣。

请提供相关链接。

以下是链接的汉语翻译(原文):

https://qiita.com/Butterthon/items/a55daa0e7f168fee7ef0
https://qiita.com/penpenta/items/c993243c4ceee3840f30
https://qiita.com/hkyo/items/65321d7015121ccf369f

广告
将在 10 秒后关闭
bannerAds