``` ├── .copier/ ├── .copier-answers.yml.jinja ├── update_dotenv.py ├── .env ├── .gitattributes ├── .github/ ├── DISCUSSION_TEMPLATE/ ├── questions.yml ├── FUNDING.yml ├── ISSUE_TEMPLATE/ ├── config.yml ├── privileged.yml ├── dependabot.yml ├── labeler.yml ├── workflows/ ├── add-to-project.yml ├── deploy-production.yml ├── deploy-staging.yml ├── generate-client.yml ├── issue-manager.yml ├── labeler.yml ├── latest-changes.yml ├── lint-backend.yml ├── playwright.yml ├── smokeshow.yml ├── test-backend.yml ├── test-docker-compose.yml ├── .gitignore ├── .pre-commit-config.yaml ├── .vscode/ ├── launch.json ├── LICENSE ├── README.md ├── SECURITY.md ├── backend/ ├── .dockerignore ├── .gitignore ├── Dockerfile ├── README.md ├── alembic.ini ├── app/ ├── __init__.py ├── alembic/ ├── README ├── env.py ├── script.py.mako ├── versions/ ├── .keep ├── 1a31ce608336_add_cascade_delete_relationships.py ├── 9c0a54914c78_add_max_length_for_string_varchar_.py ├── d98dd8ec85a3_edit_replace_id_integers_in_all_models_.py ├── e2412789c190_initialize_models.py ├── api/ ├── __init__.py ├── deps.py ├── main.py ├── routes/ ├── __init__.py ├── items.py ├── login.py ├── private.py ├── users.py ├── utils.py ├── backend_pre_start.py ├── core/ ├── __init__.py ├── config.py ├── db.py ├── security.py ├── crud.py ├── email-templates/ ├── build/ ├── new_account.html ├── reset_password.html ├── test_email.html ├── src/ ├── new_account.mjml ├── reset_password.mjml ├── test_email.mjml ├── initial_data.py ├── main.py ├── models.py ├── tests/ ├── __init__.py ├── api/ ├── __init__.py ├── routes/ ├── __init__.py ├── test_items.py ├── test_login.py ├── test_private.py ├── test_users.py ├── conftest.py ├── crud/ ├── __init__.py ├── test_user.py ├── scripts/ ├── __init__.py ├── test_backend_pre_start.py ├── test_test_pre_start.py ├── utils/ ├── __init__.py ├── item.py ├── user.py ├── utils.py ├── tests_pre_start.py ├── utils.py ├── pyproject.toml ├── scripts/ ├── format.sh ├── lint.sh ├── prestart.sh ├── test.sh ├── tests-start.sh ├── uv.lock ``` ## /.copier/.copier-answers.yml.jinja ```jinja path="/.copier/.copier-answers.yml.jinja" {{ _copier_answers|to_json -}} ``` ## /.copier/update_dotenv.py ```py path="/.copier/update_dotenv.py" from pathlib import Path import json # Update the .env file with the answers from the .copier-answers.yml file # without using Jinja2 templates in the .env file, this way the code works as is # without needing Copier, but if Copier is used, the .env file will be updated root_path = Path(__file__).parent.parent answers_path = Path(__file__).parent / ".copier-answers.yml" answers = json.loads(answers_path.read_text()) env_path = root_path / ".env" env_content = env_path.read_text() lines = [] for line in env_content.splitlines(): for key, value in answers.items(): upper_key = key.upper() if line.startswith(f"{upper_key}="): if " " in value: content = f"{upper_key}={value!r}" else: content = f"{upper_key}={value}" new_line = line.replace(line, content) lines.append(new_line) break else: lines.append(line) env_path.write_text("\n".join(lines)) ``` ## /.env ```env path="/.env" # Domain # This would be set to the production domain with an env var on deployment # used by Traefik to transmit traffic and aqcuire TLS certificates DOMAIN=localhost # To test the local Traefik config # DOMAIN=localhost.tiangolo.com # Used by the backend to generate links in emails to the frontend FRONTEND_HOST=http://localhost:5173 # In staging and production, set this env var to the frontend host, e.g. # FRONTEND_HOST=https://dashboard.example.com # Environment: local, staging, production ENVIRONMENT=local PROJECT_NAME="Full Stack FastAPI Project" STACK_NAME=full-stack-fastapi-project # Backend BACKEND_CORS_ORIGINS="http://localhost,http://localhost:5173,https://localhost,https://localhost:5173,http://localhost.tiangolo.com" SECRET_KEY=changethis FIRST_SUPERUSER=admin@example.com FIRST_SUPERUSER_PASSWORD=changethis # Emails SMTP_HOST= SMTP_USER= SMTP_PASSWORD= EMAILS_FROM_EMAIL=info@example.com SMTP_TLS=True SMTP_SSL=False SMTP_PORT=587 # Postgres POSTGRES_SERVER=localhost POSTGRES_PORT=5432 POSTGRES_DB=app POSTGRES_USER=postgres POSTGRES_PASSWORD=changethis SENTRY_DSN= # Configure these with your own Docker registry images DOCKER_IMAGE_BACKEND=backend DOCKER_IMAGE_FRONTEND=frontend ``` ## /.gitattributes ```gitattributes path="/.gitattributes" * text=auto *.sh text eol=lf ``` ## /.github/DISCUSSION_TEMPLATE/questions.yml ```yml path="/.github/DISCUSSION_TEMPLATE/questions.yml" labels: [question] body: - type: markdown attributes: value: | Thanks for your interest in this project! 🚀 Please follow these instructions, fill every question, and do every step. 🙏 I'm asking this because answering questions and solving problems in GitHub is what consumes most of the time. I end up not being able to add new features, fix bugs, review pull requests, etc. as fast as I wish because I have to spend too much time handling questions. All that, on top of all the incredible help provided by a bunch of community members, that give a lot of their time to come here and help others. That's a lot of work, but if more users came to help others like them just a little bit more, it would be much less effort for them (and you and me 😅). By asking questions in a structured way (following this) it will be much easier to help you. And there's a high chance that you will find the solution along the way and you won't even have to submit it and wait for an answer. 😎 As there are too many questions, I'll have to discard and close the incomplete ones. That will allow me (and others) to focus on helping people like you that follow the whole process and help us help you. 🤓 - type: checkboxes id: checks attributes: label: First Check description: Please confirm and check all the following options. options: - label: I added a very descriptive title here. required: true - label: I used the GitHub search to find a similar question and didn't find it. required: true - label: I searched in the documentation/README. required: true - label: I already searched in Google "How to do X" and didn't find any information. required: true - label: I already read and followed all the tutorial in the docs/README and didn't find an answer. required: true - type: checkboxes id: help attributes: label: Commit to Help description: | After submitting this, I commit to one of: * Read open questions until I find 2 where I can help someone and add a comment to help there. * I already hit the "watch" button in this repository to receive notifications and I commit to help at least 2 people that ask questions in the future. options: - label: I commit to help with one of those options 👆 required: true - type: textarea id: example attributes: label: Example Code description: | Please add a self-contained, [minimal, reproducible, example](https://stackoverflow.com/help/minimal-reproducible-example) with your use case. If I (or someone) can copy it, run it, and see it right away, there's a much higher chance I (or someone) will be able to help you. placeholder: | Write your example code here. render: Text validations: required: true - type: textarea id: description attributes: label: Description description: | What is the problem, question, or error? Write a short description telling me what you are doing, what you expect to happen, and what is currently happening. placeholder: | * Open the browser and call the endpoint `/`. * It returns a JSON with `{"message": "Hello World"}`. * But I expected it to return `{"message": "Hello Morty"}`. validations: required: true - type: dropdown id: os attributes: label: Operating System description: What operating system are you on? multiple: true options: - Linux - Windows - macOS - Other validations: required: true - type: textarea id: os-details attributes: label: Operating System Details description: You can add more details about your operating system here, in particular if you chose "Other". validations: required: true - type: input id: python-version attributes: label: Python Version description: | What Python version are you using? You can find the Python version with: \`\`\`bash python --version \`\`\` validations: required: true - type: textarea id: context attributes: label: Additional Context description: Add any additional context information or screenshots you think are useful. ``` ## /.github/FUNDING.yml ```yml path="/.github/FUNDING.yml" github: [tiangolo] ``` ## /.github/ISSUE_TEMPLATE/config.yml ```yml path="/.github/ISSUE_TEMPLATE/config.yml" blank_issues_enabled: false contact_links: - name: Security Contact about: Please report security vulnerabilities to security@tiangolo.com - name: Question or Problem about: Ask a question or ask about a problem in GitHub Discussions. url: https://github.com/fastapi/full-stack-fastapi-template/discussions/categories/questions - name: Feature Request about: To suggest an idea or ask about a feature, please start with a question saying what you would like to achieve. There might be a way to do it already. url: https://github.com/fastapi/full-stack-fastapi-template/discussions/categories/questions ``` ## /.github/ISSUE_TEMPLATE/privileged.yml ```yml path="/.github/ISSUE_TEMPLATE/privileged.yml" name: Privileged description: You are @tiangolo or he asked you directly to create an issue here. If not, check the other options. 👇 body: - type: markdown attributes: value: | Thanks for your interest in this project! 🚀 If you are not @tiangolo or he didn't ask you directly to create an issue here, please start the conversation in a [Question in GitHub Discussions](https://github.com/tiangolo/full-stack-fastapi-template/discussions/categories/questions) instead. - type: checkboxes id: privileged attributes: label: Privileged issue description: Confirm that you are allowed to create an issue here. options: - label: I'm @tiangolo or he asked me directly to create an issue here. required: true - type: textarea id: content attributes: label: Issue Content description: Add the content of the issue here. ``` ## /.github/dependabot.yml ```yml path="/.github/dependabot.yml" version: 2 updates: # GitHub Actions - package-ecosystem: github-actions directory: / schedule: interval: daily commit-message: prefix: ⬆ # Python - package-ecosystem: pip directory: / schedule: interval: daily commit-message: prefix: ⬆ # npm - package-ecosystem: npm directory: / schedule: interval: daily commit-message: prefix: ⬆ # Docker - package-ecosystem: docker directory: / schedule: interval: weekly commit-message: prefix: ⬆ ``` ## /.github/labeler.yml ```yml path="/.github/labeler.yml" docs: - all: - changed-files: - any-glob-to-any-file: - '**/*.md' - all-globs-to-all-files: - '!frontend/**' - '!backend/**' - '!.github/**' - '!scripts/**' - '!.gitignore' - '!.pre-commit-config.yaml' internal: - all: - changed-files: - any-glob-to-any-file: - .github/** - scripts/** - .gitignore - .pre-commit-config.yaml - all-globs-to-all-files: - '!./**/*.md' - '!frontend/**' - '!backend/**' ``` ## /.github/workflows/add-to-project.yml ```yml path="/.github/workflows/add-to-project.yml" name: Add to Project on: pull_request_target: issues: types: - opened - reopened jobs: add-to-project: name: Add to project runs-on: ubuntu-latest steps: - uses: actions/add-to-project@v1.0.2 with: project-url: https://github.com/orgs/fastapi/projects/2 github-token: ${{ secrets.PROJECTS_TOKEN }} ``` ## /.github/workflows/deploy-production.yml ```yml path="/.github/workflows/deploy-production.yml" name: Deploy to Production on: release: types: - published jobs: deploy: # Do not deploy in the main repository, only in user projects if: github.repository_owner != 'fastapi' runs-on: - self-hosted - production env: ENVIRONMENT: production DOMAIN: ${{ secrets.DOMAIN_PRODUCTION }} STACK_NAME: ${{ secrets.STACK_NAME_PRODUCTION }} SECRET_KEY: ${{ secrets.SECRET_KEY }} FIRST_SUPERUSER: ${{ secrets.FIRST_SUPERUSER }} FIRST_SUPERUSER_PASSWORD: ${{ secrets.FIRST_SUPERUSER_PASSWORD }} SMTP_HOST: ${{ secrets.SMTP_HOST }} SMTP_USER: ${{ secrets.SMTP_USER }} SMTP_PASSWORD: ${{ secrets.SMTP_PASSWORD }} EMAILS_FROM_EMAIL: ${{ secrets.EMAILS_FROM_EMAIL }} POSTGRES_PASSWORD: ${{ secrets.POSTGRES_PASSWORD }} SENTRY_DSN: ${{ secrets.SENTRY_DSN }} steps: - name: Checkout uses: actions/checkout@v4 - run: docker compose -f docker-compose.yml --project-name ${{ secrets.STACK_NAME_PRODUCTION }} build - run: docker compose -f docker-compose.yml --project-name ${{ secrets.STACK_NAME_PRODUCTION }} up -d ``` ## /.github/workflows/deploy-staging.yml ```yml path="/.github/workflows/deploy-staging.yml" name: Deploy to Staging on: push: branches: - master jobs: deploy: # Do not deploy in the main repository, only in user projects if: github.repository_owner != 'fastapi' runs-on: - self-hosted - staging env: ENVIRONMENT: staging DOMAIN: ${{ secrets.DOMAIN_STAGING }} STACK_NAME: ${{ secrets.STACK_NAME_STAGING }} SECRET_KEY: ${{ secrets.SECRET_KEY }} FIRST_SUPERUSER: ${{ secrets.FIRST_SUPERUSER }} FIRST_SUPERUSER_PASSWORD: ${{ secrets.FIRST_SUPERUSER_PASSWORD }} SMTP_HOST: ${{ secrets.SMTP_HOST }} SMTP_USER: ${{ secrets.SMTP_USER }} SMTP_PASSWORD: ${{ secrets.SMTP_PASSWORD }} EMAILS_FROM_EMAIL: ${{ secrets.EMAILS_FROM_EMAIL }} POSTGRES_PASSWORD: ${{ secrets.POSTGRES_PASSWORD }} SENTRY_DSN: ${{ secrets.SENTRY_DSN }} steps: - name: Checkout uses: actions/checkout@v4 - run: docker compose -f docker-compose.yml --project-name ${{ secrets.STACK_NAME_STAGING }} build - run: docker compose -f docker-compose.yml --project-name ${{ secrets.STACK_NAME_STAGING }} up -d ``` ## /.github/workflows/generate-client.yml ```yml path="/.github/workflows/generate-client.yml" name: Generate Client on: pull_request: types: - opened - synchronize jobs: generate-client: permissions: contents: write runs-on: ubuntu-latest steps: # For PRs from forks - uses: actions/checkout@v4 # For PRs from the same repo - uses: actions/checkout@v4 if: ( github.event_name != 'pull_request' || github.secret_source == 'Actions' ) with: ref: ${{ github.head_ref }} token: ${{ secrets.FULL_STACK_FASTAPI_TEMPLATE_REPO_TOKEN }} - uses: actions/setup-node@v4 with: node-version: lts/* - uses: actions/setup-python@v5 with: python-version: "3.10" - name: Install uv uses: astral-sh/setup-uv@v6 with: version: "0.4.15" enable-cache: true - name: Install dependencies run: npm ci working-directory: frontend - run: uv sync working-directory: backend - run: uv run bash scripts/generate-client.sh env: VIRTUAL_ENV: backend/.venv SECRET_KEY: just-for-generating-client POSTGRES_PASSWORD: just-for-generating-client FIRST_SUPERUSER_PASSWORD: just-for-generating-client - name: Add changes to git run: | git config --local user.email "github-actions@github.com" git config --local user.name "github-actions" git add frontend/src/client # Same repo PRs - name: Push changes if: ( github.event_name != 'pull_request' || github.secret_source == 'Actions' ) run: | git diff --staged --quiet || git commit -m "✨ Autogenerate frontend client" git push # Fork PRs - name: Check changes if: ( github.event_name == 'pull_request' && github.secret_source != 'Actions' ) run: | git diff --staged --quiet || (echo "Changes detected in generated client, run scripts/generate-client.sh and commit the changes" && exit 1) ``` ## /.github/workflows/issue-manager.yml ```yml path="/.github/workflows/issue-manager.yml" name: Issue Manager on: schedule: - cron: "21 17 * * *" issue_comment: types: - created issues: types: - labeled pull_request_target: types: - labeled workflow_dispatch: permissions: issues: write pull-requests: write jobs: issue-manager: if: github.repository_owner == 'fastapi' runs-on: ubuntu-latest steps: - name: Dump GitHub context env: GITHUB_CONTEXT: ${{ toJson(github) }} run: echo "$GITHUB_CONTEXT" - uses: tiangolo/issue-manager@0.5.1 with: token: ${{ secrets.GITHUB_TOKEN }} config: > { "answered": { "delay": 864000, "message": "Assuming the original need was handled, this will be automatically closed now. But feel free to add more comments or create new issues or PRs." }, "waiting": { "delay": 2628000, "message": "As this PR has been waiting for the original user for a while but seems to be inactive, it's now going to be closed. But if there's anyone interested, feel free to create a new PR." }, "invalid": { "delay": 0, "message": "This was marked as invalid and will be closed now. If this is an error, please provide additional details." } } ``` ## /.github/workflows/labeler.yml ```yml path="/.github/workflows/labeler.yml" name: Labels on: pull_request_target: types: - opened - synchronize - reopened # For label-checker - labeled - unlabeled jobs: labeler: permissions: contents: read pull-requests: write runs-on: ubuntu-latest steps: - uses: actions/labeler@v5 if: ${{ github.event.action != 'labeled' && github.event.action != 'unlabeled' }} - run: echo "Done adding labels" # Run this after labeler applied labels check-labels: needs: - labeler permissions: pull-requests: read runs-on: ubuntu-latest steps: - uses: docker://agilepathway/pull-request-label-checker:latest with: one_of: breaking,security,feature,bug,refactor,upgrade,docs,lang-all,internal repo_token: ${{ secrets.GITHUB_TOKEN }} ``` ## /.github/workflows/latest-changes.yml ```yml path="/.github/workflows/latest-changes.yml" name: Latest Changes on: pull_request_target: branches: - master types: - closed workflow_dispatch: inputs: number: description: PR number required: true debug_enabled: description: "Run the build with tmate debugging enabled (https://github.com/marketplace/actions/debugging-with-tmate)" required: false default: "false" jobs: latest-changes: runs-on: ubuntu-latest permissions: pull-requests: read steps: - name: Dump GitHub context env: GITHUB_CONTEXT: ${{ toJson(github) }} run: echo "$GITHUB_CONTEXT" - uses: actions/checkout@v4 with: # To allow latest-changes to commit to the main branch token: ${{ secrets.LATEST_CHANGES }} - uses: tiangolo/latest-changes@0.3.2 with: token: ${{ secrets.GITHUB_TOKEN }} latest_changes_file: ./release-notes.md latest_changes_header: "## Latest Changes" end_regex: "^## " debug_logs: true label_header_prefix: "### " ``` ## /.github/workflows/lint-backend.yml ```yml path="/.github/workflows/lint-backend.yml" name: Lint Backend on: push: branches: - master pull_request: types: - opened - synchronize jobs: lint-backend: runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Set up Python uses: actions/setup-python@v5 with: python-version: "3.10" - name: Install uv uses: astral-sh/setup-uv@v6 with: version: "0.4.15" enable-cache: true - run: uv run bash scripts/lint.sh working-directory: backend ``` ## /.github/workflows/playwright.yml ```yml path="/.github/workflows/playwright.yml" name: Playwright Tests on: push: branches: - master pull_request: types: - opened - synchronize workflow_dispatch: inputs: debug_enabled: description: 'Run the build with tmate debugging enabled (https://github.com/marketplace/actions/debugging-with-tmate)' required: false default: 'false' jobs: changes: runs-on: ubuntu-latest # Set job outputs to values from filter step outputs: changed: ${{ steps.filter.outputs.changed }} steps: - uses: actions/checkout@v4 # For pull requests it's not necessary to checkout the code but for the main branch it is - uses: dorny/paths-filter@v3 id: filter with: filters: | changed: - backend/** - frontend/** - .env - docker-compose*.yml - .github/workflows/playwright.yml test-playwright: needs: - changes if: ${{ needs.changes.outputs.changed == 'true' }} timeout-minutes: 60 runs-on: ubuntu-latest strategy: matrix: shardIndex: [1, 2, 3, 4] shardTotal: [4] fail-fast: false steps: - uses: actions/checkout@v4 - uses: actions/setup-node@v4 with: node-version: lts/* - uses: actions/setup-python@v5 with: python-version: '3.10' - name: Setup tmate session uses: mxschmitt/action-tmate@v3 if: ${{ github.event_name == 'workflow_dispatch' && github.event.inputs.debug_enabled == 'true' }} with: limit-access-to-actor: true - name: Install uv uses: astral-sh/setup-uv@v6 with: version: "0.4.15" enable-cache: true - run: uv sync working-directory: backend - run: npm ci working-directory: frontend - run: uv run bash scripts/generate-client.sh env: VIRTUAL_ENV: backend/.venv - run: docker compose build - run: docker compose down -v --remove-orphans - name: Run Playwright tests run: docker compose run --rm playwright npx playwright test --fail-on-flaky-tests --trace=retain-on-failure --shard=${{ matrix.shardIndex }}/${{ matrix.shardTotal }} - run: docker compose down -v --remove-orphans - name: Upload blob report to GitHub Actions Artifacts if: ${{ !cancelled() }} uses: actions/upload-artifact@v4 with: name: blob-report-${{ matrix.shardIndex }} path: frontend/blob-report include-hidden-files: true retention-days: 1 merge-playwright-reports: needs: - test-playwright - changes # Merge reports after playwright-tests, even if some shards have failed if: ${{ !cancelled() && needs.changes.outputs.changed == 'true' }} runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: actions/setup-node@v4 with: node-version: 20 - name: Install dependencies run: npm ci working-directory: frontend - name: Download blob reports from GitHub Actions Artifacts uses: actions/download-artifact@v4 with: path: frontend/all-blob-reports pattern: blob-report-* merge-multiple: true - name: Merge into HTML Report run: npx playwright merge-reports --reporter html ./all-blob-reports working-directory: frontend - name: Upload HTML report uses: actions/upload-artifact@v4 with: name: html-report--attempt-${{ github.run_attempt }} path: frontend/playwright-report retention-days: 30 include-hidden-files: true # https://github.com/marketplace/actions/alls-green#why alls-green-playwright: # This job does nothing and is only used for the branch protection if: always() needs: - test-playwright runs-on: ubuntu-latest steps: - name: Decide whether the needed jobs succeeded or failed uses: re-actors/alls-green@release/v1 with: jobs: ${{ toJSON(needs) }} allowed-skips: test-playwright ``` ## /.github/workflows/smokeshow.yml ```yml path="/.github/workflows/smokeshow.yml" name: Smokeshow on: workflow_run: workflows: [Test Backend] types: [completed] jobs: smokeshow: if: ${{ github.event.workflow_run.conclusion == 'success' }} runs-on: ubuntu-latest permissions: actions: read statuses: write steps: - uses: actions/checkout@v4 - uses: actions/setup-python@v5 with: python-version: "3.10" - run: pip install smokeshow - uses: actions/download-artifact@v4 with: name: coverage-html path: backend/htmlcov github-token: ${{ secrets.GITHUB_TOKEN }} run-id: ${{ github.event.workflow_run.id }} - run: smokeshow upload backend/htmlcov env: SMOKESHOW_GITHUB_STATUS_DESCRIPTION: Coverage {coverage-percentage} SMOKESHOW_GITHUB_COVERAGE_THRESHOLD: 90 SMOKESHOW_GITHUB_CONTEXT: coverage SMOKESHOW_GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} SMOKESHOW_GITHUB_PR_HEAD_SHA: ${{ github.event.workflow_run.head_sha }} SMOKESHOW_AUTH_KEY: ${{ secrets.SMOKESHOW_AUTH_KEY }} ``` ## /.github/workflows/test-backend.yml ```yml path="/.github/workflows/test-backend.yml" name: Test Backend on: push: branches: - master pull_request: types: - opened - synchronize jobs: test-backend: runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Set up Python uses: actions/setup-python@v5 with: python-version: "3.10" - name: Install uv uses: astral-sh/setup-uv@v6 with: version: "0.4.15" enable-cache: true - run: docker compose down -v --remove-orphans - run: docker compose up -d db mailcatcher - name: Migrate DB run: uv run bash scripts/prestart.sh working-directory: backend - name: Run tests run: uv run bash scripts/tests-start.sh "Coverage for ${{ github.sha }}" working-directory: backend - run: docker compose down -v --remove-orphans - name: Store coverage files uses: actions/upload-artifact@v4 with: name: coverage-html path: backend/htmlcov include-hidden-files: true ``` ## /.github/workflows/test-docker-compose.yml ```yml path="/.github/workflows/test-docker-compose.yml" name: Test Docker Compose on: push: branches: - master pull_request: types: - opened - synchronize jobs: test-docker-compose: runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - run: docker compose build - run: docker compose down -v --remove-orphans - run: docker compose up -d --wait backend frontend adminer - name: Test backend is up run: curl http://localhost:8000/api/v1/utils/health-check - name: Test frontend is up run: curl http://localhost:5173 - run: docker compose down -v --remove-orphans ``` ## /.gitignore ```gitignore path="/.gitignore" .vscode node_modules/ /test-results/ /playwright-report/ /blob-report/ /playwright/.cache/ ``` ## /.pre-commit-config.yaml ```yaml path="/.pre-commit-config.yaml" # See https://pre-commit.com for more information # See https://pre-commit.com/hooks.html for more hooks repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.4.0 hooks: - id: check-added-large-files - id: check-toml - id: check-yaml args: - --unsafe - id: end-of-file-fixer exclude: | (?x)^( frontend/src/client/.*| backend/app/email-templates/build/.* )$ - id: trailing-whitespace exclude: ^frontend/src/client/.* - repo: https://github.com/charliermarsh/ruff-pre-commit rev: v0.2.2 hooks: - id: ruff args: - --fix - id: ruff-format ci: autofix_commit_msg: 🎨 [pre-commit.ci] Auto format from pre-commit.com hooks autoupdate_commit_msg: ⬆ [pre-commit.ci] pre-commit autoupdate ``` ## /.vscode/launch.json ```json path="/.vscode/launch.json" { // Use IntelliSense to learn about possible attributes. // Hover to view descriptions of existing attributes. // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387 "version": "0.2.0", "configurations": [ { "name": "Debug FastAPI Project backend: Python Debugger", "type": "debugpy", "request": "launch", "module": "uvicorn", "args": [ "app.main:app", "--reload" ], "cwd": "${workspaceFolder}/backend", "jinja": true, "envFile": "${workspaceFolder}/.env", }, { "type": "chrome", "request": "launch", "name": "Debug Frontend: Launch Chrome against http://localhost:5173", "url": "http://localhost:5173", "webRoot": "${workspaceFolder}/frontend" }, ] } ``` ## /LICENSE ``` path="/LICENSE" MIT License Copyright (c) 2019 Sebastián Ramírez Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ``` ## /README.md # Full Stack FastAPI Template Test Coverage ## Technology Stack and Features - ⚡ [**FastAPI**](https://fastapi.tiangolo.com) for the Python backend API. - 🧰 [SQLModel](https://sqlmodel.tiangolo.com) for the Python SQL database interactions (ORM). - 🔍 [Pydantic](https://docs.pydantic.dev), used by FastAPI, for the data validation and settings management. - 💾 [PostgreSQL](https://www.postgresql.org) as the SQL database. - 🚀 [React](https://react.dev) for the frontend. - 💃 Using TypeScript, hooks, Vite, and other parts of a modern frontend stack. - 🎨 [Chakra UI](https://chakra-ui.com) for the frontend components. - 🤖 An automatically generated frontend client. - 🧪 [Playwright](https://playwright.dev) for End-to-End testing. - 🦇 Dark mode support. - 🐋 [Docker Compose](https://www.docker.com) for development and production. - 🔒 Secure password hashing by default. - 🔑 JWT (JSON Web Token) authentication. - 📫 Email based password recovery. - ✅ Tests with [Pytest](https://pytest.org). - 📞 [Traefik](https://traefik.io) as a reverse proxy / load balancer. - 🚢 Deployment instructions using Docker Compose, including how to set up a frontend Traefik proxy to handle automatic HTTPS certificates. - 🏭 CI (continuous integration) and CD (continuous deployment) based on GitHub Actions. ### Dashboard Login [![API docs](img/login.png)](https://github.com/fastapi/full-stack-fastapi-template) ### Dashboard - Admin [![API docs](img/dashboard.png)](https://github.com/fastapi/full-stack-fastapi-template) ### Dashboard - Create User [![API docs](img/dashboard-create.png)](https://github.com/fastapi/full-stack-fastapi-template) ### Dashboard - Items [![API docs](img/dashboard-items.png)](https://github.com/fastapi/full-stack-fastapi-template) ### Dashboard - User Settings [![API docs](img/dashboard-user-settings.png)](https://github.com/fastapi/full-stack-fastapi-template) ### Dashboard - Dark Mode [![API docs](img/dashboard-dark.png)](https://github.com/fastapi/full-stack-fastapi-template) ### Interactive API Documentation [![API docs](img/docs.png)](https://github.com/fastapi/full-stack-fastapi-template) ## How To Use It You can **just fork or clone** this repository and use it as is. ✨ It just works. ✨ ### How to Use a Private Repository If you want to have a private repository, GitHub won't allow you to simply fork it as it doesn't allow changing the visibility of forks. But you can do the following: - Create a new GitHub repo, for example `my-full-stack`. - Clone this repository manually, set the name with the name of the project you want to use, for example `my-full-stack`: ```bash git clone git@github.com:fastapi/full-stack-fastapi-template.git my-full-stack ``` - Enter into the new directory: ```bash cd my-full-stack ``` - Set the new origin to your new repository, copy it from the GitHub interface, for example: ```bash git remote set-url origin git@github.com:octocat/my-full-stack.git ``` - Add this repo as another "remote" to allow you to get updates later: ```bash git remote add upstream git@github.com:fastapi/full-stack-fastapi-template.git ``` - Push the code to your new repository: ```bash git push -u origin master ``` ### Update From the Original Template After cloning the repository, and after doing changes, you might want to get the latest changes from this original template. - Make sure you added the original repository as a remote, you can check it with: ```bash git remote -v origin git@github.com:octocat/my-full-stack.git (fetch) origin git@github.com:octocat/my-full-stack.git (push) upstream git@github.com:fastapi/full-stack-fastapi-template.git (fetch) upstream git@github.com:fastapi/full-stack-fastapi-template.git (push) ``` - Pull the latest changes without merging: ```bash git pull --no-commit upstream master ``` This will download the latest changes from this template without committing them, that way you can check everything is right before committing. - If there are conflicts, solve them in your editor. - Once you are done, commit the changes: ```bash git merge --continue ``` ### Configure You can then update configs in the `.env` files to customize your configurations. Before deploying it, make sure you change at least the values for: - `SECRET_KEY` - `FIRST_SUPERUSER_PASSWORD` - `POSTGRES_PASSWORD` You can (and should) pass these as environment variables from secrets. Read the [deployment.md](./deployment.md) docs for more details. ### Generate Secret Keys Some environment variables in the `.env` file have a default value of `changethis`. You have to change them with a secret key, to generate secret keys you can run the following command: ```bash python -c "import secrets; print(secrets.token_urlsafe(32))" ``` Copy the content and use that as password / secret key. And run that again to generate another secure key. ## How To Use It - Alternative With Copier This repository also supports generating a new project using [Copier](https://copier.readthedocs.io). It will copy all the files, ask you configuration questions, and update the `.env` files with your answers. ### Install Copier You can install Copier with: ```bash pip install copier ``` Or better, if you have [`pipx`](https://pipx.pypa.io/), you can run it with: ```bash pipx install copier ``` **Note**: If you have `pipx`, installing copier is optional, you could run it directly. ### Generate a Project With Copier Decide a name for your new project's directory, you will use it below. For example, `my-awesome-project`. Go to the directory that will be the parent of your project, and run the command with your project's name: ```bash copier copy https://github.com/fastapi/full-stack-fastapi-template my-awesome-project --trust ``` If you have `pipx` and you didn't install `copier`, you can run it directly: ```bash pipx run copier copy https://github.com/fastapi/full-stack-fastapi-template my-awesome-project --trust ``` **Note** the `--trust` option is necessary to be able to execute a [post-creation script](https://github.com/fastapi/full-stack-fastapi-template/blob/master/.copier/update_dotenv.py) that updates your `.env` files. ### Input Variables Copier will ask you for some data, you might want to have at hand before generating the project. But don't worry, you can just update any of that in the `.env` files afterwards. The input variables, with their default values (some auto generated) are: - `project_name`: (default: `"FastAPI Project"`) The name of the project, shown to API users (in .env). - `stack_name`: (default: `"fastapi-project"`) The name of the stack used for Docker Compose labels and project name (no spaces, no periods) (in .env). - `secret_key`: (default: `"changethis"`) The secret key for the project, used for security, stored in .env, you can generate one with the method above. - `first_superuser`: (default: `"admin@example.com"`) The email of the first superuser (in .env). - `first_superuser_password`: (default: `"changethis"`) The password of the first superuser (in .env). - `smtp_host`: (default: "") The SMTP server host to send emails, you can set it later in .env. - `smtp_user`: (default: "") The SMTP server user to send emails, you can set it later in .env. - `smtp_password`: (default: "") The SMTP server password to send emails, you can set it later in .env. - `emails_from_email`: (default: `"info@example.com"`) The email account to send emails from, you can set it later in .env. - `postgres_password`: (default: `"changethis"`) The password for the PostgreSQL database, stored in .env, you can generate one with the method above. - `sentry_dsn`: (default: "") The DSN for Sentry, if you are using it, you can set it later in .env. ## Backend Development Backend docs: [backend/README.md](./backend/README.md). ## Frontend Development Frontend docs: [frontend/README.md](./frontend/README.md). ## Deployment Deployment docs: [deployment.md](./deployment.md). ## Development General development docs: [development.md](./development.md). This includes using Docker Compose, custom local domains, `.env` configurations, etc. ## Release Notes Check the file [release-notes.md](./release-notes.md). ## License The Full Stack FastAPI Template is licensed under the terms of the MIT license. ## /SECURITY.md # Security Policy Security is very important for this project and its community. 🔒 Learn more about it below. 👇 ## Versions The latest version or release is supported. You are encouraged to write tests for your application and update your versions frequently after ensuring that your tests are passing. This way you will benefit from the latest features, bug fixes, and **security fixes**. ## Reporting a Vulnerability If you think you found a vulnerability, and even if you are not sure about it, please report it right away by sending an email to: security@tiangolo.com. Please try to be as explicit as possible, describing all the steps and example code to reproduce the security issue. I (the author, [@tiangolo](https://twitter.com/tiangolo)) will review it thoroughly and get back to you. ## Public Discussions Please restrain from publicly discussing a potential security vulnerability. 🙊 It's better to discuss privately and try to find a solution first, to limit the potential impact as much as possible. --- Thanks for your help! The community and I thank you for that. 🙇 ## /backend/.dockerignore ```dockerignore path="/backend/.dockerignore" # Python __pycache__ app.egg-info *.pyc .mypy_cache .coverage htmlcov .venv ``` ## /backend/.gitignore ```gitignore path="/backend/.gitignore" __pycache__ app.egg-info *.pyc .mypy_cache .coverage htmlcov .cache .venv ``` ## /backend/Dockerfile ``` path="/backend/Dockerfile" FROM python:3.10 ENV PYTHONUNBUFFERED=1 WORKDIR /app/ # Install uv # Ref: https://docs.astral.sh/uv/guides/integration/docker/#installing-uv COPY --from=ghcr.io/astral-sh/uv:0.5.11 /uv /uvx /bin/ # Place executables in the environment at the front of the path # Ref: https://docs.astral.sh/uv/guides/integration/docker/#using-the-environment ENV PATH="/app/.venv/bin:$PATH" # Compile bytecode # Ref: https://docs.astral.sh/uv/guides/integration/docker/#compiling-bytecode ENV UV_COMPILE_BYTECODE=1 # uv Cache # Ref: https://docs.astral.sh/uv/guides/integration/docker/#caching ENV UV_LINK_MODE=copy # Install dependencies # Ref: https://docs.astral.sh/uv/guides/integration/docker/#intermediate-layers RUN --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv sync --frozen --no-install-project ENV PYTHONPATH=/app COPY ./scripts /app/scripts COPY ./pyproject.toml ./uv.lock ./alembic.ini /app/ COPY ./app /app/app # Sync the project # Ref: https://docs.astral.sh/uv/guides/integration/docker/#intermediate-layers RUN --mount=type=cache,target=/root/.cache/uv \ uv sync CMD ["fastapi", "run", "--workers", "4", "app/main.py"] ``` ## /backend/README.md # FastAPI Project - Backend ## Requirements * [Docker](https://www.docker.com/). * [uv](https://docs.astral.sh/uv/) for Python package and environment management. ## Docker Compose Start the local development environment with Docker Compose following the guide in [../development.md](../development.md). ## General Workflow By default, the dependencies are managed with [uv](https://docs.astral.sh/uv/), go there and install it. From `./backend/` you can install all the dependencies with: ```console $ uv sync ``` Then you can activate the virtual environment with: ```console $ source .venv/bin/activate ``` Make sure your editor is using the correct Python virtual environment, with the interpreter at `backend/.venv/bin/python`. Modify or add SQLModel models for data and SQL tables in `./backend/app/models.py`, API endpoints in `./backend/app/api/`, CRUD (Create, Read, Update, Delete) utils in `./backend/app/crud.py`. ## VS Code There are already configurations in place to run the backend through the VS Code debugger, so that you can use breakpoints, pause and explore variables, etc. The setup is also already configured so you can run the tests through the VS Code Python tests tab. ## Docker Compose Override During development, you can change Docker Compose settings that will only affect the local development environment in the file `docker-compose.override.yml`. The changes to that file only affect the local development environment, not the production environment. So, you can add "temporary" changes that help the development workflow. For example, the directory with the backend code is synchronized in the Docker container, copying the code you change live to the directory inside the container. That allows you to test your changes right away, without having to build the Docker image again. It should only be done during development, for production, you should build the Docker image with a recent version of the backend code. But during development, it allows you to iterate very fast. There is also a command override that runs `fastapi run --reload` instead of the default `fastapi run`. It starts a single server process (instead of multiple, as would be for production) and reloads the process whenever the code changes. Have in mind that if you have a syntax error and save the Python file, it will break and exit, and the container will stop. After that, you can restart the container by fixing the error and running again: ```console $ docker compose watch ``` There is also a commented out `command` override, you can uncomment it and comment the default one. It makes the backend container run a process that does "nothing", but keeps the container alive. That allows you to get inside your running container and execute commands inside, for example a Python interpreter to test installed dependencies, or start the development server that reloads when it detects changes. To get inside the container with a `bash` session you can start the stack with: ```console $ docker compose watch ``` and then in another terminal, `exec` inside the running container: ```console $ docker compose exec backend bash ``` You should see an output like: ```console root@7f2607af31c3:/app# ``` that means that you are in a `bash` session inside your container, as a `root` user, under the `/app` directory, this directory has another directory called "app" inside, that's where your code lives inside the container: `/app/app`. There you can use the `fastapi run --reload` command to run the debug live reloading server. ```console $ fastapi run --reload app/main.py ``` ...it will look like: ```console root@7f2607af31c3:/app# fastapi run --reload app/main.py ``` and then hit enter. That runs the live reloading server that auto reloads when it detects code changes. Nevertheless, if it doesn't detect a change but a syntax error, it will just stop with an error. But as the container is still alive and you are in a Bash session, you can quickly restart it after fixing the error, running the same command ("up arrow" and "Enter"). ...this previous detail is what makes it useful to have the container alive doing nothing and then, in a Bash session, make it run the live reload server. ## Backend tests To test the backend run: ```console $ bash ./scripts/test.sh ``` The tests run with Pytest, modify and add tests to `./backend/app/tests/`. If you use GitHub Actions the tests will run automatically. ### Test running stack If your stack is already up and you just want to run the tests, you can use: ```bash docker compose exec backend bash scripts/tests-start.sh ``` That `/app/scripts/tests-start.sh` script just calls `pytest` after making sure that the rest of the stack is running. If you need to pass extra arguments to `pytest`, you can pass them to that command and they will be forwarded. For example, to stop on first error: ```bash docker compose exec backend bash scripts/tests-start.sh -x ``` ### Test Coverage When the tests are run, a file `htmlcov/index.html` is generated, you can open it in your browser to see the coverage of the tests. ## Migrations As during local development your app directory is mounted as a volume inside the container, you can also run the migrations with `alembic` commands inside the container and the migration code will be in your app directory (instead of being only inside the container). So you can add it to your git repository. Make sure you create a "revision" of your models and that you "upgrade" your database with that revision every time you change them. As this is what will update the tables in your database. Otherwise, your application will have errors. * Start an interactive session in the backend container: ```console $ docker compose exec backend bash ``` * Alembic is already configured to import your SQLModel models from `./backend/app/models.py`. * After changing a model (for example, adding a column), inside the container, create a revision, e.g.: ```console $ alembic revision --autogenerate -m "Add column last_name to User model" ``` * Commit to the git repository the files generated in the alembic directory. * After creating the revision, run the migration in the database (this is what will actually change the database): ```console $ alembic upgrade head ``` If you don't want to use migrations at all, uncomment the lines in the file at `./backend/app/core/db.py` that end in: ```python SQLModel.metadata.create_all(engine) ``` and comment the line in the file `scripts/prestart.sh` that contains: ```console $ alembic upgrade head ``` If you don't want to start with the default models and want to remove them / modify them, from the beginning, without having any previous revision, you can remove the revision files (`.py` Python files) under `./backend/app/alembic/versions/`. And then create a first migration as described above. ## Email Templates The email templates are in `./backend/app/email-templates/`. Here, there are two directories: `build` and `src`. The `src` directory contains the source files that are used to build the final email templates. The `build` directory contains the final email templates that are used by the application. Before continuing, ensure you have the [MJML extension](https://marketplace.visualstudio.com/items?itemName=attilabuti.vscode-mjml) installed in your VS Code. Once you have the MJML extension installed, you can create a new email template in the `src` directory. After creating the new email template and with the `.mjml` file open in your editor, open the command palette with `Ctrl+Shift+P` and search for `MJML: Export to HTML`. This will convert the `.mjml` file to a `.html` file and now you can save it in the build directory. ## /backend/alembic.ini ```ini path="/backend/alembic.ini" # A generic, single database configuration. [alembic] # path to migration scripts script_location = app/alembic # template used to generate migration files # file_template = %%(rev)s_%%(slug)s # timezone to use when rendering the date # within the migration file as well as the filename. # string value is passed to dateutil.tz.gettz() # leave blank for localtime # timezone = # max length of characters to apply to the # "slug" field #truncate_slug_length = 40 # set to 'true' to run the environment during # the 'revision' command, regardless of autogenerate # revision_environment = false # set to 'true' to allow .pyc and .pyo files without # a source .py file to be detected as revisions in the # versions/ directory # sourceless = false # version location specification; this defaults # to alembic/versions. When using multiple version # directories, initial revisions must be specified with --version-path # version_locations = %(here)s/bar %(here)s/bat alembic/versions # the output encoding used when revision files # are written from script.py.mako # output_encoding = utf-8 # Logging configuration [loggers] keys = root,sqlalchemy,alembic [handlers] keys = console [formatters] keys = generic [logger_root] level = WARN handlers = console qualname = [logger_sqlalchemy] level = WARN handlers = qualname = sqlalchemy.engine [logger_alembic] level = INFO handlers = qualname = alembic [handler_console] class = StreamHandler args = (sys.stderr,) level = NOTSET formatter = generic [formatter_generic] format = %(levelname)-5.5s [%(name)s] %(message)s datefmt = %H:%M:%S ``` ## /backend/app/__init__.py ```py path="/backend/app/__init__.py" ``` ## /backend/app/alembic/README ``` path="/backend/app/alembic/README" Generic single-database configuration. ``` ## /backend/app/alembic/env.py ```py path="/backend/app/alembic/env.py" import os from logging.config import fileConfig from alembic import context from sqlalchemy import engine_from_config, pool # this is the Alembic Config object, which provides # access to the values within the .ini file in use. config = context.config # Interpret the config file for Python logging. # This line sets up loggers basically. fileConfig(config.config_file_name) # add your model's MetaData object here # for 'autogenerate' support # from myapp import mymodel # target_metadata = mymodel.Base.metadata # target_metadata = None from app.models import SQLModel # noqa from app.core.config import settings # noqa target_metadata = SQLModel.metadata # other values from the config, defined by the needs of env.py, # can be acquired: # my_important_option = config.get_main_option("my_important_option") # ... etc. def get_url(): return str(settings.SQLALCHEMY_DATABASE_URI) def run_migrations_offline(): """Run migrations in 'offline' mode. This configures the context with just a URL and not an Engine, though an Engine is acceptable here as well. By skipping the Engine creation we don't even need a DBAPI to be available. Calls to context.execute() here emit the given string to the script output. """ url = get_url() context.configure( url=url, target_metadata=target_metadata, literal_binds=True, compare_type=True ) with context.begin_transaction(): context.run_migrations() def run_migrations_online(): """Run migrations in 'online' mode. In this scenario we need to create an Engine and associate a connection with the context. """ configuration = config.get_section(config.config_ini_section) configuration["sqlalchemy.url"] = get_url() connectable = engine_from_config( configuration, prefix="sqlalchemy.", poolclass=pool.NullPool, ) with connectable.connect() as connection: context.configure( connection=connection, target_metadata=target_metadata, compare_type=True ) with context.begin_transaction(): context.run_migrations() if context.is_offline_mode(): run_migrations_offline() else: run_migrations_online() ``` ## /backend/app/alembic/script.py.mako ```mako path="/backend/app/alembic/script.py.mako" """${message} Revision ID: ${up_revision} Revises: ${down_revision | comma,n} Create Date: ${create_date} """ from alembic import op import sqlalchemy as sa import sqlmodel.sql.sqltypes ${imports if imports else ""} # revision identifiers, used by Alembic. revision = ${repr(up_revision)} down_revision = ${repr(down_revision)} branch_labels = ${repr(branch_labels)} depends_on = ${repr(depends_on)} def upgrade(): ${upgrades if upgrades else "pass"} def downgrade(): ${downgrades if downgrades else "pass"} ``` ## /backend/app/alembic/versions/.keep ```keep path="/backend/app/alembic/versions/.keep" ``` ## /backend/app/alembic/versions/1a31ce608336_add_cascade_delete_relationships.py ```py path="/backend/app/alembic/versions/1a31ce608336_add_cascade_delete_relationships.py" """Add cascade delete relationships Revision ID: 1a31ce608336 Revises: d98dd8ec85a3 Create Date: 2024-07-31 22:24:34.447891 """ from alembic import op import sqlalchemy as sa import sqlmodel.sql.sqltypes # revision identifiers, used by Alembic. revision = '1a31ce608336' down_revision = 'd98dd8ec85a3' branch_labels = None depends_on = None def upgrade(): # ### commands auto generated by Alembic - please adjust! ### op.alter_column('item', 'owner_id', existing_type=sa.UUID(), nullable=False) op.drop_constraint('item_owner_id_fkey', 'item', type_='foreignkey') op.create_foreign_key(None, 'item', 'user', ['owner_id'], ['id'], ondelete='CASCADE') # ### end Alembic commands ### def downgrade(): # ### commands auto generated by Alembic - please adjust! ### op.drop_constraint(None, 'item', type_='foreignkey') op.create_foreign_key('item_owner_id_fkey', 'item', 'user', ['owner_id'], ['id']) op.alter_column('item', 'owner_id', existing_type=sa.UUID(), nullable=True) # ### end Alembic commands ### ``` ## /backend/app/alembic/versions/9c0a54914c78_add_max_length_for_string_varchar_.py ```py path="/backend/app/alembic/versions/9c0a54914c78_add_max_length_for_string_varchar_.py" """Add max length for string(varchar) fields in User and Items models Revision ID: 9c0a54914c78 Revises: e2412789c190 Create Date: 2024-06-17 14:42:44.639457 """ from alembic import op import sqlalchemy as sa import sqlmodel.sql.sqltypes # revision identifiers, used by Alembic. revision = '9c0a54914c78' down_revision = 'e2412789c190' branch_labels = None depends_on = None def upgrade(): # Adjust the length of the email field in the User table op.alter_column('user', 'email', existing_type=sa.String(), type_=sa.String(length=255), existing_nullable=False) # Adjust the length of the full_name field in the User table op.alter_column('user', 'full_name', existing_type=sa.String(), type_=sa.String(length=255), existing_nullable=True) # Adjust the length of the title field in the Item table op.alter_column('item', 'title', existing_type=sa.String(), type_=sa.String(length=255), existing_nullable=False) # Adjust the length of the description field in the Item table op.alter_column('item', 'description', existing_type=sa.String(), type_=sa.String(length=255), existing_nullable=True) def downgrade(): # Revert the length of the email field in the User table op.alter_column('user', 'email', existing_type=sa.String(length=255), type_=sa.String(), existing_nullable=False) # Revert the length of the full_name field in the User table op.alter_column('user', 'full_name', existing_type=sa.String(length=255), type_=sa.String(), existing_nullable=True) # Revert the length of the title field in the Item table op.alter_column('item', 'title', existing_type=sa.String(length=255), type_=sa.String(), existing_nullable=False) # Revert the length of the description field in the Item table op.alter_column('item', 'description', existing_type=sa.String(length=255), type_=sa.String(), existing_nullable=True) ``` ## /backend/app/alembic/versions/d98dd8ec85a3_edit_replace_id_integers_in_all_models_.py ```py path="/backend/app/alembic/versions/d98dd8ec85a3_edit_replace_id_integers_in_all_models_.py" """Edit replace id integers in all models to use UUID instead Revision ID: d98dd8ec85a3 Revises: 9c0a54914c78 Create Date: 2024-07-19 04:08:04.000976 """ from alembic import op import sqlalchemy as sa import sqlmodel.sql.sqltypes from sqlalchemy.dialects import postgresql # revision identifiers, used by Alembic. revision = 'd98dd8ec85a3' down_revision = '9c0a54914c78' branch_labels = None depends_on = None def upgrade(): # Ensure uuid-ossp extension is available op.execute('CREATE EXTENSION IF NOT EXISTS "uuid-ossp"') # Create a new UUID column with a default UUID value op.add_column('user', sa.Column('new_id', postgresql.UUID(as_uuid=True), default=sa.text('uuid_generate_v4()'))) op.add_column('item', sa.Column('new_id', postgresql.UUID(as_uuid=True), default=sa.text('uuid_generate_v4()'))) op.add_column('item', sa.Column('new_owner_id', postgresql.UUID(as_uuid=True), nullable=True)) # Populate the new columns with UUIDs op.execute('UPDATE "user" SET new_id = uuid_generate_v4()') op.execute('UPDATE item SET new_id = uuid_generate_v4()') op.execute('UPDATE item SET new_owner_id = (SELECT new_id FROM "user" WHERE "user".id = item.owner_id)') # Set the new_id as not nullable op.alter_column('user', 'new_id', nullable=False) op.alter_column('item', 'new_id', nullable=False) # Drop old columns and rename new columns op.drop_constraint('item_owner_id_fkey', 'item', type_='foreignkey') op.drop_column('item', 'owner_id') op.alter_column('item', 'new_owner_id', new_column_name='owner_id') op.drop_column('user', 'id') op.alter_column('user', 'new_id', new_column_name='id') op.drop_column('item', 'id') op.alter_column('item', 'new_id', new_column_name='id') # Create primary key constraint op.create_primary_key('user_pkey', 'user', ['id']) op.create_primary_key('item_pkey', 'item', ['id']) # Recreate foreign key constraint op.create_foreign_key('item_owner_id_fkey', 'item', 'user', ['owner_id'], ['id']) def downgrade(): # Reverse the upgrade process op.add_column('user', sa.Column('old_id', sa.Integer, autoincrement=True)) op.add_column('item', sa.Column('old_id', sa.Integer, autoincrement=True)) op.add_column('item', sa.Column('old_owner_id', sa.Integer, nullable=True)) # Populate the old columns with default values # Generate sequences for the integer IDs if not exist op.execute('CREATE SEQUENCE IF NOT EXISTS user_id_seq AS INTEGER OWNED BY "user".old_id') op.execute('CREATE SEQUENCE IF NOT EXISTS item_id_seq AS INTEGER OWNED BY item.old_id') op.execute('SELECT setval(\'user_id_seq\', COALESCE((SELECT MAX(old_id) + 1 FROM "user"), 1), false)') op.execute('SELECT setval(\'item_id_seq\', COALESCE((SELECT MAX(old_id) + 1 FROM item), 1), false)') op.execute('UPDATE "user" SET old_id = nextval(\'user_id_seq\')') op.execute('UPDATE item SET old_id = nextval(\'item_id_seq\'), old_owner_id = (SELECT old_id FROM "user" WHERE "user".id = item.owner_id)') # Drop new columns and rename old columns back op.drop_constraint('item_owner_id_fkey', 'item', type_='foreignkey') op.drop_column('item', 'owner_id') op.alter_column('item', 'old_owner_id', new_column_name='owner_id') op.drop_column('user', 'id') op.alter_column('user', 'old_id', new_column_name='id') op.drop_column('item', 'id') op.alter_column('item', 'old_id', new_column_name='id') # Create primary key constraint op.create_primary_key('user_pkey', 'user', ['id']) op.create_primary_key('item_pkey', 'item', ['id']) # Recreate foreign key constraint op.create_foreign_key('item_owner_id_fkey', 'item', 'user', ['owner_id'], ['id']) ``` ## /backend/app/alembic/versions/e2412789c190_initialize_models.py ```py path="/backend/app/alembic/versions/e2412789c190_initialize_models.py" """Initialize models Revision ID: e2412789c190 Revises: Create Date: 2023-11-24 22:55:43.195942 """ import sqlalchemy as sa import sqlmodel.sql.sqltypes from alembic import op # revision identifiers, used by Alembic. revision = "e2412789c190" down_revision = None branch_labels = None depends_on = None def upgrade(): # ### commands auto generated by Alembic - please adjust! ### op.create_table( "user", sa.Column("email", sqlmodel.sql.sqltypes.AutoString(), nullable=False), sa.Column("is_active", sa.Boolean(), nullable=False), sa.Column("is_superuser", sa.Boolean(), nullable=False), sa.Column("full_name", sqlmodel.sql.sqltypes.AutoString(), nullable=True), sa.Column("id", sa.Integer(), nullable=False), sa.Column( "hashed_password", sqlmodel.sql.sqltypes.AutoString(), nullable=False ), sa.PrimaryKeyConstraint("id"), ) op.create_index(op.f("ix_user_email"), "user", ["email"], unique=True) op.create_table( "item", sa.Column("description", sqlmodel.sql.sqltypes.AutoString(), nullable=True), sa.Column("id", sa.Integer(), nullable=False), sa.Column("title", sqlmodel.sql.sqltypes.AutoString(), nullable=False), sa.Column("owner_id", sa.Integer(), nullable=False), sa.ForeignKeyConstraint( ["owner_id"], ["user.id"], ), sa.PrimaryKeyConstraint("id"), ) # ### end Alembic commands ### def downgrade(): # ### commands auto generated by Alembic - please adjust! ### op.drop_table("item") op.drop_index(op.f("ix_user_email"), table_name="user") op.drop_table("user") # ### end Alembic commands ### ``` ## /backend/app/api/__init__.py ```py path="/backend/app/api/__init__.py" ``` ## /backend/app/api/deps.py ```py path="/backend/app/api/deps.py" from collections.abc import Generator from typing import Annotated import jwt from fastapi import Depends, HTTPException, status from fastapi.security import OAuth2PasswordBearer from jwt.exceptions import InvalidTokenError from pydantic import ValidationError from sqlmodel import Session from app.core import security from app.core.config import settings from app.core.db import engine from app.models import TokenPayload, User reusable_oauth2 = OAuth2PasswordBearer( tokenUrl=f"{settings.API_V1_STR}/login/access-token" ) def get_db() -> Generator[Session, None, None]: with Session(engine) as session: yield session SessionDep = Annotated[Session, Depends(get_db)] TokenDep = Annotated[str, Depends(reusable_oauth2)] def get_current_user(session: SessionDep, token: TokenDep) -> User: try: payload = jwt.decode( token, settings.SECRET_KEY, algorithms=[security.ALGORITHM] ) token_data = TokenPayload(**payload) except (InvalidTokenError, ValidationError): raise HTTPException( status_code=status.HTTP_403_FORBIDDEN, detail="Could not validate credentials", ) user = session.get(User, token_data.sub) if not user: raise HTTPException(status_code=404, detail="User not found") if not user.is_active: raise HTTPException(status_code=400, detail="Inactive user") return user CurrentUser = Annotated[User, Depends(get_current_user)] def get_current_active_superuser(current_user: CurrentUser) -> User: if not current_user.is_superuser: raise HTTPException( status_code=403, detail="The user doesn't have enough privileges" ) return current_user ``` ## /backend/app/api/main.py ```py path="/backend/app/api/main.py" from fastapi import APIRouter from app.api.routes import items, login, private, users, utils from app.core.config import settings api_router = APIRouter() api_router.include_router(login.router) api_router.include_router(users.router) api_router.include_router(utils.router) api_router.include_router(items.router) if settings.ENVIRONMENT == "local": api_router.include_router(private.router) ``` ## /backend/app/api/routes/__init__.py ```py path="/backend/app/api/routes/__init__.py" ``` ## /backend/app/api/routes/items.py ```py path="/backend/app/api/routes/items.py" import uuid from typing import Any from fastapi import APIRouter, HTTPException from sqlmodel import func, select from app.api.deps import CurrentUser, SessionDep from app.models import Item, ItemCreate, ItemPublic, ItemsPublic, ItemUpdate, Message router = APIRouter(prefix="/items", tags=["items"]) @router.get("/", response_model=ItemsPublic) def read_items( session: SessionDep, current_user: CurrentUser, skip: int = 0, limit: int = 100 ) -> Any: """ Retrieve items. """ if current_user.is_superuser: count_statement = select(func.count()).select_from(Item) count = session.exec(count_statement).one() statement = select(Item).offset(skip).limit(limit) items = session.exec(statement).all() else: count_statement = ( select(func.count()) .select_from(Item) .where(Item.owner_id == current_user.id) ) count = session.exec(count_statement).one() statement = ( select(Item) .where(Item.owner_id == current_user.id) .offset(skip) .limit(limit) ) items = session.exec(statement).all() return ItemsPublic(data=items, count=count) @router.get("/{id}", response_model=ItemPublic) def read_item(session: SessionDep, current_user: CurrentUser, id: uuid.UUID) -> Any: """ Get item by ID. """ item = session.get(Item, id) if not item: raise HTTPException(status_code=404, detail="Item not found") if not current_user.is_superuser and (item.owner_id != current_user.id): raise HTTPException(status_code=400, detail="Not enough permissions") return item @router.post("/", response_model=ItemPublic) def create_item( *, session: SessionDep, current_user: CurrentUser, item_in: ItemCreate ) -> Any: """ Create new item. """ item = Item.model_validate(item_in, update={"owner_id": current_user.id}) session.add(item) session.commit() session.refresh(item) return item @router.put("/{id}", response_model=ItemPublic) def update_item( *, session: SessionDep, current_user: CurrentUser, id: uuid.UUID, item_in: ItemUpdate, ) -> Any: """ Update an item. """ item = session.get(Item, id) if not item: raise HTTPException(status_code=404, detail="Item not found") if not current_user.is_superuser and (item.owner_id != current_user.id): raise HTTPException(status_code=400, detail="Not enough permissions") update_dict = item_in.model_dump(exclude_unset=True) item.sqlmodel_update(update_dict) session.add(item) session.commit() session.refresh(item) return item @router.delete("/{id}") def delete_item( session: SessionDep, current_user: CurrentUser, id: uuid.UUID ) -> Message: """ Delete an item. """ item = session.get(Item, id) if not item: raise HTTPException(status_code=404, detail="Item not found") if not current_user.is_superuser and (item.owner_id != current_user.id): raise HTTPException(status_code=400, detail="Not enough permissions") session.delete(item) session.commit() return Message(message="Item deleted successfully") ``` ## /backend/app/api/routes/login.py ```py path="/backend/app/api/routes/login.py" from datetime import timedelta from typing import Annotated, Any from fastapi import APIRouter, Depends, HTTPException from fastapi.responses import HTMLResponse from fastapi.security import OAuth2PasswordRequestForm from app import crud from app.api.deps import CurrentUser, SessionDep, get_current_active_superuser from app.core import security from app.core.config import settings from app.core.security import get_password_hash from app.models import Message, NewPassword, Token, UserPublic from app.utils import ( generate_password_reset_token, generate_reset_password_email, send_email, verify_password_reset_token, ) router = APIRouter(tags=["login"]) @router.post("/login/access-token") def login_access_token( session: SessionDep, form_data: Annotated[OAuth2PasswordRequestForm, Depends()] ) -> Token: """ OAuth2 compatible token login, get an access token for future requests """ user = crud.authenticate( session=session, email=form_data.username, password=form_data.password ) if not user: raise HTTPException(status_code=400, detail="Incorrect email or password") elif not user.is_active: raise HTTPException(status_code=400, detail="Inactive user") access_token_expires = timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES) return Token( access_token=security.create_access_token( user.id, expires_delta=access_token_expires ) ) @router.post("/login/test-token", response_model=UserPublic) def test_token(current_user: CurrentUser) -> Any: """ Test access token """ return current_user @router.post("/password-recovery/{email}") def recover_password(email: str, session: SessionDep) -> Message: """ Password Recovery """ user = crud.get_user_by_email(session=session, email=email) if not user: raise HTTPException( status_code=404, detail="The user with this email does not exist in the system.", ) password_reset_token = generate_password_reset_token(email=email) email_data = generate_reset_password_email( email_to=user.email, email=email, token=password_reset_token ) send_email( email_to=user.email, subject=email_data.subject, html_content=email_data.html_content, ) return Message(message="Password recovery email sent") @router.post("/reset-password/") def reset_password(session: SessionDep, body: NewPassword) -> Message: """ Reset password """ email = verify_password_reset_token(token=body.token) if not email: raise HTTPException(status_code=400, detail="Invalid token") user = crud.get_user_by_email(session=session, email=email) if not user: raise HTTPException( status_code=404, detail="The user with this email does not exist in the system.", ) elif not user.is_active: raise HTTPException(status_code=400, detail="Inactive user") hashed_password = get_password_hash(password=body.new_password) user.hashed_password = hashed_password session.add(user) session.commit() return Message(message="Password updated successfully") @router.post( "/password-recovery-html-content/{email}", dependencies=[Depends(get_current_active_superuser)], response_class=HTMLResponse, ) def recover_password_html_content(email: str, session: SessionDep) -> Any: """ HTML Content for Password Recovery """ user = crud.get_user_by_email(session=session, email=email) if not user: raise HTTPException( status_code=404, detail="The user with this username does not exist in the system.", ) password_reset_token = generate_password_reset_token(email=email) email_data = generate_reset_password_email( email_to=user.email, email=email, token=password_reset_token ) return HTMLResponse( content=email_data.html_content, headers={"subject:": email_data.subject} ) ``` ## /backend/app/api/routes/private.py ```py path="/backend/app/api/routes/private.py" from typing import Any from fastapi import APIRouter from pydantic import BaseModel from app.api.deps import SessionDep from app.core.security import get_password_hash from app.models import ( User, UserPublic, ) router = APIRouter(tags=["private"], prefix="/private") class PrivateUserCreate(BaseModel): email: str password: str full_name: str is_verified: bool = False @router.post("/users/", response_model=UserPublic) def create_user(user_in: PrivateUserCreate, session: SessionDep) -> Any: """ Create a new user. """ user = User( email=user_in.email, full_name=user_in.full_name, hashed_password=get_password_hash(user_in.password), ) session.add(user) session.commit() return user ``` ## /backend/app/api/routes/users.py ```py path="/backend/app/api/routes/users.py" import uuid from typing import Any from fastapi import APIRouter, Depends, HTTPException from sqlmodel import col, delete, func, select from app import crud from app.api.deps import ( CurrentUser, SessionDep, get_current_active_superuser, ) from app.core.config import settings from app.core.security import get_password_hash, verify_password from app.models import ( Item, Message, UpdatePassword, User, UserCreate, UserPublic, UserRegister, UsersPublic, UserUpdate, UserUpdateMe, ) from app.utils import generate_new_account_email, send_email router = APIRouter(prefix="/users", tags=["users"]) @router.get( "/", dependencies=[Depends(get_current_active_superuser)], response_model=UsersPublic, ) def read_users(session: SessionDep, skip: int = 0, limit: int = 100) -> Any: """ Retrieve users. """ count_statement = select(func.count()).select_from(User) count = session.exec(count_statement).one() statement = select(User).offset(skip).limit(limit) users = session.exec(statement).all() return UsersPublic(data=users, count=count) @router.post( "/", dependencies=[Depends(get_current_active_superuser)], response_model=UserPublic ) def create_user(*, session: SessionDep, user_in: UserCreate) -> Any: """ Create new user. """ user = crud.get_user_by_email(session=session, email=user_in.email) if user: raise HTTPException( status_code=400, detail="The user with this email already exists in the system.", ) user = crud.create_user(session=session, user_create=user_in) if settings.emails_enabled and user_in.email: email_data = generate_new_account_email( email_to=user_in.email, username=user_in.email, password=user_in.password ) send_email( email_to=user_in.email, subject=email_data.subject, html_content=email_data.html_content, ) return user @router.patch("/me", response_model=UserPublic) def update_user_me( *, session: SessionDep, user_in: UserUpdateMe, current_user: CurrentUser ) -> Any: """ Update own user. """ if user_in.email: existing_user = crud.get_user_by_email(session=session, email=user_in.email) if existing_user and existing_user.id != current_user.id: raise HTTPException( status_code=409, detail="User with this email already exists" ) user_data = user_in.model_dump(exclude_unset=True) current_user.sqlmodel_update(user_data) session.add(current_user) session.commit() session.refresh(current_user) return current_user @router.patch("/me/password", response_model=Message) def update_password_me( *, session: SessionDep, body: UpdatePassword, current_user: CurrentUser ) -> Any: """ Update own password. """ if not verify_password(body.current_password, current_user.hashed_password): raise HTTPException(status_code=400, detail="Incorrect password") if body.current_password == body.new_password: raise HTTPException( status_code=400, detail="New password cannot be the same as the current one" ) hashed_password = get_password_hash(body.new_password) current_user.hashed_password = hashed_password session.add(current_user) session.commit() return Message(message="Password updated successfully") @router.get("/me", response_model=UserPublic) def read_user_me(current_user: CurrentUser) -> Any: """ Get current user. """ return current_user @router.delete("/me", response_model=Message) def delete_user_me(session: SessionDep, current_user: CurrentUser) -> Any: """ Delete own user. """ if current_user.is_superuser: raise HTTPException( status_code=403, detail="Super users are not allowed to delete themselves" ) session.delete(current_user) session.commit() return Message(message="User deleted successfully") @router.post("/signup", response_model=UserPublic) def register_user(session: SessionDep, user_in: UserRegister) -> Any: """ Create new user without the need to be logged in. """ user = crud.get_user_by_email(session=session, email=user_in.email) if user: raise HTTPException( status_code=400, detail="The user with this email already exists in the system", ) user_create = UserCreate.model_validate(user_in) user = crud.create_user(session=session, user_create=user_create) return user @router.get("/{user_id}", response_model=UserPublic) def read_user_by_id( user_id: uuid.UUID, session: SessionDep, current_user: CurrentUser ) -> Any: """ Get a specific user by id. """ user = session.get(User, user_id) if user == current_user: return user if not current_user.is_superuser: raise HTTPException( status_code=403, detail="The user doesn't have enough privileges", ) return user @router.patch( "/{user_id}", dependencies=[Depends(get_current_active_superuser)], response_model=UserPublic, ) def update_user( *, session: SessionDep, user_id: uuid.UUID, user_in: UserUpdate, ) -> Any: """ Update a user. """ db_user = session.get(User, user_id) if not db_user: raise HTTPException( status_code=404, detail="The user with this id does not exist in the system", ) if user_in.email: existing_user = crud.get_user_by_email(session=session, email=user_in.email) if existing_user and existing_user.id != user_id: raise HTTPException( status_code=409, detail="User with this email already exists" ) db_user = crud.update_user(session=session, db_user=db_user, user_in=user_in) return db_user @router.delete("/{user_id}", dependencies=[Depends(get_current_active_superuser)]) def delete_user( session: SessionDep, current_user: CurrentUser, user_id: uuid.UUID ) -> Message: """ Delete a user. """ user = session.get(User, user_id) if not user: raise HTTPException(status_code=404, detail="User not found") if user == current_user: raise HTTPException( status_code=403, detail="Super users are not allowed to delete themselves" ) statement = delete(Item).where(col(Item.owner_id) == user_id) session.exec(statement) # type: ignore session.delete(user) session.commit() return Message(message="User deleted successfully") ``` ## /backend/app/api/routes/utils.py ```py path="/backend/app/api/routes/utils.py" from fastapi import APIRouter, Depends from pydantic.networks import EmailStr from app.api.deps import get_current_active_superuser from app.models import Message from app.utils import generate_test_email, send_email router = APIRouter(prefix="/utils", tags=["utils"]) @router.post( "/test-email/", dependencies=[Depends(get_current_active_superuser)], status_code=201, ) def test_email(email_to: EmailStr) -> Message: """ Test emails. """ email_data = generate_test_email(email_to=email_to) send_email( email_to=email_to, subject=email_data.subject, html_content=email_data.html_content, ) return Message(message="Test email sent") @router.get("/health-check/") async def health_check() -> bool: return True ``` ## /backend/app/backend_pre_start.py ```py path="/backend/app/backend_pre_start.py" import logging from sqlalchemy import Engine from sqlmodel import Session, select from tenacity import after_log, before_log, retry, stop_after_attempt, wait_fixed from app.core.db import engine logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__) max_tries = 60 * 5 # 5 minutes wait_seconds = 1 @retry( stop=stop_after_attempt(max_tries), wait=wait_fixed(wait_seconds), before=before_log(logger, logging.INFO), after=after_log(logger, logging.WARN), ) def init(db_engine: Engine) -> None: try: with Session(db_engine) as session: # Try to create session to check if DB is awake session.exec(select(1)) except Exception as e: logger.error(e) raise e def main() -> None: logger.info("Initializing service") init(engine) logger.info("Service finished initializing") if __name__ == "__main__": main() ``` ## /backend/app/core/__init__.py ```py path="/backend/app/core/__init__.py" ``` ## /backend/app/core/config.py ```py path="/backend/app/core/config.py" import secrets import warnings from typing import Annotated, Any, Literal from pydantic import ( AnyUrl, BeforeValidator, EmailStr, HttpUrl, PostgresDsn, computed_field, model_validator, ) from pydantic_core import MultiHostUrl from pydantic_settings import BaseSettings, SettingsConfigDict from typing_extensions import Self def parse_cors(v: Any) -> list[str] | str: if isinstance(v, str) and not v.startswith("["): return [i.strip() for i in v.split(",")] elif isinstance(v, list | str): return v raise ValueError(v) class Settings(BaseSettings): model_config = SettingsConfigDict( # Use top level .env file (one level above ./backend/) env_file="../.env", env_ignore_empty=True, extra="ignore", ) API_V1_STR: str = "/api/v1" SECRET_KEY: str = secrets.token_urlsafe(32) # 60 minutes * 24 hours * 8 days = 8 days ACCESS_TOKEN_EXPIRE_MINUTES: int = 60 * 24 * 8 FRONTEND_HOST: str = "http://localhost:5173" ENVIRONMENT: Literal["local", "staging", "production"] = "local" BACKEND_CORS_ORIGINS: Annotated[ list[AnyUrl] | str, BeforeValidator(parse_cors) ] = [] @computed_field # type: ignore[prop-decorator] @property def all_cors_origins(self) -> list[str]: return [str(origin).rstrip("/") for origin in self.BACKEND_CORS_ORIGINS] + [ self.FRONTEND_HOST ] PROJECT_NAME: str SENTRY_DSN: HttpUrl | None = None POSTGRES_SERVER: str POSTGRES_PORT: int = 5432 POSTGRES_USER: str POSTGRES_PASSWORD: str = "" POSTGRES_DB: str = "" @computed_field # type: ignore[prop-decorator] @property def SQLALCHEMY_DATABASE_URI(self) -> PostgresDsn: return MultiHostUrl.build( scheme="postgresql+psycopg", username=self.POSTGRES_USER, password=self.POSTGRES_PASSWORD, host=self.POSTGRES_SERVER, port=self.POSTGRES_PORT, path=self.POSTGRES_DB, ) SMTP_TLS: bool = True SMTP_SSL: bool = False SMTP_PORT: int = 587 SMTP_HOST: str | None = None SMTP_USER: str | None = None SMTP_PASSWORD: str | None = None EMAILS_FROM_EMAIL: EmailStr | None = None EMAILS_FROM_NAME: EmailStr | None = None @model_validator(mode="after") def _set_default_emails_from(self) -> Self: if not self.EMAILS_FROM_NAME: self.EMAILS_FROM_NAME = self.PROJECT_NAME return self EMAIL_RESET_TOKEN_EXPIRE_HOURS: int = 48 @computed_field # type: ignore[prop-decorator] @property def emails_enabled(self) -> bool: return bool(self.SMTP_HOST and self.EMAILS_FROM_EMAIL) EMAIL_TEST_USER: EmailStr = "test@example.com" FIRST_SUPERUSER: EmailStr FIRST_SUPERUSER_PASSWORD: str def _check_default_secret(self, var_name: str, value: str | None) -> None: if value == "changethis": message = ( f'The value of {var_name} is "changethis", ' "for security, please change it, at least for deployments." ) if self.ENVIRONMENT == "local": warnings.warn(message, stacklevel=1) else: raise ValueError(message) @model_validator(mode="after") def _enforce_non_default_secrets(self) -> Self: self._check_default_secret("SECRET_KEY", self.SECRET_KEY) self._check_default_secret("POSTGRES_PASSWORD", self.POSTGRES_PASSWORD) self._check_default_secret( "FIRST_SUPERUSER_PASSWORD", self.FIRST_SUPERUSER_PASSWORD ) return self settings = Settings() # type: ignore ``` ## /backend/app/core/db.py ```py path="/backend/app/core/db.py" from sqlmodel import Session, create_engine, select from app import crud from app.core.config import settings from app.models import User, UserCreate engine = create_engine(str(settings.SQLALCHEMY_DATABASE_URI)) # make sure all SQLModel models are imported (app.models) before initializing DB # otherwise, SQLModel might fail to initialize relationships properly # for more details: https://github.com/fastapi/full-stack-fastapi-template/issues/28 def init_db(session: Session) -> None: # Tables should be created with Alembic migrations # But if you don't want to use migrations, create # the tables un-commenting the next lines # from sqlmodel import SQLModel # This works because the models are already imported and registered from app.models # SQLModel.metadata.create_all(engine) user = session.exec( select(User).where(User.email == settings.FIRST_SUPERUSER) ).first() if not user: user_in = UserCreate( email=settings.FIRST_SUPERUSER, password=settings.FIRST_SUPERUSER_PASSWORD, is_superuser=True, ) user = crud.create_user(session=session, user_create=user_in) ``` ## /backend/app/core/security.py ```py path="/backend/app/core/security.py" from datetime import datetime, timedelta, timezone from typing import Any import jwt from passlib.context import CryptContext from app.core.config import settings pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto") ALGORITHM = "HS256" def create_access_token(subject: str | Any, expires_delta: timedelta) -> str: expire = datetime.now(timezone.utc) + expires_delta to_encode = {"exp": expire, "sub": str(subject)} encoded_jwt = jwt.encode(to_encode, settings.SECRET_KEY, algorithm=ALGORITHM) return encoded_jwt def verify_password(plain_password: str, hashed_password: str) -> bool: return pwd_context.verify(plain_password, hashed_password) def get_password_hash(password: str) -> str: return pwd_context.hash(password) ``` ## /backend/app/crud.py ```py path="/backend/app/crud.py" import uuid from typing import Any from sqlmodel import Session, select from app.core.security import get_password_hash, verify_password from app.models import Item, ItemCreate, User, UserCreate, UserUpdate def create_user(*, session: Session, user_create: UserCreate) -> User: db_obj = User.model_validate( user_create, update={"hashed_password": get_password_hash(user_create.password)} ) session.add(db_obj) session.commit() session.refresh(db_obj) return db_obj def update_user(*, session: Session, db_user: User, user_in: UserUpdate) -> Any: user_data = user_in.model_dump(exclude_unset=True) extra_data = {} if "password" in user_data: password = user_data["password"] hashed_password = get_password_hash(password) extra_data["hashed_password"] = hashed_password db_user.sqlmodel_update(user_data, update=extra_data) session.add(db_user) session.commit() session.refresh(db_user) return db_user def get_user_by_email(*, session: Session, email: str) -> User | None: statement = select(User).where(User.email == email) session_user = session.exec(statement).first() return session_user def authenticate(*, session: Session, email: str, password: str) -> User | None: db_user = get_user_by_email(session=session, email=email) if not db_user: return None if not verify_password(password, db_user.hashed_password): return None return db_user def create_item(*, session: Session, item_in: ItemCreate, owner_id: uuid.UUID) -> Item: db_item = Item.model_validate(item_in, update={"owner_id": owner_id}) session.add(db_item) session.commit() session.refresh(db_item) return db_item ``` ## /backend/app/email-templates/build/new_account.html ```html path="/backend/app/email-templates/build/new_account.html"
{{ project_name }} - New Account
Welcome to your new account!
Here are your account details:
Username: {{ username }}
Password: {{ password }}
Go to Dashboard

``` ## /backend/app/email-templates/build/reset_password.html ```html path="/backend/app/email-templates/build/reset_password.html"
{{ project_name }} - Password Recovery
Hello {{ username }}
We've received a request to reset your password. You can do it by clicking the button below:
Reset password
Or copy and paste the following link into your browser:
This password will expire in {{ valid_hours }} hours.

If you didn't request a password recovery you can disregard this email.
``` ## /backend/app/email-templates/build/test_email.html ```html path="/backend/app/email-templates/build/test_email.html"
{{ project_name }}
Test email for: {{ email }}

``` ## /backend/app/email-templates/src/new_account.mjml ```mjml path="/backend/app/email-templates/src/new_account.mjml" {{ project_name }} - New Account Welcome to your new account! Here are your account details: Username: {{ username }} Password: {{ password }} Go to Dashboard ``` ## /backend/app/email-templates/src/reset_password.mjml ```mjml path="/backend/app/email-templates/src/reset_password.mjml" {{ project_name }} - Password Recovery Hello {{ username }} We've received a request to reset your password. You can do it by clicking the button below: Reset password Or copy and paste the following link into your browser: {{ link }} This password will expire in {{ valid_hours }} hours. If you didn't request a password recovery you can disregard this email. ``` ## /backend/app/email-templates/src/test_email.mjml ```mjml path="/backend/app/email-templates/src/test_email.mjml" {{ project_name }} Test email for: {{ email }} ``` ## /backend/app/initial_data.py ```py path="/backend/app/initial_data.py" import logging from sqlmodel import Session from app.core.db import engine, init_db logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__) def init() -> None: with Session(engine) as session: init_db(session) def main() -> None: logger.info("Creating initial data") init() logger.info("Initial data created") if __name__ == "__main__": main() ``` ## /backend/app/main.py ```py path="/backend/app/main.py" import sentry_sdk from fastapi import FastAPI from fastapi.routing import APIRoute from starlette.middleware.cors import CORSMiddleware from app.api.main import api_router from app.core.config import settings def custom_generate_unique_id(route: APIRoute) -> str: return f"{route.tags[0]}-{route.name}" if settings.SENTRY_DSN and settings.ENVIRONMENT != "local": sentry_sdk.init(dsn=str(settings.SENTRY_DSN), enable_tracing=True) app = FastAPI( title=settings.PROJECT_NAME, openapi_url=f"{settings.API_V1_STR}/openapi.json", generate_unique_id_function=custom_generate_unique_id, ) # Set all CORS enabled origins if settings.all_cors_origins: app.add_middleware( CORSMiddleware, allow_origins=settings.all_cors_origins, allow_credentials=True, allow_methods=["*"], allow_headers=["*"], ) app.include_router(api_router, prefix=settings.API_V1_STR) ``` ## /backend/app/models.py ```py path="/backend/app/models.py" import uuid from pydantic import EmailStr from sqlmodel import Field, Relationship, SQLModel # Shared properties class UserBase(SQLModel): email: EmailStr = Field(unique=True, index=True, max_length=255) is_active: bool = True is_superuser: bool = False full_name: str | None = Field(default=None, max_length=255) # Properties to receive via API on creation class UserCreate(UserBase): password: str = Field(min_length=8, max_length=40) class UserRegister(SQLModel): email: EmailStr = Field(max_length=255) password: str = Field(min_length=8, max_length=40) full_name: str | None = Field(default=None, max_length=255) # Properties to receive via API on update, all are optional class UserUpdate(UserBase): email: EmailStr | None = Field(default=None, max_length=255) # type: ignore password: str | None = Field(default=None, min_length=8, max_length=40) class UserUpdateMe(SQLModel): full_name: str | None = Field(default=None, max_length=255) email: EmailStr | None = Field(default=None, max_length=255) class UpdatePassword(SQLModel): current_password: str = Field(min_length=8, max_length=40) new_password: str = Field(min_length=8, max_length=40) # Database model, database table inferred from class name class User(UserBase, table=True): id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True) hashed_password: str items: list["Item"] = Relationship(back_populates="owner", cascade_delete=True) # Properties to return via API, id is always required class UserPublic(UserBase): id: uuid.UUID class UsersPublic(SQLModel): data: list[UserPublic] count: int # Shared properties class ItemBase(SQLModel): title: str = Field(min_length=1, max_length=255) description: str | None = Field(default=None, max_length=255) # Properties to receive on item creation class ItemCreate(ItemBase): pass # Properties to receive on item update class ItemUpdate(ItemBase): title: str | None = Field(default=None, min_length=1, max_length=255) # type: ignore # Database model, database table inferred from class name class Item(ItemBase, table=True): id: uuid.UUID = Field(default_factory=uuid.uuid4, primary_key=True) owner_id: uuid.UUID = Field( foreign_key="user.id", nullable=False, ondelete="CASCADE" ) owner: User | None = Relationship(back_populates="items") # Properties to return via API, id is always required class ItemPublic(ItemBase): id: uuid.UUID owner_id: uuid.UUID class ItemsPublic(SQLModel): data: list[ItemPublic] count: int # Generic message class Message(SQLModel): message: str # JSON payload containing access token class Token(SQLModel): access_token: str token_type: str = "bearer" # Contents of JWT token class TokenPayload(SQLModel): sub: str | None = None class NewPassword(SQLModel): token: str new_password: str = Field(min_length=8, max_length=40) ``` ## /backend/app/tests/__init__.py ```py path="/backend/app/tests/__init__.py" ``` ## /backend/app/tests/api/__init__.py ```py path="/backend/app/tests/api/__init__.py" ``` ## /backend/app/tests/api/routes/__init__.py ```py path="/backend/app/tests/api/routes/__init__.py" ``` ## /backend/app/tests/api/routes/test_items.py ```py path="/backend/app/tests/api/routes/test_items.py" import uuid from fastapi.testclient import TestClient from sqlmodel import Session from app.core.config import settings from app.tests.utils.item import create_random_item def test_create_item( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: data = {"title": "Foo", "description": "Fighters"} response = client.post( f"{settings.API_V1_STR}/items/", headers=superuser_token_headers, json=data, ) assert response.status_code == 200 content = response.json() assert content["title"] == data["title"] assert content["description"] == data["description"] assert "id" in content assert "owner_id" in content def test_read_item( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: item = create_random_item(db) response = client.get( f"{settings.API_V1_STR}/items/{item.id}", headers=superuser_token_headers, ) assert response.status_code == 200 content = response.json() assert content["title"] == item.title assert content["description"] == item.description assert content["id"] == str(item.id) assert content["owner_id"] == str(item.owner_id) def test_read_item_not_found( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: response = client.get( f"{settings.API_V1_STR}/items/{uuid.uuid4()}", headers=superuser_token_headers, ) assert response.status_code == 404 content = response.json() assert content["detail"] == "Item not found" def test_read_item_not_enough_permissions( client: TestClient, normal_user_token_headers: dict[str, str], db: Session ) -> None: item = create_random_item(db) response = client.get( f"{settings.API_V1_STR}/items/{item.id}", headers=normal_user_token_headers, ) assert response.status_code == 400 content = response.json() assert content["detail"] == "Not enough permissions" def test_read_items( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: create_random_item(db) create_random_item(db) response = client.get( f"{settings.API_V1_STR}/items/", headers=superuser_token_headers, ) assert response.status_code == 200 content = response.json() assert len(content["data"]) >= 2 def test_update_item( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: item = create_random_item(db) data = {"title": "Updated title", "description": "Updated description"} response = client.put( f"{settings.API_V1_STR}/items/{item.id}", headers=superuser_token_headers, json=data, ) assert response.status_code == 200 content = response.json() assert content["title"] == data["title"] assert content["description"] == data["description"] assert content["id"] == str(item.id) assert content["owner_id"] == str(item.owner_id) def test_update_item_not_found( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: data = {"title": "Updated title", "description": "Updated description"} response = client.put( f"{settings.API_V1_STR}/items/{uuid.uuid4()}", headers=superuser_token_headers, json=data, ) assert response.status_code == 404 content = response.json() assert content["detail"] == "Item not found" def test_update_item_not_enough_permissions( client: TestClient, normal_user_token_headers: dict[str, str], db: Session ) -> None: item = create_random_item(db) data = {"title": "Updated title", "description": "Updated description"} response = client.put( f"{settings.API_V1_STR}/items/{item.id}", headers=normal_user_token_headers, json=data, ) assert response.status_code == 400 content = response.json() assert content["detail"] == "Not enough permissions" def test_delete_item( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: item = create_random_item(db) response = client.delete( f"{settings.API_V1_STR}/items/{item.id}", headers=superuser_token_headers, ) assert response.status_code == 200 content = response.json() assert content["message"] == "Item deleted successfully" def test_delete_item_not_found( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: response = client.delete( f"{settings.API_V1_STR}/items/{uuid.uuid4()}", headers=superuser_token_headers, ) assert response.status_code == 404 content = response.json() assert content["detail"] == "Item not found" def test_delete_item_not_enough_permissions( client: TestClient, normal_user_token_headers: dict[str, str], db: Session ) -> None: item = create_random_item(db) response = client.delete( f"{settings.API_V1_STR}/items/{item.id}", headers=normal_user_token_headers, ) assert response.status_code == 400 content = response.json() assert content["detail"] == "Not enough permissions" ``` ## /backend/app/tests/api/routes/test_login.py ```py path="/backend/app/tests/api/routes/test_login.py" from unittest.mock import patch from fastapi.testclient import TestClient from sqlmodel import Session from app.core.config import settings from app.core.security import verify_password from app.crud import create_user from app.models import UserCreate from app.tests.utils.user import user_authentication_headers from app.tests.utils.utils import random_email, random_lower_string from app.utils import generate_password_reset_token def test_get_access_token(client: TestClient) -> None: login_data = { "username": settings.FIRST_SUPERUSER, "password": settings.FIRST_SUPERUSER_PASSWORD, } r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) tokens = r.json() assert r.status_code == 200 assert "access_token" in tokens assert tokens["access_token"] def test_get_access_token_incorrect_password(client: TestClient) -> None: login_data = { "username": settings.FIRST_SUPERUSER, "password": "incorrect", } r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) assert r.status_code == 400 def test_use_access_token( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: r = client.post( f"{settings.API_V1_STR}/login/test-token", headers=superuser_token_headers, ) result = r.json() assert r.status_code == 200 assert "email" in result def test_recovery_password( client: TestClient, normal_user_token_headers: dict[str, str] ) -> None: with ( patch("app.core.config.settings.SMTP_HOST", "smtp.example.com"), patch("app.core.config.settings.SMTP_USER", "admin@example.com"), ): email = "test@example.com" r = client.post( f"{settings.API_V1_STR}/password-recovery/{email}", headers=normal_user_token_headers, ) assert r.status_code == 200 assert r.json() == {"message": "Password recovery email sent"} def test_recovery_password_user_not_exits( client: TestClient, normal_user_token_headers: dict[str, str] ) -> None: email = "jVgQr@example.com" r = client.post( f"{settings.API_V1_STR}/password-recovery/{email}", headers=normal_user_token_headers, ) assert r.status_code == 404 def test_reset_password(client: TestClient, db: Session) -> None: email = random_email() password = random_lower_string() new_password = random_lower_string() user_create = UserCreate( email=email, full_name="Test User", password=password, is_active=True, is_superuser=False, ) user = create_user(session=db, user_create=user_create) token = generate_password_reset_token(email=email) headers = user_authentication_headers(client=client, email=email, password=password) data = {"new_password": new_password, "token": token} r = client.post( f"{settings.API_V1_STR}/reset-password/", headers=headers, json=data, ) assert r.status_code == 200 assert r.json() == {"message": "Password updated successfully"} db.refresh(user) assert verify_password(new_password, user.hashed_password) def test_reset_password_invalid_token( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: data = {"new_password": "changethis", "token": "invalid"} r = client.post( f"{settings.API_V1_STR}/reset-password/", headers=superuser_token_headers, json=data, ) response = r.json() assert "detail" in response assert r.status_code == 400 assert response["detail"] == "Invalid token" ``` ## /backend/app/tests/api/routes/test_private.py ```py path="/backend/app/tests/api/routes/test_private.py" from fastapi.testclient import TestClient from sqlmodel import Session, select from app.core.config import settings from app.models import User def test_create_user(client: TestClient, db: Session) -> None: r = client.post( f"{settings.API_V1_STR}/private/users/", json={ "email": "pollo@listo.com", "password": "password123", "full_name": "Pollo Listo", }, ) assert r.status_code == 200 data = r.json() user = db.exec(select(User).where(User.id == data["id"])).first() assert user assert user.email == "pollo@listo.com" assert user.full_name == "Pollo Listo" ``` ## /backend/app/tests/api/routes/test_users.py ```py path="/backend/app/tests/api/routes/test_users.py" import uuid from unittest.mock import patch from fastapi.testclient import TestClient from sqlmodel import Session, select from app import crud from app.core.config import settings from app.core.security import verify_password from app.models import User, UserCreate from app.tests.utils.utils import random_email, random_lower_string def test_get_users_superuser_me( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: r = client.get(f"{settings.API_V1_STR}/users/me", headers=superuser_token_headers) current_user = r.json() assert current_user assert current_user["is_active"] is True assert current_user["is_superuser"] assert current_user["email"] == settings.FIRST_SUPERUSER def test_get_users_normal_user_me( client: TestClient, normal_user_token_headers: dict[str, str] ) -> None: r = client.get(f"{settings.API_V1_STR}/users/me", headers=normal_user_token_headers) current_user = r.json() assert current_user assert current_user["is_active"] is True assert current_user["is_superuser"] is False assert current_user["email"] == settings.EMAIL_TEST_USER def test_create_user_new_email( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: with ( patch("app.utils.send_email", return_value=None), patch("app.core.config.settings.SMTP_HOST", "smtp.example.com"), patch("app.core.config.settings.SMTP_USER", "admin@example.com"), ): username = random_email() password = random_lower_string() data = {"email": username, "password": password} r = client.post( f"{settings.API_V1_STR}/users/", headers=superuser_token_headers, json=data, ) assert 200 <= r.status_code < 300 created_user = r.json() user = crud.get_user_by_email(session=db, email=username) assert user assert user.email == created_user["email"] def test_get_existing_user( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) user_id = user.id r = client.get( f"{settings.API_V1_STR}/users/{user_id}", headers=superuser_token_headers, ) assert 200 <= r.status_code < 300 api_user = r.json() existing_user = crud.get_user_by_email(session=db, email=username) assert existing_user assert existing_user.email == api_user["email"] def test_get_existing_user_current_user(client: TestClient, db: Session) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) user_id = user.id login_data = { "username": username, "password": password, } r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) tokens = r.json() a_token = tokens["access_token"] headers = {"Authorization": f"Bearer {a_token}"} r = client.get( f"{settings.API_V1_STR}/users/{user_id}", headers=headers, ) assert 200 <= r.status_code < 300 api_user = r.json() existing_user = crud.get_user_by_email(session=db, email=username) assert existing_user assert existing_user.email == api_user["email"] def test_get_existing_user_permissions_error( client: TestClient, normal_user_token_headers: dict[str, str] ) -> None: r = client.get( f"{settings.API_V1_STR}/users/{uuid.uuid4()}", headers=normal_user_token_headers, ) assert r.status_code == 403 assert r.json() == {"detail": "The user doesn't have enough privileges"} def test_create_user_existing_username( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: username = random_email() # username = email password = random_lower_string() user_in = UserCreate(email=username, password=password) crud.create_user(session=db, user_create=user_in) data = {"email": username, "password": password} r = client.post( f"{settings.API_V1_STR}/users/", headers=superuser_token_headers, json=data, ) created_user = r.json() assert r.status_code == 400 assert "_id" not in created_user def test_create_user_by_normal_user( client: TestClient, normal_user_token_headers: dict[str, str] ) -> None: username = random_email() password = random_lower_string() data = {"email": username, "password": password} r = client.post( f"{settings.API_V1_STR}/users/", headers=normal_user_token_headers, json=data, ) assert r.status_code == 403 def test_retrieve_users( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) crud.create_user(session=db, user_create=user_in) username2 = random_email() password2 = random_lower_string() user_in2 = UserCreate(email=username2, password=password2) crud.create_user(session=db, user_create=user_in2) r = client.get(f"{settings.API_V1_STR}/users/", headers=superuser_token_headers) all_users = r.json() assert len(all_users["data"]) > 1 assert "count" in all_users for item in all_users["data"]: assert "email" in item def test_update_user_me( client: TestClient, normal_user_token_headers: dict[str, str], db: Session ) -> None: full_name = "Updated Name" email = random_email() data = {"full_name": full_name, "email": email} r = client.patch( f"{settings.API_V1_STR}/users/me", headers=normal_user_token_headers, json=data, ) assert r.status_code == 200 updated_user = r.json() assert updated_user["email"] == email assert updated_user["full_name"] == full_name user_query = select(User).where(User.email == email) user_db = db.exec(user_query).first() assert user_db assert user_db.email == email assert user_db.full_name == full_name def test_update_password_me( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: new_password = random_lower_string() data = { "current_password": settings.FIRST_SUPERUSER_PASSWORD, "new_password": new_password, } r = client.patch( f"{settings.API_V1_STR}/users/me/password", headers=superuser_token_headers, json=data, ) assert r.status_code == 200 updated_user = r.json() assert updated_user["message"] == "Password updated successfully" user_query = select(User).where(User.email == settings.FIRST_SUPERUSER) user_db = db.exec(user_query).first() assert user_db assert user_db.email == settings.FIRST_SUPERUSER assert verify_password(new_password, user_db.hashed_password) # Revert to the old password to keep consistency in test old_data = { "current_password": new_password, "new_password": settings.FIRST_SUPERUSER_PASSWORD, } r = client.patch( f"{settings.API_V1_STR}/users/me/password", headers=superuser_token_headers, json=old_data, ) db.refresh(user_db) assert r.status_code == 200 assert verify_password(settings.FIRST_SUPERUSER_PASSWORD, user_db.hashed_password) def test_update_password_me_incorrect_password( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: new_password = random_lower_string() data = {"current_password": new_password, "new_password": new_password} r = client.patch( f"{settings.API_V1_STR}/users/me/password", headers=superuser_token_headers, json=data, ) assert r.status_code == 400 updated_user = r.json() assert updated_user["detail"] == "Incorrect password" def test_update_user_me_email_exists( client: TestClient, normal_user_token_headers: dict[str, str], db: Session ) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) data = {"email": user.email} r = client.patch( f"{settings.API_V1_STR}/users/me", headers=normal_user_token_headers, json=data, ) assert r.status_code == 409 assert r.json()["detail"] == "User with this email already exists" def test_update_password_me_same_password_error( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: data = { "current_password": settings.FIRST_SUPERUSER_PASSWORD, "new_password": settings.FIRST_SUPERUSER_PASSWORD, } r = client.patch( f"{settings.API_V1_STR}/users/me/password", headers=superuser_token_headers, json=data, ) assert r.status_code == 400 updated_user = r.json() assert ( updated_user["detail"] == "New password cannot be the same as the current one" ) def test_register_user(client: TestClient, db: Session) -> None: username = random_email() password = random_lower_string() full_name = random_lower_string() data = {"email": username, "password": password, "full_name": full_name} r = client.post( f"{settings.API_V1_STR}/users/signup", json=data, ) assert r.status_code == 200 created_user = r.json() assert created_user["email"] == username assert created_user["full_name"] == full_name user_query = select(User).where(User.email == username) user_db = db.exec(user_query).first() assert user_db assert user_db.email == username assert user_db.full_name == full_name assert verify_password(password, user_db.hashed_password) def test_register_user_already_exists_error(client: TestClient) -> None: password = random_lower_string() full_name = random_lower_string() data = { "email": settings.FIRST_SUPERUSER, "password": password, "full_name": full_name, } r = client.post( f"{settings.API_V1_STR}/users/signup", json=data, ) assert r.status_code == 400 assert r.json()["detail"] == "The user with this email already exists in the system" def test_update_user( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) data = {"full_name": "Updated_full_name"} r = client.patch( f"{settings.API_V1_STR}/users/{user.id}", headers=superuser_token_headers, json=data, ) assert r.status_code == 200 updated_user = r.json() assert updated_user["full_name"] == "Updated_full_name" user_query = select(User).where(User.email == username) user_db = db.exec(user_query).first() db.refresh(user_db) assert user_db assert user_db.full_name == "Updated_full_name" def test_update_user_not_exists( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: data = {"full_name": "Updated_full_name"} r = client.patch( f"{settings.API_V1_STR}/users/{uuid.uuid4()}", headers=superuser_token_headers, json=data, ) assert r.status_code == 404 assert r.json()["detail"] == "The user with this id does not exist in the system" def test_update_user_email_exists( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) username2 = random_email() password2 = random_lower_string() user_in2 = UserCreate(email=username2, password=password2) user2 = crud.create_user(session=db, user_create=user_in2) data = {"email": user2.email} r = client.patch( f"{settings.API_V1_STR}/users/{user.id}", headers=superuser_token_headers, json=data, ) assert r.status_code == 409 assert r.json()["detail"] == "User with this email already exists" def test_delete_user_me(client: TestClient, db: Session) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) user_id = user.id login_data = { "username": username, "password": password, } r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) tokens = r.json() a_token = tokens["access_token"] headers = {"Authorization": f"Bearer {a_token}"} r = client.delete( f"{settings.API_V1_STR}/users/me", headers=headers, ) assert r.status_code == 200 deleted_user = r.json() assert deleted_user["message"] == "User deleted successfully" result = db.exec(select(User).where(User.id == user_id)).first() assert result is None user_query = select(User).where(User.id == user_id) user_db = db.execute(user_query).first() assert user_db is None def test_delete_user_me_as_superuser( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: r = client.delete( f"{settings.API_V1_STR}/users/me", headers=superuser_token_headers, ) assert r.status_code == 403 response = r.json() assert response["detail"] == "Super users are not allowed to delete themselves" def test_delete_user_super_user( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) user_id = user.id r = client.delete( f"{settings.API_V1_STR}/users/{user_id}", headers=superuser_token_headers, ) assert r.status_code == 200 deleted_user = r.json() assert deleted_user["message"] == "User deleted successfully" result = db.exec(select(User).where(User.id == user_id)).first() assert result is None def test_delete_user_not_found( client: TestClient, superuser_token_headers: dict[str, str] ) -> None: r = client.delete( f"{settings.API_V1_STR}/users/{uuid.uuid4()}", headers=superuser_token_headers, ) assert r.status_code == 404 assert r.json()["detail"] == "User not found" def test_delete_user_current_super_user_error( client: TestClient, superuser_token_headers: dict[str, str], db: Session ) -> None: super_user = crud.get_user_by_email(session=db, email=settings.FIRST_SUPERUSER) assert super_user user_id = super_user.id r = client.delete( f"{settings.API_V1_STR}/users/{user_id}", headers=superuser_token_headers, ) assert r.status_code == 403 assert r.json()["detail"] == "Super users are not allowed to delete themselves" def test_delete_user_without_privileges( client: TestClient, normal_user_token_headers: dict[str, str], db: Session ) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) r = client.delete( f"{settings.API_V1_STR}/users/{user.id}", headers=normal_user_token_headers, ) assert r.status_code == 403 assert r.json()["detail"] == "The user doesn't have enough privileges" ``` ## /backend/app/tests/conftest.py ```py path="/backend/app/tests/conftest.py" from collections.abc import Generator import pytest from fastapi.testclient import TestClient from sqlmodel import Session, delete from app.core.config import settings from app.core.db import engine, init_db from app.main import app from app.models import Item, User from app.tests.utils.user import authentication_token_from_email from app.tests.utils.utils import get_superuser_token_headers @pytest.fixture(scope="session", autouse=True) def db() -> Generator[Session, None, None]: with Session(engine) as session: init_db(session) yield session statement = delete(Item) session.execute(statement) statement = delete(User) session.execute(statement) session.commit() @pytest.fixture(scope="module") def client() -> Generator[TestClient, None, None]: with TestClient(app) as c: yield c @pytest.fixture(scope="module") def superuser_token_headers(client: TestClient) -> dict[str, str]: return get_superuser_token_headers(client) @pytest.fixture(scope="module") def normal_user_token_headers(client: TestClient, db: Session) -> dict[str, str]: return authentication_token_from_email( client=client, email=settings.EMAIL_TEST_USER, db=db ) ``` ## /backend/app/tests/crud/__init__.py ```py path="/backend/app/tests/crud/__init__.py" ``` ## /backend/app/tests/crud/test_user.py ```py path="/backend/app/tests/crud/test_user.py" from fastapi.encoders import jsonable_encoder from sqlmodel import Session from app import crud from app.core.security import verify_password from app.models import User, UserCreate, UserUpdate from app.tests.utils.utils import random_email, random_lower_string def test_create_user(db: Session) -> None: email = random_email() password = random_lower_string() user_in = UserCreate(email=email, password=password) user = crud.create_user(session=db, user_create=user_in) assert user.email == email assert hasattr(user, "hashed_password") def test_authenticate_user(db: Session) -> None: email = random_email() password = random_lower_string() user_in = UserCreate(email=email, password=password) user = crud.create_user(session=db, user_create=user_in) authenticated_user = crud.authenticate(session=db, email=email, password=password) assert authenticated_user assert user.email == authenticated_user.email def test_not_authenticate_user(db: Session) -> None: email = random_email() password = random_lower_string() user = crud.authenticate(session=db, email=email, password=password) assert user is None def test_check_if_user_is_active(db: Session) -> None: email = random_email() password = random_lower_string() user_in = UserCreate(email=email, password=password) user = crud.create_user(session=db, user_create=user_in) assert user.is_active is True def test_check_if_user_is_active_inactive(db: Session) -> None: email = random_email() password = random_lower_string() user_in = UserCreate(email=email, password=password, disabled=True) user = crud.create_user(session=db, user_create=user_in) assert user.is_active def test_check_if_user_is_superuser(db: Session) -> None: email = random_email() password = random_lower_string() user_in = UserCreate(email=email, password=password, is_superuser=True) user = crud.create_user(session=db, user_create=user_in) assert user.is_superuser is True def test_check_if_user_is_superuser_normal_user(db: Session) -> None: username = random_email() password = random_lower_string() user_in = UserCreate(email=username, password=password) user = crud.create_user(session=db, user_create=user_in) assert user.is_superuser is False def test_get_user(db: Session) -> None: password = random_lower_string() username = random_email() user_in = UserCreate(email=username, password=password, is_superuser=True) user = crud.create_user(session=db, user_create=user_in) user_2 = db.get(User, user.id) assert user_2 assert user.email == user_2.email assert jsonable_encoder(user) == jsonable_encoder(user_2) def test_update_user(db: Session) -> None: password = random_lower_string() email = random_email() user_in = UserCreate(email=email, password=password, is_superuser=True) user = crud.create_user(session=db, user_create=user_in) new_password = random_lower_string() user_in_update = UserUpdate(password=new_password, is_superuser=True) if user.id is not None: crud.update_user(session=db, db_user=user, user_in=user_in_update) user_2 = db.get(User, user.id) assert user_2 assert user.email == user_2.email assert verify_password(new_password, user_2.hashed_password) ``` ## /backend/app/tests/scripts/__init__.py ```py path="/backend/app/tests/scripts/__init__.py" ``` ## /backend/app/tests/scripts/test_backend_pre_start.py ```py path="/backend/app/tests/scripts/test_backend_pre_start.py" from unittest.mock import MagicMock, patch from sqlmodel import select from app.backend_pre_start import init, logger def test_init_successful_connection() -> None: engine_mock = MagicMock() session_mock = MagicMock() exec_mock = MagicMock(return_value=True) session_mock.configure_mock(**{"exec.return_value": exec_mock}) with ( patch("sqlmodel.Session", return_value=session_mock), patch.object(logger, "info"), patch.object(logger, "error"), patch.object(logger, "warn"), ): try: init(engine_mock) connection_successful = True except Exception: connection_successful = False assert ( connection_successful ), "The database connection should be successful and not raise an exception." assert session_mock.exec.called_once_with( select(1) ), "The session should execute a select statement once." ``` ## /backend/app/tests/scripts/test_test_pre_start.py ```py path="/backend/app/tests/scripts/test_test_pre_start.py" from unittest.mock import MagicMock, patch from sqlmodel import select from app.tests_pre_start import init, logger def test_init_successful_connection() -> None: engine_mock = MagicMock() session_mock = MagicMock() exec_mock = MagicMock(return_value=True) session_mock.configure_mock(**{"exec.return_value": exec_mock}) with ( patch("sqlmodel.Session", return_value=session_mock), patch.object(logger, "info"), patch.object(logger, "error"), patch.object(logger, "warn"), ): try: init(engine_mock) connection_successful = True except Exception: connection_successful = False assert ( connection_successful ), "The database connection should be successful and not raise an exception." assert session_mock.exec.called_once_with( select(1) ), "The session should execute a select statement once." ``` ## /backend/app/tests/utils/__init__.py ```py path="/backend/app/tests/utils/__init__.py" ``` ## /backend/app/tests/utils/item.py ```py path="/backend/app/tests/utils/item.py" from sqlmodel import Session from app import crud from app.models import Item, ItemCreate from app.tests.utils.user import create_random_user from app.tests.utils.utils import random_lower_string def create_random_item(db: Session) -> Item: user = create_random_user(db) owner_id = user.id assert owner_id is not None title = random_lower_string() description = random_lower_string() item_in = ItemCreate(title=title, description=description) return crud.create_item(session=db, item_in=item_in, owner_id=owner_id) ``` ## /backend/app/tests/utils/user.py ```py path="/backend/app/tests/utils/user.py" from fastapi.testclient import TestClient from sqlmodel import Session from app import crud from app.core.config import settings from app.models import User, UserCreate, UserUpdate from app.tests.utils.utils import random_email, random_lower_string def user_authentication_headers( *, client: TestClient, email: str, password: str ) -> dict[str, str]: data = {"username": email, "password": password} r = client.post(f"{settings.API_V1_STR}/login/access-token", data=data) response = r.json() auth_token = response["access_token"] headers = {"Authorization": f"Bearer {auth_token}"} return headers def create_random_user(db: Session) -> User: email = random_email() password = random_lower_string() user_in = UserCreate(email=email, password=password) user = crud.create_user(session=db, user_create=user_in) return user def authentication_token_from_email( *, client: TestClient, email: str, db: Session ) -> dict[str, str]: """ Return a valid token for the user with given email. If the user doesn't exist it is created first. """ password = random_lower_string() user = crud.get_user_by_email(session=db, email=email) if not user: user_in_create = UserCreate(email=email, password=password) user = crud.create_user(session=db, user_create=user_in_create) else: user_in_update = UserUpdate(password=password) if not user.id: raise Exception("User id not set") user = crud.update_user(session=db, db_user=user, user_in=user_in_update) return user_authentication_headers(client=client, email=email, password=password) ``` ## /backend/app/tests/utils/utils.py ```py path="/backend/app/tests/utils/utils.py" import random import string from fastapi.testclient import TestClient from app.core.config import settings def random_lower_string() -> str: return "".join(random.choices(string.ascii_lowercase, k=32)) def random_email() -> str: return f"{random_lower_string()}@{random_lower_string()}.com" def get_superuser_token_headers(client: TestClient) -> dict[str, str]: login_data = { "username": settings.FIRST_SUPERUSER, "password": settings.FIRST_SUPERUSER_PASSWORD, } r = client.post(f"{settings.API_V1_STR}/login/access-token", data=login_data) tokens = r.json() a_token = tokens["access_token"] headers = {"Authorization": f"Bearer {a_token}"} return headers ``` ## /backend/app/tests_pre_start.py ```py path="/backend/app/tests_pre_start.py" import logging from sqlalchemy import Engine from sqlmodel import Session, select from tenacity import after_log, before_log, retry, stop_after_attempt, wait_fixed from app.core.db import engine logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__) max_tries = 60 * 5 # 5 minutes wait_seconds = 1 @retry( stop=stop_after_attempt(max_tries), wait=wait_fixed(wait_seconds), before=before_log(logger, logging.INFO), after=after_log(logger, logging.WARN), ) def init(db_engine: Engine) -> None: try: # Try to create session to check if DB is awake with Session(db_engine) as session: session.exec(select(1)) except Exception as e: logger.error(e) raise e def main() -> None: logger.info("Initializing service") init(engine) logger.info("Service finished initializing") if __name__ == "__main__": main() ``` ## /backend/app/utils.py ```py path="/backend/app/utils.py" import logging from dataclasses import dataclass from datetime import datetime, timedelta, timezone from pathlib import Path from typing import Any import emails # type: ignore import jwt from jinja2 import Template from jwt.exceptions import InvalidTokenError from app.core import security from app.core.config import settings logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__) @dataclass class EmailData: html_content: str subject: str def render_email_template(*, template_name: str, context: dict[str, Any]) -> str: template_str = ( Path(__file__).parent / "email-templates" / "build" / template_name ).read_text() html_content = Template(template_str).render(context) return html_content def send_email( *, email_to: str, subject: str = "", html_content: str = "", ) -> None: assert settings.emails_enabled, "no provided configuration for email variables" message = emails.Message( subject=subject, html=html_content, mail_from=(settings.EMAILS_FROM_NAME, settings.EMAILS_FROM_EMAIL), ) smtp_options = {"host": settings.SMTP_HOST, "port": settings.SMTP_PORT} if settings.SMTP_TLS: smtp_options["tls"] = True elif settings.SMTP_SSL: smtp_options["ssl"] = True if settings.SMTP_USER: smtp_options["user"] = settings.SMTP_USER if settings.SMTP_PASSWORD: smtp_options["password"] = settings.SMTP_PASSWORD response = message.send(to=email_to, smtp=smtp_options) logger.info(f"send email result: {response}") def generate_test_email(email_to: str) -> EmailData: project_name = settings.PROJECT_NAME subject = f"{project_name} - Test email" html_content = render_email_template( template_name="test_email.html", context={"project_name": settings.PROJECT_NAME, "email": email_to}, ) return EmailData(html_content=html_content, subject=subject) def generate_reset_password_email(email_to: str, email: str, token: str) -> EmailData: project_name = settings.PROJECT_NAME subject = f"{project_name} - Password recovery for user {email}" link = f"{settings.FRONTEND_HOST}/reset-password?token={token}" html_content = render_email_template( template_name="reset_password.html", context={ "project_name": settings.PROJECT_NAME, "username": email, "email": email_to, "valid_hours": settings.EMAIL_RESET_TOKEN_EXPIRE_HOURS, "link": link, }, ) return EmailData(html_content=html_content, subject=subject) def generate_new_account_email( email_to: str, username: str, password: str ) -> EmailData: project_name = settings.PROJECT_NAME subject = f"{project_name} - New account for user {username}" html_content = render_email_template( template_name="new_account.html", context={ "project_name": settings.PROJECT_NAME, "username": username, "password": password, "email": email_to, "link": settings.FRONTEND_HOST, }, ) return EmailData(html_content=html_content, subject=subject) def generate_password_reset_token(email: str) -> str: delta = timedelta(hours=settings.EMAIL_RESET_TOKEN_EXPIRE_HOURS) now = datetime.now(timezone.utc) expires = now + delta exp = expires.timestamp() encoded_jwt = jwt.encode( {"exp": exp, "nbf": now, "sub": email}, settings.SECRET_KEY, algorithm=security.ALGORITHM, ) return encoded_jwt def verify_password_reset_token(token: str) -> str | None: try: decoded_token = jwt.decode( token, settings.SECRET_KEY, algorithms=[security.ALGORITHM] ) return str(decoded_token["sub"]) except InvalidTokenError: return None ``` ## /backend/pyproject.toml ```toml path="/backend/pyproject.toml" [project] name = "app" version = "0.1.0" description = "" requires-python = ">=3.10,<4.0" dependencies = [ "fastapi[standard]<1.0.0,>=0.114.2", "python-multipart<1.0.0,>=0.0.7", "email-validator<3.0.0.0,>=2.1.0.post1", "passlib[bcrypt]<2.0.0,>=1.7.4", "tenacity<9.0.0,>=8.2.3", "pydantic>2.0", "emails<1.0,>=0.6", "jinja2<4.0.0,>=3.1.4", "alembic<2.0.0,>=1.12.1", "httpx<1.0.0,>=0.25.1", "psycopg[binary]<4.0.0,>=3.1.13", "sqlmodel<1.0.0,>=0.0.21", # Pin bcrypt until passlib supports the latest "bcrypt==4.0.1", "pydantic-settings<3.0.0,>=2.2.1", "sentry-sdk[fastapi]<2.0.0,>=1.40.6", "pyjwt<3.0.0,>=2.8.0", ] [tool.uv] dev-dependencies = [ "pytest<8.0.0,>=7.4.3", "mypy<2.0.0,>=1.8.0", "ruff<1.0.0,>=0.2.2", "pre-commit<4.0.0,>=3.6.2", "types-passlib<2.0.0.0,>=1.7.7.20240106", "coverage<8.0.0,>=7.4.3", ] [build-system] requires = ["hatchling"] build-backend = "hatchling.build" [tool.mypy] strict = true exclude = ["venv", ".venv", "alembic"] [tool.ruff] target-version = "py310" exclude = ["alembic"] [tool.ruff.lint] select = [ "E", # pycodestyle errors "W", # pycodestyle warnings "F", # pyflakes "I", # isort "B", # flake8-bugbear "C4", # flake8-comprehensions "UP", # pyupgrade "ARG001", # unused arguments in functions ] ignore = [ "E501", # line too long, handled by black "B008", # do not perform function calls in argument defaults "W191", # indentation contains tabs "B904", # Allow raising exceptions without from e, for HTTPException ] [tool.ruff.lint.pyupgrade] # Preserve types, even if a file imports `from __future__ import annotations`. keep-runtime-typing = true ``` ## /backend/scripts/format.sh ```sh path="/backend/scripts/format.sh" #!/bin/sh -e set -x ruff check app scripts --fix ruff format app scripts ``` ## /backend/scripts/lint.sh ```sh path="/backend/scripts/lint.sh" #!/usr/bin/env bash set -e set -x mypy app ruff check app ruff format app --check ``` ## /backend/scripts/prestart.sh ```sh path="/backend/scripts/prestart.sh" #! /usr/bin/env bash set -e set -x # Let the DB start python app/backend_pre_start.py # Run migrations alembic upgrade head # Create initial data in DB python app/initial_data.py ``` ## /backend/scripts/test.sh ```sh path="/backend/scripts/test.sh" #!/usr/bin/env bash set -e set -x coverage run --source=app -m pytest coverage report --show-missing coverage html --title "${@-coverage}" ``` ## /backend/scripts/tests-start.sh ```sh path="/backend/scripts/tests-start.sh" #! /usr/bin/env bash set -e set -x python app/tests_pre_start.py bash scripts/test.sh "$@" ``` The content has been capped at 50000 tokens, and files over NaN bytes have been omitted. The user could consider applying other filters to refine the result. The better and more specific the context, the better the LLM can follow instructions. If the context seems verbose, the user can refine the filter using uithub. Thank you for using https://uithub.com - Perfect LLM context for any GitHub repo.