commit 9a7c66e0cc341d9beb4debca9b41ede4b2a26c08 Author: itqop Date: Tue Dec 30 15:35:19 2025 +0300 first versuion diff --git a/.claude/settings.local.json b/.claude/settings.local.json new file mode 100644 index 0000000..fed9a2c --- /dev/null +++ b/.claude/settings.local.json @@ -0,0 +1,7 @@ +{ + "permissions": { + "allow": [ + "Bash(dir:*)" + ] + } +} diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..74af903 --- /dev/null +++ b/.gitignore @@ -0,0 +1,58 @@ +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +.Python +*.egg +*.egg-info/ +dist/ +build/ +.eggs/ +venv/ +.env + +# Node +node_modules/ +npm-debug.log* +yarn-debug.log* +yarn-error.log* +.pnpm-debug.log* + +# Build outputs +static/ +dist/ +dist-ssr/ + +# Database +*.db +*.db-journal +*.sqlite +*.sqlite3 + +# IDE +.vscode/ +.idea/ +*.swp +*.swo +*~ + +# OS +.DS_Store +Thumbs.db + +# Docker +*.log + +# Testing +.coverage +htmlcov/ +.pytest_cache/ + +# Misc +.env.local +.env.development.local +.env.test.local +.env.production.local + +.venv \ No newline at end of file diff --git a/CLAUDE.md b/CLAUDE.md new file mode 100644 index 0000000..18bd661 --- /dev/null +++ b/CLAUDE.md @@ -0,0 +1,230 @@ +# CLAUDE.md + +This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository. + +## Project Overview + +This is a cloud photo and video storage service with a React frontend (SPA hosted in S3) and a Python FastAPI backend. The project uses S3 for file storage and starts with SQLite for metadata, with a migration path to PostgreSQL. + +**Primary Language:** Russian (comments and documentation may be in Russian) + +## Tech Stack + +### Backend +- **Framework:** FastAPI (ASGI) +- **ORM:** SQLAlchemy 2.x (async) with Alembic migrations +- **Database:** SQLite (MVP) → PostgreSQL (future) +- **S3 SDK:** boto3 or aioboto3 +- **Authentication:** JWT (access + refresh tokens) +- **Background Tasks:** Redis + RQ (recommended for thumbnail generation) +- **Testing:** pytest + httpx +- **Linting/Formatting:** ruff + black (or ruff format) + +### Frontend +- **Framework:** React +- **UI Library:** Material UI (MUI) +- **Build Tool:** Vite +- **Deployment:** Static files in `static/` folder → S3 hosting + +### Infrastructure +- **File Storage:** S3 or S3-compatible (MinIO for development) +- **Queue:** Redis (for background workers) + +## Architecture Principles + +### Layered Architecture (Clean Architecture) +The backend follows strict separation of concerns: + +- **`api/`** - Routes, schemas, dependencies, auth middleware +- **`services/`** - Business logic (upload, library, share management) +- **`repositories/`** - Data access layer (CRUD operations) +- **`infra/`** - S3 client, database session factory, config, background tasks +- **`domain/`** - Domain models and interfaces (as needed) + +### Code Quality Standards +- **SOLID principles** are mandatory +- **DRY (Don't Repeat Yourself)** +- **Docstrings required** for all public methods/classes and main services +- **Minimal code comments** - prefer self-documenting code and docstrings +- **Stable API contract** - maintain OpenAPI/Swagger compatibility + +### Security Requirements +- Never store passwords in plaintext - use argon2id or bcrypt +- All S3 access must use pre-signed URLs with short TTL +- Validate all input with Pydantic +- Check asset ownership in all endpoints +- CORS must be strictly configured +- CSRF protection for cookie-based auth + +## Planned Project Structure + +``` +repo/ + backend/ + src/ + app/ + api/ + v1/ # API routes versioned + services/ # Business logic layer + repositories/ # Data access layer + infra/ # Infrastructure (S3, DB, config) + domain/ # Domain models + main.py # Application entry point + alembic/ # Database migrations + tests/ + pyproject.toml + Dockerfile + frontend/ + src/ + public/ + vite.config.js + package.json + static/ # Build output (deployed to S3) + docker-compose.yml +``` + +## Data Model + +### Core Entities + +**users**: User accounts (id, email, password_hash, created_at, updated_at, is_active) + +**assets**: Media files with metadata +- Type: photo | video +- Status: uploading | ready | failed | deleted +- Includes: original_filename, content_type, size_bytes, sha256, captured_at +- S3 keys: storage_key_original, storage_key_thumb +- Soft delete support via deleted_at + +**shares**: Public/private sharing links +- Links to either a single asset or album +- Supports expiration (expires_at) and revocation (revoked_at) +- Optional password protection + +**albums** (v1): Logical grouping of assets + +**tags** (v1): Tagging system for assets + +## S3 Storage Structure + +- **Bucket:** Private, access only via pre-signed URLs +- **Original files:** `u/{user_id}/o/{yyyy}/{mm}/{asset_id}{ext}` +- **Thumbnails:** `u/{user_id}/t/{yyyy}/{mm}/{asset_id}.jpg` +- **Video posters (v1):** `u/{user_id}/p/{yyyy}/{mm}/{asset_id}.jpg` + +## Upload Flow + +1. Frontend requests `POST /api/v1/uploads/create` with file metadata +2. Backend returns pre-signed URL or multipart upload credentials +3. Frontend uploads file directly to S3 +4. Frontend calls `POST /api/v1/uploads/{asset_id}/finalize` +5. Backend saves metadata and enqueues thumbnail generation task + +## API Structure + +Base path: `/api/v1` + +### Authentication +- `POST /api/v1/auth/register` +- `POST /api/v1/auth/login` +- `POST /api/v1/auth/logout` +- `GET /api/v1/auth/me` + +### Assets (Library) +- `GET /api/v1/assets` - List with cursor-based pagination +- `GET /api/v1/assets/{asset_id}` +- `DELETE /api/v1/assets/{asset_id}` - Soft delete +- `POST /api/v1/assets/{asset_id}/restore` +- `DELETE /api/v1/assets/{asset_id}/purge` - Hard delete from trash + +### Upload +- `POST /api/v1/uploads/create` +- `POST /api/v1/uploads/{asset_id}/finalize` + +### Access URLs +- `GET /api/v1/assets/{asset_id}/download-url?kind=original|thumb` +- `GET /api/v1/assets/{asset_id}/stream-url` - For video + +### Shares +- `POST /api/v1/shares` - Create share link +- `GET /api/v1/shares/{token}` +- `GET /api/v1/shares/{token}/download-url?asset_id=&kind=` +- `POST /api/v1/shares/{token}/revoke` + +## Environment Variables + +Key backend environment variables (see [tech_spec_cloud_media_storage.md](tech_spec_cloud_media_storage.md) section 13 for full list): + +- `APP_ENV=dev|prod` +- `DATABASE_URL=sqlite+aiosqlite:///./app.db` (or PostgreSQL connection string) +- `S3_ENDPOINT_URL` - For MinIO or custom S3-compatible storage +- `S3_REGION`, `S3_ACCESS_KEY_ID`, `S3_SECRET_ACCESS_KEY` +- `MEDIA_BUCKET` - S3 bucket name +- `SIGNED_URL_TTL_SECONDS=600` +- `JWT_SECRET`, `JWT_ACCESS_TTL_SECONDS`, `JWT_REFRESH_TTL_SECONDS` +- `MAX_UPLOAD_SIZE_BYTES` - File size limit +- `CORS_ORIGINS` - Allowed frontend origins + +## Database Migrations + +- All schema changes MUST go through Alembic migrations +- No raw SQL in application code (except migrations) +- Schema must be compatible with both SQLite (MVP) and PostgreSQL +- UUID stored as TEXT in SQLite, native UUID in PostgreSQL + +## Development Workflow (When Implemented) + +### Backend Development +Expected commands once backend is set up: +- Start server: `uvicorn app.main:app --reload` +- Run tests: `pytest` +- Create migration: `alembic revision --autogenerate -m "description"` +- Apply migrations: `alembic upgrade head` +- Format code: `ruff format .` or `black .` +- Lint: `ruff check .` + +### Frontend Development +Expected commands once frontend is set up: +- Install dependencies: `npm install` +- Dev server: `npm run dev` +- Build for production: `npm run build` (outputs to `static/`) +- Lint: `npm run lint` + +### Docker Compose (Development) +Recommended setup: `docker-compose up` to start: +- Backend service +- MinIO (S3-compatible storage) +- Redis (for background tasks) +- PostgreSQL (when migrating from SQLite) + +## MVP Acceptance Criteria + +- User registration and authentication working +- Upload photo and video files (including batch upload of 100+ files) +- Library displays thumbnails for photos +- Photo/video viewer works in browser +- Soft delete to trash with restore capability +- Public share links work without authentication +- SQLite database with migration path to PostgreSQL ready + +## Implementation Phases + +1. **Foundation:** FastAPI skeleton, DB config, migrations, auth +2. **Assets:** CRUD for assets, library listing with pagination, trash management +3. **Frontend MVP:** Login, library grid, upload dialog, viewer, trash UI +4. **Thumbnails:** Background generation and display +5. **Shares:** Create and access share links, shared view UI + +## Important Notes + +- This project follows **Clean Architecture** - respect layer boundaries +- All file access goes through **pre-signed S3 URLs**, never direct access +- Use **cursor-based pagination** for listing endpoints +- **Thumbnails reduce bandwidth** - originals loaded only on demand +- For large files, use **S3 multipart upload** +- Background tasks via **Redis + RQ** for thumbnail/poster generation +- Support both **inline** (MVP acceptable) and **background** thumbnail generation + +## Reference Documentation + +For detailed technical requirements, see [tech_spec_cloud_media_storage.md](tech_spec_cloud_media_storage.md). diff --git a/README.md b/README.md new file mode 100644 index 0000000..e36d321 --- /dev/null +++ b/README.md @@ -0,0 +1,230 @@ +# ITCloud - Облачное хранилище фото и видео + +Современное облачное хранилище для фото и видео с удобным веб-интерфейсом, построенное на React + Material UI и Python FastAPI. + +## Особенности + +- 📸 **Загрузка фото и видео** - поддержка drag & drop, пакетная загрузка +- 🖼️ **Удобная галерея** - сетка с превью, быстрый просмотр +- 🎬 **Видео плеер** - встроенный плеер для просмотра видео +- 🗑️ **Корзина** - мягкое удаление с возможностью восстановления +- 🔗 **Шаринг** - публичные ссылки с возможностью установить срок действия и пароль +- 📱 **Responsive дизайн** - отлично работает на мобильных устройствах и десктопе +- 🔐 **Безопасность** - JWT аутентификация, pre-signed URLs для S3 + +## Технологии + +### Backend +- **FastAPI** - современный асинхронный веб-фреймворк +- **SQLAlchemy 2.0** - ORM с асинхронной поддержкой +- **SQLite / PostgreSQL** - база данных (легкая миграция) +- **S3 / MinIO** - хранилище объектов +- **Alembic** - миграции базы данных +- **JWT** - аутентификация + +### Frontend +- **React 18** - современная библиотека для UI +- **TypeScript** - типизированный JavaScript +- **Material UI (MUI)** - готовые компоненты с Material Design +- **Vite** - быстрая сборка +- **React Router** - маршрутизация +- **Axios** - HTTP клиент + +## Быстрый старт + +### Предварительные требования + +- Docker и Docker Compose +- Node.js 20+ (для разработки фронтенда без Docker) +- Python 3.11+ (для разработки backend без Docker) + +### Запуск с Docker Compose (рекомендуется) + +1. Клонируйте репозиторий: +```bash +git clone +cd itcloud +``` + +2. Запустите все сервисы: +```bash +docker-compose up +``` + +Это запустит: +- Backend API на http://localhost:8000 +- Frontend на http://localhost:5173 +- MinIO (S3) на http://localhost:9000 (консоль: http://localhost:9001) +- Redis на localhost:6379 + +3. Откройте браузер и перейдите на http://localhost:5173 + +### Разработка без Docker + +#### Backend + +1. Установите зависимости: +```bash +cd backend +pip install poetry +poetry install +``` + +2. Создайте файл `.env`: +```bash +cp .env.example .env +# Отредактируйте .env с вашими настройками +``` + +3. Примените миграции: +```bash +poetry run alembic upgrade head +``` + +4. Запустите сервер: +```bash +poetry run uvicorn app.main:app --reload --host 0.0.0.0 --port 8000 +``` + +#### Frontend + +1. Установите зависимости: +```bash +cd frontend +npm install +``` + +2. Создайте файл `.env`: +```bash +echo "VITE_API_URL=http://localhost:8000" > .env.local +``` + +3. Запустите dev сервер: +```bash +npm run dev +``` + +## Структура проекта + +``` +itcloud/ +├── backend/ # Python FastAPI backend +│ ├── src/app/ +│ │ ├── api/ # API routes +│ │ │ └── v1/ # API v1 endpoints +│ │ ├── services/ # Business logic +│ │ ├── repositories/ # Data access layer +│ │ ├── infra/ # Infrastructure (S3, DB, config) +│ │ └── domain/ # Domain models +│ ├── alembic/ # Database migrations +│ ├── tests/ # Tests +│ └── pyproject.toml # Python dependencies +├── frontend/ # React frontend +│ ├── src/ +│ │ ├── components/ # React components +│ │ ├── pages/ # Page components +│ │ ├── services/ # API client +│ │ ├── hooks/ # Custom hooks +│ │ ├── types/ # TypeScript types +│ │ └── theme/ # MUI theme +│ └── package.json # Node dependencies +├── docker-compose.yml # Docker Compose configuration +└── CLAUDE.md # Developer documentation +``` + +## API Документация + +После запуска backend, документация доступна по адресу: +- Swagger UI: http://localhost:8000/docs +- ReDoc: http://localhost:8000/redoc + +## Основные эндпоинты + +### Аутентификация +- `POST /api/v1/auth/register` - Регистрация +- `POST /api/v1/auth/login` - Вход +- `GET /api/v1/auth/me` - Получить текущего пользователя + +### Файлы +- `GET /api/v1/assets` - Список файлов +- `GET /api/v1/assets/{id}` - Информация о файле +- `DELETE /api/v1/assets/{id}` - Удалить (в корзину) +- `POST /api/v1/assets/{id}/restore` - Восстановить из корзины +- `DELETE /api/v1/assets/{id}/purge` - Удалить навсегда + +### Загрузка +- `POST /api/v1/uploads/create` - Создать загрузку +- `POST /api/v1/uploads/{id}/finalize` - Завершить загрузку + +### Шаринг +- `POST /api/v1/shares` - Создать публичную ссылку +- `GET /api/v1/shares/{token}` - Получить информацию о ссылке +- `GET /api/v1/shares/{token}/download-url` - Получить URL для скачивания + +## Переменные окружения + +### Backend +```env +APP_ENV=dev +DATABASE_URL=sqlite+aiosqlite:///./app.db +S3_ENDPOINT_URL=http://localhost:9000 +S3_ACCESS_KEY_ID=minioadmin +S3_SECRET_ACCESS_KEY=minioadmin +MEDIA_BUCKET=itcloud-media +JWT_SECRET=your-secret-key +CORS_ORIGINS=http://localhost:5173 +``` + +### Frontend +```env +VITE_API_URL=http://localhost:8000 +``` + +## Миграция на PostgreSQL + +1. Обновите `DATABASE_URL` в `.env`: +```env +DATABASE_URL=postgresql+asyncpg://user:password@localhost:5432/itcloud +``` + +2. Примените миграции: +```bash +poetry run alembic upgrade head +``` + +## Деплой + +### Production Build Frontend +```bash +cd frontend +npm run build +``` + +Файлы будут собраны в `static/` и готовы для хостинга в S3 или через nginx. + +### Backend в Production +1. Используйте PostgreSQL вместо SQLite +2. Настройте CORS для вашего домена +3. Используйте сильный JWT_SECRET +4. Настройте SSL/TLS +5. Используйте gunicorn/uvicorn с несколькими воркерами + +## Следующие шаги (TODO) + +- [ ] Redis + RQ для фоновых задач +- [ ] Генерация превью для фото +- [ ] Генерация постеров для видео +- [ ] Извлечение EXIF данных +- [ ] Альбомы +- [ ] Теги +- [ ] Поиск по метаданным +- [ ] Квоты пользователей +- [ ] Тесты + +## Лицензия + +MIT + +## Поддержка + +Для вопросов и предложений создавайте issue в репозитории. diff --git a/backend/.dockerignore b/backend/.dockerignore new file mode 100644 index 0000000..0c478b4 --- /dev/null +++ b/backend/.dockerignore @@ -0,0 +1,20 @@ +__pycache__ +*.pyc +*.pyo +*.pyd +.Python +*.so +*.egg +*.egg-info +dist +build +.env +.venv +venv +*.db +*.db-journal +.pytest_cache +.coverage +htmlcov +.mypy_cache +.ruff_cache diff --git a/backend/.env.example b/backend/.env.example new file mode 100644 index 0000000..3c2d4e9 --- /dev/null +++ b/backend/.env.example @@ -0,0 +1,35 @@ +# Application +APP_ENV=dev +APP_HOST=0.0.0.0 +APP_PORT=8000 + +# Database +DATABASE_URL=sqlite+aiosqlite:///./app.db +# For PostgreSQL: postgresql+asyncpg://user:password@localhost:5432/itcloud + +# S3 Storage +S3_ENDPOINT_URL=http://localhost:9000 +S3_REGION=us-east-1 +S3_ACCESS_KEY_ID=minioadmin +S3_SECRET_ACCESS_KEY=minioadmin +MEDIA_BUCKET=itcloud-media + +# Security +JWT_SECRET=your-secret-key-change-this-in-production +JWT_ALGORITHM=HS256 +JWT_ACCESS_TTL_SECONDS=900 +JWT_REFRESH_TTL_SECONDS=1209600 + +# Upload limits +MAX_UPLOAD_SIZE_BYTES=21474836480 +SIGNED_URL_TTL_SECONDS=600 + +# CORS +CORS_ORIGINS=http://localhost:5173,http://localhost:3000 + +# Redis +REDIS_URL=redis://localhost:6379/0 + +# Thumbnails +THUMBNAIL_MAX_SIZE=1024 +THUMBNAIL_QUALITY=85 diff --git a/backend/.gitignore b/backend/.gitignore new file mode 100644 index 0000000..5afc107 --- /dev/null +++ b/backend/.gitignore @@ -0,0 +1,50 @@ +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +.Python +*.egg +*.egg-info/ +dist/ +build/ +.eggs/ + +# Virtual environments +.env +.venv +venv/ +ENV/ + +# Database +*.db +*.db-journal +*.sqlite +*.sqlite3 + +# IDE +.vscode/ +.idea/ +*.swp +*.swo +*~ + +# Testing +.pytest_cache/ +.coverage +htmlcov/ +.tox/ + +# Type checking +.mypy_cache/ +.dmypy.json + +# Linting +.ruff_cache/ + +# Logs +*.log + +# OS +.DS_Store +Thumbs.db diff --git a/backend/Dockerfile b/backend/Dockerfile new file mode 100644 index 0000000..6a88ae9 --- /dev/null +++ b/backend/Dockerfile @@ -0,0 +1,28 @@ +FROM python:3.11-slim + +WORKDIR /app + +# Install system dependencies +RUN apt-get update && apt-get install -y \ + gcc \ + && rm -rf /var/lib/apt/lists/* + +# Install poetry +RUN pip install poetry==1.7.1 + +# Copy dependency files +COPY pyproject.toml ./ + +# Install dependencies +RUN poetry config virtualenvs.create false \ + && poetry install --no-dev --no-interaction --no-ansi + +# Copy application code +COPY src/ ./src/ + +# Create directory for SQLite database +RUN mkdir -p /app/data + +ENV PYTHONPATH=/app/src + +CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"] diff --git a/backend/alembic.ini b/backend/alembic.ini new file mode 100644 index 0000000..a3cb705 --- /dev/null +++ b/backend/alembic.ini @@ -0,0 +1,112 @@ +# A generic, single database configuration. + +[alembic] +# path to migration scripts +script_location = alembic + +# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s +file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s + +# sys.path path, will be prepended to sys.path if present. +prepend_sys_path = src + +# timezone to use when rendering the date within the migration file +# as well as the filename. +# If specified, requires the python-dateutil library that can be +# installed by adding `alembic[tz]` to the pip requirements +# string value is passed to dateutil.tz.gettz() +# leave blank for localtime +# timezone = + +# max length of characters to apply to the +# "slug" field +# truncate_slug_length = 40 + +# set to 'true' to run the environment during +# the 'revision' command, regardless of autogenerate +# revision_environment = false + +# set to 'true' to allow .pyc and .pyo files without +# a source .py file to be detected as revisions in the +# versions/ directory +# sourceless = false + +# version location specification; This defaults +# to alembic/versions. When using multiple version +# directories, initial revisions must be specified with --version-path. +# The path separator used here should be the separator specified by "version_path_separator" below. +# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions + +# version path separator; As mentioned above, this is the character used to split +# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep. +# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas. +# Valid values for version_path_separator are: +# +# version_path_separator = : +# version_path_separator = ; +# version_path_separator = space +version_path_separator = os # Use os.pathsep. Default configuration used for new projects. + +# set to 'true' to search source files recursively +# in each "version_locations" directory +# new in Alembic version 1.10 +# recursive_version_locations = false + +# the output encoding used when revision files +# are written from script.py.mako +# output_encoding = utf-8 + +sqlalchemy.url = sqlite+aiosqlite:///./app.db + + +[post_write_hooks] +# post_write_hooks defines scripts or Python functions that are run +# on newly generated revision scripts. See the documentation for further +# detail and examples + +# format using "black" - use the console_scripts runner, against the "black" entrypoint +# hooks = black +# black.type = console_scripts +# black.entrypoint = black +# black.options = -l 79 REVISION_SCRIPT_FILENAME + +# lint with attempts to fix using "ruff" - use the exec runner, execute a binary +# hooks = ruff +# ruff.type = exec +# ruff.executable = %(here)s/.venv/bin/ruff +# ruff.options = --fix REVISION_SCRIPT_FILENAME + +# Logging configuration +[loggers] +keys = root,sqlalchemy,alembic + +[handlers] +keys = console + +[formatters] +keys = generic + +[logger_root] +level = WARN +handlers = console +qualname = + +[logger_sqlalchemy] +level = WARN +handlers = +qualname = sqlalchemy.engine + +[logger_alembic] +level = INFO +handlers = +qualname = alembic + +[handler_console] +class = StreamHandler +args = (sys.stderr,) +level = NOTSET +formatter = generic + +[formatter_generic] +format = %(levelname)-5.5s [%(name)s] %(message)s +datefmt = %H:%M:%S diff --git a/backend/alembic/env.py b/backend/alembic/env.py new file mode 100644 index 0000000..7ebf571 --- /dev/null +++ b/backend/alembic/env.py @@ -0,0 +1,77 @@ +"""Alembic environment configuration.""" + +import asyncio +from logging.config import fileConfig + +from sqlalchemy import pool +from sqlalchemy.engine import Connection +from sqlalchemy.ext.asyncio import async_engine_from_config + +from alembic import context + +# Import your models here +from app.domain.models import Base +from app.infra.config import get_settings + +# this is the Alembic Config object +config = context.config + +# Interpret the config file for Python logging +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +# Get database URL from settings +settings = get_settings() +config.set_main_option("sqlalchemy.url", settings.database_url) + +# Add your model's MetaData object here for 'autogenerate' support +target_metadata = Base.metadata + + +def run_migrations_offline() -> None: + """Run migrations in 'offline' mode.""" + url = config.get_main_option("sqlalchemy.url") + context.configure( + url=url, + target_metadata=target_metadata, + literal_binds=True, + dialect_opts={"paramstyle": "named"}, + ) + + with context.begin_transaction(): + context.run_migrations() + + +def do_run_migrations(connection: Connection) -> None: + """Run migrations with connection.""" + context.configure(connection=connection, target_metadata=target_metadata) + + with context.begin_transaction(): + context.run_migrations() + + +async def run_async_migrations() -> None: + """Run migrations in async mode.""" + configuration = config.get_section(config.config_ini_section) + configuration["sqlalchemy.url"] = settings.database_url + connectable = async_engine_from_config( + configuration, + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + + async with connectable.connect() as connection: + await connection.run_sync(do_run_migrations) + + await connectable.dispose() + + +def run_migrations_online() -> None: + """Run migrations in 'online' mode.""" + asyncio.run(run_async_migrations()) + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/backend/alembic/script.py.mako b/backend/alembic/script.py.mako new file mode 100644 index 0000000..fbc4b07 --- /dev/null +++ b/backend/alembic/script.py.mako @@ -0,0 +1,26 @@ +"""${message} + +Revision ID: ${up_revision} +Revises: ${down_revision | comma,n} +Create Date: ${create_date} + +""" +from typing import Sequence, Union + +from alembic import op +import sqlalchemy as sa +${imports if imports else ""} + +# revision identifiers, used by Alembic. +revision: str = ${repr(up_revision)} +down_revision: Union[str, None] = ${repr(down_revision)} +branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)} +depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)} + + +def upgrade() -> None: + ${upgrades if upgrades else "pass"} + + +def downgrade() -> None: + ${downgrades if downgrades else "pass"} diff --git a/backend/pyproject.toml b/backend/pyproject.toml new file mode 100644 index 0000000..e6c8fef --- /dev/null +++ b/backend/pyproject.toml @@ -0,0 +1,52 @@ +[tool.poetry] +name = "itcloud-backend" +version = "0.1.0" +description = "Cloud photo and video storage backend" +authors = ["ITCloud Team"] +readme = "README.md" + +[tool.poetry.dependencies] +python = "^3.11" +fastapi = "^0.109.0" +uvicorn = {extras = ["standard"], version = "^0.27.0"} +sqlalchemy = {extras = ["asyncio"], version = "^2.0.25"} +alembic = "^1.13.1" +pydantic = "^2.5.3" +pydantic-settings = "^2.1.0" +python-jose = {extras = ["cryptography"], version = "^3.3.0"} +passlib = {extras = ["bcrypt"], version = "^1.7.4"} +python-multipart = "^0.0.6" +aiosqlite = "^0.19.0" +asyncpg = "^0.29.0" +boto3 = "^1.34.34" +aioboto3 = "^12.3.0" +redis = "^5.0.1" +rq = "^1.16.1" +pillow = "^10.2.0" +python-magic = "^0.4.27" +loguru = "^0.7.2" +httpx = "^0.26.0" + +[tool.poetry.group.dev.dependencies] +pytest = "^7.4.4" +pytest-asyncio = "^0.23.3" +pytest-cov = "^4.1.0" +ruff = "^0.1.14" +black = "^24.1.1" +mypy = "^1.8.0" + +[build-system] +requires = ["poetry-core"] +build-backend = "poetry.core.masonry.api" + +[tool.ruff] +line-length = 100 +target-version = "py311" + +[tool.black] +line-length = 100 +target-version = ["py311"] + +[tool.pytest.ini_options] +asyncio_mode = "auto" +testpaths = ["tests"] diff --git a/backend/src/app/__init__.py b/backend/src/app/__init__.py new file mode 100644 index 0000000..01105e3 --- /dev/null +++ b/backend/src/app/__init__.py @@ -0,0 +1 @@ +"""ITCloud backend application.""" diff --git a/backend/src/app/api/__init__.py b/backend/src/app/api/__init__.py new file mode 100644 index 0000000..5651e3a --- /dev/null +++ b/backend/src/app/api/__init__.py @@ -0,0 +1 @@ +"""API layer.""" diff --git a/backend/src/app/api/dependencies.py b/backend/src/app/api/dependencies.py new file mode 100644 index 0000000..10ccfaf --- /dev/null +++ b/backend/src/app/api/dependencies.py @@ -0,0 +1,66 @@ +"""API dependencies for dependency injection.""" + +from typing import Annotated + +from fastapi import Depends, HTTPException, status +from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer +from sqlalchemy.ext.asyncio import AsyncSession + +from app.domain.models import User +from app.infra.database import get_db +from app.infra.s3_client import S3Client, get_s3_client +from app.infra.security import decode_token +from app.repositories.user_repository import UserRepository + +security = HTTPBearer() + + +async def get_current_user( + credentials: Annotated[HTTPAuthorizationCredentials, Depends(security)], + session: Annotated[AsyncSession, Depends(get_db)], +) -> User: + """ + Get current authenticated user from JWT token. + + Args: + credentials: HTTP authorization credentials + session: Database session + + Returns: + Current user + + Raises: + HTTPException: If token is invalid or user not found + """ + token = credentials.credentials + payload = decode_token(token) + + if not payload or payload.get("type") != "access": + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail="Invalid authentication credentials", + ) + + user_id = payload.get("sub") + if not user_id: + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail="Invalid authentication credentials", + ) + + user_repo = UserRepository(session) + user = await user_repo.get_by_id(user_id) + + if not user or not user.is_active: + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail="User not found or inactive", + ) + + return user + + +# Type aliases for dependency injection +CurrentUser = Annotated[User, Depends(get_current_user)] +DatabaseSession = Annotated[AsyncSession, Depends(get_db)] +S3ClientDep = Annotated[S3Client, Depends(get_s3_client)] diff --git a/backend/src/app/api/schemas.py b/backend/src/app/api/schemas.py new file mode 100644 index 0000000..4b9c942 --- /dev/null +++ b/backend/src/app/api/schemas.py @@ -0,0 +1,144 @@ +"""Pydantic schemas for API requests and responses.""" + +from datetime import datetime +from typing import Optional + +from pydantic import BaseModel, EmailStr, Field + +from app.domain.models import AssetStatus, AssetType + + +# Auth schemas +class UserRegister(BaseModel): + """User registration request.""" + + email: EmailStr + password: str = Field(min_length=8, max_length=100) + + +class UserLogin(BaseModel): + """User login request.""" + + email: EmailStr + password: str + + +class Token(BaseModel): + """JWT token response.""" + + access_token: str + refresh_token: str + token_type: str = "bearer" + + +class UserResponse(BaseModel): + """User information response.""" + + id: str + email: str + is_active: bool + created_at: datetime + + model_config = {"from_attributes": True} + + +# Asset schemas +class AssetResponse(BaseModel): + """Asset information response.""" + + id: str + user_id: str + type: AssetType + status: AssetStatus + original_filename: str + content_type: str + size_bytes: int + sha256: Optional[str] = None + captured_at: Optional[datetime] = None + width: Optional[int] = None + height: Optional[int] = None + duration_sec: Optional[float] = None + storage_key_original: str + storage_key_thumb: Optional[str] = None + created_at: datetime + deleted_at: Optional[datetime] = None + + model_config = {"from_attributes": True} + + +class AssetListResponse(BaseModel): + """Paginated list of assets.""" + + items: list[AssetResponse] + next_cursor: Optional[str] = None + has_more: bool + + +class CreateUploadRequest(BaseModel): + """Request to create an upload.""" + + original_filename: str = Field(max_length=512) + content_type: str = Field(max_length=100) + size_bytes: int = Field(gt=0) + + +class CreateUploadResponse(BaseModel): + """Response with upload credentials.""" + + asset_id: str + upload_url: str + upload_method: str = "presigned_post" + fields: Optional[dict] = None + + +class FinalizeUploadRequest(BaseModel): + """Request to finalize an upload.""" + + etag: Optional[str] = None + sha256: Optional[str] = Field(None, max_length=64) + + +# Download URLs +class DownloadUrlResponse(BaseModel): + """Pre-signed download URL.""" + + url: str + expires_in: int + + +# Share schemas +class CreateShareRequest(BaseModel): + """Request to create a share link.""" + + asset_id: Optional[str] = None + album_id: Optional[str] = None + expires_in_seconds: Optional[int] = Field(None, gt=0) + password: Optional[str] = None + + +class ShareResponse(BaseModel): + """Share link information.""" + + id: str + owner_user_id: str + asset_id: Optional[str] = None + album_id: Optional[str] = None + token: str + expires_at: Optional[datetime] = None + created_at: datetime + revoked_at: Optional[datetime] = None + + model_config = {"from_attributes": True} + + +class ShareAccessRequest(BaseModel): + """Request to access a shared resource.""" + + password: Optional[str] = None + + +# Error response +class ErrorResponse(BaseModel): + """Standard error response.""" + + error: dict[str, str | dict] diff --git a/backend/src/app/api/v1/__init__.py b/backend/src/app/api/v1/__init__.py new file mode 100644 index 0000000..5a151f5 --- /dev/null +++ b/backend/src/app/api/v1/__init__.py @@ -0,0 +1 @@ +"""API v1 routes.""" diff --git a/backend/src/app/api/v1/assets.py b/backend/src/app/api/v1/assets.py new file mode 100644 index 0000000..78c41c5 --- /dev/null +++ b/backend/src/app/api/v1/assets.py @@ -0,0 +1,174 @@ +"""Assets API routes.""" + +from typing import Optional + +from fastapi import APIRouter, Query, status + +from app.api.dependencies import CurrentUser, DatabaseSession, S3ClientDep +from app.api.schemas import AssetListResponse, AssetResponse, DownloadUrlResponse +from app.domain.models import AssetType +from app.infra.config import get_settings +from app.services.asset_service import AssetService + +router = APIRouter(prefix="/assets", tags=["assets"]) +settings = get_settings() + + +@router.get("", response_model=AssetListResponse) +async def list_assets( + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, + cursor: Optional[str] = Query(None), + limit: int = Query(50, ge=1, le=200), + type: Optional[AssetType] = Query(None), +): + """ + List user's assets with pagination. + + Args: + current_user: Current authenticated user + session: Database session + s3_client: S3 client + cursor: Pagination cursor + limit: Maximum number of results + type: Filter by asset type + + Returns: + Paginated list of assets + """ + asset_service = AssetService(session, s3_client) + assets, next_cursor, has_more = await asset_service.list_assets( + user_id=current_user.id, + limit=limit, + cursor=cursor, + asset_type=type, + ) + + return AssetListResponse( + items=assets, + next_cursor=next_cursor, + has_more=has_more, + ) + + +@router.get("/{asset_id}", response_model=AssetResponse) +async def get_asset( + asset_id: str, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, +): + """ + Get asset by ID. + + Args: + asset_id: Asset ID + current_user: Current authenticated user + session: Database session + s3_client: S3 client + + Returns: + Asset information + """ + asset_service = AssetService(session, s3_client) + asset = await asset_service.get_asset(user_id=current_user.id, asset_id=asset_id) + return asset + + +@router.get("/{asset_id}/download-url", response_model=DownloadUrlResponse) +async def get_download_url( + asset_id: str, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, + kind: str = Query("original", regex="^(original|thumb)$"), +): + """ + Get pre-signed download URL for an asset. + + Args: + asset_id: Asset ID + current_user: Current authenticated user + session: Database session + s3_client: S3 client + kind: 'original' or 'thumb' + + Returns: + Pre-signed download URL + """ + asset_service = AssetService(session, s3_client) + url = await asset_service.get_download_url( + user_id=current_user.id, + asset_id=asset_id, + kind=kind, + ) + return DownloadUrlResponse(url=url, expires_in=settings.signed_url_ttl_seconds) + + +@router.delete("/{asset_id}", response_model=AssetResponse) +async def delete_asset( + asset_id: str, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, +): + """ + Soft delete an asset (move to trash). + + Args: + asset_id: Asset ID + current_user: Current authenticated user + session: Database session + s3_client: S3 client + + Returns: + Updated asset + """ + asset_service = AssetService(session, s3_client) + asset = await asset_service.delete_asset(user_id=current_user.id, asset_id=asset_id) + return asset + + +@router.post("/{asset_id}/restore", response_model=AssetResponse) +async def restore_asset( + asset_id: str, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, +): + """ + Restore a soft-deleted asset. + + Args: + asset_id: Asset ID + current_user: Current authenticated user + session: Database session + s3_client: S3 client + + Returns: + Updated asset + """ + asset_service = AssetService(session, s3_client) + asset = await asset_service.restore_asset(user_id=current_user.id, asset_id=asset_id) + return asset + + +@router.delete("/{asset_id}/purge", status_code=status.HTTP_204_NO_CONTENT) +async def purge_asset( + asset_id: str, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, +): + """ + Permanently delete an asset. + + Args: + asset_id: Asset ID + current_user: Current authenticated user + session: Database session + s3_client: S3 client + """ + asset_service = AssetService(session, s3_client) + await asset_service.purge_asset(user_id=current_user.id, asset_id=asset_id) diff --git a/backend/src/app/api/v1/auth.py b/backend/src/app/api/v1/auth.py new file mode 100644 index 0000000..7101c2e --- /dev/null +++ b/backend/src/app/api/v1/auth.py @@ -0,0 +1,59 @@ +"""Authentication API routes.""" + +from fastapi import APIRouter, status + +from app.api.dependencies import CurrentUser, DatabaseSession +from app.api.schemas import Token, UserLogin, UserRegister, UserResponse +from app.services.auth_service import AuthService + +router = APIRouter(prefix="/auth", tags=["auth"]) + + +@router.post("/register", response_model=UserResponse, status_code=status.HTTP_201_CREATED) +async def register(data: UserRegister, session: DatabaseSession): + """ + Register a new user. + + Args: + data: Registration data + session: Database session + + Returns: + Created user information + """ + auth_service = AuthService(session) + user = await auth_service.register(email=data.email, password=data.password) + return user + + +@router.post("/login", response_model=Token) +async def login(data: UserLogin, session: DatabaseSession): + """ + Authenticate user and get access tokens. + + Args: + data: Login credentials + session: Database session + + Returns: + Access and refresh tokens + """ + auth_service = AuthService(session) + access_token, refresh_token = await auth_service.login( + email=data.email, password=data.password + ) + return Token(access_token=access_token, refresh_token=refresh_token) + + +@router.get("/me", response_model=UserResponse) +async def get_current_user_info(current_user: CurrentUser): + """ + Get current user information. + + Args: + current_user: Current authenticated user + + Returns: + User information + """ + return current_user diff --git a/backend/src/app/api/v1/shares.py b/backend/src/app/api/v1/shares.py new file mode 100644 index 0000000..677a4ad --- /dev/null +++ b/backend/src/app/api/v1/shares.py @@ -0,0 +1,132 @@ +"""Share API routes.""" + +from typing import Optional + +from fastapi import APIRouter, Query, status + +from app.api.dependencies import CurrentUser, DatabaseSession, S3ClientDep +from app.api.schemas import ( + CreateShareRequest, + DownloadUrlResponse, + ShareAccessRequest, + ShareResponse, +) +from app.infra.config import get_settings +from app.services.share_service import ShareService + +router = APIRouter(prefix="/shares", tags=["shares"]) +settings = get_settings() + + +@router.post("", response_model=ShareResponse, status_code=status.HTTP_201_CREATED) +async def create_share( + data: CreateShareRequest, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, +): + """ + Create a share link. + + Args: + data: Share creation data + current_user: Current authenticated user + session: Database session + s3_client: S3 client + + Returns: + Created share information + """ + share_service = ShareService(session, s3_client) + share = await share_service.create_share( + user_id=current_user.id, + asset_id=data.asset_id, + album_id=data.album_id, + expires_in_seconds=data.expires_in_seconds, + password=data.password, + ) + return share + + +@router.get("/{token}", response_model=ShareResponse) +async def get_share( + token: str, + session: DatabaseSession, + s3_client: S3ClientDep, + password: Optional[str] = Query(None), +): + """ + Get share information by token. + + Args: + token: Share token + session: Database session + s3_client: S3 client + password: Optional password for protected shares + + Returns: + Share information + """ + share_service = ShareService(session, s3_client) + share = await share_service.get_share(token=token, password=password) + return share + + +@router.get("/{token}/download-url", response_model=DownloadUrlResponse) +async def get_share_download_url( + token: str, + session: DatabaseSession, + s3_client: S3ClientDep, + asset_id: str = Query(...), + kind: str = Query("original", regex="^(original|thumb)$"), + password: Optional[str] = Query(None), +): + """ + Get download URL for a shared asset. + + Args: + token: Share token + session: Database session + s3_client: S3 client + asset_id: Asset ID + kind: 'original' or 'thumb' + password: Optional password + + Returns: + Pre-signed download URL + """ + share_service = ShareService(session, s3_client) + url = await share_service.get_share_download_url( + token=token, + asset_id=asset_id, + kind=kind, + password=password, + ) + return DownloadUrlResponse(url=url, expires_in=settings.signed_url_ttl_seconds) + + +@router.post("/{token}/revoke", response_model=ShareResponse) +async def revoke_share( + token: str, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, +): + """ + Revoke a share link. + + Args: + token: Share token + current_user: Current authenticated user + session: Database session + s3_client: S3 client + + Returns: + Updated share + """ + share_service = ShareService(session, s3_client) + # First get the share to find its ID + share = await share_service.get_share(token=token) + # Then revoke it + share = await share_service.revoke_share(user_id=current_user.id, share_id=share.id) + return share diff --git a/backend/src/app/api/v1/uploads.py b/backend/src/app/api/v1/uploads.py new file mode 100644 index 0000000..1f29717 --- /dev/null +++ b/backend/src/app/api/v1/uploads.py @@ -0,0 +1,80 @@ +"""Upload API routes.""" + +from fastapi import APIRouter, status + +from app.api.dependencies import CurrentUser, DatabaseSession, S3ClientDep +from app.api.schemas import ( + AssetResponse, + CreateUploadRequest, + CreateUploadResponse, + FinalizeUploadRequest, +) +from app.services.asset_service import AssetService + +router = APIRouter(prefix="/uploads", tags=["uploads"]) + + +@router.post("/create", response_model=CreateUploadResponse, status_code=status.HTTP_201_CREATED) +async def create_upload( + data: CreateUploadRequest, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, +): + """ + Create an upload and get pre-signed URL. + + Args: + data: Upload creation data + current_user: Current authenticated user + session: Database session + s3_client: S3 client + + Returns: + Upload credentials + """ + asset_service = AssetService(session, s3_client) + asset, presigned_post = await asset_service.create_upload( + user_id=current_user.id, + original_filename=data.original_filename, + content_type=data.content_type, + size_bytes=data.size_bytes, + ) + + return CreateUploadResponse( + asset_id=asset.id, + upload_url=presigned_post["url"], + upload_method="presigned_post", + fields=presigned_post["fields"], + ) + + +@router.post("/{asset_id}/finalize", response_model=AssetResponse) +async def finalize_upload( + asset_id: str, + data: FinalizeUploadRequest, + current_user: CurrentUser, + session: DatabaseSession, + s3_client: S3ClientDep, +): + """ + Finalize upload after file is uploaded to S3. + + Args: + asset_id: Asset ID + data: Finalization data + current_user: Current authenticated user + session: Database session + s3_client: S3 client + + Returns: + Updated asset information + """ + asset_service = AssetService(session, s3_client) + asset = await asset_service.finalize_upload( + user_id=current_user.id, + asset_id=asset_id, + etag=data.etag, + sha256=data.sha256, + ) + return asset diff --git a/backend/src/app/domain/__init__.py b/backend/src/app/domain/__init__.py new file mode 100644 index 0000000..520ac1f --- /dev/null +++ b/backend/src/app/domain/__init__.py @@ -0,0 +1 @@ +"""Domain layer.""" diff --git a/backend/src/app/domain/models.py b/backend/src/app/domain/models.py new file mode 100644 index 0000000..ed8c3e3 --- /dev/null +++ b/backend/src/app/domain/models.py @@ -0,0 +1,106 @@ +"""Domain models for the application.""" + +import enum +from datetime import datetime +from typing import Optional +from uuid import uuid4 + +from sqlalchemy import Boolean, DateTime, Enum, Float, Integer, String, Text, func +from sqlalchemy.orm import Mapped, mapped_column + +from app.infra.database import Base + + +def generate_uuid() -> str: + """Generate UUID as string for SQLite compatibility.""" + return str(uuid4()) + + +class AssetType(str, enum.Enum): + """Type of media asset.""" + + PHOTO = "photo" + VIDEO = "video" + + +class AssetStatus(str, enum.Enum): + """Status of media asset.""" + + UPLOADING = "uploading" + READY = "ready" + FAILED = "failed" + DELETED = "deleted" + + +class User(Base): + """User account.""" + + __tablename__ = "users" + + id: Mapped[str] = mapped_column(String(36), primary_key=True, default=generate_uuid) + email: Mapped[str] = mapped_column(String(255), unique=True, nullable=False, index=True) + password_hash: Mapped[str] = mapped_column(String(255), nullable=False) + is_active: Mapped[bool] = mapped_column(Boolean, default=True, nullable=False) + created_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), nullable=False + ) + updated_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), onupdate=func.now(), nullable=False + ) + + +class Asset(Base): + """Media asset (photo or video).""" + + __tablename__ = "assets" + + id: Mapped[str] = mapped_column(String(36), primary_key=True, default=generate_uuid) + user_id: Mapped[str] = mapped_column(String(36), nullable=False, index=True) + type: Mapped[AssetType] = mapped_column(Enum(AssetType), nullable=False) + status: Mapped[AssetStatus] = mapped_column( + Enum(AssetStatus), default=AssetStatus.UPLOADING, nullable=False, index=True + ) + original_filename: Mapped[str] = mapped_column(String(512), nullable=False) + content_type: Mapped[str] = mapped_column(String(100), nullable=False) + size_bytes: Mapped[int] = mapped_column(Integer, nullable=False) + sha256: Mapped[Optional[str]] = mapped_column(String(64), nullable=True) + + # Metadata + captured_at: Mapped[Optional[datetime]] = mapped_column(DateTime(timezone=True), nullable=True) + width: Mapped[Optional[int]] = mapped_column(Integer, nullable=True) + height: Mapped[Optional[int]] = mapped_column(Integer, nullable=True) + duration_sec: Mapped[Optional[float]] = mapped_column(Float, nullable=True) + + # Storage + storage_key_original: Mapped[str] = mapped_column(String(512), nullable=False) + storage_key_thumb: Mapped[Optional[str]] = mapped_column(String(512), nullable=True) + + # Timestamps + created_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), nullable=False, index=True + ) + deleted_at: Mapped[Optional[datetime]] = mapped_column( + DateTime(timezone=True), nullable=True, index=True + ) + + +class Share(Base): + """Public share link for assets or albums.""" + + __tablename__ = "shares" + + id: Mapped[str] = mapped_column(String(36), primary_key=True, default=generate_uuid) + owner_user_id: Mapped[str] = mapped_column(String(36), nullable=False, index=True) + asset_id: Mapped[Optional[str]] = mapped_column(String(36), nullable=True, index=True) + album_id: Mapped[Optional[str]] = mapped_column(String(36), nullable=True, index=True) + + token: Mapped[str] = mapped_column(String(64), unique=True, nullable=False, index=True) + expires_at: Mapped[Optional[datetime]] = mapped_column(DateTime(timezone=True), nullable=True) + password_hash: Mapped[Optional[str]] = mapped_column(String(255), nullable=True) + + created_at: Mapped[datetime] = mapped_column( + DateTime(timezone=True), server_default=func.now(), nullable=False + ) + revoked_at: Mapped[Optional[datetime]] = mapped_column( + DateTime(timezone=True), nullable=True, index=True + ) diff --git a/backend/src/app/infra/__init__.py b/backend/src/app/infra/__init__.py new file mode 100644 index 0000000..a831abd --- /dev/null +++ b/backend/src/app/infra/__init__.py @@ -0,0 +1 @@ +"""Infrastructure layer.""" diff --git a/backend/src/app/infra/config.py b/backend/src/app/infra/config.py new file mode 100644 index 0000000..2f8afc4 --- /dev/null +++ b/backend/src/app/infra/config.py @@ -0,0 +1,64 @@ +"""Application configuration management.""" + +from functools import lru_cache +from typing import Literal + +from pydantic import Field +from pydantic_settings import BaseSettings, SettingsConfigDict + + +class Settings(BaseSettings): + """Application settings loaded from environment variables.""" + + model_config = SettingsConfigDict( + env_file=".env", + env_file_encoding="utf-8", + case_sensitive=False, + extra="ignore", + ) + + # Application + app_env: Literal["dev", "prod"] = "dev" + app_host: str = "0.0.0.0" + app_port: int = 8000 + + # Database + database_url: str = "sqlite+aiosqlite:///./app.db" + + # S3 Storage + s3_endpoint_url: str | None = None + s3_region: str = "us-east-1" + s3_access_key_id: str + s3_secret_access_key: str + media_bucket: str = "itcloud-media" + + # Security + jwt_secret: str + jwt_algorithm: str = "HS256" + jwt_access_ttl_seconds: int = 900 + jwt_refresh_ttl_seconds: int = 1209600 + + # Upload limits + max_upload_size_bytes: int = 21474836480 # 20GB + signed_url_ttl_seconds: int = 600 + + # CORS + cors_origins: str = "http://localhost:5173" + + @property + def cors_origins_list(self) -> list[str]: + """Parse CORS origins as a list.""" + return [origin.strip() for origin in self.cors_origins.split(",")] + + # Redis + redis_url: str = "redis://localhost:6379/0" + + # Thumbnails + thumbnail_max_size: int = 1024 + thumbnail_quality: int = 85 + + +@lru_cache +def get_settings() -> Settings: + """Get cached application settings.""" + return Settings() diff --git a/backend/src/app/infra/database.py b/backend/src/app/infra/database.py new file mode 100644 index 0000000..7ac44b1 --- /dev/null +++ b/backend/src/app/infra/database.py @@ -0,0 +1,57 @@ +"""Database session management.""" + +from collections.abc import AsyncGenerator +from typing import AsyncIterator + +from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine +from sqlalchemy.orm import DeclarativeBase + +from app.infra.config import get_settings + +settings = get_settings() + +# Create async engine +engine = create_async_engine( + settings.database_url, + echo=settings.app_env == "dev", + future=True, +) + +# Create session factory +AsyncSessionLocal = async_sessionmaker( + engine, + class_=AsyncSession, + expire_on_commit=False, + autocommit=False, + autoflush=False, +) + + +class Base(DeclarativeBase): + """Base class for all database models.""" + + pass + + +async def get_db() -> AsyncGenerator[AsyncSession, None]: + """ + Dependency that provides database session. + + Yields: + AsyncSession: Database session + """ + async with AsyncSessionLocal() as session: + try: + yield session + await session.commit() + except Exception: + await session.rollback() + raise + finally: + await session.close() + + +async def init_db() -> None: + """Initialize database tables.""" + async with engine.begin() as conn: + await conn.run_sync(Base.metadata.create_all) diff --git a/backend/src/app/infra/s3_client.py b/backend/src/app/infra/s3_client.py new file mode 100644 index 0000000..ecc47c1 --- /dev/null +++ b/backend/src/app/infra/s3_client.py @@ -0,0 +1,148 @@ +"""S3 client for file storage operations.""" + +from datetime import datetime +from typing import Optional + +import boto3 +from botocore.config import Config +from botocore.exceptions import ClientError + +from app.infra.config import get_settings + +settings = get_settings() + + +class S3Client: + """Client for S3 storage operations.""" + + def __init__(self): + """Initialize S3 client.""" + self.client = boto3.client( + "s3", + endpoint_url=settings.s3_endpoint_url, + region_name=settings.s3_region, + aws_access_key_id=settings.s3_access_key_id, + aws_secret_access_key=settings.s3_secret_access_key, + config=Config(signature_version="s3v4"), + ) + self.bucket = settings.media_bucket + + def generate_storage_key( + self, user_id: str, asset_id: str, prefix: str, extension: str + ) -> str: + """ + Generate S3 storage key for an asset. + + Args: + user_id: User ID + asset_id: Asset ID + prefix: Key prefix (o for original, t for thumbnail, p for poster) + extension: File extension + + Returns: + Storage key + """ + now = datetime.utcnow() + year = now.strftime("%Y") + month = now.strftime("%m") + return f"u/{user_id}/{prefix}/{year}/{month}/{asset_id}{extension}" + + def generate_presigned_post( + self, storage_key: str, content_type: str, max_size: int + ) -> dict: + """ + Generate pre-signed POST data for direct upload. + + Args: + storage_key: S3 object key + content_type: File content type + max_size: Maximum file size in bytes + + Returns: + Dictionary with 'url' and 'fields' for POST request + """ + conditions = [ + {"Content-Type": content_type}, + ["content-length-range", 1, max_size], + ] + + presigned_post = self.client.generate_presigned_post( + Bucket=self.bucket, + Key=storage_key, + Fields={"Content-Type": content_type}, + Conditions=conditions, + ExpiresIn=settings.signed_url_ttl_seconds, + ) + return presigned_post + + def generate_presigned_url( + self, storage_key: str, expires_in: Optional[int] = None + ) -> str: + """ + Generate pre-signed URL for download. + + Args: + storage_key: S3 object key + expires_in: Expiration time in seconds + + Returns: + Pre-signed URL + """ + if expires_in is None: + expires_in = settings.signed_url_ttl_seconds + + url = self.client.generate_presigned_url( + "get_object", + Params={"Bucket": self.bucket, "Key": storage_key}, + ExpiresIn=expires_in, + ) + return url + + def upload_file(self, file_path: str, storage_key: str, content_type: str) -> None: + """ + Upload a file to S3. + + Args: + file_path: Local file path + storage_key: S3 object key + content_type: File content type + """ + self.client.upload_file( + file_path, + self.bucket, + storage_key, + ExtraArgs={"ContentType": content_type}, + ) + + def delete_object(self, storage_key: str) -> None: + """ + Delete an object from S3. + + Args: + storage_key: S3 object key + """ + try: + self.client.delete_object(Bucket=self.bucket, Key=storage_key) + except ClientError: + pass + + def object_exists(self, storage_key: str) -> bool: + """ + Check if an object exists in S3. + + Args: + storage_key: S3 object key + + Returns: + True if object exists, False otherwise + """ + try: + self.client.head_object(Bucket=self.bucket, Key=storage_key) + return True + except ClientError: + return False + + +def get_s3_client() -> S3Client: + """Get S3 client instance.""" + return S3Client() diff --git a/backend/src/app/infra/security.py b/backend/src/app/infra/security.py new file mode 100644 index 0000000..d354271 --- /dev/null +++ b/backend/src/app/infra/security.py @@ -0,0 +1,97 @@ +"""Security utilities for authentication and authorization.""" + +from datetime import datetime, timedelta +from typing import Optional + +from jose import JWTError, jwt +from passlib.context import CryptContext + +from app.infra.config import get_settings + +settings = get_settings() + +# Password hashing context +pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto") + + +def hash_password(password: str) -> str: + """ + Hash a password using bcrypt. + + Args: + password: Plain text password + + Returns: + Hashed password + """ + return pwd_context.hash(password) + + +def verify_password(plain_password: str, hashed_password: str) -> bool: + """ + Verify a password against its hash. + + Args: + plain_password: Plain text password + hashed_password: Hashed password to verify against + + Returns: + True if password matches, False otherwise + """ + return pwd_context.verify(plain_password, hashed_password) + + +def create_access_token(data: dict, expires_delta: Optional[timedelta] = None) -> str: + """ + Create a JWT access token. + + Args: + data: Data to encode in the token + expires_delta: Optional expiration time delta + + Returns: + Encoded JWT token + """ + to_encode = data.copy() + if expires_delta: + expire = datetime.utcnow() + expires_delta + else: + expire = datetime.utcnow() + timedelta(seconds=settings.jwt_access_ttl_seconds) + + to_encode.update({"exp": expire, "type": "access"}) + encoded_jwt = jwt.encode(to_encode, settings.jwt_secret, algorithm=settings.jwt_algorithm) + return encoded_jwt + + +def create_refresh_token(data: dict) -> str: + """ + Create a JWT refresh token. + + Args: + data: Data to encode in the token + + Returns: + Encoded JWT token + """ + to_encode = data.copy() + expire = datetime.utcnow() + timedelta(seconds=settings.jwt_refresh_ttl_seconds) + to_encode.update({"exp": expire, "type": "refresh"}) + encoded_jwt = jwt.encode(to_encode, settings.jwt_secret, algorithm=settings.jwt_algorithm) + return encoded_jwt + + +def decode_token(token: str) -> Optional[dict]: + """ + Decode and verify a JWT token. + + Args: + token: JWT token to decode + + Returns: + Decoded token payload or None if invalid + """ + try: + payload = jwt.decode(token, settings.jwt_secret, algorithms=[settings.jwt_algorithm]) + return payload + except JWTError: + return None diff --git a/backend/src/app/main.py b/backend/src/app/main.py new file mode 100644 index 0000000..5ce30e8 --- /dev/null +++ b/backend/src/app/main.py @@ -0,0 +1,57 @@ +"""Main FastAPI application.""" + +from contextlib import asynccontextmanager + +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware + +from app.api.v1 import assets, auth, shares, uploads +from app.infra.config import get_settings +from app.infra.database import init_db + +settings = get_settings() + + +@asynccontextmanager +async def lifespan(app: FastAPI): + """Application lifespan handler.""" + # Startup + await init_db() + yield + # Shutdown + pass + + +app = FastAPI( + title="ITCloud API", + description="Cloud photo and video storage API", + version="0.1.0", + lifespan=lifespan, +) + +# CORS middleware +app.add_middleware( + CORSMiddleware, + allow_origins=settings.cors_origins_list, + allow_credentials=True, + allow_methods=["*"], + allow_headers=["*"], +) + +# Include routers +app.include_router(auth.router, prefix="/api/v1") +app.include_router(uploads.router, prefix="/api/v1") +app.include_router(assets.router, prefix="/api/v1") +app.include_router(shares.router, prefix="/api/v1") + + +@app.get("/health") +async def health_check(): + """Health check endpoint.""" + return {"status": "ok"} + + +@app.get("/") +async def root(): + """Root endpoint.""" + return {"message": "ITCloud API", "version": "0.1.0"} diff --git a/backend/src/app/repositories/__init__.py b/backend/src/app/repositories/__init__.py new file mode 100644 index 0000000..89858b9 --- /dev/null +++ b/backend/src/app/repositories/__init__.py @@ -0,0 +1 @@ +"""Repositories layer.""" diff --git a/backend/src/app/repositories/asset_repository.py b/backend/src/app/repositories/asset_repository.py new file mode 100644 index 0000000..29885b0 --- /dev/null +++ b/backend/src/app/repositories/asset_repository.py @@ -0,0 +1,163 @@ +"""Asset repository for database operations.""" + +from datetime import datetime +from typing import Optional + +from sqlalchemy import desc, select +from sqlalchemy.ext.asyncio import AsyncSession + +from app.domain.models import Asset, AssetStatus, AssetType + + +class AssetRepository: + """Repository for asset database operations.""" + + def __init__(self, session: AsyncSession): + """ + Initialize asset repository. + + Args: + session: Database session + """ + self.session = session + + async def create( + self, + user_id: str, + asset_type: AssetType, + original_filename: str, + content_type: str, + size_bytes: int, + storage_key_original: str, + ) -> Asset: + """ + Create a new asset. + + Args: + user_id: Owner user ID + asset_type: Type of asset (photo/video) + original_filename: Original filename + content_type: MIME type + size_bytes: File size in bytes + storage_key_original: S3 storage key + + Returns: + Created asset instance + """ + asset = Asset( + user_id=user_id, + type=asset_type, + status=AssetStatus.UPLOADING, + original_filename=original_filename, + content_type=content_type, + size_bytes=size_bytes, + storage_key_original=storage_key_original, + ) + self.session.add(asset) + await self.session.flush() + await self.session.refresh(asset) + return asset + + async def get_by_id(self, asset_id: str) -> Optional[Asset]: + """ + Get asset by ID. + + Args: + asset_id: Asset ID + + Returns: + Asset instance or None if not found + """ + result = await self.session.execute(select(Asset).where(Asset.id == asset_id)) + return result.scalar_one_or_none() + + async def list_by_user( + self, + user_id: str, + limit: int = 50, + cursor: Optional[str] = None, + asset_type: Optional[AssetType] = None, + include_deleted: bool = False, + ) -> list[Asset]: + """ + List assets for a user. + + Args: + user_id: User ID + limit: Maximum number of results + cursor: Pagination cursor (asset_id) + asset_type: Filter by asset type + include_deleted: Include soft-deleted assets + + Returns: + List of assets + """ + query = select(Asset).where(Asset.user_id == user_id) + + if not include_deleted: + query = query.where(Asset.deleted_at.is_(None)) + + if asset_type: + query = query.where(Asset.type == asset_type) + + if cursor: + cursor_asset = await self.get_by_id(cursor) + if cursor_asset: + query = query.where(Asset.created_at < cursor_asset.created_at) + + query = query.order_by(desc(Asset.created_at)).limit(limit) + + result = await self.session.execute(query) + return list(result.scalars().all()) + + async def update(self, asset: Asset) -> Asset: + """ + Update asset. + + Args: + asset: Asset instance to update + + Returns: + Updated asset instance + """ + await self.session.flush() + await self.session.refresh(asset) + return asset + + async def soft_delete(self, asset: Asset) -> Asset: + """ + Soft delete an asset. + + Args: + asset: Asset to delete + + Returns: + Updated asset + """ + asset.deleted_at = datetime.utcnow() + asset.status = AssetStatus.DELETED + return await self.update(asset) + + async def restore(self, asset: Asset) -> Asset: + """ + Restore a soft-deleted asset. + + Args: + asset: Asset to restore + + Returns: + Updated asset + """ + asset.deleted_at = None + asset.status = AssetStatus.READY + return await self.update(asset) + + async def delete(self, asset: Asset) -> None: + """ + Permanently delete an asset. + + Args: + asset: Asset to delete + """ + await self.session.delete(asset) + await self.session.flush() diff --git a/backend/src/app/repositories/share_repository.py b/backend/src/app/repositories/share_repository.py new file mode 100644 index 0000000..05d8d1b --- /dev/null +++ b/backend/src/app/repositories/share_repository.py @@ -0,0 +1,123 @@ +"""Share repository for database operations.""" + +import secrets +from datetime import datetime, timedelta +from typing import Optional + +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from app.domain.models import Share + + +class ShareRepository: + """Repository for share database operations.""" + + def __init__(self, session: AsyncSession): + """ + Initialize share repository. + + Args: + session: Database session + """ + self.session = session + + def _generate_token(self) -> str: + """Generate a secure random token.""" + return secrets.token_urlsafe(32) + + async def create( + self, + owner_user_id: str, + asset_id: Optional[str] = None, + album_id: Optional[str] = None, + expires_in_seconds: Optional[int] = None, + password_hash: Optional[str] = None, + ) -> Share: + """ + Create a new share link. + + Args: + owner_user_id: Owner user ID + asset_id: Optional asset ID + album_id: Optional album ID + expires_in_seconds: Optional expiration time in seconds + password_hash: Optional password hash + + Returns: + Created share instance + """ + token = self._generate_token() + expires_at = None + if expires_in_seconds: + expires_at = datetime.utcnow() + timedelta(seconds=expires_in_seconds) + + share = Share( + owner_user_id=owner_user_id, + asset_id=asset_id, + album_id=album_id, + token=token, + expires_at=expires_at, + password_hash=password_hash, + ) + self.session.add(share) + await self.session.flush() + await self.session.refresh(share) + return share + + async def get_by_id(self, share_id: str) -> Optional[Share]: + """ + Get share by ID. + + Args: + share_id: Share ID + + Returns: + Share instance or None if not found + """ + result = await self.session.execute(select(Share).where(Share.id == share_id)) + return result.scalar_one_or_none() + + async def get_by_token(self, token: str) -> Optional[Share]: + """ + Get share by token. + + Args: + token: Share token + + Returns: + Share instance or None if not found + """ + result = await self.session.execute(select(Share).where(Share.token == token)) + return result.scalar_one_or_none() + + async def revoke(self, share: Share) -> Share: + """ + Revoke a share link. + + Args: + share: Share to revoke + + Returns: + Updated share + """ + share.revoked_at = datetime.utcnow() + await self.session.flush() + await self.session.refresh(share) + return share + + def is_valid(self, share: Share) -> bool: + """ + Check if a share is valid (not revoked or expired). + + Args: + share: Share to check + + Returns: + True if valid, False otherwise + """ + if share.revoked_at: + return False + if share.expires_at and share.expires_at < datetime.utcnow(): + return False + return True diff --git a/backend/src/app/repositories/user_repository.py b/backend/src/app/repositories/user_repository.py new file mode 100644 index 0000000..d9e9f96 --- /dev/null +++ b/backend/src/app/repositories/user_repository.py @@ -0,0 +1,78 @@ +"""User repository for database operations.""" + +from typing import Optional + +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from app.domain.models import User + + +class UserRepository: + """Repository for user database operations.""" + + def __init__(self, session: AsyncSession): + """ + Initialize user repository. + + Args: + session: Database session + """ + self.session = session + + async def create(self, email: str, password_hash: str) -> User: + """ + Create a new user. + + Args: + email: User email + password_hash: Hashed password + + Returns: + Created user instance + """ + user = User(email=email, password_hash=password_hash) + self.session.add(user) + await self.session.flush() + await self.session.refresh(user) + return user + + async def get_by_id(self, user_id: str) -> Optional[User]: + """ + Get user by ID. + + Args: + user_id: User ID + + Returns: + User instance or None if not found + """ + result = await self.session.execute(select(User).where(User.id == user_id)) + return result.scalar_one_or_none() + + async def get_by_email(self, email: str) -> Optional[User]: + """ + Get user by email. + + Args: + email: User email + + Returns: + User instance or None if not found + """ + result = await self.session.execute(select(User).where(User.email == email)) + return result.scalar_one_or_none() + + async def update(self, user: User) -> User: + """ + Update user. + + Args: + user: User instance to update + + Returns: + Updated user instance + """ + await self.session.flush() + await self.session.refresh(user) + return user diff --git a/backend/src/app/services/__init__.py b/backend/src/app/services/__init__.py new file mode 100644 index 0000000..d1f42e0 --- /dev/null +++ b/backend/src/app/services/__init__.py @@ -0,0 +1 @@ +"""Services layer.""" diff --git a/backend/src/app/services/asset_service.py b/backend/src/app/services/asset_service.py new file mode 100644 index 0000000..ced2b7a --- /dev/null +++ b/backend/src/app/services/asset_service.py @@ -0,0 +1,282 @@ +"""Asset management service.""" + +import os +from typing import Optional + +from fastapi import HTTPException, status +from sqlalchemy.ext.asyncio import AsyncSession + +from app.domain.models import Asset, AssetStatus, AssetType +from app.infra.s3_client import S3Client +from app.repositories.asset_repository import AssetRepository + + +class AssetService: + """Service for asset management operations.""" + + def __init__(self, session: AsyncSession, s3_client: S3Client): + """ + Initialize asset service. + + Args: + session: Database session + s3_client: S3 client instance + """ + self.asset_repo = AssetRepository(session) + self.s3_client = s3_client + + def _get_asset_type(self, content_type: str) -> AssetType: + """Determine asset type from content type.""" + if content_type.startswith("image/"): + return AssetType.PHOTO + elif content_type.startswith("video/"): + return AssetType.VIDEO + else: + raise HTTPException( + status_code=status.HTTP_400_BAD_REQUEST, + detail="Unsupported content type", + ) + + async def create_upload( + self, + user_id: str, + original_filename: str, + content_type: str, + size_bytes: int, + ) -> tuple[Asset, dict]: + """ + Create an asset and generate pre-signed upload URL. + + Args: + user_id: Owner user ID + original_filename: Original filename + content_type: MIME type + size_bytes: File size in bytes + + Returns: + Tuple of (asset, presigned_post_data) + """ + asset_type = self._get_asset_type(content_type) + _, ext = os.path.splitext(original_filename) + + # Create asset record + asset = await self.asset_repo.create( + user_id=user_id, + asset_type=asset_type, + original_filename=original_filename, + content_type=content_type, + size_bytes=size_bytes, + storage_key_original="", # Will be set after upload + ) + + # Generate storage key + storage_key = self.s3_client.generate_storage_key( + user_id=user_id, + asset_id=asset.id, + prefix="o", + extension=ext, + ) + + # Update asset with storage key + asset.storage_key_original = storage_key + await self.asset_repo.update(asset) + + # Generate pre-signed POST + presigned_post = self.s3_client.generate_presigned_post( + storage_key=storage_key, + content_type=content_type, + max_size=size_bytes, + ) + + return asset, presigned_post + + async def finalize_upload( + self, + user_id: str, + asset_id: str, + etag: Optional[str] = None, + sha256: Optional[str] = None, + ) -> Asset: + """ + Finalize upload and mark asset as ready. + + Args: + user_id: User ID + asset_id: Asset ID + etag: Optional S3 ETag + sha256: Optional file SHA256 hash + + Returns: + Updated asset + + Raises: + HTTPException: If asset not found or not authorized + """ + asset = await self.asset_repo.get_by_id(asset_id) + if not asset or asset.user_id != user_id: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Asset not found", + ) + + # Verify file was uploaded + if not self.s3_client.object_exists(asset.storage_key_original): + raise HTTPException( + status_code=status.HTTP_400_BAD_REQUEST, + detail="File not found in storage", + ) + + asset.status = AssetStatus.READY + if sha256: + asset.sha256 = sha256 + + await self.asset_repo.update(asset) + return asset + + async def list_assets( + self, + user_id: str, + limit: int = 50, + cursor: Optional[str] = None, + asset_type: Optional[AssetType] = None, + ) -> tuple[list[Asset], Optional[str], bool]: + """ + List user's assets. + + Args: + user_id: User ID + limit: Maximum number of results + cursor: Pagination cursor + asset_type: Filter by asset type + + Returns: + Tuple of (assets, next_cursor, has_more) + """ + assets = await self.asset_repo.list_by_user( + user_id=user_id, + limit=limit + 1, # Fetch one more to check if there are more + cursor=cursor, + asset_type=asset_type, + ) + + has_more = len(assets) > limit + if has_more: + assets = assets[:limit] + + next_cursor = assets[-1].id if has_more and assets else None + return assets, next_cursor, has_more + + async def get_asset(self, user_id: str, asset_id: str) -> Asset: + """ + Get asset by ID. + + Args: + user_id: User ID + asset_id: Asset ID + + Returns: + Asset instance + + Raises: + HTTPException: If asset not found or not authorized + """ + asset = await self.asset_repo.get_by_id(asset_id) + if not asset or asset.user_id != user_id: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Asset not found", + ) + return asset + + async def get_download_url( + self, user_id: str, asset_id: str, kind: str = "original" + ) -> str: + """ + Get pre-signed download URL for an asset. + + Args: + user_id: User ID + asset_id: Asset ID + kind: 'original' or 'thumb' + + Returns: + Pre-signed download URL + + Raises: + HTTPException: If asset not found or not authorized + """ + asset = await self.get_asset(user_id, asset_id) + + if kind == "thumb": + storage_key = asset.storage_key_thumb + if not storage_key: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Thumbnail not available", + ) + else: + storage_key = asset.storage_key_original + + return self.s3_client.generate_presigned_url(storage_key) + + async def delete_asset(self, user_id: str, asset_id: str) -> Asset: + """ + Soft delete an asset. + + Args: + user_id: User ID + asset_id: Asset ID + + Returns: + Updated asset + """ + asset = await self.get_asset(user_id, asset_id) + return await self.asset_repo.soft_delete(asset) + + async def restore_asset(self, user_id: str, asset_id: str) -> Asset: + """ + Restore a soft-deleted asset. + + Args: + user_id: User ID + asset_id: Asset ID + + Returns: + Updated asset + """ + asset = await self.asset_repo.get_by_id(asset_id) + if not asset or asset.user_id != user_id: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Asset not found", + ) + return await self.asset_repo.restore(asset) + + async def purge_asset(self, user_id: str, asset_id: str) -> None: + """ + Permanently delete an asset. + + Args: + user_id: User ID + asset_id: Asset ID + """ + asset = await self.asset_repo.get_by_id(asset_id) + if not asset or asset.user_id != user_id: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Asset not found", + ) + + if not asset.deleted_at: + raise HTTPException( + status_code=status.HTTP_400_BAD_REQUEST, + detail="Asset must be deleted before purging", + ) + + # Delete from S3 + self.s3_client.delete_object(asset.storage_key_original) + if asset.storage_key_thumb: + self.s3_client.delete_object(asset.storage_key_thumb) + + # Delete from database + await self.asset_repo.delete(asset) diff --git a/backend/src/app/services/auth_service.py b/backend/src/app/services/auth_service.py new file mode 100644 index 0000000..206a8f5 --- /dev/null +++ b/backend/src/app/services/auth_service.py @@ -0,0 +1,96 @@ +"""Authentication service.""" + +from typing import Optional + +from fastapi import HTTPException, status +from sqlalchemy.ext.asyncio import AsyncSession + +from app.domain.models import User +from app.infra.security import ( + create_access_token, + create_refresh_token, + hash_password, + verify_password, +) +from app.repositories.user_repository import UserRepository + + +class AuthService: + """Service for authentication operations.""" + + def __init__(self, session: AsyncSession): + """ + Initialize auth service. + + Args: + session: Database session + """ + self.user_repo = UserRepository(session) + + async def register(self, email: str, password: str) -> User: + """ + Register a new user. + + Args: + email: User email + password: Plain text password + + Returns: + Created user instance + + Raises: + HTTPException: If email already exists + """ + existing_user = await self.user_repo.get_by_email(email) + if existing_user: + raise HTTPException( + status_code=status.HTTP_400_BAD_REQUEST, + detail="Email already registered", + ) + + password_hash = hash_password(password) + user = await self.user_repo.create(email=email, password_hash=password_hash) + return user + + async def login(self, email: str, password: str) -> tuple[str, str]: + """ + Authenticate user and generate tokens. + + Args: + email: User email + password: Plain text password + + Returns: + Tuple of (access_token, refresh_token) + + Raises: + HTTPException: If credentials are invalid + """ + user = await self.user_repo.get_by_email(email) + if not user or not verify_password(password, user.password_hash): + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail="Incorrect email or password", + ) + + if not user.is_active: + raise HTTPException( + status_code=status.HTTP_403_FORBIDDEN, + detail="User account is inactive", + ) + + access_token = create_access_token({"sub": user.id}) + refresh_token = create_refresh_token({"sub": user.id}) + return access_token, refresh_token + + async def get_user_by_id(self, user_id: str) -> Optional[User]: + """ + Get user by ID. + + Args: + user_id: User ID + + Returns: + User instance or None + """ + return await self.user_repo.get_by_id(user_id) diff --git a/backend/src/app/services/share_service.py b/backend/src/app/services/share_service.py new file mode 100644 index 0000000..f59f49f --- /dev/null +++ b/backend/src/app/services/share_service.py @@ -0,0 +1,188 @@ +"""Share management service.""" + +from typing import Optional + +from fastapi import HTTPException, status +from sqlalchemy.ext.asyncio import AsyncSession + +from app.domain.models import Share +from app.infra.s3_client import S3Client +from app.infra.security import hash_password, verify_password +from app.repositories.asset_repository import AssetRepository +from app.repositories.share_repository import ShareRepository + + +class ShareService: + """Service for share management operations.""" + + def __init__(self, session: AsyncSession, s3_client: S3Client): + """ + Initialize share service. + + Args: + session: Database session + s3_client: S3 client instance + """ + self.share_repo = ShareRepository(session) + self.asset_repo = AssetRepository(session) + self.s3_client = s3_client + + async def create_share( + self, + user_id: str, + asset_id: Optional[str] = None, + album_id: Optional[str] = None, + expires_in_seconds: Optional[int] = None, + password: Optional[str] = None, + ) -> Share: + """ + Create a share link. + + Args: + user_id: Owner user ID + asset_id: Optional asset ID + album_id: Optional album ID + expires_in_seconds: Optional expiration time + password: Optional password + + Returns: + Created share instance + + Raises: + HTTPException: If validation fails + """ + if not asset_id and not album_id: + raise HTTPException( + status_code=status.HTTP_400_BAD_REQUEST, + detail="Either asset_id or album_id must be provided", + ) + + # Verify asset ownership if provided + if asset_id: + asset = await self.asset_repo.get_by_id(asset_id) + if not asset or asset.user_id != user_id: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Asset not found", + ) + + password_hash = hash_password(password) if password else None + + share = await self.share_repo.create( + owner_user_id=user_id, + asset_id=asset_id, + album_id=album_id, + expires_in_seconds=expires_in_seconds, + password_hash=password_hash, + ) + return share + + async def get_share(self, token: str, password: Optional[str] = None) -> Share: + """ + Get share by token. + + Args: + token: Share token + password: Optional password for protected shares + + Returns: + Share instance + + Raises: + HTTPException: If share not found or invalid + """ + share = await self.share_repo.get_by_token(token) + if not share: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Share not found", + ) + + if not self.share_repo.is_valid(share): + raise HTTPException( + status_code=status.HTTP_410_GONE, + detail="Share has expired or been revoked", + ) + + # Check password if required + if share.password_hash: + if not password or not verify_password(password, share.password_hash): + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail="Invalid password", + ) + + return share + + async def get_share_download_url( + self, + token: str, + asset_id: str, + kind: str = "original", + password: Optional[str] = None, + ) -> str: + """ + Get download URL for a shared asset. + + Args: + token: Share token + asset_id: Asset ID + kind: 'original' or 'thumb' + password: Optional password + + Returns: + Pre-signed download URL + + Raises: + HTTPException: If share invalid or asset not shared + """ + share = await self.get_share(token, password) + + # Verify asset is part of the share + if share.asset_id and share.asset_id != asset_id: + raise HTTPException( + status_code=status.HTTP_403_FORBIDDEN, + detail="Asset not part of this share", + ) + + asset = await self.asset_repo.get_by_id(asset_id) + if not asset: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Asset not found", + ) + + if kind == "thumb": + storage_key = asset.storage_key_thumb + if not storage_key: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Thumbnail not available", + ) + else: + storage_key = asset.storage_key_original + + return self.s3_client.generate_presigned_url(storage_key) + + async def revoke_share(self, user_id: str, share_id: str) -> Share: + """ + Revoke a share link. + + Args: + user_id: User ID + share_id: Share ID + + Returns: + Updated share + + Raises: + HTTPException: If share not found or not authorized + """ + share = await self.share_repo.get_by_id(share_id) + if not share or share.owner_user_id != user_id: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Share not found", + ) + + return await self.share_repo.revoke(share) diff --git a/docker-compose.yml b/docker-compose.yml new file mode 100644 index 0000000..f093b07 --- /dev/null +++ b/docker-compose.yml @@ -0,0 +1,83 @@ +version: '3.8' + +services: + backend: + build: ./backend + ports: + - "8000:8000" + environment: + - APP_ENV=dev + - DATABASE_URL=sqlite+aiosqlite:////app/data/app.db + - S3_ENDPOINT_URL=http://minio:9000 + - S3_REGION=us-east-1 + - S3_ACCESS_KEY_ID=minioadmin + - S3_SECRET_ACCESS_KEY=minioadmin + - MEDIA_BUCKET=itcloud-media + - JWT_SECRET=dev-secret-key-change-in-production + - CORS_ORIGINS=http://localhost:5173,http://localhost:3000 + - REDIS_URL=redis://redis:6379/0 + volumes: + - ./backend/src:/app/src + - backend-data:/app/data + depends_on: + - minio + - redis + command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload + + minio: + image: minio/minio:latest + ports: + - "9000:9000" + - "9001:9001" + environment: + - MINIO_ROOT_USER=minioadmin + - MINIO_ROOT_PASSWORD=minioadmin + volumes: + - minio-data:/data + command: server /data --console-address ":9001" + healthcheck: + test: ["CMD", "curl", "-f", "http://localhost:9000/minio/health/live"] + interval: 30s + timeout: 20s + retries: 3 + + minio-setup: + image: minio/mc:latest + depends_on: + - minio + entrypoint: > + /bin/sh -c " + sleep 5; + /usr/bin/mc alias set myminio http://minio:9000 minioadmin minioadmin; + /usr/bin/mc mb myminio/itcloud-media --ignore-existing; + /usr/bin/mc anonymous set none myminio/itcloud-media; + exit 0; + " + + redis: + image: redis:7-alpine + ports: + - "6379:6379" + volumes: + - redis-data:/data + healthcheck: + test: ["CMD", "redis-cli", "ping"] + interval: 30s + timeout: 10s + retries: 3 + + frontend: + build: ./frontend + ports: + - "5173:5173" + environment: + - VITE_API_URL=http://localhost:8000 + volumes: + - ./frontend/src:/app/src + - ./frontend/public:/app/public + command: npm run dev -- --host 0.0.0.0 + +volumes: + backend-data: + minio-data: + redis-data: diff --git a/frontend/.gitignore b/frontend/.gitignore new file mode 100644 index 0000000..a547bf3 --- /dev/null +++ b/frontend/.gitignore @@ -0,0 +1,24 @@ +# Logs +logs +*.log +npm-debug.log* +yarn-debug.log* +yarn-error.log* +pnpm-debug.log* +lerna-debug.log* + +node_modules +dist +dist-ssr +*.local + +# Editor directories and files +.vscode/* +!.vscode/extensions.json +.idea +.DS_Store +*.suo +*.ntvs* +*.njsproj +*.sln +*.sw? diff --git a/frontend/Dockerfile b/frontend/Dockerfile new file mode 100644 index 0000000..0985103 --- /dev/null +++ b/frontend/Dockerfile @@ -0,0 +1,17 @@ +FROM node:20-alpine + +WORKDIR /app + +# Copy package files +COPY package*.json ./ + +# Install dependencies +RUN npm install + +# Copy application code +COPY . . + +# Expose port +EXPOSE 5173 + +CMD ["npm", "run", "dev", "--", "--host", "0.0.0.0"] diff --git a/frontend/index.html b/frontend/index.html new file mode 100644 index 0000000..e0cc966 --- /dev/null +++ b/frontend/index.html @@ -0,0 +1,14 @@ + + + + + + + + ITCloud - Облачное хранилище + + +
+ + + diff --git a/frontend/package.json b/frontend/package.json new file mode 100644 index 0000000..9114c95 --- /dev/null +++ b/frontend/package.json @@ -0,0 +1,36 @@ +{ + "name": "itcloud-frontend", + "private": true, + "version": "0.1.0", + "type": "module", + "scripts": { + "dev": "vite", + "build": "tsc && vite build", + "preview": "vite preview", + "lint": "eslint . --ext ts,tsx --report-unused-disable-directives --max-warnings 0" + }, + "dependencies": { + "@emotion/react": "^11.11.3", + "@emotion/styled": "^11.11.0", + "@mui/icons-material": "^5.15.6", + "@mui/material": "^5.15.6", + "axios": "^1.6.5", + "react": "^18.2.0", + "react-dom": "^18.2.0", + "react-dropzone": "^14.2.3", + "react-router-dom": "^6.21.3", + "react-virtuoso": "^4.6.2" + }, + "devDependencies": { + "@types/react": "^18.2.48", + "@types/react-dom": "^18.2.18", + "@typescript-eslint/eslint-plugin": "^6.19.0", + "@typescript-eslint/parser": "^6.19.0", + "@vitejs/plugin-react": "^4.2.1", + "eslint": "^8.56.0", + "eslint-plugin-react-hooks": "^4.6.0", + "eslint-plugin-react-refresh": "^0.4.5", + "typescript": "^5.3.3", + "vite": "^5.0.12" + } +} diff --git a/frontend/public/vite.svg b/frontend/public/vite.svg new file mode 100644 index 0000000..ee9fada --- /dev/null +++ b/frontend/public/vite.svg @@ -0,0 +1 @@ + diff --git a/frontend/src/App.tsx b/frontend/src/App.tsx new file mode 100644 index 0000000..8fcbca2 --- /dev/null +++ b/frontend/src/App.tsx @@ -0,0 +1,47 @@ +import { Routes, Route, Navigate } from 'react-router-dom'; +import { Box } from '@mui/material'; +import LoginPage from './pages/LoginPage'; +import RegisterPage from './pages/RegisterPage'; +import LibraryPage from './pages/LibraryPage'; +import TrashPage from './pages/TrashPage'; +import ShareViewPage from './pages/ShareViewPage'; +import { useAuth } from './hooks/useAuth'; + +function PrivateRoute({ children }: { children: React.ReactNode }) { + const { isAuthenticated, loading } = useAuth(); + + if (loading) { + return Loading...; + } + + return isAuthenticated ? <>{children} : ; +} + +function App() { + return ( + + } /> + } /> + } /> + + + + } + /> + + + + } + /> + } /> + + ); +} + +export default App; diff --git a/frontend/src/components/Layout.tsx b/frontend/src/components/Layout.tsx new file mode 100644 index 0000000..bb0a978 --- /dev/null +++ b/frontend/src/components/Layout.tsx @@ -0,0 +1,160 @@ +import { useState } from 'react'; +import { useNavigate, useLocation } from 'react-router-dom'; +import { + AppBar, + Box, + Drawer, + IconButton, + List, + ListItem, + ListItemButton, + ListItemIcon, + ListItemText, + Toolbar, + Typography, + useMediaQuery, + useTheme, +} from '@mui/material'; +import { + Menu as MenuIcon, + CloudUpload as CloudIcon, + PhotoLibrary as LibraryIcon, + Delete as TrashIcon, + Logout as LogoutIcon, +} from '@mui/icons-material'; +import { useAuth } from '../hooks/useAuth'; + +const drawerWidth = 240; + +interface LayoutProps { + children: React.ReactNode; +} + +export default function Layout({ children }: LayoutProps) { + const theme = useTheme(); + const isMobile = useMediaQuery(theme.breakpoints.down('sm')); + const [mobileOpen, setMobileOpen] = useState(false); + const navigate = useNavigate(); + const location = useLocation(); + const { logout } = useAuth(); + + const handleDrawerToggle = () => { + setMobileOpen(!mobileOpen); + }; + + const handleNavigation = (path: string) => { + navigate(path); + if (isMobile) { + setMobileOpen(false); + } + }; + + const handleLogout = () => { + logout(); + navigate('/login'); + }; + + const menuItems = [ + { text: 'Библиотека', icon: , path: '/library' }, + { text: 'Корзина', icon: , path: '/trash' }, + ]; + + const drawer = ( + + + + + ITCloud + + + + {menuItems.map((item) => ( + + handleNavigation(item.path)} + > + {item.icon} + + + + ))} + + + + + + + + + + + ); + + return ( + + + + + + + + {menuItems.find((item) => item.path === location.pathname)?.text || 'ITCloud'} + + + + + + + {drawer} + + + {drawer} + + + + + + {children} + + + ); +} diff --git a/frontend/src/components/MediaCard.tsx b/frontend/src/components/MediaCard.tsx new file mode 100644 index 0000000..f979af9 --- /dev/null +++ b/frontend/src/components/MediaCard.tsx @@ -0,0 +1,193 @@ +import { useState, useEffect } from 'react'; +import { + Card, + CardMedia, + CardActionArea, + Box, + IconButton, + CircularProgress, + Typography, + Checkbox, +} from '@mui/material'; +import { + PlayCircleOutline as VideoIcon, + CheckCircle as CheckedIcon, +} from '@mui/icons-material'; +import type { Asset } from '../types'; +import api from '../services/api'; + +interface MediaCardProps { + asset: Asset; + selected?: boolean; + onSelect?: (assetId: string, selected: boolean) => void; + onClick?: () => void; +} + +export default function MediaCard({ asset, selected, onSelect, onClick }: MediaCardProps) { + const [thumbnailUrl, setThumbnailUrl] = useState(''); + const [loading, setLoading] = useState(true); + const [error, setError] = useState(false); + + useEffect(() => { + loadThumbnail(); + }, [asset.id]); + + const loadThumbnail = async () => { + try { + setLoading(true); + setError(false); + // Try to get thumbnail first, fallback to original for photos + const url = asset.storage_key_thumb + ? await api.getDownloadUrl(asset.id, 'thumb') + : asset.type === 'photo' + ? await api.getDownloadUrl(asset.id, 'original') + : ''; + setThumbnailUrl(url); + } catch (err) { + console.error('Failed to load thumbnail:', err); + setError(true); + } finally { + setLoading(false); + } + }; + + const formatFileSize = (bytes: number): string => { + if (bytes < 1024) return bytes + ' B'; + if (bytes < 1024 * 1024) return (bytes / 1024).toFixed(1) + ' KB'; + if (bytes < 1024 * 1024 * 1024) return (bytes / (1024 * 1024)).toFixed(1) + ' MB'; + return (bytes / (1024 * 1024 * 1024)).toFixed(1) + ' GB'; + }; + + const formatDate = (dateString: string): string => { + const date = new Date(dateString); + return date.toLocaleDateString('ru-RU', { + day: 'numeric', + month: 'short', + year: 'numeric', + }); + }; + + const handleSelect = (e: React.MouseEvent) => { + e.stopPropagation(); + if (onSelect) { + onSelect(asset.id, !selected); + } + }; + + return ( + + + {loading && ( + + + + )} + + {error && ( + + Ошибка загрузки + + )} + + {!loading && !error && thumbnailUrl && ( + + )} + + {asset.type === 'video' && ( + + + + )} + + + + {asset.original_filename} + + + {formatFileSize(asset.size_bytes)} • {formatDate(asset.created_at)} + + + + {onSelect && ( + + } + checkedIcon={} + sx={{ + position: 'absolute', + top: 8, + left: 8, + }} + /> + )} + + + ); +} diff --git a/frontend/src/components/UploadDialog.tsx b/frontend/src/components/UploadDialog.tsx new file mode 100644 index 0000000..3244880 --- /dev/null +++ b/frontend/src/components/UploadDialog.tsx @@ -0,0 +1,257 @@ +import { useState, useCallback } from 'react'; +import { useDropzone } from 'react-dropzone'; +import { + Dialog, + DialogTitle, + DialogContent, + DialogActions, + Button, + Box, + Typography, + LinearProgress, + List, + ListItem, + ListItemText, + IconButton, + Alert, +} from '@mui/material'; +import { + CloudUpload as UploadIcon, + Close as CloseIcon, + CheckCircle as SuccessIcon, + Error as ErrorIcon, +} from '@mui/icons-material'; +import api from '../services/api'; + +interface UploadFile { + file: File; + progress: number; + status: 'pending' | 'uploading' | 'success' | 'error'; + error?: string; + assetId?: string; +} + +interface UploadDialogProps { + open: boolean; + onClose: () => void; + onComplete?: () => void; +} + +export default function UploadDialog({ open, onClose, onComplete }: UploadDialogProps) { + const [files, setFiles] = useState([]); + const [uploading, setUploading] = useState(false); + + const onDrop = useCallback((acceptedFiles: File[]) => { + const newFiles: UploadFile[] = acceptedFiles.map((file) => ({ + file, + progress: 0, + status: 'pending', + })); + setFiles((prev) => [...prev, ...newFiles]); + }, []); + + const { getRootProps, getInputProps, isDragActive } = useDropzone({ + onDrop, + accept: { + 'image/*': ['.jpg', '.jpeg', '.png', '.gif', '.webp'], + 'video/*': ['.mp4', '.mov', '.avi', '.mkv', '.webm'], + }, + }); + + const updateFileProgress = (index: number, progress: number, status: UploadFile['status'], error?: string, assetId?: string) => { + setFiles((prev) => + prev.map((f, i) => + i === index ? { ...f, progress, status, error, assetId } : f + ) + ); + }; + + const uploadFile = async (file: File, index: number) => { + try { + updateFileProgress(index, 0, 'uploading'); + + // Step 1: Create upload + const uploadData = await api.createUpload({ + original_filename: file.name, + content_type: file.type, + size_bytes: file.size, + }); + + updateFileProgress(index, 33, 'uploading', undefined, uploadData.asset_id); + + // Step 2: Upload to S3 + await api.uploadToS3(uploadData.upload_url, file, uploadData.fields); + + updateFileProgress(index, 66, 'uploading', undefined, uploadData.asset_id); + + // Step 3: Finalize upload + await api.finalizeUpload(uploadData.asset_id); + + updateFileProgress(index, 100, 'success', undefined, uploadData.asset_id); + } catch (error: any) { + console.error('Upload failed:', error); + updateFileProgress( + index, + 0, + 'error', + error.response?.data?.detail || 'Ошибка загрузки' + ); + } + }; + + const handleUpload = async () => { + setUploading(true); + + // Upload files in parallel (max 3 at a time) + const batchSize = 3; + for (let i = 0; i < files.length; i += batchSize) { + const batch = files.slice(i, i + batchSize); + await Promise.all( + batch.map((file, batchIndex) => { + const fileIndex = i + batchIndex; + if (files[fileIndex].status === 'pending') { + return uploadFile(files[fileIndex].file, fileIndex); + } + return Promise.resolve(); + }) + ); + } + + setUploading(false); + if (onComplete) { + onComplete(); + } + }; + + const handleClose = () => { + if (!uploading) { + setFiles([]); + onClose(); + } + }; + + const canUpload = files.length > 0 && files.some((f) => f.status === 'pending'); + const allComplete = files.length > 0 && files.every((f) => f.status === 'success' || f.status === 'error'); + + return ( + + + Загрузка файлов + + + + + + + {files.length === 0 && ( + + + + + {isDragActive + ? 'Отпустите файлы для загрузки' + : 'Перетащите файлы сюда'} + + + или нажмите для выбора файлов + + + Поддерживаются фото (JPG, PNG, GIF, WebP) и видео (MP4, MOV, AVI, MKV, WebM) + + + )} + + {files.length > 0 && ( + + {files.map((uploadFile, index) => ( + + + + {uploadFile.status === 'success' && ( + + )} + {uploadFile.status === 'error' && ( + + )} + + + {uploadFile.status === 'uploading' && ( + + )} + + {uploadFile.error && ( + + {uploadFile.error} + + )} + + ))} + + )} + + {files.length > 0 && !allComplete && ( + + + + Добавить еще файлы + + + )} + + + + + {canUpload && ( + + )} + + + ); +} diff --git a/frontend/src/components/ViewerModal.tsx b/frontend/src/components/ViewerModal.tsx new file mode 100644 index 0000000..e4bc42d --- /dev/null +++ b/frontend/src/components/ViewerModal.tsx @@ -0,0 +1,273 @@ +import { useState, useEffect } from 'react'; +import { + Dialog, + Box, + IconButton, + CircularProgress, + Typography, +} from '@mui/material'; +import { + Close as CloseIcon, + NavigateBefore as PrevIcon, + NavigateNext as NextIcon, + Download as DownloadIcon, + Share as ShareIcon, + Delete as DeleteIcon, +} from '@mui/icons-material'; +import type { Asset } from '../types'; +import api from '../services/api'; + +interface ViewerModalProps { + asset: Asset | null; + assets: Asset[]; + onClose: () => void; + onDelete?: (assetId: string) => void; + onShare?: (assetId: string) => void; +} + +export default function ViewerModal({ + asset, + assets, + onClose, + onDelete, + onShare, +}: ViewerModalProps) { + const [currentUrl, setCurrentUrl] = useState(''); + const [loading, setLoading] = useState(true); + const [currentIndex, setCurrentIndex] = useState(-1); + + useEffect(() => { + if (asset) { + const index = assets.findIndex((a) => a.id === asset.id); + setCurrentIndex(index); + loadMedia(asset); + } + }, [asset]); + + useEffect(() => { + const handleKeyPress = (e: KeyboardEvent) => { + if (!asset) return; + + if (e.key === 'Escape') { + onClose(); + } else if (e.key === 'ArrowLeft') { + handlePrev(); + } else if (e.key === 'ArrowRight') { + handleNext(); + } + }; + + window.addEventListener('keydown', handleKeyPress); + return () => window.removeEventListener('keydown', handleKeyPress); + }, [asset, currentIndex]); + + const loadMedia = async (asset: Asset) => { + try { + setLoading(true); + const url = await api.getDownloadUrl(asset.id, 'original'); + setCurrentUrl(url); + } catch (error) { + console.error('Failed to load media:', error); + } finally { + setLoading(false); + } + }; + + const handlePrev = () => { + if (currentIndex > 0) { + const prevAsset = assets[currentIndex - 1]; + loadMedia(prevAsset); + setCurrentIndex(currentIndex - 1); + } + }; + + const handleNext = () => { + if (currentIndex < assets.length - 1) { + const nextAsset = assets[currentIndex + 1]; + loadMedia(nextAsset); + setCurrentIndex(currentIndex + 1); + } + }; + + const handleDownload = () => { + if (currentUrl && asset) { + const link = document.createElement('a'); + link.href = currentUrl; + link.download = asset.original_filename; + link.click(); + } + }; + + const handleDelete = () => { + if (asset && onDelete) { + onDelete(asset.id); + onClose(); + } + }; + + const handleShare = () => { + if (asset && onShare) { + onShare(asset.id); + } + }; + + if (!asset) return null; + + return ( + + + {/* Top bar */} + + + {asset.original_filename} + + + + + + + {onShare && ( + + + + )} + {onDelete && ( + + + + )} + + + + + + + {/* Navigation buttons */} + {currentIndex > 0 && ( + + + + )} + + {currentIndex < assets.length - 1 && ( + + + + )} + + {/* Content */} + {loading && ( + + )} + + {!loading && asset.type === 'photo' && ( + + )} + + {!loading && asset.type === 'video' && ( + + )} + + {/* Bottom info */} + + + {currentIndex + 1} / {assets.length} + + + {(asset.size_bytes / 1024 / 1024).toFixed(2)} MB + + {asset.width && asset.height && ( + + {asset.width} × {asset.height} + + )} + + + + ); +} diff --git a/frontend/src/hooks/useAuth.ts b/frontend/src/hooks/useAuth.ts new file mode 100644 index 0000000..0c1f22f --- /dev/null +++ b/frontend/src/hooks/useAuth.ts @@ -0,0 +1,57 @@ +import { useState, useEffect } from 'react'; +import api from '../services/api'; +import type { User } from '../types'; + +export function useAuth() { + const [user, setUser] = useState(null); + const [loading, setLoading] = useState(true); + const [isAuthenticated, setIsAuthenticated] = useState(false); + + useEffect(() => { + checkAuth(); + }, []); + + const checkAuth = async () => { + const token = localStorage.getItem('access_token'); + if (!token) { + setLoading(false); + return; + } + + try { + const userData = await api.getMe(); + setUser(userData); + setIsAuthenticated(true); + } catch (error) { + console.error('Auth check failed:', error); + setIsAuthenticated(false); + setUser(null); + } finally { + setLoading(false); + } + }; + + const login = async (email: string, password: string) => { + await api.login(email, password); + await checkAuth(); + }; + + const register = async (email: string, password: string) => { + await api.register(email, password); + }; + + const logout = () => { + api.logout(); + setUser(null); + setIsAuthenticated(false); + }; + + return { + user, + loading, + isAuthenticated, + login, + register, + logout, + }; +} diff --git a/frontend/src/index.css b/frontend/src/index.css new file mode 100644 index 0000000..3040b6c --- /dev/null +++ b/frontend/src/index.css @@ -0,0 +1,17 @@ +* { + margin: 0; + padding: 0; + box-sizing: border-box; +} + +html, body, #root { + width: 100%; + height: 100%; + overflow: hidden; +} + +body { + font-family: 'Roboto', sans-serif; + -webkit-font-smoothing: antialiased; + -moz-osx-font-smoothing: grayscale; +} diff --git a/frontend/src/main.tsx b/frontend/src/main.tsx new file mode 100644 index 0000000..8bad995 --- /dev/null +++ b/frontend/src/main.tsx @@ -0,0 +1,18 @@ +import React from 'react' +import ReactDOM from 'react-dom/client' +import { BrowserRouter } from 'react-router-dom' +import { CssBaseline, ThemeProvider } from '@mui/material' +import App from './App' +import theme from './theme/theme' +import './index.css' + +ReactDOM.createRoot(document.getElementById('root')!).render( + + + + + + + + , +) diff --git a/frontend/src/pages/LibraryPage.tsx b/frontend/src/pages/LibraryPage.tsx new file mode 100644 index 0000000..4fa5036 --- /dev/null +++ b/frontend/src/pages/LibraryPage.tsx @@ -0,0 +1,251 @@ +import { useState, useEffect } from 'react'; +import { + Box, + Grid, + Fab, + Typography, + CircularProgress, + Button, + Select, + MenuItem, + FormControl, + InputLabel, + Dialog, + DialogTitle, + DialogContent, + DialogActions, + TextField, + Alert, + Snackbar, +} from '@mui/material'; +import { Add as AddIcon, FilterList as FilterIcon } from '@mui/icons-material'; +import Layout from '../components/Layout'; +import MediaCard from '../components/MediaCard'; +import UploadDialog from '../components/UploadDialog'; +import ViewerModal from '../components/ViewerModal'; +import type { Asset, AssetType } from '../types'; +import api from '../services/api'; + +export default function LibraryPage() { + const [assets, setAssets] = useState([]); + const [loading, setLoading] = useState(true); + const [hasMore, setHasMore] = useState(false); + const [cursor, setCursor] = useState(); + const [filter, setFilter] = useState('all'); + + const [uploadOpen, setUploadOpen] = useState(false); + const [viewerAsset, setViewerAsset] = useState(null); + const [shareDialogOpen, setShareDialogOpen] = useState(false); + const [shareAssetId, setShareAssetId] = useState(''); + const [shareLink, setShareLink] = useState(''); + const [snackbarOpen, setSnackbarOpen] = useState(false); + const [snackbarMessage, setSnackbarMessage] = useState(''); + + useEffect(() => { + loadAssets(true); + }, [filter]); + + const loadAssets = async (reset: boolean = false) => { + try { + setLoading(true); + const response = await api.listAssets({ + cursor: reset ? undefined : cursor, + limit: 50, + type: filter === 'all' ? undefined : filter, + }); + + setAssets(reset ? response.items : [...assets, ...response.items]); + setHasMore(response.has_more); + setCursor(response.next_cursor); + } catch (error) { + console.error('Failed to load assets:', error); + } finally { + setLoading(false); + } + }; + + const handleUploadComplete = () => { + setUploadOpen(false); + loadAssets(true); + showSnackbar('Файлы успешно загружены'); + }; + + const handleDelete = async (assetId: string) => { + try { + await api.deleteAsset(assetId); + setAssets(assets.filter((a) => a.id !== assetId)); + showSnackbar('Файл перемещен в корзину'); + } catch (error) { + console.error('Failed to delete asset:', error); + showSnackbar('Ошибка при удалении файла'); + } + }; + + const handleShare = (assetId: string) => { + setShareAssetId(assetId); + setShareDialogOpen(true); + }; + + const handleCreateShare = async () => { + try { + const share = await api.createShare({ + asset_id: shareAssetId, + expires_in_seconds: 86400 * 7, // 7 days + }); + const link = `${window.location.origin}/share/${share.token}`; + setShareLink(link); + showSnackbar('Ссылка создана'); + } catch (error) { + console.error('Failed to create share:', error); + showSnackbar('Ошибка создания ссылки'); + } + }; + + const handleCopyShareLink = () => { + navigator.clipboard.writeText(shareLink); + showSnackbar('Ссылка скопирована'); + setShareDialogOpen(false); + setShareLink(''); + }; + + const showSnackbar = (message: string) => { + setSnackbarMessage(message); + setSnackbarOpen(true); + }; + + return ( + + + {/* Filters */} + + + Тип файлов + + + + + {/* Content */} + + {loading && assets.length === 0 && ( + + + + )} + + {!loading && assets.length === 0 && ( + + + Нет файлов + + + Нажмите кнопку + чтобы загрузить файлы + + + )} + + {assets.length > 0 && ( + <> + + {assets.map((asset) => ( + + setViewerAsset(asset)} + /> + + ))} + + + {hasMore && ( + + + + )} + + )} + + + {/* FAB */} + setUploadOpen(true)} + > + + + + {/* Upload Dialog */} + setUploadOpen(false)} + onComplete={handleUploadComplete} + /> + + {/* Viewer Modal */} + setViewerAsset(null)} + onDelete={handleDelete} + onShare={handleShare} + /> + + {/* Share Dialog */} + setShareDialogOpen(false)}> + Поделиться файлом + + {!shareLink ? ( + + Создать публичную ссылку на файл? Ссылка будет действительна 7 дней. + + ) : ( + + )} + + + + {!shareLink ? ( + + ) : ( + + )} + + + + {/* Snackbar */} + setSnackbarOpen(false)} + message={snackbarMessage} + /> + + + ); +} diff --git a/frontend/src/pages/LoginPage.tsx b/frontend/src/pages/LoginPage.tsx new file mode 100644 index 0000000..f7197f6 --- /dev/null +++ b/frontend/src/pages/LoginPage.tsx @@ -0,0 +1,117 @@ +import { useState } from 'react'; +import { useNavigate, Link as RouterLink } from 'react-router-dom'; +import { + Box, + Container, + Paper, + TextField, + Button, + Typography, + Link, + Alert, +} from '@mui/material'; +import { CloudUpload as CloudIcon } from '@mui/icons-material'; +import { useAuth } from '../hooks/useAuth'; + +export default function LoginPage() { + const navigate = useNavigate(); + const { login } = useAuth(); + const [email, setEmail] = useState(''); + const [password, setPassword] = useState(''); + const [error, setError] = useState(''); + const [loading, setLoading] = useState(false); + + const handleSubmit = async (e: React.FormEvent) => { + e.preventDefault(); + setError(''); + setLoading(true); + + try { + await login(email, password); + navigate('/library'); + } catch (err: any) { + setError(err.response?.data?.detail || 'Ошибка входа. Проверьте данные.'); + } finally { + setLoading(false); + } + }; + + return ( + + + + + + + ITCloud + + + Облачное хранилище фото и видео + + + +
+ setEmail(e.target.value)} + margin="normal" + required + autoFocus + /> + setPassword(e.target.value)} + margin="normal" + required + /> + + {error && ( + + {error} + + )} + + + + + + Нет аккаунта?{' '} + + Зарегистрироваться + + + + +
+
+
+ ); +} diff --git a/frontend/src/pages/RegisterPage.tsx b/frontend/src/pages/RegisterPage.tsx new file mode 100644 index 0000000..77d9f2e --- /dev/null +++ b/frontend/src/pages/RegisterPage.tsx @@ -0,0 +1,139 @@ +import { useState } from 'react'; +import { useNavigate, Link as RouterLink } from 'react-router-dom'; +import { + Box, + Container, + Paper, + TextField, + Button, + Typography, + Link, + Alert, +} from '@mui/material'; +import { CloudUpload as CloudIcon } from '@mui/icons-material'; +import { useAuth } from '../hooks/useAuth'; + +export default function RegisterPage() { + const navigate = useNavigate(); + const { register } = useAuth(); + const [email, setEmail] = useState(''); + const [password, setPassword] = useState(''); + const [confirmPassword, setConfirmPassword] = useState(''); + const [error, setError] = useState(''); + const [loading, setLoading] = useState(false); + + const handleSubmit = async (e: React.FormEvent) => { + e.preventDefault(); + setError(''); + + if (password !== confirmPassword) { + setError('Пароли не совпадают'); + return; + } + + if (password.length < 8) { + setError('Пароль должен быть не менее 8 символов'); + return; + } + + setLoading(true); + + try { + await register(email, password); + navigate('/login'); + } catch (err: any) { + setError(err.response?.data?.detail || 'Ошибка регистрации'); + } finally { + setLoading(false); + } + }; + + return ( + + + + + + + Регистрация + + + Создайте аккаунт для доступа к облачному хранилищу + + + +
+ setEmail(e.target.value)} + margin="normal" + required + autoFocus + /> + setPassword(e.target.value)} + margin="normal" + required + helperText="Минимум 8 символов" + /> + setConfirmPassword(e.target.value)} + margin="normal" + required + /> + + {error && ( + + {error} + + )} + + + + + + Уже есть аккаунт?{' '} + + Войти + + + + +
+
+
+ ); +} diff --git a/frontend/src/pages/ShareViewPage.tsx b/frontend/src/pages/ShareViewPage.tsx new file mode 100644 index 0000000..8c79178 --- /dev/null +++ b/frontend/src/pages/ShareViewPage.tsx @@ -0,0 +1,172 @@ +import { useState, useEffect } from 'react'; +import { useParams } from 'react-router-dom'; +import { + Box, + Container, + Paper, + Typography, + CircularProgress, + Alert, + TextField, + Button, +} from '@mui/material'; +import { CloudUpload as CloudIcon } from '@mui/icons-material'; +import ViewerModal from '../components/ViewerModal'; +import type { Share, Asset } from '../types'; +import api from '../services/api'; + +export default function ShareViewPage() { + const { token } = useParams<{ token: string }>(); + const [share, setShare] = useState(null); + const [asset, setAsset] = useState(null); + const [loading, setLoading] = useState(true); + const [error, setError] = useState(''); + const [password, setPassword] = useState(''); + const [needsPassword, setNeedsPassword] = useState(false); + const [viewerOpen, setViewerOpen] = useState(false); + + useEffect(() => { + if (token) { + loadShare(); + } + }, [token]); + + const loadShare = async (pwd?: string) => { + if (!token) return; + + try { + setLoading(true); + setError(''); + const shareData = await api.getShare(token, pwd); + setShare(shareData); + + if (shareData.asset_id) { + const assetData = await api.getAsset(shareData.asset_id); + setAsset(assetData); + } + } catch (err: any) { + if (err.response?.status === 401) { + setNeedsPassword(true); + setError('Требуется пароль'); + } else { + setError(err.response?.data?.detail || 'Ссылка недействительна или истекла'); + } + } finally { + setLoading(false); + } + }; + + const handlePasswordSubmit = (e: React.FormEvent) => { + e.preventDefault(); + loadShare(password); + }; + + const handleView = () => { + setViewerOpen(true); + }; + + return ( + + + + + + + Общий доступ к файлу + + + + {loading && ( + + + + )} + + {error && !needsPassword && ( + + {error} + + )} + + {needsPassword && !share && ( +
+ + Этот файл защищен паролем + + setPassword(e.target.value)} + margin="normal" + required + autoFocus + /> + {error && ( + + {error} + + )} + + + )} + + {!loading && share && asset && ( + + + {asset.original_filename} + + + Размер: {(asset.size_bytes / 1024 / 1024).toFixed(2)} MB + + + Тип: {asset.type === 'photo' ? 'Фото' : 'Видео'} + + + + + )} +
+ + {asset && ( + setViewerOpen(false)} + /> + )} +
+
+ ); +} diff --git a/frontend/src/pages/TrashPage.tsx b/frontend/src/pages/TrashPage.tsx new file mode 100644 index 0000000..6fae661 --- /dev/null +++ b/frontend/src/pages/TrashPage.tsx @@ -0,0 +1,154 @@ +import { useState, useEffect } from 'react'; +import { + Box, + Grid, + Typography, + CircularProgress, + Button, + Snackbar, +} from '@mui/material'; +import Layout from '../components/Layout'; +import MediaCard from '../components/MediaCard'; +import type { Asset } from '../types'; +import api from '../services/api'; + +export default function TrashPage() { + const [assets, setAssets] = useState([]); + const [loading, setLoading] = useState(true); + const [selectedAssets, setSelectedAssets] = useState>(new Set()); + const [snackbarOpen, setSnackbarOpen] = useState(false); + const [snackbarMessage, setSnackbarMessage] = useState(''); + + useEffect(() => { + loadDeletedAssets(); + }, []); + + const loadDeletedAssets = async () => { + try { + setLoading(true); + // We need to get all assets and filter deleted ones + // TODO: Add deleted filter to API + const response = await api.listAssets({ limit: 200 }); + const deletedAssets = response.items.filter((asset) => asset.deleted_at); + setAssets(deletedAssets); + } catch (error) { + console.error('Failed to load deleted assets:', error); + } finally { + setLoading(false); + } + }; + + const handleSelect = (assetId: string, selected: boolean) => { + const newSelected = new Set(selectedAssets); + if (selected) { + newSelected.add(assetId); + } else { + newSelected.delete(assetId); + } + setSelectedAssets(newSelected); + }; + + const handleRestore = async () => { + if (selectedAssets.size === 0) return; + + try { + await Promise.all( + Array.from(selectedAssets).map((assetId) => api.restoreAsset(assetId)) + ); + setAssets(assets.filter((a) => !selectedAssets.has(a.id))); + setSelectedAssets(new Set()); + showSnackbar('Файлы восстановлены'); + } catch (error) { + console.error('Failed to restore assets:', error); + showSnackbar('Ошибка при восстановлении файлов'); + } + }; + + const handlePurge = async () => { + if (selectedAssets.size === 0) return; + + if (!confirm('Вы уверены? Файлы будут удалены навсегда.')) { + return; + } + + try { + await Promise.all( + Array.from(selectedAssets).map((assetId) => api.purgeAsset(assetId)) + ); + setAssets(assets.filter((a) => !selectedAssets.has(a.id))); + setSelectedAssets(new Set()); + showSnackbar('Файлы удалены навсегда'); + } catch (error) { + console.error('Failed to purge assets:', error); + showSnackbar('Ошибка при удалении файлов'); + } + }; + + const showSnackbar = (message: string) => { + setSnackbarMessage(message); + setSnackbarOpen(true); + }; + + return ( + + + {/* Actions */} + {selectedAssets.size > 0 && ( + + + Выбрано: {selectedAssets.size} + + + + + )} + + {/* Content */} + + {loading && ( + + + + )} + + {!loading && assets.length === 0 && ( + + + Корзина пуста + + + Удаленные файлы будут отображаться здесь + + + )} + + {assets.length > 0 && ( + + {assets.map((asset) => ( + + + + ))} + + )} + + + {/* Snackbar */} + setSnackbarOpen(false)} + message={snackbarMessage} + /> + + + ); +} diff --git a/frontend/src/services/api.ts b/frontend/src/services/api.ts new file mode 100644 index 0000000..2e171d5 --- /dev/null +++ b/frontend/src/services/api.ts @@ -0,0 +1,175 @@ +import axios, { AxiosInstance } from 'axios'; +import type { + User, + AuthTokens, + Asset, + AssetListResponse, + CreateUploadRequest, + CreateUploadResponse, + DownloadUrlResponse, + Share, + CreateShareRequest, +} from '../types'; + +const API_URL = import.meta.env.VITE_API_URL || 'http://localhost:8000'; + +class ApiClient { + private client: AxiosInstance; + + constructor() { + this.client = axios.create({ + baseURL: `${API_URL}/api/v1`, + headers: { + 'Content-Type': 'application/json', + }, + }); + + // Add auth token to requests + this.client.interceptors.request.use((config) => { + const token = localStorage.getItem('access_token'); + if (token) { + config.headers.Authorization = `Bearer ${token}`; + } + return config; + }); + + // Handle 401 errors + this.client.interceptors.response.use( + (response) => response, + (error) => { + if (error.response?.status === 401) { + localStorage.removeItem('access_token'); + localStorage.removeItem('refresh_token'); + window.location.href = '/login'; + } + return Promise.reject(error); + } + ); + } + + // Auth + async register(email: string, password: string): Promise { + const { data } = await this.client.post('/auth/register', { email, password }); + return data; + } + + async login(email: string, password: string): Promise { + const { data } = await this.client.post('/auth/login', { email, password }); + localStorage.setItem('access_token', data.access_token); + localStorage.setItem('refresh_token', data.refresh_token); + return data; + } + + async getMe(): Promise { + const { data } = await this.client.get('/auth/me'); + return data; + } + + logout(): void { + localStorage.removeItem('access_token'); + localStorage.removeItem('refresh_token'); + } + + // Assets + async listAssets(params?: { + cursor?: string; + limit?: number; + type?: string; + }): Promise { + const { data } = await this.client.get('/assets', { params }); + return data; + } + + async getAsset(assetId: string): Promise { + const { data } = await this.client.get(`/assets/${assetId}`); + return data; + } + + async getDownloadUrl(assetId: string, kind: 'original' | 'thumb' = 'original'): Promise { + const { data } = await this.client.get( + `/assets/${assetId}/download-url`, + { params: { kind } } + ); + return data.url; + } + + async deleteAsset(assetId: string): Promise { + const { data } = await this.client.delete(`/assets/${assetId}`); + return data; + } + + async restoreAsset(assetId: string): Promise { + const { data } = await this.client.post(`/assets/${assetId}/restore`); + return data; + } + + async purgeAsset(assetId: string): Promise { + await this.client.delete(`/assets/${assetId}/purge`); + } + + // Upload + async createUpload(request: CreateUploadRequest): Promise { + const { data } = await this.client.post('/uploads/create', request); + return data; + } + + async uploadToS3(url: string, file: File, fields?: Record): Promise { + const formData = new FormData(); + + // Add fields first (for pre-signed POST) + if (fields) { + Object.entries(fields).forEach(([key, value]) => { + formData.append(key, value); + }); + } + + // Add file last + formData.append('file', file); + + await axios.post(url, formData, { + headers: { + 'Content-Type': 'multipart/form-data', + }, + }); + } + + async finalizeUpload(assetId: string, etag?: string, sha256?: string): Promise { + const { data } = await this.client.post(`/uploads/${assetId}/finalize`, { etag, sha256 }); + return data; + } + + // Shares + async createShare(request: CreateShareRequest): Promise { + const { data } = await this.client.post('/shares', request); + return data; + } + + async getShare(token: string, password?: string): Promise { + const { data } = await this.client.get(`/shares/${token}`, { + params: password ? { password } : undefined, + }); + return data; + } + + async getShareDownloadUrl( + token: string, + assetId: string, + kind: 'original' | 'thumb' = 'original', + password?: string + ): Promise { + const { data } = await this.client.get( + `/shares/${token}/download-url`, + { + params: { asset_id: assetId, kind, password }, + } + ); + return data.url; + } + + async revokeShare(token: string): Promise { + const { data } = await this.client.post(`/shares/${token}/revoke`); + return data; + } +} + +export default new ApiClient(); diff --git a/frontend/src/theme/theme.ts b/frontend/src/theme/theme.ts new file mode 100644 index 0000000..44fbb01 --- /dev/null +++ b/frontend/src/theme/theme.ts @@ -0,0 +1,53 @@ +import { createTheme } from '@mui/material/styles'; + +const theme = createTheme({ + palette: { + mode: 'light', + primary: { + main: '#1976d2', + light: '#42a5f5', + dark: '#1565c0', + }, + secondary: { + main: '#9c27b0', + light: '#ba68c8', + dark: '#7b1fa2', + }, + background: { + default: '#f5f5f5', + paper: '#ffffff', + }, + }, + typography: { + fontFamily: [ + 'Roboto', + '-apple-system', + 'BlinkMacSystemFont', + '"Segoe UI"', + 'Arial', + 'sans-serif', + ].join(','), + }, + shape: { + borderRadius: 8, + }, + components: { + MuiButton: { + styleOverrides: { + root: { + textTransform: 'none', + fontWeight: 500, + }, + }, + }, + MuiCard: { + styleOverrides: { + root: { + boxShadow: '0 2px 8px rgba(0,0,0,0.1)', + }, + }, + }, + }, +}); + +export default theme; diff --git a/frontend/src/types/index.ts b/frontend/src/types/index.ts new file mode 100644 index 0000000..0ae2a25 --- /dev/null +++ b/frontend/src/types/index.ts @@ -0,0 +1,76 @@ +export interface User { + id: string; + email: string; + is_active: boolean; + created_at: string; +} + +export interface AuthTokens { + access_token: string; + refresh_token: string; + token_type: string; +} + +export type AssetType = 'photo' | 'video'; +export type AssetStatus = 'uploading' | 'ready' | 'failed' | 'deleted'; + +export interface Asset { + id: string; + user_id: string; + type: AssetType; + status: AssetStatus; + original_filename: string; + content_type: string; + size_bytes: number; + sha256?: string; + captured_at?: string; + width?: number; + height?: number; + duration_sec?: number; + storage_key_original: string; + storage_key_thumb?: string; + created_at: string; + deleted_at?: string; +} + +export interface AssetListResponse { + items: Asset[]; + next_cursor?: string; + has_more: boolean; +} + +export interface CreateUploadRequest { + original_filename: string; + content_type: string; + size_bytes: number; +} + +export interface CreateUploadResponse { + asset_id: string; + upload_url: string; + upload_method: string; + fields?: Record; +} + +export interface DownloadUrlResponse { + url: string; + expires_in: number; +} + +export interface Share { + id: string; + owner_user_id: string; + asset_id?: string; + album_id?: string; + token: string; + expires_at?: string; + created_at: string; + revoked_at?: string; +} + +export interface CreateShareRequest { + asset_id?: string; + album_id?: string; + expires_in_seconds?: number; + password?: string; +} diff --git a/frontend/tsconfig.json b/frontend/tsconfig.json new file mode 100644 index 0000000..a7fc6fb --- /dev/null +++ b/frontend/tsconfig.json @@ -0,0 +1,25 @@ +{ + "compilerOptions": { + "target": "ES2020", + "useDefineForClassFields": true, + "lib": ["ES2020", "DOM", "DOM.Iterable"], + "module": "ESNext", + "skipLibCheck": true, + + /* Bundler mode */ + "moduleResolution": "bundler", + "allowImportingTsExtensions": true, + "resolveJsonModule": true, + "isolatedModules": true, + "noEmit": true, + "jsx": "react-jsx", + + /* Linting */ + "strict": true, + "noUnusedLocals": true, + "noUnusedParameters": true, + "noFallthroughCasesInSwitch": true + }, + "include": ["src"], + "references": [{ "path": "./tsconfig.node.json" }] +} diff --git a/frontend/tsconfig.node.json b/frontend/tsconfig.node.json new file mode 100644 index 0000000..42872c5 --- /dev/null +++ b/frontend/tsconfig.node.json @@ -0,0 +1,10 @@ +{ + "compilerOptions": { + "composite": true, + "skipLibCheck": true, + "module": "ESNext", + "moduleResolution": "bundler", + "allowSyntheticDefaultImports": true + }, + "include": ["vite.config.ts"] +} diff --git a/frontend/vite.config.ts b/frontend/vite.config.ts new file mode 100644 index 0000000..0351b36 --- /dev/null +++ b/frontend/vite.config.ts @@ -0,0 +1,15 @@ +import { defineConfig } from 'vite' +import react from '@vitejs/plugin-react' + +// https://vitejs.dev/config/ +export default defineConfig({ + plugins: [react()], + server: { + host: true, + port: 5173, + }, + build: { + outDir: '../static', + emptyOutDir: true, + }, +}) diff --git a/tech_spec_cloud_media_storage.md b/tech_spec_cloud_media_storage.md new file mode 100644 index 0000000..74c1225 --- /dev/null +++ b/tech_spec_cloud_media_storage.md @@ -0,0 +1,506 @@ +# Техническое задание (ТЗ) +## Проект: облачное хранилище фото и видео (S3 + Python backend + SQLite → PostgreSQL) +**Формат фронтенда:** статический сайт (SPA) в папке `static/`, хостинг в S3 +**UI:** Material UI (MUI) +**Backend:** Python (FastAPI) +**Хранилище файлов:** S3 / S3-compatible (MinIO и т.п.) +**База данных:** SQLite на старте, в будущем — PostgreSQL (миграции через Alembic) + +--- + +## 1. Цели и принципы + +### 1.1 Цель продукта +Создать удобное, быстрое и безопасное облачное хранилище для фото и видео с фокусом на: +- максимально простой и быстрый загрузчик (drag&drop, пакетные загрузки, большие файлы); +- удобную библиотеку (лента/сеткой), быстрый просмотр, поиск/фильтры; +- безопасный доступ и шаринг ссылками; +- готовность к росту: миграция SQLite → PostgreSQL без переписывания бизнес-логики. + +### 1.2 Принципы реализации +- **SOLID / DRY / Clean Architecture**: разделение слоёв (API → Service → Repository → Infrastructure). +- **Докстринги обязательны** для публичных методов/классов и основных сервисов. +- **Минимум комментариев в коде**: предпочтение самодокументирующемуся коду и докстрингам. +- **Стабильный API-контракт**: OpenAPI/Swagger (FastAPI). +- **Сразу закладываем расширяемость** (теги, альбомы, шаринг, фоновые задачи). + +--- + +## 2. Область охвата (Scope) + +### 2.1 MVP (минимально жизнеспособная версия) +1) Регистрация/логин (минимум: email+пароль) +2) Загрузка фото/видео в S3 (через pre-signed URLs) +3) Библиотека медиа: + - список/сетка, + - просмотр (лайтбокс) фото, + - просмотр/проигрывание видео, + - базовые сортировки (по дате добавления, по дате съёмки). +4) Удаление (корзина/soft-delete) и восстановление. +5) Простой шаринг публичной ссылкой (срок действия). +6) Генерация превью для изображений (thumbnails). +7) Хранение метаданных (SQLite), подготовлено для PostgreSQL (ORM + миграции). + +### 2.2 Версия v1 (после MVP) +- Альбомы/папки (логическая группировка). +- Теги и поиск по тегам/дате/типу/размеру. +- Извлечение EXIF (камера, дата съёмки, гео — опционально). +- Превью/постер для видео. +- Множественный выбор, пакетные операции (переместить/удалить/добавить теги). +- Пользовательские квоты и статистика хранения. +- Логи действий (аудит: загрузка/удаление/шаринг). + +### 2.3 Вне рамок (Non-goals) на старте +- Полноценное видеотранскодирование в HLS/DASH. +- Распознавание лиц/умные альбомы. +- Коллаборация “семейные библиотеки” (несколько владельцев одного альбома). +- End-to-end шифрование на клиенте. + +--- + +## 3. Роли и сценарии + +### 3.1 Роли +- **User**: загружает и управляет своей медиатекой. +- **Admin** (опционально в v1): администрирование пользователей/квот и доступ ко всем облакам пользователей. + +### 3.2 Ключевые user stories (MVP) +1) Как пользователь, я хочу быстро перетащить папку/файлы и загрузить их в облако. +2) Как пользователь, я хочу просматривать фото в полноэкранном режиме и листать стрелками. +3) Как пользователь, я хочу смотреть видео прямо в браузере. +4) Как пользователь, я хочу удалить файлы и при необходимости восстановить из корзины. +5) Как пользователь, я хочу поделиться ссылкой на один файл/альбом на ограниченное время. + +--- + +## 4. UX/UI требования (фронтенд) + +### 4.1 Технологическое решение фронтенда +**Рекомендуемый компромисс “JS чистый + MUI”:** +- Реализовать SPA на **React + MUI**, +- Сборка (Vite) выдаёт статические артефакты в `static/`, +- На проде в S3 лежит **только** `static/` (HTML/CSS/JS). + +> Важно: Material UI (MUI) — React-библиотека. “Чистый JS без сборки” возможен через CDN + UMD, но это хуже по производительности и DX. В ТЗ закладываем правильную схему: разработка с Vite, деплой артефактов в `static/`. + +### 4.2 Страницы/экраны (MVP) +1) **Login / Register** +2) **Library** (главная) + - переключатель “Grid / List”, + - фильтр по типу (photo/video), + - сортировка, + - строка поиска (в MVP можно скрыть за feature-flag). +3) **Viewer** + - фото: зум, листание, скачать, + - видео: плеер (HTML5), скачать. +4) **Upload** + - drag&drop зона, + - прогресс по файлу и общий прогресс, + - повтор при ошибке. +5) **Trash** + - список удалённых, restore/purge. +6) **Shared view** (страница по публичной ссылке) + +### 4.3 Компоненты UI +- AppBar + боковое меню (Drawer) +- MediaGrid (виртуализация списка, infinite scroll) +- MediaCard (thumbnail, иконка видео, размер/дата) +- UploadDialog (dropzone + очередь + прогресс) +- ViewerModal (keyboard nav: ← → Esc) +- FiltersBar (chips/select) + +### 4.4 Нефункциональные требования к UI +- **Responsive**: desktop-first, корректно на мобилках. +- **Оптимизация**: + - thumbnails грузятся лениво, + - viewer тянет оригинал только по запросу, + - infinite scroll. +- **Удобство**: + - drag&drop в любом месте Library, + - горячие клавиши в Viewer, + - понятные ошибки и “повторить загрузку”. + +--- + +## 5. Архитектура (общая) + +### 5.1 Компоненты +1) **Static SPA** (S3 hosting) +2) **Backend API** (Python) +3) **Database** (SQLite → PostgreSQL) +4) **Object storage** (S3) +5) **Background worker** (опционально сразу, обязательно для v1) — генерация превью/постеров + +### 5.2 Потоки +**Upload (рекомендуемый):** +1) SPA запрашивает у backend `create_upload` (получает pre-signed URL / multipart init). +2) SPA грузит файл напрямую в S3. +3) SPA вызывает `finalize_upload` (backend фиксирует метаданные, ставит задачу на превью). +4) Backend возвращает Asset ID. + +**Download/View:** +- SPA запрашивает `get_asset_download_url` → получает краткоживущую signed-ссылку на S3 (оригинал/thumbnail). + +--- + +## 6. Backend требования + +### 6.1 Стек (рекомендуется) +- **FastAPI** (ASGI) +- **Pydantic** (схемы) +- **SQLAlchemy 2.x (async)** + **Alembic** (миграции) +- **SQLite** (MVP), конфигурация диалекта для лёгкого перехода на PostgreSQL +- S3 SDK: boto3 (sync) или aioboto3 (async) / или минимальный клиент через presigned +- Auth: JWT access + refresh (HTTP-only cookies) или Bearer tokens +- Logging: structlog/loguru + стандартный logging +- Tests: pytest + httpx + +### 6.2 Слоистая архитектура (Clean) +- `api/` — роуты, схемы, зависимости, auth middleware +- `services/` — бизнес-логика (upload, library, share) +- `repositories/` — доступ к данным (CRUD) +- `infra/` — S3 client, db session factory, config, background tasks +- `domain/` — модели домена/интерфейсы (по необходимости) + +### 6.3 Обязательные нефункциональные требования +- Валидация входных данных (Pydantic). +- Единый формат ошибок (problem+json либо собственный стандарт). +- Rate limiting (минимально на login/share) — можно отложить до v1. +- Безопасность: + - пароли хранить только в виде хеша (argon2/bcrypt), + - все ссылки на S3 — только signed, + - CORS настроен строго, + - CSRF защита при cookie-based auth. +- Наблюдаемость: + - структурные логи, + - correlation id (request id), + - healthcheck endpoint. + +--- + +## 7. Модель данных (SQLite → PostgreSQL) + +### 7.1 Основные сущности +**users** +- `id` (UUID) +- `email` (unique) +- `password_hash` +- `created_at`, `updated_at` +- `is_active` + +**assets** +- `id` (UUID) +- `user_id` (FK users) +- `type` enum: `photo|video` +- `status` enum: `uploading|ready|failed|deleted` +- `original_filename` +- `content_type` +- `size_bytes` +- `sha256` (optional, for dedup) +- `captured_at` (datetime, optional) +- `created_at` +- `deleted_at` (nullable) +- `storage_key_original` (S3 object key) +- `storage_key_thumb` (nullable) +- `width`, `height` (nullable) +- `duration_sec` (nullable) + +**albums** (v1) +- `id` UUID +- `user_id` +- `title` +- `created_at` + +**album_items** (v1) +- `album_id` +- `asset_id` +- `position` (int) + +**tags** (v1) +- `id`, `user_id`, `name` + +**asset_tags** (v1) +- `asset_id`, `tag_id` + +**shares** +- `id` UUID +- `owner_user_id` +- `asset_id` (nullable) — если шаринг одного файла +- `album_id` (nullable) — если шаринг альбома +- `token` (unique, random) +- `expires_at` (nullable) +- `password_hash` (nullable) +- `created_at` +- `revoked_at` (nullable) + +**auth_sessions** (опционально) +- `id` +- `user_id` +- `refresh_token_hash` +- `created_at`, `expires_at`, `revoked_at` + +### 7.2 Требования к миграциям +- Все изменения БД — только через Alembic. +- Никаких raw SQL “внутри приложения”, кроме миграций. +- Схема должна работать в SQLite и PostgreSQL: + - UUID хранить как TEXT (SQLite) и UUID (PostgreSQL) через тип-переопределения, + - JSON поля избегать в MVP (или хранить TEXT). + +--- + +## 8. Object Storage (S3) + +### 8.1 Bucket и ключи +- Bucket: `MEDIA_BUCKET` +- Ключ оригинала: `u/{user_id}/o/{yyyy}/{mm}/{asset_id}{ext}` +- Ключ превью: `u/{user_id}/t/{yyyy}/{mm}/{asset_id}.jpg` +- (v1) ключ постера для видео: `u/{user_id}/p/{yyyy}/{mm}/{asset_id}.jpg` + +### 8.2 Политики доступа +- Bucket **private**. +- Доступ только через pre-signed URLs с коротким TTL: + - thumbnails: 5–30 минут, + - originals: 1–10 минут (настраиваемо). + +--- + +## 9. API (контракт) + +### 9.1 Общие правила +- Версионирование: `/api/v1` +- Авторизация: `Authorization: Bearer ` (или cookie-based) +- Ответы: JSON +- Ошибки: единый формат `{ "error": { "code": "...", "message": "...", "details": ... } }` + +### 9.2 Эндпоинты (MVP) + +#### Auth +- `POST /api/v1/auth/register` +- `POST /api/v1/auth/login` +- `POST /api/v1/auth/logout` +- `GET /api/v1/auth/me` +- (опц) `POST /api/v1/auth/refresh` + +#### Assets (library) +- `GET /api/v1/assets?cursor=&limit=&type=&sort=` +- `GET /api/v1/assets/{asset_id}` +- `DELETE /api/v1/assets/{asset_id}` (soft-delete) +- `POST /api/v1/assets/{asset_id}/restore` +- `DELETE /api/v1/assets/{asset_id}/purge` (hard-delete, только из Trash) + +#### Upload (direct-to-S3) +- `POST /api/v1/uploads/create` + - вход: `original_filename, content_type, size_bytes` + - выход: `asset_id, upload_method, presigned_url|presigned_post|multipart` +- `POST /api/v1/uploads/{asset_id}/finalize` + - вход: `etag` (или parts list), `sha256` (опц) + - выход: `asset` + +#### URLs (signed access) +- `GET /api/v1/assets/{asset_id}/download-url?kind=original|thumb` +- `GET /api/v1/assets/{asset_id}/stream-url` (для видео, kind=original на старте) + +#### Shares +- `POST /api/v1/shares` (создать ссылку) +- `GET /api/v1/shares/{token}` (получить метаданные) +- `GET /api/v1/shares/{token}/download-url?asset_id=&kind=` +- `POST /api/v1/shares/{token}/revoke` + +### 9.3 Пагинация +- Cursor-based (рекомендуется): `cursor` = base64(last_created_at, last_id) +- Ответ: `{ items: [...], next_cursor: "...", has_more: true }` + +--- + +## 10. Превью (thumbnails) и обработка + +### 10.1 MVP: только изображения +- После `finalize_upload` backend ставит задачу “generate_thumbnail(asset_id)”. + +**Режим исполнения (на выбор):** +1) **Inline** (допустимо в MVP): генерация превью синхронно при finalize, если файл небольшой. +2) **Background worker** (лучше): Celery/RQ + Redis, либо встроенный простейший worker. + +Рекомендуемый минимум: +- Redis + RQ. + +### 10.2 Технические требования к thumbnails +- Формат: JPEG +- Максимальная сторона: 512px или 1024px (настраиваемо) +- Прогрессивный JPEG (желательно) +- Сохранение в S3 по `storage_key_thumb` +- При ошибке: `assets.status = failed` или отдельное поле `thumb_status` + +### 10.3 v1: видео постеры +- Генерация постера (кадр на 1–3 секунде) через ffmpeg. +- Хранение отдельным ключом. + +--- + +## 11. Безопасность + +### 11.1 Аутентификация и пароли +- Хэширование argon2id (предпочтительно) или bcrypt. +- Ограничение попыток логина (v1). +- Сессии/refresh токены можно отзывать. + +### 11.2 Доступ к файлам +- Все доступы к S3 только через signed URLs. +- Проверка владения asset’ом во всех endpoints. +- Подпись ссылок короткая и на конкретный объект. + +### 11.3 Публичные ссылки +- token должен быть криптостойким (>= 128 бит энтропии). +- Возможность ограничить срок действия. +- (v1) опциональный пароль на ссылку. + +--- + +## 12. Производительность и ограничения + +### 12.1 Ограничения MVP +- Максимальный размер видео: конфиг `MAX_UPLOAD_SIZE_BYTES` (например 5–20 ГБ). +- Для больших файлов — multipart upload (S3 multipart). +- Лимит выдачи списка: 50–200 элементов, infinite scroll. + +### 12.2 Кэширование +- Thumbnails: CDN-friendly, но доступ через signed URL (TTL). +- Статические файлы фронтенда: cache-control immutable для hashed assets. + +--- + +## 13. Конфигурация (env) + +### 13.1 Backend env variables (пример) +- `APP_ENV=dev|prod` +- `DATABASE_URL=sqlite+aiosqlite:///./app.db` (позже: `postgresql+asyncpg://...`) +- `S3_ENDPOINT_URL=` (если MinIO) +- `S3_REGION=` +- `S3_ACCESS_KEY_ID=` +- `S3_SECRET_ACCESS_KEY=` +- `MEDIA_BUCKET=` +- `SIGNED_URL_TTL_SECONDS=600` +- `CORS_ORIGINS=https://your-domain` +- `JWT_SECRET=` +- `JWT_ACCESS_TTL_SECONDS=900` +- `JWT_REFRESH_TTL_SECONDS=1209600` +- `MAX_UPLOAD_SIZE_BYTES=` + +--- + +## 14. Деплой и окружения + +### 14.1 Static (S3) +- Папка `static/` заливается в S3 bucket для сайта. + +### 14.2 Backend +- Docker-образ, запускается как контейнер (uvicorn/gunicorn). +- Рекомендуемый `docker-compose` для dev: backend + minio + sqlite (локально) + redis. + +### 14.3 База данных +- MVP: SQLite файл в volume. +- v1: PostgreSQL отдельным сервисом, переключение через `DATABASE_URL` + миграции alembic. + +--- + +## 15. Структура репозитория (рекомендация) + +``` +repo/ + backend/ + src/ + app/ + api/ + v1/ + services/ + repositories/ + infra/ + domain/ + main.py + alembic/ + tests/ + pyproject.toml + Dockerfile + frontend/ + src/ + public/ + vite.config.js + package.json + static/ # build output (deploy to S3) + docker-compose.yml + README.md +``` + +--- + +## 16. Критерии приёмки (Acceptance Criteria) + +### 16.1 MVP +- Пользователь может зарегистрироваться/войти. +- Пользователь может загрузить: + - минимум 1 фото и 1 видео, + - пакет из 100+ фото, + - большой файл (в пределах лимита), без падения сервера. +- В библиотеке отображаются thumbnails для фото. +- Просмотр фото/видео работает в браузере. +- Удаление перемещает в корзину, restore возвращает. +- Public share link открывается без логина (если активен) и даёт доступ только к расшаренному объекту. +- База — SQLite, миграции работают, есть путь миграции на PostgreSQL (без изменения кода сервисов). + +### 16.2 Качество кода +- Линтер/форматтер настроен (ruff + black или ruff format). +- Тесты покрывают ключевые сценарии (auth, create/finalize upload, list assets, share). +- Докстринги присутствуют в слоях services/repositories/infra. + +--- + +## 17. План работ (укрупнённо) + +### Этап 1 — фундамент (1) +- Скелет backend (FastAPI), конфиг, подключение БД, миграции. +- User + Auth (register/login/me). +- S3 интеграция: pre-signed upload + signed download. + +### Этап 2 — assets (2) +- CRUD метаданных assets. +- Library list с пагинацией. +- Trash (soft-delete/restore/purge). + +### Этап 3 — frontend MVP (3) +- Login/Register. +- Library grid + infinite scroll. +- Upload dialog + прогресс. +- Viewer modal. +- Trash page. + +### Этап 4 — thumbnails (4) +- Генерация превью (inline или background). +- Отображение превью в Library. + +### Этап 5 — shares (5) +- Создание/открытие share links. +- Shared view UI. + +--- + +## 18. Риски и решения + +- **Большие видео**: сразу закладываем multipart upload, иначе упремся в лимиты/таймауты. +- **MUI и “чистый JS”**: реальный путь — React+MUI и сборка в `static/`. +- **SQLite ограничения**: используем репозитории/ORM и Alembic, чтобы миграция на PostgreSQL была сменой `DATABASE_URL`. +- **Стоимость S3**: thumbnails уменьшают трафик; оригиналы только по запросу. + +--- + +## 19. Доп. ПО (если понадобятся очереди/воркеры) +- **Redis** (рекомендуется) +- **RQ** (проще) +- Для видео в v1: **ffmpeg** (в контейнере воркера) + +--- + +## 20. Примечания по лицензиям и приватности +- Пользовательские файлы и метаданные считаются приватными, но админ может всё просматривать. +- Логи не должны содержать содержимое файлов и чувствительные данные (пароли/токены). +- При шаринге — показывать минимум метаданных. +