Compare commits

...

7 Commits

Author SHA1 Message Date
JSC
1388ede1dc Merge branch 'tasks3'
Some checks failed
Backend CI / lint (push) Successful in 9m32s
Backend CI / test (push) Failing after 4m47s
2025-08-29 23:12:45 +02:00
JSC
75569a60b5 fix: Improve logging for invalid player mode by using logger.exception 2025-08-29 15:44:11 +02:00
JSC
2bdd109492 Refactor code structure for improved readability and maintainability 2025-08-29 15:27:12 +02:00
JSC
dc89e45675 Refactor scheduled task repository and schemas for improved type hints and consistency
- Updated type hints from List/Optional to list/None for better readability and consistency across the codebase.
- Refactored import statements for better organization and clarity.
- Enhanced the ScheduledTaskBase schema to use modern type hints.
- Cleaned up unnecessary comments and whitespace in various files.
- Improved error handling and logging in task execution handlers.
- Updated test cases to reflect changes in type hints and ensure compatibility with the new structure.
2025-08-28 23:38:47 +02:00
JSC
96801dc4d6 feat: Refactor TaskHandlerRegistry to include db_session_factory and enhance sound playback handling for user tasks 2025-08-28 23:36:30 +02:00
JSC
6e74d9b940 feat: Add load_playlist method to PlayerService and update task handlers for playlist management 2025-08-28 22:50:57 +02:00
JSC
03abed6d39 Add comprehensive tests for scheduled task repository, scheduler service, and task handlers
- Implemented tests for ScheduledTaskRepository covering task creation, retrieval, filtering, and status updates.
- Developed tests for SchedulerService including task creation, cancellation, user task retrieval, and maintenance jobs.
- Created tests for TaskHandlerRegistry to validate task execution for various types, including credit recharge and sound playback.
- Ensured proper error handling and edge cases in task execution scenarios.
- Added fixtures and mocks to facilitate isolated testing of services and repositories.
2025-08-28 22:37:43 +02:00
24 changed files with 3561 additions and 272 deletions

232
SCHEDULER_EXAMPLE.md Normal file
View File

@@ -0,0 +1,232 @@
# Enhanced Scheduler System - Usage Examples
This document demonstrates how to use the new comprehensive scheduled task system.
## Features Overview
### ✨ **Task Types**
- **Credit Recharge**: Automatic or scheduled credit replenishment
- **Play Sound**: Schedule individual sound playback
- **Play Playlist**: Schedule playlist playback with modes
### 🌍 **Timezone Support**
- Full timezone support with automatic UTC conversion
- Specify any IANA timezone (e.g., "America/New_York", "Europe/Paris")
### 🔄 **Scheduling Options**
- **One-shot**: Execute once at specific date/time
- **Recurring**: Hourly, daily, weekly, monthly, yearly intervals
- **Cron**: Custom cron expressions for complex scheduling
## API Usage Examples
### Create a One-Shot Task
```bash
# Schedule a sound to play in 2 hours
curl -X POST "http://localhost:8000/api/v1/scheduler/tasks" \
-H "Content-Type: application/json" \
-H "Cookie: access_token=YOUR_JWT_TOKEN" \
-d '{
"name": "Play Morning Alarm",
"task_type": "play_sound",
"scheduled_at": "2024-01-01T10:00:00",
"timezone": "America/New_York",
"parameters": {
"sound_id": "sound-uuid-here"
}
}'
```
### Create a Recurring Task
```bash
# Daily credit recharge at midnight UTC
curl -X POST "http://localhost:8000/api/v1/scheduler/admin/system-tasks" \
-H "Content-Type: application/json" \
-H "Cookie: access_token=ADMIN_JWT_TOKEN" \
-d '{
"name": "Daily Credit Recharge",
"task_type": "credit_recharge",
"scheduled_at": "2024-01-01T00:00:00",
"timezone": "UTC",
"recurrence_type": "daily",
"parameters": {}
}'
```
### Create a Cron-Based Task
```bash
# Play playlist every weekday at 9 AM
curl -X POST "http://localhost:8000/api/v1/scheduler/tasks" \
-H "Content-Type: application/json" \
-H "Cookie: access_token=YOUR_JWT_TOKEN" \
-d '{
"name": "Workday Playlist",
"task_type": "play_playlist",
"scheduled_at": "2024-01-01T09:00:00",
"timezone": "America/New_York",
"recurrence_type": "cron",
"cron_expression": "0 9 * * 1-5",
"parameters": {
"playlist_id": "playlist-uuid-here",
"play_mode": "loop",
"shuffle": true
}
}'
```
## Python Service Usage
```python
from datetime import datetime, timedelta
from app.services.scheduler import SchedulerService
from app.models.scheduled_task import TaskType, RecurrenceType
# Initialize scheduler service
scheduler_service = SchedulerService(db_session_factory, player_service)
# Create a one-shot task
task = await scheduler_service.create_task(
name="Test Sound",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.utcnow() + timedelta(hours=2),
timezone="America/New_York",
parameters={"sound_id": "sound-uuid-here"},
user_id=user.id
)
# Create a recurring task
recurring_task = await scheduler_service.create_task(
name="Weekly Playlist",
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=datetime.utcnow() + timedelta(days=1),
recurrence_type=RecurrenceType.WEEKLY,
recurrence_count=10, # Run 10 times then stop
parameters={
"playlist_id": "playlist-uuid",
"play_mode": "continuous",
"shuffle": False
},
user_id=user.id
)
# Cancel a task
success = await scheduler_service.cancel_task(task.id)
# Get user's tasks
user_tasks = await scheduler_service.get_user_tasks(
user_id=user.id,
status=TaskStatus.PENDING,
limit=20
)
```
## Task Parameters
### Credit Recharge Parameters
```json
{
"user_id": "uuid-string-or-null" // null for all users (system task)
}
```
### Play Sound Parameters
```json
{
"sound_id": "uuid-string" // Required: sound to play
}
```
### Play Playlist Parameters
```json
{
"playlist_id": "uuid-string", // Required: playlist to play
"play_mode": "continuous", // Optional: continuous, loop, loop_one, random, single
"shuffle": false // Optional: shuffle playlist
}
```
## Recurrence Types
| Type | Description | Example |
|------|-------------|---------|
| `none` | One-shot execution | Single alarm |
| `hourly` | Every hour | Hourly reminders |
| `daily` | Every day | Daily credit recharge |
| `weekly` | Every week | Weekly reports |
| `monthly` | Every month | Monthly maintenance |
| `yearly` | Every year | Annual renewals |
| `cron` | Custom cron expression | Complex schedules |
## Cron Expression Examples
| Expression | Description |
|------------|-------------|
| `0 9 * * *` | Daily at 9 AM |
| `0 9 * * 1-5` | Weekdays at 9 AM |
| `30 14 1 * *` | 1st of month at 2:30 PM |
| `0 0 * * 0` | Every Sunday at midnight |
| `*/15 * * * *` | Every 15 minutes |
## System Tasks vs User Tasks
### System Tasks
- Created by administrators
- No user association (`user_id` is null)
- Typically for maintenance operations
- Accessible via admin endpoints
### User Tasks
- Created by regular users
- Associated with specific user
- User can only manage their own tasks
- Accessible via regular user endpoints
## Error Handling
The system provides comprehensive error handling:
- **Invalid Parameters**: Validation errors for missing or invalid task parameters
- **Scheduling Conflicts**: Prevention of resource conflicts
- **Timezone Errors**: Invalid timezone specifications handled gracefully
- **Execution Failures**: Failed tasks marked with error messages and retry logic
- **Expired Tasks**: Automatic cleanup of expired tasks
## Monitoring and Management
### Get Task Status
```bash
curl "http://localhost:8000/api/v1/scheduler/tasks/{task-id}" \
-H "Cookie: access_token=YOUR_JWT_TOKEN"
```
### List User Tasks
```bash
curl "http://localhost:8000/api/v1/scheduler/tasks?status=pending&limit=10" \
-H "Cookie: access_token=YOUR_JWT_TOKEN"
```
### Admin: View All Tasks
```bash
curl "http://localhost:8000/api/v1/scheduler/admin/tasks?limit=50" \
-H "Cookie: access_token=ADMIN_JWT_TOKEN"
```
### Cancel Task
```bash
curl -X DELETE "http://localhost:8000/api/v1/scheduler/tasks/{task-id}" \
-H "Cookie: access_token=YOUR_JWT_TOKEN"
```
## Migration from Old Scheduler
The new system automatically:
1. **Creates system tasks**: Daily credit recharge task created on startup
2. **Maintains compatibility**: Existing credit recharge functionality preserved
3. **Enhances functionality**: Adds user tasks and new task types
4. **Improves reliability**: Better error handling and timezone support
The old scheduler is completely replaced - no migration needed for existing functionality.

View File

@@ -12,6 +12,7 @@ from app.api.v1 import (
main, main,
player, player,
playlists, playlists,
scheduler,
socket, socket,
sounds, sounds,
) )
@@ -28,6 +29,7 @@ api_router.include_router(files.router, tags=["files"])
api_router.include_router(main.router, tags=["main"]) api_router.include_router(main.router, tags=["main"])
api_router.include_router(player.router, tags=["player"]) api_router.include_router(player.router, tags=["player"])
api_router.include_router(playlists.router, tags=["playlists"]) api_router.include_router(playlists.router, tags=["playlists"])
api_router.include_router(scheduler.router, tags=["scheduler"])
api_router.include_router(socket.router, tags=["socket"]) api_router.include_router(socket.router, tags=["socket"])
api_router.include_router(sounds.router, tags=["sounds"]) api_router.include_router(sounds.router, tags=["sounds"])
api_router.include_router(admin.router) api_router.include_router(admin.router)

230
app/api/v1/scheduler.py Normal file
View File

@@ -0,0 +1,230 @@
"""API endpoints for scheduled task management."""
from typing import Annotated
from fastapi import APIRouter, Depends, HTTPException, Query
from sqlmodel import select
from sqlmodel.ext.asyncio.session import AsyncSession
from app.core.database import get_db
from app.core.dependencies import (
get_admin_user,
get_current_active_user,
)
from app.core.services import get_global_scheduler_service
from app.models.scheduled_task import ScheduledTask, TaskStatus, TaskType
from app.models.user import User
from app.repositories.scheduled_task import ScheduledTaskRepository
from app.schemas.scheduler import (
ScheduledTaskCreate,
ScheduledTaskResponse,
ScheduledTaskUpdate,
TaskFilterParams,
)
from app.services.scheduler import SchedulerService
router = APIRouter(prefix="/scheduler")
def get_scheduler_service() -> SchedulerService:
"""Get the global scheduler service instance."""
return get_global_scheduler_service()
def get_task_filters(
status: Annotated[
TaskStatus | None, Query(description="Filter by task status"),
] = None,
task_type: Annotated[
TaskType | None, Query(description="Filter by task type"),
] = None,
limit: Annotated[int, Query(description="Maximum number of tasks to return")] = 50,
offset: Annotated[int, Query(description="Number of tasks to skip")] = 0,
) -> TaskFilterParams:
"""Create task filter parameters from query parameters."""
return TaskFilterParams(
status=status,
task_type=task_type,
limit=limit,
offset=offset,
)
@router.post("/tasks", response_model=ScheduledTaskResponse)
async def create_task(
task_data: ScheduledTaskCreate,
current_user: Annotated[User, Depends(get_current_active_user)],
scheduler_service: Annotated[SchedulerService, Depends(get_scheduler_service)],
) -> ScheduledTask:
"""Create a new scheduled task."""
try:
return await scheduler_service.create_task(
task_data=task_data,
user_id=current_user.id,
)
except Exception as e:
raise HTTPException(status_code=400, detail=str(e)) from e
@router.get("/tasks", response_model=list[ScheduledTaskResponse])
async def get_user_tasks(
filters: Annotated[TaskFilterParams, Depends(get_task_filters)],
current_user: Annotated[User, Depends(get_current_active_user)],
scheduler_service: Annotated[SchedulerService, Depends(get_scheduler_service)],
) -> list[ScheduledTask]:
"""Get user's scheduled tasks."""
return await scheduler_service.get_user_tasks(
user_id=current_user.id,
status=filters.status,
task_type=filters.task_type,
limit=filters.limit,
offset=filters.offset,
)
@router.get("/tasks/{task_id}", response_model=ScheduledTaskResponse)
async def get_task(
task_id: int,
current_user: Annotated[User, Depends(get_current_active_user)] = ...,
db_session: Annotated[AsyncSession, Depends(get_db)] = ...,
) -> ScheduledTask:
"""Get a specific scheduled task."""
repo = ScheduledTaskRepository(db_session)
task = await repo.get_by_id(task_id)
if not task:
raise HTTPException(status_code=404, detail="Task not found")
# Check if user owns the task or is admin
if task.user_id != current_user.id and not current_user.is_admin:
raise HTTPException(status_code=403, detail="Access denied")
return task
@router.patch("/tasks/{task_id}", response_model=ScheduledTaskResponse)
async def update_task(
task_id: int,
task_update: ScheduledTaskUpdate,
current_user: Annotated[User, Depends(get_current_active_user)] = ...,
db_session: Annotated[AsyncSession, Depends(get_db)] = ...,
) -> ScheduledTask:
"""Update a scheduled task."""
repo = ScheduledTaskRepository(db_session)
task = await repo.get_by_id(task_id)
if not task:
raise HTTPException(status_code=404, detail="Task not found")
# Check if user owns the task or is admin
if task.user_id != current_user.id and not current_user.is_admin:
raise HTTPException(status_code=403, detail="Access denied")
# Update task fields
update_data = task_update.model_dump(exclude_unset=True)
for field, value in update_data.items():
setattr(task, field, value)
return await repo.update(task)
@router.delete("/tasks/{task_id}")
async def cancel_task(
task_id: int,
current_user: Annotated[User, Depends(get_current_active_user)] = ...,
scheduler_service: Annotated[
SchedulerService, Depends(get_scheduler_service),
] = ...,
db_session: Annotated[AsyncSession, Depends(get_db)] = ...,
) -> dict:
"""Cancel a scheduled task."""
repo = ScheduledTaskRepository(db_session)
task = await repo.get_by_id(task_id)
if not task:
raise HTTPException(status_code=404, detail="Task not found")
# Check if user owns the task or is admin
if task.user_id != current_user.id and not current_user.is_admin:
raise HTTPException(status_code=403, detail="Access denied")
success = await scheduler_service.cancel_task(task_id)
if not success:
raise HTTPException(status_code=400, detail="Failed to cancel task")
return {"message": "Task cancelled successfully"}
# Admin-only endpoints
@router.get("/admin/tasks", response_model=list[ScheduledTaskResponse])
async def get_all_tasks(
status: Annotated[
TaskStatus | None, Query(description="Filter by task status"),
] = None,
task_type: Annotated[
TaskType | None, Query(description="Filter by task type"),
] = None,
limit: Annotated[
int | None, Query(description="Maximum number of tasks to return"),
] = 100,
offset: Annotated[
int | None, Query(description="Number of tasks to skip"),
] = 0,
_: Annotated[User, Depends(get_admin_user)] = ...,
db_session: Annotated[AsyncSession, Depends(get_db)] = ...,
) -> list[ScheduledTask]:
"""Get all scheduled tasks (admin only)."""
# Build query with pagination and filtering
statement = select(ScheduledTask)
if status:
statement = statement.where(ScheduledTask.status == status)
if task_type:
statement = statement.where(ScheduledTask.task_type == task_type)
statement = statement.order_by(ScheduledTask.scheduled_at.desc())
if offset:
statement = statement.offset(offset)
if limit:
statement = statement.limit(limit)
result = await db_session.exec(statement)
return list(result.all())
@router.get("/admin/system-tasks", response_model=list[ScheduledTaskResponse])
async def get_system_tasks(
status: Annotated[
TaskStatus | None, Query(description="Filter by task status"),
] = None,
task_type: Annotated[
TaskType | None, Query(description="Filter by task type"),
] = None,
_: Annotated[User, Depends(get_admin_user)] = ...,
db_session: Annotated[AsyncSession, Depends(get_db)] = ...,
) -> list[ScheduledTask]:
"""Get system tasks (admin only)."""
repo = ScheduledTaskRepository(db_session)
return await repo.get_system_tasks(status=status, task_type=task_type)
@router.post("/admin/system-tasks", response_model=ScheduledTaskResponse)
async def create_system_task(
task_data: ScheduledTaskCreate,
_: Annotated[User, Depends(get_admin_user)] = ...,
scheduler_service: Annotated[
SchedulerService, Depends(get_scheduler_service),
] = ...,
) -> ScheduledTask:
"""Create a system task (admin only)."""
try:
return await scheduler_service.create_task(
task_data=task_data,
user_id=None, # System task
)
except Exception as e:
raise HTTPException(status_code=400, detail=str(e)) from e

View File

@@ -4,20 +4,11 @@ from sqlalchemy.ext.asyncio import AsyncEngine, create_async_engine
from sqlmodel import SQLModel from sqlmodel import SQLModel
from sqlmodel.ext.asyncio.session import AsyncSession from sqlmodel.ext.asyncio.session import AsyncSession
# Import all models to ensure SQLModel metadata discovery
import app.models # noqa: F401
from app.core.config import settings from app.core.config import settings
from app.core.logging import get_logger from app.core.logging import get_logger
from app.core.seeds import seed_all_data from app.core.seeds import seed_all_data
from app.models import ( # noqa: F401
extraction,
favorite,
plan,
playlist,
playlist_sound,
sound,
sound_played,
user,
user_oauth,
)
engine: AsyncEngine = create_async_engine( engine: AsyncEngine = create_async_engine(
settings.DATABASE_URL, settings.DATABASE_URL,

23
app/core/services.py Normal file
View File

@@ -0,0 +1,23 @@
"""Global services container to avoid circular imports."""
from app.services.scheduler import SchedulerService
class AppServices:
"""Container for application services."""
def __init__(self) -> None:
"""Initialize the application services container."""
self.scheduler_service: SchedulerService | None = None
# Global service container
app_services = AppServices()
def get_global_scheduler_service() -> SchedulerService:
"""Get the global scheduler service instance."""
if app_services.scheduler_service is None:
msg = "Scheduler service not initialized"
raise RuntimeError(msg)
return app_services.scheduler_service

View File

@@ -9,9 +9,14 @@ from app.api import api_router
from app.core.config import settings from app.core.config import settings
from app.core.database import get_session_factory, init_db from app.core.database import get_session_factory, init_db
from app.core.logging import get_logger, setup_logging from app.core.logging import get_logger, setup_logging
from app.core.services import app_services
from app.middleware.logging import LoggingMiddleware from app.middleware.logging import LoggingMiddleware
from app.services.extraction_processor import extraction_processor from app.services.extraction_processor import extraction_processor
from app.services.player import initialize_player_service, shutdown_player_service from app.services.player import (
get_player_service,
initialize_player_service,
shutdown_player_service,
)
from app.services.scheduler import SchedulerService from app.services.scheduler import SchedulerService
from app.services.socket import socket_manager from app.services.socket import socket_manager
@@ -35,17 +40,25 @@ async def lifespan(_app: FastAPI) -> AsyncGenerator[None]:
logger.info("Player service started") logger.info("Player service started")
# Start the scheduler service # Start the scheduler service
scheduler_service = SchedulerService(get_session_factory()) try:
await scheduler_service.start() player_service = get_player_service() # Get the initialized player service
logger.info("Scheduler service started") app_services.scheduler_service = SchedulerService(
get_session_factory(), player_service,
)
await app_services.scheduler_service.start()
logger.info("Enhanced scheduler service started")
except Exception:
logger.exception("Failed to start scheduler service - continuing without it")
app_services.scheduler_service = None
yield yield
logger.info("Shutting down application") logger.info("Shutting down application")
# Stop the scheduler service # Stop the scheduler service
await scheduler_service.stop() if app_services.scheduler_service:
logger.info("Scheduler service stopped") await app_services.scheduler_service.stop()
logger.info("Scheduler service stopped")
# Stop the player service # Stop the player service
await shutdown_player_service() await shutdown_player_service()

View File

@@ -1 +1,32 @@
"""Models package.""" """Models package."""
# Import all models for SQLAlchemy metadata discovery
from .base import BaseModel
from .credit_action import CreditAction
from .credit_transaction import CreditTransaction
from .extraction import Extraction
from .favorite import Favorite
from .plan import Plan
from .playlist import Playlist
from .playlist_sound import PlaylistSound
from .scheduled_task import ScheduledTask
from .sound import Sound
from .sound_played import SoundPlayed
from .user import User
from .user_oauth import UserOauth
__all__ = [
"BaseModel",
"CreditAction",
"CreditTransaction",
"Extraction",
"Favorite",
"Plan",
"Playlist",
"PlaylistSound",
"ScheduledTask",
"Sound",
"SoundPlayed",
"User",
"UserOauth",
]

View File

@@ -0,0 +1,124 @@
"""Scheduled task model for flexible task scheduling with timezone support."""
from datetime import UTC, datetime
from enum import Enum
from typing import Any
from sqlmodel import JSON, Column, Field
from app.models.base import BaseModel
class TaskType(str, Enum):
"""Available task types."""
CREDIT_RECHARGE = "credit_recharge"
PLAY_SOUND = "play_sound"
PLAY_PLAYLIST = "play_playlist"
class TaskStatus(str, Enum):
"""Task execution status."""
PENDING = "pending"
RUNNING = "running"
COMPLETED = "completed"
FAILED = "failed"
CANCELLED = "cancelled"
class RecurrenceType(str, Enum):
"""Recurrence patterns."""
NONE = "none" # One-shot task
HOURLY = "hourly"
DAILY = "daily"
WEEKLY = "weekly"
MONTHLY = "monthly"
YEARLY = "yearly"
CRON = "cron" # Custom cron expression
class ScheduledTask(BaseModel, table=True):
"""Model for scheduled tasks with timezone support."""
__tablename__ = "scheduled_task"
id: int | None = Field(primary_key=True, default=None)
name: str = Field(max_length=255, description="Human-readable task name")
task_type: TaskType = Field(description="Type of task to execute")
status: TaskStatus = Field(default=TaskStatus.PENDING)
# Scheduling fields with timezone support
scheduled_at: datetime = Field(description="When the task should be executed (UTC)")
timezone: str = Field(
default="UTC",
description="Timezone for scheduling (e.g., 'America/New_York')",
)
recurrence_type: RecurrenceType = Field(default=RecurrenceType.NONE)
cron_expression: str | None = Field(
default=None,
description="Cron expression for custom recurrence",
)
recurrence_count: int | None = Field(
default=None,
description="Number of times to repeat (None for infinite)",
)
executions_count: int = Field(default=0, description="Number of times executed")
# Task parameters
parameters: dict[str, Any] = Field(
default_factory=dict,
sa_column=Column(JSON),
description="Task-specific parameters",
)
# User association (None for system tasks)
user_id: int | None = Field(
default=None,
foreign_key="user.id",
description="User who created the task (None for system tasks)",
)
# Execution tracking
last_executed_at: datetime | None = Field(
default=None,
description="When the task was last executed (UTC)",
)
next_execution_at: datetime | None = Field(
default=None,
description="When the task should be executed next (UTC, for recurring tasks)",
)
error_message: str | None = Field(
default=None,
description="Error message if execution failed",
)
# Task lifecycle
is_active: bool = Field(default=True, description="Whether the task is active")
expires_at: datetime | None = Field(
default=None,
description="When the task expires (UTC, optional)",
)
def is_expired(self) -> bool:
"""Check if the task has expired."""
if self.expires_at is None:
return False
return datetime.now(tz=UTC).replace(tzinfo=None) > self.expires_at
def is_recurring(self) -> bool:
"""Check if the task is recurring."""
return self.recurrence_type != RecurrenceType.NONE
def should_repeat(self) -> bool:
"""Check if the task should be repeated."""
if not self.is_recurring():
return False
if self.recurrence_count is None:
return True
return self.executions_count < self.recurrence_count
def is_system_task(self) -> bool:
"""Check if this is a system task (no user association)."""
return self.user_id is None

View File

@@ -72,18 +72,22 @@ class BaseRepository[ModelType]:
logger.exception("Failed to get all %s", self.model.__name__) logger.exception("Failed to get all %s", self.model.__name__)
raise raise
async def create(self, entity_data: dict[str, Any]) -> ModelType: async def create(self, entity_data: dict[str, Any] | ModelType) -> ModelType:
"""Create a new entity. """Create a new entity.
Args: Args:
entity_data: Dictionary of entity data entity_data: Dictionary of entity data or model instance
Returns: Returns:
The created entity The created entity
""" """
try: try:
entity = self.model(**entity_data) if isinstance(entity_data, dict):
entity = self.model(**entity_data)
else:
# Already a model instance
entity = entity_data
self.session.add(entity) self.session.add(entity)
await self.session.commit() await self.session.commit()
await self.session.refresh(entity) await self.session.refresh(entity)

View File

@@ -0,0 +1,181 @@
"""Repository for scheduled task operations."""
from datetime import UTC, datetime
from sqlmodel import select
from sqlmodel.ext.asyncio.session import AsyncSession
from app.models.scheduled_task import (
RecurrenceType,
ScheduledTask,
TaskStatus,
TaskType,
)
from app.repositories.base import BaseRepository
class ScheduledTaskRepository(BaseRepository[ScheduledTask]):
"""Repository for scheduled task database operations."""
def __init__(self, session: AsyncSession) -> None:
"""Initialize the repository."""
super().__init__(ScheduledTask, session)
async def get_pending_tasks(self) -> list[ScheduledTask]:
"""Get all pending tasks that are ready to be executed."""
now = datetime.now(tz=UTC)
statement = select(ScheduledTask).where(
ScheduledTask.status == TaskStatus.PENDING,
ScheduledTask.is_active.is_(True),
ScheduledTask.scheduled_at <= now,
)
result = await self.session.exec(statement)
return list(result.all())
async def get_active_tasks(self) -> list[ScheduledTask]:
"""Get all active tasks."""
statement = select(ScheduledTask).where(
ScheduledTask.is_active.is_(True),
ScheduledTask.status.in_([TaskStatus.PENDING, TaskStatus.RUNNING]),
)
result = await self.session.exec(statement)
return list(result.all())
async def get_user_tasks(
self,
user_id: int,
status: TaskStatus | None = None,
task_type: TaskType | None = None,
limit: int | None = None,
offset: int | None = None,
) -> list[ScheduledTask]:
"""Get tasks for a specific user."""
statement = select(ScheduledTask).where(ScheduledTask.user_id == user_id)
if status:
statement = statement.where(ScheduledTask.status == status)
if task_type:
statement = statement.where(ScheduledTask.task_type == task_type)
statement = statement.order_by(ScheduledTask.scheduled_at.desc())
if offset:
statement = statement.offset(offset)
if limit:
statement = statement.limit(limit)
result = await self.session.exec(statement)
return list(result.all())
async def get_system_tasks(
self,
status: TaskStatus | None = None,
task_type: TaskType | None = None,
) -> list[ScheduledTask]:
"""Get system tasks (tasks with no user association)."""
statement = select(ScheduledTask).where(ScheduledTask.user_id.is_(None))
if status:
statement = statement.where(ScheduledTask.status == status)
if task_type:
statement = statement.where(ScheduledTask.task_type == task_type)
statement = statement.order_by(ScheduledTask.scheduled_at.desc())
result = await self.session.exec(statement)
return list(result.all())
async def get_recurring_tasks_due_for_next_execution(self) -> list[ScheduledTask]:
"""Get recurring tasks that need their next execution scheduled."""
now = datetime.now(tz=UTC)
statement = select(ScheduledTask).where(
ScheduledTask.recurrence_type != RecurrenceType.NONE,
ScheduledTask.is_active.is_(True),
ScheduledTask.status == TaskStatus.COMPLETED,
ScheduledTask.next_execution_at <= now,
)
result = await self.session.exec(statement)
return list(result.all())
async def get_expired_tasks(self) -> list[ScheduledTask]:
"""Get expired tasks that should be cleaned up."""
now = datetime.now(tz=UTC)
statement = select(ScheduledTask).where(
ScheduledTask.expires_at.is_not(None),
ScheduledTask.expires_at <= now,
ScheduledTask.is_active.is_(True),
)
result = await self.session.exec(statement)
return list(result.all())
async def cancel_user_tasks(
self,
user_id: int,
task_type: TaskType | None = None,
) -> int:
"""Cancel all pending/running tasks for a user."""
statement = select(ScheduledTask).where(
ScheduledTask.user_id == user_id,
ScheduledTask.status.in_([TaskStatus.PENDING, TaskStatus.RUNNING]),
)
if task_type:
statement = statement.where(ScheduledTask.task_type == task_type)
result = await self.session.exec(statement)
tasks = list(result.all())
count = 0
for task in tasks:
task.status = TaskStatus.CANCELLED
task.is_active = False
self.session.add(task)
count += 1
await self.session.commit()
return count
async def mark_as_running(self, task: ScheduledTask) -> None:
"""Mark a task as running."""
task.status = TaskStatus.RUNNING
self.session.add(task)
await self.session.commit()
await self.session.refresh(task)
async def mark_as_completed(
self,
task: ScheduledTask,
next_execution_at: datetime | None = None,
) -> None:
"""Mark a task as completed and set next execution if recurring."""
task.status = TaskStatus.COMPLETED
task.last_executed_at = datetime.now(tz=UTC)
task.executions_count += 1
task.error_message = None
if next_execution_at and task.should_repeat():
task.next_execution_at = next_execution_at
task.status = TaskStatus.PENDING
elif not task.should_repeat():
task.is_active = False
self.session.add(task)
await self.session.commit()
await self.session.refresh(task)
async def mark_as_failed(self, task: ScheduledTask, error_message: str) -> None:
"""Mark a task as failed with error message."""
task.status = TaskStatus.FAILED
task.error_message = error_message
task.last_executed_at = datetime.now(tz=UTC)
# For non-recurring tasks, deactivate on failure
if not task.is_recurring():
task.is_active = False
self.session.add(task)
await self.session.commit()
await self.session.refresh(task)

197
app/schemas/scheduler.py Normal file
View File

@@ -0,0 +1,197 @@
"""Schemas for scheduled task API."""
from datetime import datetime
from typing import Any
from pydantic import BaseModel, Field
from app.models.scheduled_task import RecurrenceType, TaskStatus, TaskType
class TaskFilterParams(BaseModel):
"""Query parameters for filtering tasks."""
status: TaskStatus | None = Field(default=None, description="Filter by task status")
task_type: TaskType | None = Field(default=None, description="Filter by task type")
limit: int = Field(default=50, description="Maximum number of tasks to return")
offset: int = Field(default=0, description="Number of tasks to skip")
class ScheduledTaskBase(BaseModel):
"""Base schema for scheduled tasks."""
name: str = Field(description="Human-readable task name")
task_type: TaskType = Field(description="Type of task to execute")
scheduled_at: datetime = Field(description="When the task should be executed")
timezone: str = Field(default="UTC", description="Timezone for scheduling")
parameters: dict[str, Any] = Field(
default_factory=dict,
description="Task-specific parameters",
)
recurrence_type: RecurrenceType = Field(
default=RecurrenceType.NONE,
description="Recurrence pattern",
)
cron_expression: str | None = Field(
default=None,
description="Cron expression for custom recurrence",
)
recurrence_count: int | None = Field(
default=None,
description="Number of times to repeat (None for infinite)",
)
expires_at: datetime | None = Field(
default=None,
description="When the task expires (optional)",
)
class ScheduledTaskCreate(ScheduledTaskBase):
"""Schema for creating a scheduled task."""
class ScheduledTaskUpdate(BaseModel):
"""Schema for updating a scheduled task."""
name: str | None = None
scheduled_at: datetime | None = None
timezone: str | None = None
parameters: dict[str, Any] | None = None
is_active: bool | None = None
expires_at: datetime | None = None
class ScheduledTaskResponse(ScheduledTaskBase):
"""Schema for scheduled task responses."""
id: int
status: TaskStatus
user_id: int | None = None
executions_count: int
last_executed_at: datetime | None = None
next_execution_at: datetime | None = None
error_message: str | None = None
is_active: bool
created_at: datetime
updated_at: datetime
class Config:
"""Pydantic configuration."""
from_attributes = True
# Task-specific parameter schemas
class CreditRechargeParameters(BaseModel):
"""Parameters for credit recharge tasks."""
user_id: int | None = Field(
default=None,
description="Specific user ID to recharge (None for all users)",
)
class PlaySoundParameters(BaseModel):
"""Parameters for play sound tasks."""
sound_id: int = Field(description="ID of the sound to play")
class PlayPlaylistParameters(BaseModel):
"""Parameters for play playlist tasks."""
playlist_id: int = Field(description="ID of the playlist to play")
play_mode: str = Field(
default="continuous",
description="Play mode (continuous, loop, loop_one, random, single)",
)
shuffle: bool = Field(default=False, description="Whether to shuffle the playlist")
# Convenience schemas for creating specific task types
class CreateCreditRechargeTask(BaseModel):
"""Schema for creating credit recharge tasks."""
name: str = "Credit Recharge"
scheduled_at: datetime
timezone: str = "UTC"
recurrence_type: RecurrenceType = RecurrenceType.NONE
cron_expression: str | None = None
recurrence_count: int | None = None
expires_at: datetime | None = None
user_id: int | None = None
def to_task_create(self) -> ScheduledTaskCreate:
"""Convert to generic task creation schema."""
return ScheduledTaskCreate(
name=self.name,
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=self.scheduled_at,
timezone=self.timezone,
parameters={"user_id": self.user_id},
recurrence_type=self.recurrence_type,
cron_expression=self.cron_expression,
recurrence_count=self.recurrence_count,
expires_at=self.expires_at,
)
class CreatePlaySoundTask(BaseModel):
"""Schema for creating play sound tasks."""
name: str
scheduled_at: datetime
sound_id: int
timezone: str = "UTC"
recurrence_type: RecurrenceType = RecurrenceType.NONE
cron_expression: str | None = None
recurrence_count: int | None = None
expires_at: datetime | None = None
def to_task_create(self) -> ScheduledTaskCreate:
"""Convert to generic task creation schema."""
return ScheduledTaskCreate(
name=self.name,
task_type=TaskType.PLAY_SOUND,
scheduled_at=self.scheduled_at,
timezone=self.timezone,
parameters={"sound_id": self.sound_id},
recurrence_type=self.recurrence_type,
cron_expression=self.cron_expression,
recurrence_count=self.recurrence_count,
expires_at=self.expires_at,
)
class CreatePlayPlaylistTask(BaseModel):
"""Schema for creating play playlist tasks."""
name: str
scheduled_at: datetime
playlist_id: int
play_mode: str = "continuous"
shuffle: bool = False
timezone: str = "UTC"
recurrence_type: RecurrenceType = RecurrenceType.NONE
cron_expression: str | None = None
recurrence_count: int | None = None
expires_at: datetime | None = None
def to_task_create(self) -> ScheduledTaskCreate:
"""Convert to generic task creation schema."""
return ScheduledTaskCreate(
name=self.name,
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=self.scheduled_at,
timezone=self.timezone,
parameters={
"playlist_id": self.playlist_id,
"play_mode": self.play_mode,
"shuffle": self.shuffle,
},
recurrence_type=self.recurrence_type,
cron_expression=self.cron_expression,
recurrence_count=self.recurrence_count,
expires_at=self.expires_at,
)

View File

@@ -63,7 +63,7 @@ class PlayerState:
"""Convert player state to dictionary for serialization.""" """Convert player state to dictionary for serialization."""
return { return {
"status": self.status.value, "status": self.status.value,
"mode": self.mode.value, "mode": self.mode.value if isinstance(self.mode, PlayerMode) else self.mode,
"volume": self.volume, "volume": self.volume,
"previous_volume": self.previous_volume, "previous_volume": self.previous_volume,
"position": self.current_sound_position or 0, "position": self.current_sound_position or 0,
@@ -401,8 +401,16 @@ class PlayerService:
if self.state.volume == 0 and self.state.previous_volume > 0: if self.state.volume == 0 and self.state.previous_volume > 0:
await self.set_volume(self.state.previous_volume) await self.set_volume(self.state.previous_volume)
async def set_mode(self, mode: PlayerMode) -> None: async def set_mode(self, mode: PlayerMode | str) -> None:
"""Set playback mode.""" """Set playback mode."""
if isinstance(mode, str):
# Convert string to PlayerMode enum
try:
mode = PlayerMode(mode)
except ValueError:
logger.exception("Invalid player mode: %s", mode)
return
self.state.mode = mode self.state.mode = mode
await self._broadcast_state() await self._broadcast_state()
logger.info("Playback mode set to: %s", mode.value) logger.info("Playback mode set to: %s", mode.value)
@@ -429,6 +437,26 @@ class PlayerService:
await self._broadcast_state() await self._broadcast_state()
async def load_playlist(self, playlist_id: int) -> None:
"""Load a specific playlist by ID."""
session = self.db_session_factory()
try:
playlist_repo = PlaylistRepository(session)
playlist = await playlist_repo.get_by_id(playlist_id)
if playlist and playlist.id:
sounds = await playlist_repo.get_playlist_sounds(playlist.id)
await self._handle_playlist_reload(playlist, sounds)
logger.info(
"Loaded playlist: %s (%s sounds)",
playlist.name,
len(sounds),
)
else:
logger.warning("Playlist not found: %s", playlist_id)
finally:
await session.close()
await self._broadcast_state()
async def _handle_playlist_reload( async def _handle_playlist_reload(
self, self,
current_playlist: Playlist, current_playlist: Playlist,

View File

@@ -1,63 +1,425 @@
"""Scheduler service for periodic tasks.""" """Enhanced scheduler service for flexible task scheduling with timezone support."""
from collections.abc import Callable from collections.abc import Callable
from contextlib import suppress
from datetime import UTC, datetime, timedelta
import pytz
from apscheduler.schedulers.asyncio import AsyncIOScheduler from apscheduler.schedulers.asyncio import AsyncIOScheduler
from apscheduler.triggers.cron import CronTrigger
from apscheduler.triggers.date import DateTrigger
from apscheduler.triggers.interval import IntervalTrigger
from sqlmodel.ext.asyncio.session import AsyncSession from sqlmodel.ext.asyncio.session import AsyncSession
from app.core.logging import get_logger from app.core.logging import get_logger
from app.models.scheduled_task import (
RecurrenceType,
ScheduledTask,
TaskStatus,
TaskType,
)
from app.repositories.scheduled_task import ScheduledTaskRepository
from app.schemas.scheduler import ScheduledTaskCreate
from app.services.credit import CreditService from app.services.credit import CreditService
from app.services.player import PlayerService
from app.services.task_handlers import TaskHandlerRegistry
logger = get_logger(__name__) logger = get_logger(__name__)
class SchedulerService: class SchedulerService:
"""Service for managing scheduled tasks.""" """Enhanced service for managing scheduled tasks with timezone support."""
def __init__(self, db_session_factory: Callable[[], AsyncSession]) -> None: def __init__(
self,
db_session_factory: Callable[[], AsyncSession],
player_service: PlayerService,
) -> None:
"""Initialize the scheduler service. """Initialize the scheduler service.
Args: Args:
db_session_factory: Factory function to create database sessions db_session_factory: Factory function to create database sessions
player_service: Player service for audio playback tasks
""" """
self.db_session_factory = db_session_factory self.db_session_factory = db_session_factory
self.scheduler = AsyncIOScheduler() self.scheduler = AsyncIOScheduler(timezone=pytz.UTC)
self.credit_service = CreditService(db_session_factory) self.credit_service = CreditService(db_session_factory)
self.player_service = player_service
self._running_tasks: set[str] = set()
async def start(self) -> None: async def start(self) -> None:
"""Start the scheduler and register all tasks.""" """Start the scheduler and load all active tasks."""
logger.info("Starting scheduler service...") logger.info("Starting enhanced scheduler service...")
# Add daily credit recharge job (runs at midnight UTC) self.scheduler.start()
# Schedule system tasks initialization for after startup
self.scheduler.add_job( self.scheduler.add_job(
self._daily_credit_recharge, self._initialize_system_tasks,
"cron", "date",
hour=0, run_date=datetime.now(tz=UTC) + timedelta(seconds=2),
minute=0, id="initialize_system_tasks",
id="daily_credit_recharge", name="Initialize System Tasks",
name="Daily Credit Recharge",
replace_existing=True, replace_existing=True,
) )
self.scheduler.start() # Schedule periodic cleanup and maintenance
logger.info("Scheduler service started successfully") self.scheduler.add_job(
self._maintenance_job,
"interval",
minutes=5,
id="scheduler_maintenance",
name="Scheduler Maintenance",
replace_existing=True,
)
logger.info("Enhanced scheduler service started successfully")
async def stop(self) -> None: async def stop(self) -> None:
"""Stop the scheduler.""" """Stop the scheduler."""
logger.info("Stopping scheduler service...") logger.info("Stopping scheduler service...")
self.scheduler.shutdown() self.scheduler.shutdown(wait=True)
logger.info("Scheduler service stopped") logger.info("Scheduler service stopped")
async def _daily_credit_recharge(self) -> None: async def create_task(
"""Execute daily credit recharge for all users.""" self,
logger.info("Starting daily credit recharge task...") task_data: ScheduledTaskCreate,
user_id: int | None = None,
) -> ScheduledTask:
"""Create a new scheduled task from schema data."""
async with self.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
# Convert scheduled_at to UTC if it's in a different timezone
scheduled_at = task_data.scheduled_at
if task_data.timezone != "UTC":
tz = pytz.timezone(task_data.timezone)
if scheduled_at.tzinfo is None:
# Assume the datetime is in the specified timezone
scheduled_at = tz.localize(scheduled_at)
scheduled_at = scheduled_at.astimezone(pytz.UTC).replace(tzinfo=None)
db_task_data = {
"name": task_data.name,
"task_type": task_data.task_type,
"scheduled_at": scheduled_at,
"timezone": task_data.timezone,
"parameters": task_data.parameters,
"user_id": user_id,
"recurrence_type": task_data.recurrence_type,
"cron_expression": task_data.cron_expression,
"recurrence_count": task_data.recurrence_count,
"expires_at": task_data.expires_at,
}
created_task = await repo.create(db_task_data)
await self._schedule_apscheduler_job(created_task)
logger.info(
"Created scheduled task: %s (%s)",
created_task.name,
created_task.id,
)
return created_task
async def cancel_task(self, task_id: int) -> bool:
"""Cancel a scheduled task."""
async with self.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
task = await repo.get_by_id(task_id)
if not task:
return False
await repo.update(task, {
"status": TaskStatus.CANCELLED,
"is_active": False,
})
# Remove from APScheduler (job might not exist in scheduler)
with suppress(Exception):
self.scheduler.remove_job(str(task_id))
logger.info("Cancelled task: %s (%s)", task.name, task_id)
return True
async def get_user_tasks(
self,
user_id: int,
status: TaskStatus | None = None,
task_type: TaskType | None = None,
limit: int | None = None,
offset: int | None = None,
) -> list[ScheduledTask]:
"""Get tasks for a specific user."""
async with self.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
return await repo.get_user_tasks(user_id, status, task_type, limit, offset)
async def _initialize_system_tasks(self) -> None:
"""Initialize system tasks and load active tasks from database."""
logger.info("Initializing system tasks...")
try: try:
stats = await self.credit_service.recharge_all_users_credits() # Create system tasks if they don't exist
logger.info( await self._ensure_system_tasks()
"Daily credit recharge completed successfully: %s",
stats, # Load all active tasks from database
) await self._load_active_tasks()
logger.info("System tasks initialized successfully")
except Exception: except Exception:
logger.exception("Daily credit recharge task failed") logger.exception("Failed to initialize system tasks")
async def _ensure_system_tasks(self) -> None:
"""Ensure required system tasks exist."""
async with self.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
# Check if daily credit recharge task exists
system_tasks = await repo.get_system_tasks(
task_type=TaskType.CREDIT_RECHARGE,
)
daily_recharge_exists = any(
task.recurrence_type == RecurrenceType.DAILY
and task.is_active
for task in system_tasks
)
if not daily_recharge_exists:
# Create daily credit recharge task
tomorrow_midnight = datetime.now(tz=UTC).replace(
hour=0, minute=0, second=0, microsecond=0,
) + timedelta(days=1)
task_data = {
"name": "Daily Credit Recharge",
"task_type": TaskType.CREDIT_RECHARGE,
"scheduled_at": tomorrow_midnight,
"recurrence_type": RecurrenceType.DAILY,
"parameters": {},
}
await repo.create(task_data)
logger.info("Created system daily credit recharge task")
async def _load_active_tasks(self) -> None:
"""Load all active tasks from database into scheduler."""
async with self.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
active_tasks = await repo.get_active_tasks()
for task in active_tasks:
await self._schedule_apscheduler_job(task)
logger.info("Loaded %s active tasks into scheduler", len(active_tasks))
async def _schedule_apscheduler_job(self, task: ScheduledTask) -> None:
"""Schedule a task in APScheduler."""
job_id = str(task.id)
# Remove existing job if it exists
with suppress(Exception):
self.scheduler.remove_job(job_id)
# Don't schedule if task is not active or already completed/failed
inactive_statuses = [
TaskStatus.COMPLETED,
TaskStatus.FAILED,
TaskStatus.CANCELLED,
]
if not task.is_active or task.status in inactive_statuses:
return
# Create trigger based on recurrence type
trigger = self._create_trigger(task)
if not trigger:
logger.warning("Could not create trigger for task %s", task.id)
return
# Schedule the job
self.scheduler.add_job(
self._execute_task,
trigger=trigger,
args=[task.id],
id=job_id,
name=task.name,
replace_existing=True,
)
logger.debug("Scheduled APScheduler job for task %s", task.id)
def _create_trigger(
self, task: ScheduledTask,
) -> DateTrigger | IntervalTrigger | CronTrigger | None:
"""Create APScheduler trigger based on task configuration."""
tz = pytz.timezone(task.timezone)
scheduled_time = task.scheduled_at
# Handle special cases first
if task.recurrence_type == RecurrenceType.NONE:
return DateTrigger(run_date=scheduled_time, timezone=tz)
if task.recurrence_type == RecurrenceType.CRON and task.cron_expression:
return CronTrigger.from_crontab(task.cron_expression, timezone=tz)
# Handle interval-based recurrence types
interval_configs = {
RecurrenceType.HOURLY: {"hours": 1},
RecurrenceType.DAILY: {"days": 1},
RecurrenceType.WEEKLY: {"weeks": 1},
}
if task.recurrence_type in interval_configs:
config = interval_configs[task.recurrence_type]
return IntervalTrigger(start_date=scheduled_time, timezone=tz, **config)
# Handle cron-based recurrence types
cron_configs = {
RecurrenceType.MONTHLY: {
"day": scheduled_time.day,
"hour": scheduled_time.hour,
"minute": scheduled_time.minute,
},
RecurrenceType.YEARLY: {
"month": scheduled_time.month,
"day": scheduled_time.day,
"hour": scheduled_time.hour,
"minute": scheduled_time.minute,
},
}
if task.recurrence_type in cron_configs:
config = cron_configs[task.recurrence_type]
return CronTrigger(timezone=tz, **config)
return None
async def _execute_task(self, task_id: int) -> None:
"""Execute a scheduled task."""
task_id_str = str(task_id)
# Prevent concurrent execution of the same task
if task_id_str in self._running_tasks:
logger.warning("Task %s is already running, skipping execution", task_id)
return
self._running_tasks.add(task_id_str)
try:
async with self.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
# Get fresh task data
task = await repo.get_by_id(task_id)
if not task:
logger.warning("Task %s not found", task_id)
return
# Check if task is still active and pending
if not task.is_active or task.status != TaskStatus.PENDING:
logger.info("Task %s not active or not pending, skipping", task_id)
return
# Check if task has expired
if task.is_expired():
logger.info("Task %s has expired, marking as cancelled", task_id)
await repo.update(task, {
"status": TaskStatus.CANCELLED,
"is_active": False,
})
return
# Mark task as running
await repo.mark_as_running(task)
# Execute the task
try:
handler_registry = TaskHandlerRegistry(
session,
self.db_session_factory,
self.credit_service,
self.player_service,
)
await handler_registry.execute_task(task)
# Calculate next execution time for recurring tasks
next_execution_at = None
if task.should_repeat():
next_execution_at = self._calculate_next_execution(task)
# Mark as completed
await repo.mark_as_completed(task, next_execution_at)
# Reschedule if recurring
if next_execution_at and task.should_repeat():
# Refresh task to get updated data
await session.refresh(task)
await self._schedule_apscheduler_job(task)
except Exception as e:
await repo.mark_as_failed(task, str(e))
logger.exception("Task %s execution failed", task_id)
finally:
self._running_tasks.discard(task_id_str)
def _calculate_next_execution(self, task: ScheduledTask) -> datetime | None:
"""Calculate the next execution time for a recurring task."""
now = datetime.now(tz=UTC)
if task.recurrence_type == RecurrenceType.HOURLY:
return now + timedelta(hours=1)
if task.recurrence_type == RecurrenceType.DAILY:
return now + timedelta(days=1)
if task.recurrence_type == RecurrenceType.WEEKLY:
return now + timedelta(weeks=1)
if task.recurrence_type == RecurrenceType.MONTHLY:
# Add approximately one month
return now + timedelta(days=30)
if task.recurrence_type == RecurrenceType.YEARLY:
return now + timedelta(days=365)
return None
async def _maintenance_job(self) -> None:
"""Periodic maintenance job to clean up expired tasks and handle scheduling."""
try:
async with self.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
# Handle expired tasks
expired_tasks = await repo.get_expired_tasks()
for task in expired_tasks:
await repo.update(task, {
"status": TaskStatus.CANCELLED,
"is_active": False,
})
# Remove from scheduler
with suppress(Exception):
self.scheduler.remove_job(str(task.id))
if expired_tasks:
logger.info("Cleaned up %s expired tasks", len(expired_tasks))
# Handle any missed recurring tasks
due_recurring = await repo.get_recurring_tasks_due_for_next_execution()
for task in due_recurring:
if task.should_repeat():
next_scheduled_at = (
task.next_execution_at or datetime.now(tz=UTC)
)
await repo.update(task, {
"status": TaskStatus.PENDING,
"scheduled_at": next_scheduled_at,
})
await self._schedule_apscheduler_job(task)
if due_recurring:
logger.info("Rescheduled %s recurring tasks", len(due_recurring))
except Exception:
logger.exception("Maintenance job failed")

View File

@@ -0,0 +1,183 @@
"""Task execution handlers for different task types."""
from collections.abc import Callable
from sqlmodel.ext.asyncio.session import AsyncSession
from app.core.logging import get_logger
from app.models.scheduled_task import ScheduledTask, TaskType
from app.repositories.playlist import PlaylistRepository
from app.repositories.sound import SoundRepository
from app.services.credit import CreditService
from app.services.player import PlayerService
from app.services.vlc_player import VLCPlayerService
logger = get_logger(__name__)
class TaskExecutionError(Exception):
"""Exception raised when task execution fails."""
class TaskHandlerRegistry:
"""Registry for task execution handlers."""
def __init__(
self,
db_session: AsyncSession,
db_session_factory: Callable[[], AsyncSession],
credit_service: CreditService,
player_service: PlayerService,
) -> None:
"""Initialize the task handler registry."""
self.db_session = db_session
self.db_session_factory = db_session_factory
self.credit_service = credit_service
self.player_service = player_service
self.sound_repository = SoundRepository(db_session)
self.playlist_repository = PlaylistRepository(db_session)
# Register handlers
self._handlers = {
TaskType.CREDIT_RECHARGE: self._handle_credit_recharge,
TaskType.PLAY_SOUND: self._handle_play_sound,
TaskType.PLAY_PLAYLIST: self._handle_play_playlist,
}
async def execute_task(self, task: ScheduledTask) -> None:
"""Execute a task based on its type."""
handler = self._handlers.get(task.task_type)
if not handler:
msg = f"No handler registered for task type: {task.task_type}"
raise TaskExecutionError(msg)
logger.info(
"Executing task %s (%s): %s",
task.id,
task.task_type.value,
task.name,
)
try:
await handler(task)
logger.info("Task %s executed successfully", task.id)
except Exception as e:
logger.exception("Task %s execution failed", task.id)
msg = f"Task execution failed: {e!s}"
raise TaskExecutionError(msg) from e
async def _handle_credit_recharge(self, task: ScheduledTask) -> None:
"""Handle credit recharge task."""
parameters = task.parameters
user_id = parameters.get("user_id")
if user_id:
# Recharge specific user
try:
user_id_int = int(user_id)
except (ValueError, TypeError) as e:
msg = f"Invalid user_id format: {user_id}"
raise TaskExecutionError(msg) from e
stats = await self.credit_service.recharge_user_credits(user_id_int)
logger.info("Recharged credits for user %s: %s", user_id, stats)
else:
# Recharge all users (system task)
stats = await self.credit_service.recharge_all_users_credits()
logger.info("Recharged credits for all users: %s", stats)
async def _handle_play_sound(self, task: ScheduledTask) -> None:
"""Handle play sound task."""
parameters = task.parameters
sound_id = parameters.get("sound_id")
if not sound_id:
msg = "sound_id parameter is required for PLAY_SOUND tasks"
raise TaskExecutionError(msg)
try:
# Handle both integer and string sound IDs
sound_id_int = int(sound_id)
except (ValueError, TypeError) as e:
msg = f"Invalid sound_id format: {sound_id}"
raise TaskExecutionError(msg) from e
# Check if this is a user task (has user_id)
if task.user_id:
# User task: use credit-aware playback
vlc_service = VLCPlayerService(self.db_session_factory)
try:
result = await vlc_service.play_sound_with_credits(
sound_id_int, task.user_id,
)
logger.info(
(
"Played sound %s via scheduled task for user %s "
"(credits deducted: %s)"
),
result.get("sound_name", sound_id),
task.user_id,
result.get("credits_deducted", 0),
)
except Exception as e:
# Convert HTTP exceptions or credit errors to task execution errors
msg = f"Failed to play sound with credits: {e!s}"
raise TaskExecutionError(msg) from e
else:
# System task: play without credit deduction
sound = await self.sound_repository.get_by_id(sound_id_int)
if not sound:
msg = f"Sound not found: {sound_id}"
raise TaskExecutionError(msg)
vlc_service = VLCPlayerService(self.db_session_factory)
success = await vlc_service.play_sound(sound)
if not success:
msg = f"Failed to play sound {sound.filename}"
raise TaskExecutionError(msg)
logger.info("Played sound %s via scheduled system task", sound.filename)
async def _handle_play_playlist(self, task: ScheduledTask) -> None:
"""Handle play playlist task."""
parameters = task.parameters
playlist_id = parameters.get("playlist_id")
play_mode = parameters.get("play_mode", "continuous")
shuffle = parameters.get("shuffle", False)
if not playlist_id:
msg = "playlist_id parameter is required for PLAY_PLAYLIST tasks"
raise TaskExecutionError(msg)
try:
# Handle both integer and string playlist IDs
playlist_id_int = int(playlist_id)
except (ValueError, TypeError) as e:
msg = f"Invalid playlist_id format: {playlist_id}"
raise TaskExecutionError(msg) from e
# Get the playlist from database
playlist = await self.playlist_repository.get_by_id(playlist_id_int)
if not playlist:
msg = f"Playlist not found: {playlist_id}"
raise TaskExecutionError(msg)
# Load playlist in player
await self.player_service.load_playlist(playlist_id_int)
# Set play mode if specified
if play_mode in ["continuous", "loop", "loop_one", "random", "single"]:
await self.player_service.set_mode(play_mode)
# Enable shuffle if requested
if shuffle:
await self.player_service.set_shuffle(shuffle=True)
# Start playing
await self.player_service.play()
logger.info("Started playing playlist %s via scheduled task", playlist.name)

View File

@@ -238,75 +238,76 @@ class VLCPlayerService:
return return
logger.info("Recording play count for sound %s", sound_id) logger.info("Recording play count for sound %s", sound_id)
session = self.db_session_factory()
# Initialize variables for WebSocket event
old_count = 0
sound = None
admin_user_id = None
admin_user_name = None
try: try:
sound_repo = SoundRepository(session) async with self.db_session_factory() as session:
user_repo = UserRepository(session) sound_repo = SoundRepository(session)
user_repo = UserRepository(session)
# Update sound play count # Update sound play count
sound = await sound_repo.get_by_id(sound_id) sound = await sound_repo.get_by_id(sound_id)
old_count = 0 if sound:
if sound: old_count = sound.play_count
old_count = sound.play_count # Update the sound's play count using direct attribute modification
await sound_repo.update( sound.play_count = sound.play_count + 1
sound, session.add(sound)
{"play_count": sound.play_count + 1}, await session.commit()
await session.refresh(sound)
logger.info(
"Updated sound %s play_count: %s -> %s",
sound_id,
old_count,
old_count + 1,
)
else:
logger.warning("Sound %s not found for play count update", sound_id)
# Record play history for admin user (ID 1) as placeholder
# This could be refined to track per-user play history
admin_user = await user_repo.get_by_id(1)
if admin_user:
admin_user_id = admin_user.id
admin_user_name = admin_user.name
# Always create a new SoundPlayed record for each play event
sound_played = SoundPlayed(
user_id=admin_user_id, # Can be None for player-based plays
sound_id=sound_id,
) )
session.add(sound_played)
logger.info( logger.info(
"Updated sound %s play_count: %s -> %s", "Created SoundPlayed record for user %s, sound %s",
sound_id, admin_user_id,
old_count,
old_count + 1,
)
else:
logger.warning("Sound %s not found for play count update", sound_id)
# Record play history for admin user (ID 1) as placeholder
# This could be refined to track per-user play history
admin_user = await user_repo.get_by_id(1)
admin_user_id = None
admin_user_name = None
if admin_user:
admin_user_id = admin_user.id
admin_user_name = admin_user.name
# Always create a new SoundPlayed record for each play event
sound_played = SoundPlayed(
user_id=admin_user_id, # Can be None for player-based plays
sound_id=sound_id,
)
session.add(sound_played)
logger.info(
"Created SoundPlayed record for user %s, sound %s",
admin_user_id,
sound_id,
)
await session.commit()
logger.info("Successfully recorded play count for sound %s", sound_id)
# Emit sound_played event via WebSocket
try:
event_data = {
"sound_id": sound_id,
"sound_name": sound_name,
"user_id": admin_user_id,
"user_name": admin_user_name,
"play_count": (old_count + 1) if sound else None,
}
await socket_manager.broadcast_to_all("sound_played", event_data)
logger.info("Broadcasted sound_played event for sound %s", sound_id)
except Exception:
logger.exception(
"Failed to broadcast sound_played event for sound %s",
sound_id, sound_id,
) )
await session.commit()
logger.info("Successfully recorded play count for sound %s", sound_id)
except Exception: except Exception:
logger.exception("Error recording play count for sound %s", sound_id) logger.exception("Error recording play count for sound %s", sound_id)
await session.rollback()
finally: # Emit sound_played event via WebSocket (outside session context)
await session.close() try:
event_data = {
"sound_id": sound_id,
"sound_name": sound_name,
"user_id": admin_user_id,
"user_name": admin_user_name,
"play_count": (old_count + 1) if sound else None,
}
await socket_manager.broadcast_to_all("sound_played", event_data)
logger.info("Broadcasted sound_played event for sound %s", sound_id)
except Exception:
logger.exception(
"Failed to broadcast sound_played event for sound %s",
sound_id,
)
async def play_sound_with_credits( async def play_sound_with_credits(
self, self,

View File

@@ -8,28 +8,29 @@ dependencies = [
"aiosqlite==0.21.0", "aiosqlite==0.21.0",
"apscheduler==3.11.0", "apscheduler==3.11.0",
"bcrypt==4.3.0", "bcrypt==4.3.0",
"email-validator==2.2.0", "email-validator==2.3.0",
"fastapi[standard]==0.116.1", "fastapi[standard]==0.116.1",
"ffmpeg-python==0.2.0", "ffmpeg-python==0.2.0",
"httpx==0.28.1", "httpx==0.28.1",
"pydantic-settings==2.10.1", "pydantic-settings==2.10.1",
"pyjwt==2.10.1", "pyjwt==2.10.1",
"python-socketio==5.13.0", "python-socketio==5.13.0",
"pytz==2025.2",
"python-vlc==3.0.21203", "python-vlc==3.0.21203",
"sqlmodel==0.0.24", "sqlmodel==0.0.24",
"uvicorn[standard]==0.35.0", "uvicorn[standard]==0.35.0",
"yt-dlp==2025.8.20", "yt-dlp==2025.8.27",
] ]
[tool.uv] [tool.uv]
dev-dependencies = [ dev-dependencies = [
"coverage==7.10.4", "coverage==7.10.5",
"faker==37.5.3", "faker==37.6.0",
"httpx==0.28.1", "httpx==0.28.1",
"mypy==1.17.1", "mypy==1.17.1",
"pytest==8.4.1", "pytest==8.4.1",
"pytest-asyncio==1.1.0", "pytest-asyncio==1.1.0",
"ruff==0.12.10", "ruff==0.12.11",
] ]
[tool.mypy] [tool.mypy]
@@ -68,6 +69,7 @@ ignore = ["D100", "D103", "TRY301"]
] ]
[tool.pytest.ini_options] [tool.pytest.ini_options]
asyncio_mode = "auto"
filterwarnings = [ filterwarnings = [
"ignore:transaction already deassociated from connection:sqlalchemy.exc.SAWarning", "ignore:transaction already deassociated from connection:sqlalchemy.exc.SAWarning",
] ]

View File

@@ -25,6 +25,7 @@ from app.models.favorite import Favorite # noqa: F401
from app.models.plan import Plan from app.models.plan import Plan
from app.models.playlist import Playlist # noqa: F401 from app.models.playlist import Playlist # noqa: F401
from app.models.playlist_sound import PlaylistSound # noqa: F401 from app.models.playlist_sound import PlaylistSound # noqa: F401
from app.models.scheduled_task import ScheduledTask # noqa: F401
from app.models.sound import Sound # noqa: F401 from app.models.sound import Sound # noqa: F401
from app.models.sound_played import SoundPlayed # noqa: F401 from app.models.sound_played import SoundPlayed # noqa: F401
from app.models.user import User from app.models.user import User
@@ -346,3 +347,27 @@ async def admin_cookies(admin_user: User) -> dict[str, str]:
access_token = JWTUtils.create_access_token(token_data) access_token = JWTUtils.create_access_token(token_data)
return {"access_token": access_token} return {"access_token": access_token}
@pytest.fixture
def test_user_id(test_user: User):
"""Get test user ID."""
return test_user.id
@pytest.fixture
def test_sound_id():
"""Create a test sound ID."""
return 1
@pytest.fixture
def test_playlist_id():
"""Create a test playlist ID."""
return 1
@pytest.fixture
def db_session(test_session: AsyncSession) -> AsyncSession:
"""Alias for test_session to match test expectations."""
return test_session

View File

@@ -20,7 +20,9 @@ class TestSchedulerService:
@pytest.fixture @pytest.fixture
def scheduler_service(self, mock_db_session_factory): def scheduler_service(self, mock_db_session_factory):
"""Create a scheduler service instance for testing.""" """Create a scheduler service instance for testing."""
return SchedulerService(mock_db_session_factory) from unittest.mock import MagicMock
mock_player_service = MagicMock()
return SchedulerService(mock_db_session_factory, mock_player_service)
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_start_scheduler(self, scheduler_service) -> None: async def test_start_scheduler(self, scheduler_service) -> None:
@@ -31,20 +33,18 @@ class TestSchedulerService:
): ):
await scheduler_service.start() await scheduler_service.start()
# Verify job was added # Verify scheduler start was called
mock_add_job.assert_called_once_with(
scheduler_service._daily_credit_recharge,
"cron",
hour=0,
minute=0,
id="daily_credit_recharge",
name="Daily Credit Recharge",
replace_existing=True,
)
# Verify scheduler was started
mock_start.assert_called_once() mock_start.assert_called_once()
# Verify jobs were added (2 calls: initialize_system_tasks and scheduler_maintenance)
assert mock_add_job.call_count == 2
# Check that the jobs are the expected ones
calls = mock_add_job.call_args_list
job_ids = [call[1]["id"] for call in calls]
assert "initialize_system_tasks" in job_ids
assert "scheduler_maintenance" in job_ids
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_stop_scheduler(self, scheduler_service) -> None: async def test_stop_scheduler(self, scheduler_service) -> None:
"""Test stopping the scheduler service.""" """Test stopping the scheduler service."""
@@ -52,36 +52,3 @@ class TestSchedulerService:
await scheduler_service.stop() await scheduler_service.stop()
mock_shutdown.assert_called_once() mock_shutdown.assert_called_once()
@pytest.mark.asyncio
async def test_daily_credit_recharge_success(self, scheduler_service) -> None:
"""Test successful daily credit recharge task."""
mock_stats = {
"total_users": 10,
"recharged_users": 8,
"skipped_users": 2,
"total_credits_added": 500,
}
with patch.object(
scheduler_service.credit_service,
"recharge_all_users_credits",
) as mock_recharge:
mock_recharge.return_value = mock_stats
await scheduler_service._daily_credit_recharge()
mock_recharge.assert_called_once()
@pytest.mark.asyncio
async def test_daily_credit_recharge_failure(self, scheduler_service) -> None:
"""Test daily credit recharge task with failure."""
with patch.object(
scheduler_service.credit_service,
"recharge_all_users_credits",
) as mock_recharge:
mock_recharge.side_effect = Exception("Database error")
# Should not raise exception, just log it
await scheduler_service._daily_credit_recharge()
mock_recharge.assert_called_once()

View File

@@ -2,7 +2,7 @@
import asyncio import asyncio
from pathlib import Path from pathlib import Path
from unittest.mock import AsyncMock, Mock, patch from unittest.mock import AsyncMock, MagicMock, Mock, patch
import pytest import pytest
@@ -405,8 +405,17 @@ class TestVLCPlayerService:
async def test_record_play_count_success(self, vlc_service_with_db) -> None: async def test_record_play_count_success(self, vlc_service_with_db) -> None:
"""Test successful play count recording.""" """Test successful play count recording."""
# Mock session and repositories # Mock session and repositories
mock_session = AsyncMock() mock_session = MagicMock()
vlc_service_with_db.db_session_factory.return_value = mock_session # Make async methods async mocks but keep sync methods as regular mocks
mock_session.commit = AsyncMock()
mock_session.refresh = AsyncMock()
mock_session.close = AsyncMock()
# Mock the context manager behavior
mock_context_manager = AsyncMock()
mock_context_manager.__aenter__ = AsyncMock(return_value=mock_session)
mock_context_manager.__aexit__ = AsyncMock(return_value=None)
vlc_service_with_db.db_session_factory.return_value = mock_context_manager
mock_sound_repo = AsyncMock() mock_sound_repo = AsyncMock()
mock_user_repo = AsyncMock() mock_user_repo = AsyncMock()
@@ -449,18 +458,18 @@ class TestVLCPlayerService:
# Verify sound repository calls # Verify sound repository calls
mock_sound_repo.get_by_id.assert_called_once_with(1) mock_sound_repo.get_by_id.assert_called_once_with(1)
mock_sound_repo.update.assert_called_once_with(
test_sound,
{"play_count": 1},
)
# Verify user repository calls # Verify user repository calls
mock_user_repo.get_by_id.assert_called_once_with(1) mock_user_repo.get_by_id.assert_called_once_with(1)
# Verify session operations # Verify session operations (called twice: once for sound, once for sound_played)
mock_session.add.assert_called_once() assert mock_session.add.call_count == 2
mock_session.commit.assert_called_once() # Commit is called twice: once after updating sound, once after adding sound_played
mock_session.close.assert_called_once() assert mock_session.commit.call_count == 2
# Context manager handles session cleanup, so no explicit close() call
# Verify the sound's play count was incremented
assert test_sound.play_count == 1
# Verify socket broadcast # Verify socket broadcast
mock_socket.broadcast_to_all.assert_called_once_with( mock_socket.broadcast_to_all.assert_called_once_with(
@@ -488,8 +497,17 @@ class TestVLCPlayerService:
) -> None: ) -> None:
"""Test play count recording always creates a new SoundPlayed record.""" """Test play count recording always creates a new SoundPlayed record."""
# Mock session and repositories # Mock session and repositories
mock_session = AsyncMock() mock_session = MagicMock()
vlc_service_with_db.db_session_factory.return_value = mock_session # Make async methods async mocks but keep sync methods as regular mocks
mock_session.commit = AsyncMock()
mock_session.refresh = AsyncMock()
mock_session.close = AsyncMock()
# Mock the context manager behavior
mock_context_manager = AsyncMock()
mock_context_manager.__aenter__ = AsyncMock(return_value=mock_session)
mock_context_manager.__aexit__ = AsyncMock(return_value=None)
vlc_service_with_db.db_session_factory.return_value = mock_context_manager
mock_sound_repo = AsyncMock() mock_sound_repo = AsyncMock()
mock_user_repo = AsyncMock() mock_user_repo = AsyncMock()
@@ -530,17 +548,19 @@ class TestVLCPlayerService:
await vlc_service_with_db._record_play_count(1, "Test Sound") await vlc_service_with_db._record_play_count(1, "Test Sound")
# Verify sound play count was updated # Verify sound repository calls
mock_sound_repo.update.assert_called_once_with( mock_sound_repo.get_by_id.assert_called_once_with(1)
test_sound,
{"play_count": 6},
)
# Verify new SoundPlayed record was always added # Verify user repository calls
mock_session.add.assert_called_once() mock_user_repo.get_by_id.assert_called_once_with(1)
# Verify commit happened # Verify session operations (called twice: once for sound, once for sound_played)
mock_session.commit.assert_called_once() assert mock_session.add.call_count == 2
# Commit is called twice: once after updating sound, once after adding sound_played
assert mock_session.commit.call_count == 2
# Verify the sound's play count was incremented from 5 to 6
assert test_sound.play_count == 6
def test_uses_shared_sound_path_utility(self, vlc_service, sample_sound) -> None: def test_uses_shared_sound_path_utility(self, vlc_service, sample_sound) -> None:
"""Test that VLC service uses the shared sound path utility.""" """Test that VLC service uses the shared sound path utility."""

View File

@@ -0,0 +1,218 @@
"""Tests for scheduled task model."""
import uuid
from datetime import UTC, datetime, timedelta
from app.models.scheduled_task import (
RecurrenceType,
ScheduledTask,
TaskStatus,
TaskType,
)
class TestScheduledTaskModel:
"""Test cases for scheduled task model."""
def test_task_creation(self):
"""Test basic task creation."""
task = ScheduledTask(
name="Test Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
)
assert task.name == "Test Task"
assert task.task_type == TaskType.CREDIT_RECHARGE
assert task.status == TaskStatus.PENDING
assert task.timezone == "UTC"
assert task.recurrence_type == RecurrenceType.NONE
assert task.parameters == {}
assert task.user_id is None
assert task.executions_count == 0
assert task.is_active is True
def test_task_with_user(self):
"""Test task creation with user association."""
user_id = uuid.uuid4()
task = ScheduledTask(
name="User Task",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
user_id=user_id,
)
assert task.user_id == user_id
assert not task.is_system_task()
def test_system_task(self):
"""Test system task (no user association)."""
task = ScheduledTask(
name="System Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
)
assert task.user_id is None
assert task.is_system_task()
def test_recurring_task(self):
"""Test recurring task properties."""
task = ScheduledTask(
name="Recurring Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
recurrence_type=RecurrenceType.DAILY,
recurrence_count=5,
)
assert task.is_recurring()
assert task.should_repeat()
def test_non_recurring_task(self):
"""Test non-recurring task properties."""
task = ScheduledTask(
name="One-shot Task",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
recurrence_type=RecurrenceType.NONE,
)
assert not task.is_recurring()
assert not task.should_repeat()
def test_infinite_recurring_task(self):
"""Test infinitely recurring task."""
task = ScheduledTask(
name="Infinite Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
recurrence_type=RecurrenceType.DAILY,
recurrence_count=None, # Infinite
)
assert task.is_recurring()
assert task.should_repeat()
# Even after many executions
task.executions_count = 100
assert task.should_repeat()
def test_recurring_task_execution_limit(self):
"""Test recurring task with execution limit."""
task = ScheduledTask(
name="Limited Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
recurrence_type=RecurrenceType.DAILY,
recurrence_count=3,
)
assert task.should_repeat()
# After 3 executions, should not repeat
task.executions_count = 3
assert not task.should_repeat()
# After more than limit, still should not repeat
task.executions_count = 5
assert not task.should_repeat()
def test_task_expiration(self):
"""Test task expiration."""
# Non-expired task (using naive UTC datetimes)
task = ScheduledTask(
name="Valid Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC).replace(tzinfo=None) + timedelta(hours=1),
expires_at=datetime.now(tz=UTC).replace(tzinfo=None) + timedelta(hours=2),
)
assert not task.is_expired()
# Expired task (using naive UTC datetimes)
expired_task = ScheduledTask(
name="Expired Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC).replace(tzinfo=None) + timedelta(hours=1),
expires_at=datetime.now(tz=UTC).replace(tzinfo=None) - timedelta(hours=1),
)
assert expired_task.is_expired()
# Task with no expiration
no_expiry_task = ScheduledTask(
name="No Expiry Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
)
assert not no_expiry_task.is_expired()
def test_task_with_parameters(self):
"""Test task with custom parameters."""
parameters = {
"sound_id": str(uuid.uuid4()),
"volume": 80,
"repeat": True,
}
task = ScheduledTask(
name="Parametrized Task",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
parameters=parameters,
)
assert task.parameters == parameters
assert task.parameters["sound_id"] == parameters["sound_id"]
assert task.parameters["volume"] == 80
assert task.parameters["repeat"] is True
def test_task_with_timezone(self):
"""Test task with custom timezone."""
task = ScheduledTask(
name="NY Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
timezone="America/New_York",
)
assert task.timezone == "America/New_York"
def test_task_with_cron_expression(self):
"""Test task with cron expression."""
cron_expr = "0 9 * * 1-5" # 9 AM on weekdays
task = ScheduledTask(
name="Cron Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
recurrence_type=RecurrenceType.CRON,
cron_expression=cron_expr,
)
assert task.recurrence_type == RecurrenceType.CRON
assert task.cron_expression == cron_expr
assert task.is_recurring()
def test_task_status_enum_values(self):
"""Test all task status enum values."""
assert TaskStatus.PENDING == "pending"
assert TaskStatus.RUNNING == "running"
assert TaskStatus.COMPLETED == "completed"
assert TaskStatus.FAILED == "failed"
assert TaskStatus.CANCELLED == "cancelled"
def test_task_type_enum_values(self):
"""Test all task type enum values."""
assert TaskType.CREDIT_RECHARGE == "credit_recharge"
assert TaskType.PLAY_SOUND == "play_sound"
assert TaskType.PLAY_PLAYLIST == "play_playlist"
def test_recurrence_type_enum_values(self):
"""Test all recurrence type enum values."""
assert RecurrenceType.NONE == "none"
assert RecurrenceType.HOURLY == "hourly"
assert RecurrenceType.DAILY == "daily"
assert RecurrenceType.WEEKLY == "weekly"
assert RecurrenceType.MONTHLY == "monthly"
assert RecurrenceType.YEARLY == "yearly"
assert RecurrenceType.CRON == "cron"

View File

@@ -0,0 +1,492 @@
"""Tests for scheduled task repository."""
from datetime import UTC, datetime, timedelta
import pytest
from sqlmodel.ext.asyncio.session import AsyncSession
from app.models.scheduled_task import (
RecurrenceType,
ScheduledTask,
TaskStatus,
TaskType,
)
from app.repositories.scheduled_task import ScheduledTaskRepository
class TestScheduledTaskRepository:
"""Test cases for scheduled task repository."""
@pytest.fixture
def repository(self, db_session: AsyncSession) -> ScheduledTaskRepository:
"""Create repository fixture."""
return ScheduledTaskRepository(db_session)
@pytest.fixture
async def sample_task(
self,
repository: ScheduledTaskRepository,
) -> ScheduledTask:
"""Create a sample scheduled task."""
task_data = {
"name": "Test Task",
"task_type": TaskType.CREDIT_RECHARGE,
"scheduled_at": datetime.now(tz=UTC) + timedelta(hours=1),
"parameters": {"test": "value"},
}
return await repository.create(task_data)
@pytest.fixture
async def user_task(
self,
repository: ScheduledTaskRepository,
test_user_id: int,
) -> ScheduledTask:
"""Create a user task."""
task_data = {
"name": "User Task",
"task_type": TaskType.PLAY_SOUND,
"scheduled_at": datetime.now(tz=UTC) + timedelta(hours=2),
"user_id": test_user_id,
"parameters": {"sound_id": "1"},
}
return await repository.create(task_data)
async def test_create_task(self, repository: ScheduledTaskRepository):
"""Test creating a scheduled task."""
task_data = {
"name": "Test Task",
"task_type": TaskType.CREDIT_RECHARGE,
"scheduled_at": datetime.now(tz=UTC) + timedelta(hours=1),
"timezone": "America/New_York",
"recurrence_type": RecurrenceType.DAILY,
"parameters": {"test": "value"},
}
created_task = await repository.create(task_data)
assert created_task.id is not None
assert created_task.name == "Test Task"
assert created_task.task_type == TaskType.CREDIT_RECHARGE
assert created_task.status == TaskStatus.PENDING
assert created_task.timezone == "America/New_York"
assert created_task.recurrence_type == RecurrenceType.DAILY
assert created_task.parameters == {"test": "value"}
assert created_task.is_active is True
assert created_task.executions_count == 0
async def test_get_pending_tasks(
self,
repository: ScheduledTaskRepository,
):
"""Test getting pending tasks."""
# Create tasks with different statuses and times
past_pending = ScheduledTask(
name="Past Pending",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) - timedelta(hours=1),
status=TaskStatus.PENDING,
)
await repository.create(past_pending)
future_pending = ScheduledTask(
name="Future Pending",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
status=TaskStatus.PENDING,
)
await repository.create(future_pending)
completed_task = ScheduledTask(
name="Completed",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) - timedelta(hours=1),
status=TaskStatus.COMPLETED,
)
await repository.create(completed_task)
inactive_task = ScheduledTask(
name="Inactive",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) - timedelta(hours=1),
status=TaskStatus.PENDING,
is_active=False,
)
await repository.create(inactive_task)
pending_tasks = await repository.get_pending_tasks()
task_names = [task.name for task in pending_tasks]
# Only the past pending task should be returned
assert len(pending_tasks) == 1
assert "Past Pending" in task_names
async def test_get_user_tasks(
self,
repository: ScheduledTaskRepository,
user_task: ScheduledTask,
test_user_id: int,
):
"""Test getting tasks for a specific user."""
# Create another user's task
other_user_id = 999
other_task = ScheduledTask(
name="Other User Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
user_id=other_user_id,
)
await repository.create(other_task)
# Create system task (no user)
system_task = ScheduledTask(
name="System Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
)
await repository.create(system_task)
user_tasks = await repository.get_user_tasks(test_user_id)
assert len(user_tasks) == 1
assert user_tasks[0].name == "User Task"
assert user_tasks[0].user_id == test_user_id
async def test_get_user_tasks_with_filters(
self,
repository: ScheduledTaskRepository,
test_user_id: int,
):
"""Test getting user tasks with status and type filters."""
# Create tasks with different statuses and types
tasks_data = [
("Task 1", TaskStatus.PENDING, TaskType.CREDIT_RECHARGE),
("Task 2", TaskStatus.COMPLETED, TaskType.CREDIT_RECHARGE),
("Task 3", TaskStatus.PENDING, TaskType.PLAY_SOUND),
("Task 4", TaskStatus.FAILED, TaskType.PLAY_PLAYLIST),
]
for name, status, task_type in tasks_data:
task = ScheduledTask(
name=name,
task_type=task_type,
status=status,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
user_id=test_user_id,
)
await repository.create(task)
# Test status filter
pending_tasks = await repository.get_user_tasks(
test_user_id,
status=TaskStatus.PENDING,
)
assert len(pending_tasks) == 2
assert all(task.status == TaskStatus.PENDING for task in pending_tasks)
# Test type filter
credit_tasks = await repository.get_user_tasks(
test_user_id,
task_type=TaskType.CREDIT_RECHARGE,
)
assert len(credit_tasks) == 2
assert all(task.task_type == TaskType.CREDIT_RECHARGE for task in credit_tasks)
# Test combined filters
pending_credit_tasks = await repository.get_user_tasks(
test_user_id,
status=TaskStatus.PENDING,
task_type=TaskType.CREDIT_RECHARGE,
)
assert len(pending_credit_tasks) == 1
assert pending_credit_tasks[0].name == "Task 1"
async def test_get_system_tasks(
self,
repository: ScheduledTaskRepository,
sample_task: ScheduledTask,
user_task: ScheduledTask,
):
"""Test getting system tasks."""
system_tasks = await repository.get_system_tasks()
assert len(system_tasks) == 1
assert system_tasks[0].name == "Test Task"
assert system_tasks[0].user_id is None
async def test_get_recurring_tasks_due_for_next_execution(
self,
repository: ScheduledTaskRepository,
):
"""Test getting recurring tasks due for next execution."""
# Create completed recurring task that should be re-executed
due_task = ScheduledTask(
name="Due Recurring",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) - timedelta(hours=1),
recurrence_type=RecurrenceType.DAILY,
status=TaskStatus.COMPLETED,
next_execution_at=datetime.now(tz=UTC) - timedelta(minutes=5),
)
await repository.create(due_task)
# Create completed recurring task not due yet
not_due_task = ScheduledTask(
name="Not Due Recurring",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) - timedelta(hours=1),
recurrence_type=RecurrenceType.DAILY,
status=TaskStatus.COMPLETED,
next_execution_at=datetime.now(tz=UTC) + timedelta(hours=1),
)
await repository.create(not_due_task)
# Create non-recurring completed task
non_recurring = ScheduledTask(
name="Non-recurring",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) - timedelta(hours=1),
recurrence_type=RecurrenceType.NONE,
status=TaskStatus.COMPLETED,
)
await repository.create(non_recurring)
due_tasks = await repository.get_recurring_tasks_due_for_next_execution()
assert len(due_tasks) == 1
assert due_tasks[0].name == "Due Recurring"
async def test_get_expired_tasks(
self,
repository: ScheduledTaskRepository,
):
"""Test getting expired tasks."""
# Create expired task
expired_task = ScheduledTask(
name="Expired Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
expires_at=datetime.now(tz=UTC) - timedelta(hours=1),
)
await repository.create(expired_task)
# Create non-expired task
valid_task = ScheduledTask(
name="Valid Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
expires_at=datetime.now(tz=UTC) + timedelta(hours=2),
)
await repository.create(valid_task)
# Create task with no expiry
no_expiry_task = ScheduledTask(
name="No Expiry",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
)
await repository.create(no_expiry_task)
expired_tasks = await repository.get_expired_tasks()
assert len(expired_tasks) == 1
assert expired_tasks[0].name == "Expired Task"
async def test_cancel_user_tasks(
self,
repository: ScheduledTaskRepository,
test_user_id: int,
):
"""Test cancelling user tasks."""
# Create multiple user tasks
tasks_data = [
("Pending Task 1", TaskStatus.PENDING, TaskType.CREDIT_RECHARGE),
("Running Task", TaskStatus.RUNNING, TaskType.PLAY_SOUND),
("Completed Task", TaskStatus.COMPLETED, TaskType.CREDIT_RECHARGE),
]
for name, status, task_type in tasks_data:
task = ScheduledTask(
name=name,
task_type=task_type,
status=status,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
user_id=test_user_id,
)
await repository.create(task)
# Cancel all user tasks
cancelled_count = await repository.cancel_user_tasks(test_user_id)
assert cancelled_count == 2 # Only pending and running tasks
# Verify tasks are cancelled
user_tasks = await repository.get_user_tasks(test_user_id)
pending_or_running = [
task for task in user_tasks
if task.status in [TaskStatus.PENDING, TaskStatus.RUNNING]
]
cancelled_tasks = [
task for task in user_tasks
if task.status == TaskStatus.CANCELLED
]
assert len(pending_or_running) == 0
assert len(cancelled_tasks) == 2
async def test_cancel_user_tasks_by_type(
self,
repository: ScheduledTaskRepository,
test_user_id: int,
):
"""Test cancelling user tasks by type."""
# Create tasks of different types
credit_task = ScheduledTask(
name="Credit Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
user_id=test_user_id,
)
await repository.create(credit_task)
sound_task = ScheduledTask(
name="Sound Task",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
user_id=test_user_id,
)
await repository.create(sound_task)
# Cancel only credit tasks
cancelled_count = await repository.cancel_user_tasks(
test_user_id,
TaskType.CREDIT_RECHARGE,
)
assert cancelled_count == 1
# Verify only credit task is cancelled
user_tasks = await repository.get_user_tasks(test_user_id)
credit_tasks = [
task for task in user_tasks
if task.task_type == TaskType.CREDIT_RECHARGE
]
sound_tasks = [
task for task in user_tasks
if task.task_type == TaskType.PLAY_SOUND
]
assert len(credit_tasks) == 1
assert credit_tasks[0].status == TaskStatus.CANCELLED
assert len(sound_tasks) == 1
assert sound_tasks[0].status == TaskStatus.PENDING
async def test_mark_as_running(
self,
repository: ScheduledTaskRepository,
sample_task: ScheduledTask,
):
"""Test marking task as running."""
await repository.mark_as_running(sample_task)
updated_task = await repository.get_by_id(sample_task.id)
assert updated_task.status == TaskStatus.RUNNING
async def test_mark_as_completed(
self,
repository: ScheduledTaskRepository,
sample_task: ScheduledTask,
):
"""Test marking task as completed."""
initial_count = sample_task.executions_count
next_execution = datetime.now(tz=UTC) + timedelta(days=1)
await repository.mark_as_completed(sample_task, next_execution)
updated_task = await repository.get_by_id(sample_task.id)
assert updated_task.status == TaskStatus.COMPLETED
assert updated_task.executions_count == initial_count + 1
assert updated_task.last_executed_at is not None
assert updated_task.error_message is None
async def test_mark_as_completed_recurring_task(
self,
repository: ScheduledTaskRepository,
):
"""Test marking recurring task as completed."""
task = ScheduledTask(
name="Recurring Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC),
recurrence_type=RecurrenceType.DAILY,
)
created_task = await repository.create(task)
next_execution = datetime.now(tz=UTC).replace(tzinfo=None) + timedelta(days=1)
await repository.mark_as_completed(created_task, next_execution)
updated_task = await repository.get_by_id(created_task.id)
# Should be set back to pending for next execution
assert updated_task.status == TaskStatus.PENDING
assert updated_task.next_execution_at == next_execution
assert updated_task.is_active is True
async def test_mark_as_completed_non_recurring_task(
self,
repository: ScheduledTaskRepository,
sample_task: ScheduledTask,
):
"""Test marking non-recurring task as completed."""
await repository.mark_as_completed(sample_task, None)
updated_task = await repository.get_by_id(sample_task.id)
assert updated_task.status == TaskStatus.COMPLETED
assert updated_task.is_active is False
async def test_mark_as_failed(
self,
repository: ScheduledTaskRepository,
sample_task: ScheduledTask,
):
"""Test marking task as failed."""
error_message = "Task execution failed"
await repository.mark_as_failed(sample_task, error_message)
updated_task = await repository.get_by_id(sample_task.id)
assert updated_task.status == TaskStatus.FAILED
assert updated_task.error_message == error_message
assert updated_task.last_executed_at is not None
async def test_mark_as_failed_recurring_task(
self,
repository: ScheduledTaskRepository,
):
"""Test marking recurring task as failed."""
task = ScheduledTask(
name="Recurring Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC),
recurrence_type=RecurrenceType.DAILY,
)
created_task = await repository.create(task)
await repository.mark_as_failed(created_task, "Failed")
updated_task = await repository.get_by_id(created_task.id)
assert updated_task.status == TaskStatus.FAILED
# Recurring tasks should remain active even after failure
assert updated_task.is_active is True
async def test_mark_as_failed_non_recurring_task(
self,
repository: ScheduledTaskRepository,
sample_task: ScheduledTask,
):
"""Test marking non-recurring task as failed."""
await repository.mark_as_failed(sample_task, "Failed")
updated_task = await repository.get_by_id(sample_task.id)
assert updated_task.status == TaskStatus.FAILED
# Non-recurring tasks should be deactivated on failure
assert updated_task.is_active is False

View File

@@ -0,0 +1,519 @@
"""Tests for scheduler service."""
import uuid
from datetime import UTC, datetime, timedelta
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from sqlmodel.ext.asyncio.session import AsyncSession
from app.models.scheduled_task import (
RecurrenceType,
ScheduledTask,
TaskStatus,
TaskType,
)
from app.schemas.scheduler import ScheduledTaskCreate
from app.services.scheduler import SchedulerService
class TestSchedulerService:
"""Test cases for scheduler service."""
@pytest.fixture
def mock_player_service(self):
"""Create mock player service."""
return MagicMock()
@pytest.fixture
def scheduler_service(
self,
db_session: AsyncSession,
mock_player_service,
) -> SchedulerService:
"""Create scheduler service fixture."""
def session_factory():
return db_session
return SchedulerService(session_factory, mock_player_service)
@pytest.fixture
def sample_task_data(self) -> dict:
"""Sample task data for testing."""
return {
"name": "Test Task",
"task_type": TaskType.CREDIT_RECHARGE,
"scheduled_at": datetime.now(tz=UTC) + timedelta(hours=1),
"parameters": {"test": "value"},
"timezone": "UTC",
}
def _create_task_schema(self, task_data: dict, **overrides) -> ScheduledTaskCreate:
"""Create ScheduledTaskCreate schema from dict."""
data = {**task_data, **overrides}
return ScheduledTaskCreate(**data)
async def test_create_task(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test creating a scheduled task."""
with patch.object(scheduler_service, "_schedule_apscheduler_job") as mock_schedule:
schema = self._create_task_schema(sample_task_data)
task = await scheduler_service.create_task(task_data=schema)
assert task.id is not None
assert task.name == sample_task_data["name"]
assert task.task_type == sample_task_data["task_type"]
assert task.status == TaskStatus.PENDING
assert task.parameters == sample_task_data["parameters"]
mock_schedule.assert_called_once_with(task)
async def test_create_user_task(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
test_user_id: uuid.UUID,
):
"""Test creating a user task."""
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
schema = self._create_task_schema(sample_task_data)
task = await scheduler_service.create_task(
task_data=schema,
user_id=test_user_id,
)
assert task.user_id == test_user_id
assert not task.is_system_task()
async def test_create_system_task(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test creating a system task."""
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
schema = self._create_task_schema(sample_task_data)
task = await scheduler_service.create_task(task_data=schema)
assert task.user_id is None
assert task.is_system_task()
async def test_create_recurring_task(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test creating a recurring task."""
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
schema = self._create_task_schema(
sample_task_data,
recurrence_type=RecurrenceType.DAILY,
recurrence_count=5,
)
task = await scheduler_service.create_task(task_data=schema)
assert task.recurrence_type == RecurrenceType.DAILY
assert task.recurrence_count == 5
assert task.is_recurring()
async def test_create_task_with_timezone_conversion(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test creating task with timezone conversion."""
# Use a specific datetime for testing
ny_time = datetime(2024, 1, 1, 12, 0, 0) # Noon in NY # noqa: DTZ001
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
schema = self._create_task_schema(
sample_task_data,
scheduled_at=ny_time,
timezone="America/New_York",
)
task = await scheduler_service.create_task(task_data=schema)
# The scheduled_at should be converted to UTC
assert task.timezone == "America/New_York"
# In winter, EST is UTC-5, so noon EST becomes 5 PM UTC
# Note: This test might need adjustment based on DST
assert task.scheduled_at.hour in [16, 17] # Account for DST
async def test_cancel_task(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test cancelling a task."""
# Create a task first
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
schema = self._create_task_schema(sample_task_data)
task = await scheduler_service.create_task(task_data=schema)
# Mock the scheduler remove_job method
with patch.object(scheduler_service.scheduler, "remove_job") as mock_remove:
result = await scheduler_service.cancel_task(task.id)
assert result is True
mock_remove.assert_called_once_with(str(task.id))
# Check task is cancelled in database
from app.repositories.scheduled_task import ScheduledTaskRepository
async with scheduler_service.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
updated_task = await repo.get_by_id(task.id)
assert updated_task.status == TaskStatus.CANCELLED
assert updated_task.is_active is False
async def test_cancel_nonexistent_task(
self,
scheduler_service: SchedulerService,
):
"""Test cancelling a non-existent task."""
result = await scheduler_service.cancel_task(uuid.uuid4())
assert result is False
async def test_get_user_tasks(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
test_user_id: uuid.UUID,
):
"""Test getting user tasks."""
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
# Create user task
schema = self._create_task_schema(sample_task_data)
await scheduler_service.create_task(
task_data=schema,
user_id=test_user_id,
)
# Create system task
system_schema = self._create_task_schema(sample_task_data)
await scheduler_service.create_task(task_data=system_schema)
user_tasks = await scheduler_service.get_user_tasks(test_user_id)
assert len(user_tasks) == 1
assert user_tasks[0].user_id == test_user_id
async def test_ensure_system_tasks(
self,
scheduler_service: SchedulerService,
):
"""Test ensuring system tasks exist."""
# Mock the repository to return no existing tasks
with patch("app.repositories.scheduled_task.ScheduledTaskRepository.get_system_tasks") as mock_get:
with patch("app.repositories.scheduled_task.ScheduledTaskRepository.create") as mock_create:
mock_get.return_value = []
await scheduler_service._ensure_system_tasks()
# Should create daily credit recharge task
mock_create.assert_called_once()
created_task_data = mock_create.call_args[0][0]
assert created_task_data["name"] == "Daily Credit Recharge"
assert created_task_data["task_type"] == TaskType.CREDIT_RECHARGE
assert created_task_data["recurrence_type"] == RecurrenceType.DAILY
async def test_ensure_system_tasks_already_exist(
self,
scheduler_service: SchedulerService,
):
"""Test ensuring system tasks when they already exist."""
existing_task = ScheduledTask(
name="Existing Daily Credit Recharge",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC),
recurrence_type=RecurrenceType.DAILY,
is_active=True,
)
with patch("app.repositories.scheduled_task.ScheduledTaskRepository.get_system_tasks") as mock_get:
with patch("app.repositories.scheduled_task.ScheduledTaskRepository.create") as mock_create:
mock_get.return_value = [existing_task]
await scheduler_service._ensure_system_tasks()
# Should not create new task
mock_create.assert_not_called()
def test_create_trigger_one_shot(
self,
scheduler_service: SchedulerService,
):
"""Test creating one-shot trigger."""
task = ScheduledTask(
name="One Shot",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
recurrence_type=RecurrenceType.NONE,
)
trigger = scheduler_service._create_trigger(task)
assert trigger is not None
assert trigger.__class__.__name__ == "DateTrigger"
def test_create_trigger_daily(
self,
scheduler_service: SchedulerService,
):
"""Test creating daily interval trigger."""
task = ScheduledTask(
name="Daily",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
recurrence_type=RecurrenceType.DAILY,
)
trigger = scheduler_service._create_trigger(task)
assert trigger is not None
assert trigger.__class__.__name__ == "IntervalTrigger"
def test_create_trigger_cron(
self,
scheduler_service: SchedulerService,
):
"""Test creating cron trigger."""
task = ScheduledTask(
name="Cron",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC) + timedelta(hours=1),
recurrence_type=RecurrenceType.CRON,
cron_expression="0 9 * * *", # 9 AM daily
)
trigger = scheduler_service._create_trigger(task)
assert trigger is not None
assert trigger.__class__.__name__ == "CronTrigger"
def test_create_trigger_monthly(
self,
scheduler_service: SchedulerService,
):
"""Test creating monthly cron trigger."""
task = ScheduledTask(
name="Monthly",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime(2024, 1, 15, 10, 30, 0, tzinfo=UTC), # 15th at 10:30 AM
recurrence_type=RecurrenceType.MONTHLY,
)
trigger = scheduler_service._create_trigger(task)
assert trigger is not None
assert trigger.__class__.__name__ == "CronTrigger"
def test_calculate_next_execution(
self,
scheduler_service: SchedulerService,
):
"""Test calculating next execution time."""
now = datetime.now(tz=UTC)
# Test different recurrence types
test_cases = [
(RecurrenceType.HOURLY, timedelta(hours=1)),
(RecurrenceType.DAILY, timedelta(days=1)),
(RecurrenceType.WEEKLY, timedelta(weeks=1)),
(RecurrenceType.MONTHLY, timedelta(days=30)),
(RecurrenceType.YEARLY, timedelta(days=365)),
]
for recurrence_type, expected_delta in test_cases:
task = ScheduledTask(
name="Test",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=now,
recurrence_type=recurrence_type,
)
with patch("app.services.scheduler.datetime") as mock_datetime:
mock_datetime.now.return_value = now
next_execution = scheduler_service._calculate_next_execution(task)
assert next_execution is not None
# Allow some tolerance for execution time
assert abs((next_execution - now) - expected_delta) < timedelta(seconds=1)
def test_calculate_next_execution_none_recurrence(
self,
scheduler_service: SchedulerService,
):
"""Test calculating next execution for non-recurring task."""
task = ScheduledTask(
name="One Shot",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC),
recurrence_type=RecurrenceType.NONE,
)
next_execution = scheduler_service._calculate_next_execution(task)
assert next_execution is None
@patch("app.services.scheduler.TaskHandlerRegistry")
async def test_execute_task_success(
self,
mock_handler_class,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test successful task execution."""
# Create task ready for immediate execution
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
ready_data = {**sample_task_data, "scheduled_at": datetime.now(tz=UTC) - timedelta(minutes=1)}
schema = self._create_task_schema(ready_data)
task = await scheduler_service.create_task(task_data=schema)
# Mock handler registry
mock_handler = AsyncMock()
mock_handler_class.return_value = mock_handler
# Execute task
await scheduler_service._execute_task(task.id)
# Verify handler was called
mock_handler.execute_task.assert_called_once()
# Check task is marked as completed
from app.repositories.scheduled_task import ScheduledTaskRepository
async with scheduler_service.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
updated_task = await repo.get_by_id(task.id)
assert updated_task.status == TaskStatus.COMPLETED
assert updated_task.executions_count == 1
@patch("app.services.scheduler.TaskHandlerRegistry")
async def test_execute_task_failure(
self,
mock_handler_class,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test task execution failure."""
# Create task ready for immediate execution
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
ready_data = {**sample_task_data, "scheduled_at": datetime.now(tz=UTC) - timedelta(minutes=1)}
schema = self._create_task_schema(ready_data)
task = await scheduler_service.create_task(task_data=schema)
# Mock handler to raise exception
mock_handler = AsyncMock()
mock_handler.execute_task.side_effect = Exception("Task failed")
mock_handler_class.return_value = mock_handler
# Execute task
await scheduler_service._execute_task(task.id)
# Check task is marked as failed
from app.repositories.scheduled_task import ScheduledTaskRepository
async with scheduler_service.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
updated_task = await repo.get_by_id(task.id)
assert updated_task.status == TaskStatus.FAILED
assert "Task failed" in updated_task.error_message
async def test_execute_nonexistent_task(
self,
scheduler_service: SchedulerService,
):
"""Test executing non-existent task."""
# Should handle gracefully
await scheduler_service._execute_task(uuid.uuid4())
async def test_execute_expired_task(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test executing expired task."""
# Create expired task (stored as naive UTC datetime)
expires_at = datetime.now(tz=UTC).replace(tzinfo=None) - timedelta(hours=1)
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
schema = self._create_task_schema(sample_task_data, expires_at=expires_at)
task = await scheduler_service.create_task(task_data=schema)
# Execute task
await scheduler_service._execute_task(task.id)
# Check task is cancelled
from app.repositories.scheduled_task import ScheduledTaskRepository
async with scheduler_service.db_session_factory() as session:
repo = ScheduledTaskRepository(session)
updated_task = await repo.get_by_id(task.id)
assert updated_task.status == TaskStatus.CANCELLED
assert updated_task.is_active is False
async def test_concurrent_task_execution_prevention(
self,
scheduler_service: SchedulerService,
sample_task_data: dict,
):
"""Test prevention of concurrent task execution."""
with patch.object(scheduler_service, "_schedule_apscheduler_job"):
schema = self._create_task_schema(sample_task_data)
task = await scheduler_service.create_task(task_data=schema)
# Add task to running set
scheduler_service._running_tasks.add(str(task.id))
# Try to execute - should return without doing anything
with patch("app.services.task_handlers.TaskHandlerRegistry") as mock_handler_class:
await scheduler_service._execute_task(task.id)
# Handler should not be called
mock_handler_class.assert_not_called()
@patch("app.services.scheduler.ScheduledTaskRepository")
async def test_maintenance_job_expired_tasks(
self,
mock_repo_class,
scheduler_service: SchedulerService,
):
"""Test maintenance job handling expired tasks."""
# Mock expired task
expired_task = MagicMock()
expired_task.id = uuid.uuid4()
mock_repo = AsyncMock()
mock_repo.get_expired_tasks.return_value = [expired_task]
mock_repo.get_recurring_tasks_due_for_next_execution.return_value = []
mock_repo_class.return_value = mock_repo
with patch.object(scheduler_service.scheduler, "remove_job") as mock_remove:
await scheduler_service._maintenance_job()
# Should mark as cancelled and remove from scheduler
mock_repo.update.assert_called_with(expired_task, {
"status": TaskStatus.CANCELLED,
"is_active": False,
})
mock_remove.assert_called_once_with(str(expired_task.id))
@patch("app.services.scheduler.ScheduledTaskRepository")
async def test_maintenance_job_due_recurring_tasks(
self,
mock_repo_class,
scheduler_service: SchedulerService,
):
"""Test maintenance job handling due recurring tasks."""
# Mock due recurring task
due_task = MagicMock()
due_task.should_repeat.return_value = True
due_task.next_execution_at = datetime.now(tz=UTC) - timedelta(minutes=5)
mock_repo = AsyncMock()
mock_repo.get_expired_tasks.return_value = []
mock_repo.get_recurring_tasks_due_for_next_execution.return_value = [due_task]
mock_repo_class.return_value = mock_repo
with patch.object(scheduler_service, "_schedule_apscheduler_job") as mock_schedule:
await scheduler_service._maintenance_job()
# Should reset to pending and reschedule
mock_repo.update.assert_called_with(due_task, {
"status": TaskStatus.PENDING,
"scheduled_at": due_task.next_execution_at,
})
mock_schedule.assert_called_once_with(due_task)

433
tests/test_task_handlers.py Normal file
View File

@@ -0,0 +1,433 @@
"""Tests for task handlers."""
import uuid
from datetime import UTC, datetime
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from sqlmodel.ext.asyncio.session import AsyncSession
from app.models.scheduled_task import ScheduledTask, TaskType
from app.services.task_handlers import TaskExecutionError, TaskHandlerRegistry
class TestTaskHandlerRegistry:
"""Test cases for task handler registry."""
@pytest.fixture
def mock_credit_service(self):
"""Create mock credit service."""
return AsyncMock()
@pytest.fixture
def mock_player_service(self):
"""Create mock player service."""
return AsyncMock()
@pytest.fixture
def task_registry(
self,
db_session: AsyncSession,
mock_credit_service,
mock_player_service,
) -> TaskHandlerRegistry:
"""Create task handler registry fixture."""
def mock_db_session_factory():
return db_session
return TaskHandlerRegistry(
db_session,
mock_db_session_factory,
mock_credit_service,
mock_player_service,
)
async def test_execute_task_unknown_type(
self,
task_registry: TaskHandlerRegistry,
):
"""Test executing task with unknown type."""
# Create task with invalid type
task = ScheduledTask(
name="Unknown Task",
task_type="UNKNOWN_TYPE", # Invalid type
scheduled_at=datetime.now(tz=UTC),
)
with pytest.raises(TaskExecutionError, match="No handler registered"):
await task_registry.execute_task(task)
async def test_handle_credit_recharge_all_users(
self,
task_registry: TaskHandlerRegistry,
mock_credit_service,
):
"""Test handling credit recharge for all users."""
task = ScheduledTask(
name="Daily Credit Recharge",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC),
parameters={},
)
mock_credit_service.recharge_all_users_credits.return_value = {
"users_recharged": 10,
"total_credits": 1000,
}
await task_registry.execute_task(task)
mock_credit_service.recharge_all_users_credits.assert_called_once()
async def test_handle_credit_recharge_specific_user(
self,
task_registry: TaskHandlerRegistry,
mock_credit_service,
test_user_id: uuid.UUID,
):
"""Test handling credit recharge for specific user."""
task = ScheduledTask(
name="User Credit Recharge",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC),
parameters={"user_id": str(test_user_id)},
)
mock_credit_service.recharge_user_credits.return_value = {
"user_id": str(test_user_id),
"credits_added": 100,
}
await task_registry.execute_task(task)
mock_credit_service.recharge_user_credits.assert_called_once_with(test_user_id)
async def test_handle_credit_recharge_uuid_user_id(
self,
task_registry: TaskHandlerRegistry,
mock_credit_service,
test_user_id: uuid.UUID,
):
"""Test handling credit recharge with UUID user_id parameter."""
task = ScheduledTask(
name="User Credit Recharge",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC),
parameters={"user_id": test_user_id}, # UUID object instead of string
)
await task_registry.execute_task(task)
mock_credit_service.recharge_user_credits.assert_called_once_with(test_user_id)
async def test_handle_play_sound_success(
self,
task_registry: TaskHandlerRegistry,
test_sound_id: int,
):
"""Test successful play sound task handling."""
task = ScheduledTask(
name="Play Sound",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC),
parameters={"sound_id": str(test_sound_id)},
)
# Mock sound repository
mock_sound = MagicMock()
mock_sound.id = test_sound_id
mock_sound.filename = "test_sound.mp3"
with patch.object(task_registry.sound_repository, "get_by_id", return_value=mock_sound):
with patch("app.services.task_handlers.VLCPlayerService") as mock_vlc_class:
mock_vlc_service = AsyncMock()
mock_vlc_service.play_sound.return_value = True
mock_vlc_class.return_value = mock_vlc_service
await task_registry.execute_task(task)
task_registry.sound_repository.get_by_id.assert_called_once_with(test_sound_id)
mock_vlc_service.play_sound.assert_called_once_with(mock_sound)
async def test_handle_play_sound_missing_sound_id(
self,
task_registry: TaskHandlerRegistry,
):
"""Test play sound task with missing sound_id parameter."""
task = ScheduledTask(
name="Play Sound",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC),
parameters={}, # Missing sound_id
)
with pytest.raises(TaskExecutionError, match="sound_id parameter is required"):
await task_registry.execute_task(task)
async def test_handle_play_sound_invalid_sound_id(
self,
task_registry: TaskHandlerRegistry,
):
"""Test play sound task with invalid sound_id parameter."""
task = ScheduledTask(
name="Play Sound",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC),
parameters={"sound_id": "invalid-uuid"},
)
with pytest.raises(TaskExecutionError, match="Invalid sound_id format"):
await task_registry.execute_task(task)
async def test_handle_play_sound_not_found(
self,
task_registry: TaskHandlerRegistry,
test_sound_id: int,
):
"""Test play sound task with non-existent sound."""
task = ScheduledTask(
name="Play Sound",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC),
parameters={"sound_id": str(test_sound_id)},
)
with patch.object(task_registry.sound_repository, "get_by_id", return_value=None):
with pytest.raises(TaskExecutionError, match="Sound not found"):
await task_registry.execute_task(task)
async def test_handle_play_sound_uuid_parameter(
self,
task_registry: TaskHandlerRegistry,
test_sound_id: int,
):
"""Test play sound task with UUID parameter (not string)."""
task = ScheduledTask(
name="Play Sound",
task_type=TaskType.PLAY_SOUND,
scheduled_at=datetime.now(tz=UTC),
parameters={"sound_id": test_sound_id}, # UUID object
)
mock_sound = MagicMock()
mock_sound.filename = "test_sound.mp3"
with patch.object(task_registry.sound_repository, "get_by_id", return_value=mock_sound):
with patch("app.services.task_handlers.VLCPlayerService") as mock_vlc_class:
mock_vlc_service = AsyncMock()
mock_vlc_class.return_value = mock_vlc_service
await task_registry.execute_task(task)
task_registry.sound_repository.get_by_id.assert_called_once_with(test_sound_id)
async def test_handle_play_playlist_success(
self,
task_registry: TaskHandlerRegistry,
mock_player_service,
test_playlist_id: int,
):
"""Test successful play playlist task handling."""
task = ScheduledTask(
name="Play Playlist",
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=datetime.now(tz=UTC),
parameters={
"playlist_id": str(test_playlist_id),
"play_mode": "loop",
"shuffle": True,
},
)
# Mock playlist repository
mock_playlist = MagicMock()
mock_playlist.id = test_playlist_id
mock_playlist.name = "Test Playlist"
with patch.object(task_registry.playlist_repository, "get_by_id", return_value=mock_playlist):
await task_registry.execute_task(task)
task_registry.playlist_repository.get_by_id.assert_called_once_with(test_playlist_id)
mock_player_service.load_playlist.assert_called_once_with(test_playlist_id)
mock_player_service.set_mode.assert_called_once_with("loop")
mock_player_service.set_shuffle.assert_called_once_with(shuffle=True)
mock_player_service.play.assert_called_once()
async def test_handle_play_playlist_minimal_parameters(
self,
task_registry: TaskHandlerRegistry,
mock_player_service,
test_playlist_id: int,
):
"""Test play playlist task with minimal parameters."""
task = ScheduledTask(
name="Play Playlist",
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=datetime.now(tz=UTC),
parameters={"playlist_id": str(test_playlist_id)},
)
mock_playlist = MagicMock()
mock_playlist.name = "Test Playlist"
with patch.object(task_registry.playlist_repository, "get_by_id", return_value=mock_playlist):
await task_registry.execute_task(task)
# Should use default values
mock_player_service.set_mode.assert_called_once_with("continuous")
mock_player_service.set_shuffle.assert_not_called()
async def test_handle_play_playlist_missing_playlist_id(
self,
task_registry: TaskHandlerRegistry,
):
"""Test play playlist task with missing playlist_id parameter."""
task = ScheduledTask(
name="Play Playlist",
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=datetime.now(tz=UTC),
parameters={}, # Missing playlist_id
)
with pytest.raises(TaskExecutionError, match="playlist_id parameter is required"):
await task_registry.execute_task(task)
async def test_handle_play_playlist_invalid_playlist_id(
self,
task_registry: TaskHandlerRegistry,
):
"""Test play playlist task with invalid playlist_id parameter."""
task = ScheduledTask(
name="Play Playlist",
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=datetime.now(tz=UTC),
parameters={"playlist_id": "invalid-uuid"},
)
with pytest.raises(TaskExecutionError, match="Invalid playlist_id format"):
await task_registry.execute_task(task)
async def test_handle_play_playlist_not_found(
self,
task_registry: TaskHandlerRegistry,
test_playlist_id: int,
):
"""Test play playlist task with non-existent playlist."""
task = ScheduledTask(
name="Play Playlist",
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=datetime.now(tz=UTC),
parameters={"playlist_id": str(test_playlist_id)},
)
with patch.object(task_registry.playlist_repository, "get_by_id", return_value=None):
with pytest.raises(TaskExecutionError, match="Playlist not found"):
await task_registry.execute_task(task)
async def test_handle_play_playlist_valid_play_modes(
self,
task_registry: TaskHandlerRegistry,
mock_player_service,
test_playlist_id: int,
):
"""Test play playlist task with various valid play modes."""
mock_playlist = MagicMock()
mock_playlist.name = "Test Playlist"
valid_modes = ["continuous", "loop", "loop_one", "random", "single"]
for mode in valid_modes:
task = ScheduledTask(
name="Play Playlist",
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=datetime.now(tz=UTC),
parameters={
"playlist_id": str(test_playlist_id),
"play_mode": mode,
},
)
with patch.object(task_registry.playlist_repository, "get_by_id", return_value=mock_playlist):
await task_registry.execute_task(task)
mock_player_service.set_mode.assert_called_with(mode)
# Reset mock for next iteration
mock_player_service.reset_mock()
async def test_handle_play_playlist_invalid_play_mode(
self,
task_registry: TaskHandlerRegistry,
mock_player_service,
test_playlist_id: int,
):
"""Test play playlist task with invalid play mode."""
task = ScheduledTask(
name="Play Playlist",
task_type=TaskType.PLAY_PLAYLIST,
scheduled_at=datetime.now(tz=UTC),
parameters={
"playlist_id": str(test_playlist_id),
"play_mode": "invalid_mode",
},
)
mock_playlist = MagicMock()
mock_playlist.name = "Test Playlist"
with patch.object(task_registry.playlist_repository, "get_by_id", return_value=mock_playlist):
await task_registry.execute_task(task)
# Should not call set_mode for invalid mode
mock_player_service.set_mode.assert_not_called()
# But should still load playlist and play
mock_player_service.load_playlist.assert_called_once()
mock_player_service.play.assert_called_once()
async def test_task_execution_exception_handling(
self,
task_registry: TaskHandlerRegistry,
mock_credit_service,
):
"""Test exception handling during task execution."""
task = ScheduledTask(
name="Failing Task",
task_type=TaskType.CREDIT_RECHARGE,
scheduled_at=datetime.now(tz=UTC),
parameters={},
)
# Make credit service raise an exception
mock_credit_service.recharge_all_users_credits.side_effect = Exception("Service error")
with pytest.raises(TaskExecutionError, match="Task execution failed: Service error"):
await task_registry.execute_task(task)
async def test_task_registry_initialization(
self,
db_session: AsyncSession,
mock_credit_service,
mock_player_service,
):
"""Test task registry initialization."""
def mock_db_session_factory():
return db_session
registry = TaskHandlerRegistry(
db_session,
mock_db_session_factory,
mock_credit_service,
mock_player_service,
)
assert registry.db_session == db_session
assert registry.db_session_factory == mock_db_session_factory
assert registry.credit_service == mock_credit_service
assert registry.player_service == mock_player_service
assert registry.sound_repository is not None
assert registry.playlist_repository is not None
# Check all handlers are registered
expected_handlers = {
TaskType.CREDIT_RECHARGE,
TaskType.PLAY_SOUND,
TaskType.PLAY_PLAYLIST,
}
assert set(registry._handlers.keys()) == expected_handlers

195
uv.lock generated
View File

@@ -140,66 +140,66 @@ wheels = [
[[package]] [[package]]
name = "coverage" name = "coverage"
version = "7.10.4" version = "7.10.5"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d6/4e/08b493f1f1d8a5182df0044acc970799b58a8d289608e0d891a03e9d269a/coverage-7.10.4.tar.gz", hash = "sha256:25f5130af6c8e7297fd14634955ba9e1697f47143f289e2a23284177c0061d27", size = 823798 } sdist = { url = "https://files.pythonhosted.org/packages/61/83/153f54356c7c200013a752ce1ed5448573dca546ce125801afca9e1ac1a4/coverage-7.10.5.tar.gz", hash = "sha256:f2e57716a78bc3ae80b2207be0709a3b2b63b9f2dcf9740ee6ac03588a2015b6", size = 821662 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/9e/4a/781c9e4dd57cabda2a28e2ce5b00b6be416015265851060945a5ed4bd85e/coverage-7.10.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:a1f0264abcabd4853d4cb9b3d164adbf1565da7dab1da1669e93f3ea60162d79", size = 216706 }, { url = "https://files.pythonhosted.org/packages/27/8e/40d75c7128f871ea0fd829d3e7e4a14460cad7c3826e3b472e6471ad05bd/coverage-7.10.5-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c2d05c7e73c60a4cecc7d9b60dbfd603b4ebc0adafaef371445b47d0f805c8a9", size = 217077 },
{ url = "https://files.pythonhosted.org/packages/6a/8c/51255202ca03d2e7b664770289f80db6f47b05138e06cce112b3957d5dfd/coverage-7.10.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:536cbe6b118a4df231b11af3e0f974a72a095182ff8ec5f4868c931e8043ef3e", size = 216939 }, { url = "https://files.pythonhosted.org/packages/18/a8/f333f4cf3fb5477a7f727b4d603a2eb5c3c5611c7fe01329c2e13b23b678/coverage-7.10.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:32ddaa3b2c509778ed5373b177eb2bf5662405493baeff52278a0b4f9415188b", size = 217310 },
{ url = "https://files.pythonhosted.org/packages/06/7f/df11131483698660f94d3c847dc76461369782d7a7644fcd72ac90da8fd0/coverage-7.10.4-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:9a4c0d84134797b7bf3f080599d0cd501471f6c98b715405166860d79cfaa97e", size = 248429 }, { url = "https://files.pythonhosted.org/packages/ec/2c/fbecd8381e0a07d1547922be819b4543a901402f63930313a519b937c668/coverage-7.10.5-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:dd382410039fe062097aa0292ab6335a3f1e7af7bba2ef8d27dcda484918f20c", size = 248802 },
{ url = "https://files.pythonhosted.org/packages/eb/fa/13ac5eda7300e160bf98f082e75f5c5b4189bf3a883dd1ee42dbedfdc617/coverage-7.10.4-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7c155fc0f9cee8c9803ea0ad153ab6a3b956baa5d4cd993405dc0b45b2a0b9e0", size = 251178 }, { url = "https://files.pythonhosted.org/packages/3f/bc/1011da599b414fb6c9c0f34086736126f9ff71f841755786a6b87601b088/coverage-7.10.5-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7fa22800f3908df31cea6fb230f20ac49e343515d968cc3a42b30d5c3ebf9b5a", size = 251550 },
{ url = "https://files.pythonhosted.org/packages/9a/bc/f63b56a58ad0bec68a840e7be6b7ed9d6f6288d790760647bb88f5fea41e/coverage-7.10.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0a5f2ab6e451d4b07855d8bcf063adf11e199bff421a4ba57f5bb95b7444ca62", size = 252313 }, { url = "https://files.pythonhosted.org/packages/4c/6f/b5c03c0c721c067d21bc697accc3642f3cef9f087dac429c918c37a37437/coverage-7.10.5-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f366a57ac81f5e12797136552f5b7502fa053c861a009b91b80ed51f2ce651c6", size = 252684 },
{ url = "https://files.pythonhosted.org/packages/2b/b6/79338f1ea27b01266f845afb4485976211264ab92407d1c307babe3592a7/coverage-7.10.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:685b67d99b945b0c221be0780c336b303a7753b3e0ec0d618c795aada25d5e7a", size = 250230 }, { url = "https://files.pythonhosted.org/packages/f9/50/d474bc300ebcb6a38a1047d5c465a227605d6473e49b4e0d793102312bc5/coverage-7.10.5-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:5f1dc8f1980a272ad4a6c84cba7981792344dad33bf5869361576b7aef42733a", size = 250602 },
{ url = "https://files.pythonhosted.org/packages/bc/93/3b24f1da3e0286a4dc5832427e1d448d5296f8287464b1ff4a222abeeeb5/coverage-7.10.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0c079027e50c2ae44da51c2e294596cbc9dbb58f7ca45b30651c7e411060fc23", size = 248351 }, { url = "https://files.pythonhosted.org/packages/4a/2d/548c8e04249cbba3aba6bd799efdd11eee3941b70253733f5d355d689559/coverage-7.10.5-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:2285c04ee8676f7938b02b4936d9b9b672064daab3187c20f73a55f3d70e6b4a", size = 248724 },
{ url = "https://files.pythonhosted.org/packages/de/5f/d59412f869e49dcc5b89398ef3146c8bfaec870b179cc344d27932e0554b/coverage-7.10.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3749aa72b93ce516f77cf5034d8e3c0dfd45c6e8a163a602ede2dc5f9a0bb927", size = 249788 }, { url = "https://files.pythonhosted.org/packages/e2/96/a7c3c0562266ac39dcad271d0eec8fc20ab576e3e2f64130a845ad2a557b/coverage-7.10.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c2492e4dd9daab63f5f56286f8a04c51323d237631eb98505d87e4c4ff19ec34", size = 250158 },
{ url = "https://files.pythonhosted.org/packages/cc/52/04a3b733f40a0cc7c4a5b9b010844111dbf906df3e868b13e1ce7b39ac31/coverage-7.10.4-cp312-cp312-win32.whl", hash = "sha256:fecb97b3a52fa9bcd5a7375e72fae209088faf671d39fae67261f37772d5559a", size = 219131 }, { url = "https://files.pythonhosted.org/packages/f3/75/74d4be58c70c42ef0b352d597b022baf12dbe2b43e7cb1525f56a0fb1d4b/coverage-7.10.5-cp312-cp312-win32.whl", hash = "sha256:38a9109c4ee8135d5df5505384fc2f20287a47ccbe0b3f04c53c9a1989c2bbaf", size = 219493 },
{ url = "https://files.pythonhosted.org/packages/83/dd/12909fc0b83888197b3ec43a4ac7753589591c08d00d9deda4158df2734e/coverage-7.10.4-cp312-cp312-win_amd64.whl", hash = "sha256:26de58f355626628a21fe6a70e1e1fad95702dafebfb0685280962ae1449f17b", size = 219939 }, { url = "https://files.pythonhosted.org/packages/4f/08/364e6012d1d4d09d1e27437382967efed971d7613f94bca9add25f0c1f2b/coverage-7.10.5-cp312-cp312-win_amd64.whl", hash = "sha256:6b87f1ad60b30bc3c43c66afa7db6b22a3109902e28c5094957626a0143a001f", size = 220302 },
{ url = "https://files.pythonhosted.org/packages/83/c7/058bb3220fdd6821bada9685eadac2940429ab3c97025ce53549ff423cc1/coverage-7.10.4-cp312-cp312-win_arm64.whl", hash = "sha256:67e8885408f8325198862bc487038a4980c9277d753cb8812510927f2176437a", size = 218572 }, { url = "https://files.pythonhosted.org/packages/db/d5/7c8a365e1f7355c58af4fe5faf3f90cc8e587590f5854808d17ccb4e7077/coverage-7.10.5-cp312-cp312-win_arm64.whl", hash = "sha256:672a6c1da5aea6c629819a0e1461e89d244f78d7b60c424ecf4f1f2556c041d8", size = 218936 },
{ url = "https://files.pythonhosted.org/packages/46/b0/4a3662de81f2ed792a4e425d59c4ae50d8dd1d844de252838c200beed65a/coverage-7.10.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:2b8e1d2015d5dfdbf964ecef12944c0c8c55b885bb5c0467ae8ef55e0e151233", size = 216735 }, { url = "https://files.pythonhosted.org/packages/9f/08/4166ecfb60ba011444f38a5a6107814b80c34c717bc7a23be0d22e92ca09/coverage-7.10.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ef3b83594d933020f54cf65ea1f4405d1f4e41a009c46df629dd964fcb6e907c", size = 217106 },
{ url = "https://files.pythonhosted.org/packages/c5/e8/e2dcffea01921bfffc6170fb4406cffb763a3b43a047bbd7923566708193/coverage-7.10.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:25735c299439018d66eb2dccf54f625aceb78645687a05f9f848f6e6c751e169", size = 216982 }, { url = "https://files.pythonhosted.org/packages/25/d7/b71022408adbf040a680b8c64bf6ead3be37b553e5844f7465643979f7ca/coverage-7.10.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2b96bfdf7c0ea9faebce088a3ecb2382819da4fbc05c7b80040dbc428df6af44", size = 217353 },
{ url = "https://files.pythonhosted.org/packages/9d/59/cc89bb6ac869704d2781c2f5f7957d07097c77da0e8fdd4fd50dbf2ac9c0/coverage-7.10.4-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:715c06cb5eceac4d9b7cdf783ce04aa495f6aff657543fea75c30215b28ddb74", size = 247981 }, { url = "https://files.pythonhosted.org/packages/74/68/21e0d254dbf8972bb8dd95e3fe7038f4be037ff04ba47d6d1b12b37510ba/coverage-7.10.5-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:63df1fdaffa42d914d5c4d293e838937638bf75c794cf20bee12978fc8c4e3bc", size = 248350 },
{ url = "https://files.pythonhosted.org/packages/aa/23/3da089aa177ceaf0d3f96754ebc1318597822e6387560914cc480086e730/coverage-7.10.4-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:e017ac69fac9aacd7df6dc464c05833e834dc5b00c914d7af9a5249fcccf07ef", size = 250584 }, { url = "https://files.pythonhosted.org/packages/90/65/28752c3a896566ec93e0219fc4f47ff71bd2b745f51554c93e8dcb659796/coverage-7.10.5-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:8002dc6a049aac0e81ecec97abfb08c01ef0c1fbf962d0c98da3950ace89b869", size = 250955 },
{ url = "https://files.pythonhosted.org/packages/ad/82/e8693c368535b4e5fad05252a366a1794d481c79ae0333ed943472fd778d/coverage-7.10.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bad180cc40b3fccb0f0e8c702d781492654ac2580d468e3ffc8065e38c6c2408", size = 251856 }, { url = "https://files.pythonhosted.org/packages/a5/eb/ca6b7967f57f6fef31da8749ea20417790bb6723593c8cd98a987be20423/coverage-7.10.5-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:63d4bb2966d6f5f705a6b0c6784c8969c468dbc4bcf9d9ded8bff1c7e092451f", size = 252230 },
{ url = "https://files.pythonhosted.org/packages/56/19/8b9cb13292e602fa4135b10a26ac4ce169a7fc7c285ff08bedd42ff6acca/coverage-7.10.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:becbdcd14f685fada010a5f792bf0895675ecf7481304fe159f0cd3f289550bd", size = 250015 }, { url = "https://files.pythonhosted.org/packages/bc/29/17a411b2a2a18f8b8c952aa01c00f9284a1fbc677c68a0003b772ea89104/coverage-7.10.5-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1f672efc0731a6846b157389b6e6d5d5e9e59d1d1a23a5c66a99fd58339914d5", size = 250387 },
{ url = "https://files.pythonhosted.org/packages/10/e7/e5903990ce089527cf1c4f88b702985bd65c61ac245923f1ff1257dbcc02/coverage-7.10.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0b485ca21e16a76f68060911f97ebbe3e0d891da1dbbce6af7ca1ab3f98b9097", size = 247908 }, { url = "https://files.pythonhosted.org/packages/c7/89/97a9e271188c2fbb3db82235c33980bcbc733da7da6065afbaa1d685a169/coverage-7.10.5-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:3f39cef43d08049e8afc1fde4a5da8510fc6be843f8dea350ee46e2a26b2f54c", size = 248280 },
{ url = "https://files.pythonhosted.org/packages/dd/c9/7d464f116df1df7fe340669af1ddbe1a371fc60f3082ff3dc837c4f1f2ab/coverage-7.10.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6c1d098ccfe8e1e0a1ed9a0249138899948afd2978cbf48eb1cc3fcd38469690", size = 249525 }, { url = "https://files.pythonhosted.org/packages/d1/c6/0ad7d0137257553eb4706b4ad6180bec0a1b6a648b092c5bbda48d0e5b2c/coverage-7.10.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2968647e3ed5a6c019a419264386b013979ff1fb67dd11f5c9886c43d6a31fc2", size = 249894 },
{ url = "https://files.pythonhosted.org/packages/ce/42/722e0cdbf6c19e7235c2020837d4e00f3b07820fd012201a983238cc3a30/coverage-7.10.4-cp313-cp313-win32.whl", hash = "sha256:8630f8af2ca84b5c367c3df907b1706621abe06d6929f5045fd628968d421e6e", size = 219173 }, { url = "https://files.pythonhosted.org/packages/84/56/fb3aba936addb4c9e5ea14f5979393f1c2466b4c89d10591fd05f2d6b2aa/coverage-7.10.5-cp313-cp313-win32.whl", hash = "sha256:0d511dda38595b2b6934c2b730a1fd57a3635c6aa2a04cb74714cdfdd53846f4", size = 219536 },
{ url = "https://files.pythonhosted.org/packages/97/7e/aa70366f8275955cd51fa1ed52a521c7fcebcc0fc279f53c8c1ee6006dfe/coverage-7.10.4-cp313-cp313-win_amd64.whl", hash = "sha256:f68835d31c421736be367d32f179e14ca932978293fe1b4c7a6a49b555dff5b2", size = 219969 }, { url = "https://files.pythonhosted.org/packages/fc/54/baacb8f2f74431e3b175a9a2881feaa8feb6e2f187a0e7e3046f3c7742b2/coverage-7.10.5-cp313-cp313-win_amd64.whl", hash = "sha256:9a86281794a393513cf117177fd39c796b3f8e3759bb2764259a2abba5cce54b", size = 220330 },
{ url = "https://files.pythonhosted.org/packages/ac/96/c39d92d5aad8fec28d4606556bfc92b6fee0ab51e4a548d9b49fb15a777c/coverage-7.10.4-cp313-cp313-win_arm64.whl", hash = "sha256:6eaa61ff6724ca7ebc5326d1fae062d85e19b38dd922d50903702e6078370ae7", size = 218601 }, { url = "https://files.pythonhosted.org/packages/64/8a/82a3788f8e31dee51d350835b23d480548ea8621f3effd7c3ba3f7e5c006/coverage-7.10.5-cp313-cp313-win_arm64.whl", hash = "sha256:cebd8e906eb98bb09c10d1feed16096700b1198d482267f8bf0474e63a7b8d84", size = 218961 },
{ url = "https://files.pythonhosted.org/packages/79/13/34d549a6177bd80fa5db758cb6fd3057b7ad9296d8707d4ab7f480b0135f/coverage-7.10.4-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:702978108876bfb3d997604930b05fe769462cc3000150b0e607b7b444f2fd84", size = 217445 }, { url = "https://files.pythonhosted.org/packages/d8/a1/590154e6eae07beee3b111cc1f907c30da6fc8ce0a83ef756c72f3c7c748/coverage-7.10.5-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0520dff502da5e09d0d20781df74d8189ab334a1e40d5bafe2efaa4158e2d9e7", size = 217819 },
{ url = "https://files.pythonhosted.org/packages/6a/c0/433da866359bf39bf595f46d134ff2d6b4293aeea7f3328b6898733b0633/coverage-7.10.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e8f978e8c5521d9c8f2086ac60d931d583fab0a16f382f6eb89453fe998e2484", size = 217676 }, { url = "https://files.pythonhosted.org/packages/0d/ff/436ffa3cfc7741f0973c5c89405307fe39b78dcf201565b934e6616fc4ad/coverage-7.10.5-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:d9cd64aca68f503ed3f1f18c7c9174cbb797baba02ca8ab5112f9d1c0328cd4b", size = 218040 },
{ url = "https://files.pythonhosted.org/packages/7e/d7/2b99aa8737f7801fd95222c79a4ebc8c5dd4460d4bed7ef26b17a60c8d74/coverage-7.10.4-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:df0ac2ccfd19351411c45e43ab60932b74472e4648b0a9edf6a3b58846e246a9", size = 259002 }, { url = "https://files.pythonhosted.org/packages/a0/ca/5787fb3d7820e66273913affe8209c534ca11241eb34ee8c4fd2aaa9dd87/coverage-7.10.5-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:0913dd1613a33b13c4f84aa6e3f4198c1a21ee28ccb4f674985c1f22109f0aae", size = 259374 },
{ url = "https://files.pythonhosted.org/packages/08/cf/86432b69d57debaef5abf19aae661ba8f4fcd2882fa762e14added4bd334/coverage-7.10.4-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:73a0d1aaaa3796179f336448e1576a3de6fc95ff4f07c2d7251d4caf5d18cf8d", size = 261178 }, { url = "https://files.pythonhosted.org/packages/b5/89/21af956843896adc2e64fc075eae3c1cadb97ee0a6960733e65e696f32dd/coverage-7.10.5-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:1b7181c0feeb06ed8a02da02792f42f829a7b29990fef52eff257fef0885d760", size = 261551 },
{ url = "https://files.pythonhosted.org/packages/23/78/85176593f4aa6e869cbed7a8098da3448a50e3fac5cb2ecba57729a5220d/coverage-7.10.4-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:873da6d0ed6b3ffc0bc01f2c7e3ad7e2023751c0d8d86c26fe7322c314b031dc", size = 263402 }, { url = "https://files.pythonhosted.org/packages/e1/96/390a69244ab837e0ac137989277879a084c786cf036c3c4a3b9637d43a89/coverage-7.10.5-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:36d42b7396b605f774d4372dd9c49bed71cbabce4ae1ccd074d155709dd8f235", size = 263776 },
{ url = "https://files.pythonhosted.org/packages/88/1d/57a27b6789b79abcac0cc5805b31320d7a97fa20f728a6a7c562db9a3733/coverage-7.10.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:c6446c75b0e7dda5daa876a1c87b480b2b52affb972fedd6c22edf1aaf2e00ec", size = 260957 }, { url = "https://files.pythonhosted.org/packages/00/32/cfd6ae1da0a521723349f3129b2455832fc27d3f8882c07e5b6fefdd0da2/coverage-7.10.5-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:b4fdc777e05c4940b297bf47bf7eedd56a39a61dc23ba798e4b830d585486ca5", size = 261326 },
{ url = "https://files.pythonhosted.org/packages/fa/e5/3e5ddfd42835c6def6cd5b2bdb3348da2e34c08d9c1211e91a49e9fd709d/coverage-7.10.4-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:6e73933e296634e520390c44758d553d3b573b321608118363e52113790633b9", size = 258718 }, { url = "https://files.pythonhosted.org/packages/4c/c4/bf8d459fb4ce2201e9243ce6c015936ad283a668774430a3755f467b39d1/coverage-7.10.5-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:42144e8e346de44a6f1dbd0a56575dd8ab8dfa7e9007da02ea5b1c30ab33a7db", size = 259090 },
{ url = "https://files.pythonhosted.org/packages/1a/0b/d364f0f7ef111615dc4e05a6ed02cac7b6f2ac169884aa57faeae9eb5fa0/coverage-7.10.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:52073d4b08d2cb571234c8a71eb32af3c6923149cf644a51d5957ac128cf6aa4", size = 259848 }, { url = "https://files.pythonhosted.org/packages/f4/5d/a234f7409896468e5539d42234016045e4015e857488b0b5b5f3f3fa5f2b/coverage-7.10.5-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:66c644cbd7aed8fe266d5917e2c9f65458a51cfe5eeff9c05f15b335f697066e", size = 260217 },
{ url = "https://files.pythonhosted.org/packages/10/c6/bbea60a3b309621162e53faf7fac740daaf083048ea22077418e1ecaba3f/coverage-7.10.4-cp313-cp313t-win32.whl", hash = "sha256:e24afb178f21f9ceb1aefbc73eb524769aa9b504a42b26857243f881af56880c", size = 219833 }, { url = "https://files.pythonhosted.org/packages/f3/ad/87560f036099f46c2ddd235be6476dd5c1d6be6bb57569a9348d43eeecea/coverage-7.10.5-cp313-cp313t-win32.whl", hash = "sha256:2d1b73023854068c44b0c554578a4e1ef1b050ed07cf8b431549e624a29a66ee", size = 220194 },
{ url = "https://files.pythonhosted.org/packages/44/a5/f9f080d49cfb117ddffe672f21eab41bd23a46179a907820743afac7c021/coverage-7.10.4-cp313-cp313t-win_amd64.whl", hash = "sha256:be04507ff1ad206f4be3d156a674e3fb84bbb751ea1b23b142979ac9eebaa15f", size = 220897 }, { url = "https://files.pythonhosted.org/packages/36/a8/04a482594fdd83dc677d4a6c7e2d62135fff5a1573059806b8383fad9071/coverage-7.10.5-cp313-cp313t-win_amd64.whl", hash = "sha256:54a1532c8a642d8cc0bd5a9a51f5a9dcc440294fd06e9dda55e743c5ec1a8f14", size = 221258 },
{ url = "https://files.pythonhosted.org/packages/46/89/49a3fc784fa73d707f603e586d84a18c2e7796707044e9d73d13260930b7/coverage-7.10.4-cp313-cp313t-win_arm64.whl", hash = "sha256:f3e3ff3f69d02b5dad67a6eac68cc9c71ae343b6328aae96e914f9f2f23a22e2", size = 219160 }, { url = "https://files.pythonhosted.org/packages/eb/ad/7da28594ab66fe2bc720f1bc9b131e62e9b4c6e39f044d9a48d18429cc21/coverage-7.10.5-cp313-cp313t-win_arm64.whl", hash = "sha256:74d5b63fe3f5f5d372253a4ef92492c11a4305f3550631beaa432fc9df16fcff", size = 219521 },
{ url = "https://files.pythonhosted.org/packages/b5/22/525f84b4cbcff66024d29f6909d7ecde97223f998116d3677cfba0d115b5/coverage-7.10.4-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:a59fe0af7dd7211ba595cf7e2867458381f7e5d7b4cffe46274e0b2f5b9f4eb4", size = 216717 }, { url = "https://files.pythonhosted.org/packages/d3/7f/c8b6e4e664b8a95254c35a6c8dd0bf4db201ec681c169aae2f1256e05c85/coverage-7.10.5-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:68c5e0bc5f44f68053369fa0d94459c84548a77660a5f2561c5e5f1e3bed7031", size = 217090 },
{ url = "https://files.pythonhosted.org/packages/a6/58/213577f77efe44333a416d4bcb251471e7f64b19b5886bb515561b5ce389/coverage-7.10.4-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:3a6c35c5b70f569ee38dc3350cd14fdd0347a8b389a18bb37538cc43e6f730e6", size = 216994 }, { url = "https://files.pythonhosted.org/packages/44/74/3ee14ede30a6e10a94a104d1d0522d5fb909a7c7cac2643d2a79891ff3b9/coverage-7.10.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:cf33134ffae93865e32e1e37df043bef15a5e857d8caebc0099d225c579b0fa3", size = 217365 },
{ url = "https://files.pythonhosted.org/packages/17/85/34ac02d0985a09472f41b609a1d7babc32df87c726c7612dc93d30679b5a/coverage-7.10.4-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:acb7baf49f513554c4af6ef8e2bd6e8ac74e6ea0c7386df8b3eb586d82ccccc4", size = 248038 }, { url = "https://files.pythonhosted.org/packages/41/5f/06ac21bf87dfb7620d1f870dfa3c2cae1186ccbcdc50b8b36e27a0d52f50/coverage-7.10.5-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:ad8fa9d5193bafcf668231294241302b5e683a0518bf1e33a9a0dfb142ec3031", size = 248413 },
{ url = "https://files.pythonhosted.org/packages/47/4f/2140305ec93642fdaf988f139813629cbb6d8efa661b30a04b6f7c67c31e/coverage-7.10.4-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:a89afecec1ed12ac13ed203238b560cbfad3522bae37d91c102e690b8b1dc46c", size = 250575 }, { url = "https://files.pythonhosted.org/packages/21/bc/cc5bed6e985d3a14228539631573f3863be6a2587381e8bc5fdf786377a1/coverage-7.10.5-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:146fa1531973d38ab4b689bc764592fe6c2f913e7e80a39e7eeafd11f0ef6db2", size = 250943 },
{ url = "https://files.pythonhosted.org/packages/f2/b5/41b5784180b82a083c76aeba8f2c72ea1cb789e5382157b7dc852832aea2/coverage-7.10.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:480442727f464407d8ade6e677b7f21f3b96a9838ab541b9a28ce9e44123c14e", size = 251927 }, { url = "https://files.pythonhosted.org/packages/8d/43/6a9fc323c2c75cd80b18d58db4a25dc8487f86dd9070f9592e43e3967363/coverage-7.10.5-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6013a37b8a4854c478d3219ee8bc2392dea51602dd0803a12d6f6182a0061762", size = 252301 },
{ url = "https://files.pythonhosted.org/packages/78/ca/c1dd063e50b71f5aea2ebb27a1c404e7b5ecf5714c8b5301f20e4e8831ac/coverage-7.10.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:a89bf193707f4a17f1ed461504031074d87f035153239f16ce86dfb8f8c7ac76", size = 249930 }, { url = "https://files.pythonhosted.org/packages/69/7c/3e791b8845f4cd515275743e3775adb86273576596dc9f02dca37357b4f2/coverage-7.10.5-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:eb90fe20db9c3d930fa2ad7a308207ab5b86bf6a76f54ab6a40be4012d88fcae", size = 250302 },
{ url = "https://files.pythonhosted.org/packages/8d/66/d8907408612ffee100d731798e6090aedb3ba766ecf929df296c1a7ee4fb/coverage-7.10.4-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:3ddd912c2fc440f0fb3229e764feec85669d5d80a988ff1b336a27d73f63c818", size = 247862 }, { url = "https://files.pythonhosted.org/packages/5c/bc/5099c1e1cb0c9ac6491b281babea6ebbf999d949bf4aa8cdf4f2b53505e8/coverage-7.10.5-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:384b34482272e960c438703cafe63316dfbea124ac62006a455c8410bf2a2262", size = 248237 },
{ url = "https://files.pythonhosted.org/packages/29/db/53cd8ec8b1c9c52d8e22a25434785bfc2d1e70c0cfb4d278a1326c87f741/coverage-7.10.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:8a538944ee3a42265e61c7298aeba9ea43f31c01271cf028f437a7b4075592cf", size = 249360 }, { url = "https://files.pythonhosted.org/packages/7e/51/d346eb750a0b2f1e77f391498b753ea906fde69cc11e4b38dca28c10c88c/coverage-7.10.5-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:467dc74bd0a1a7de2bedf8deaf6811f43602cb532bd34d81ffd6038d6d8abe99", size = 249726 },
{ url = "https://files.pythonhosted.org/packages/4f/75/5ec0a28ae4a0804124ea5a5becd2b0fa3adf30967ac656711fb5cdf67c60/coverage-7.10.4-cp314-cp314-win32.whl", hash = "sha256:fd2e6002be1c62476eb862b8514b1ba7e7684c50165f2a8d389e77da6c9a2ebd", size = 219449 }, { url = "https://files.pythonhosted.org/packages/a3/85/eebcaa0edafe427e93286b94f56ea7e1280f2c49da0a776a6f37e04481f9/coverage-7.10.5-cp314-cp314-win32.whl", hash = "sha256:556d23d4e6393ca898b2e63a5bca91e9ac2d5fb13299ec286cd69a09a7187fde", size = 219825 },
{ url = "https://files.pythonhosted.org/packages/9d/ab/66e2ee085ec60672bf5250f11101ad8143b81f24989e8c0e575d16bb1e53/coverage-7.10.4-cp314-cp314-win_amd64.whl", hash = "sha256:ec113277f2b5cf188d95fb66a65c7431f2b9192ee7e6ec9b72b30bbfb53c244a", size = 220246 }, { url = "https://files.pythonhosted.org/packages/3c/f7/6d43e037820742603f1e855feb23463979bf40bd27d0cde1f761dcc66a3e/coverage-7.10.5-cp314-cp314-win_amd64.whl", hash = "sha256:f4446a9547681533c8fa3e3c6cf62121eeee616e6a92bd9201c6edd91beffe13", size = 220618 },
{ url = "https://files.pythonhosted.org/packages/37/3b/00b448d385f149143190846217797d730b973c3c0ec2045a7e0f5db3a7d0/coverage-7.10.4-cp314-cp314-win_arm64.whl", hash = "sha256:9744954bfd387796c6a091b50d55ca7cac3d08767795b5eec69ad0f7dbf12d38", size = 218825 }, { url = "https://files.pythonhosted.org/packages/4a/b0/ed9432e41424c51509d1da603b0393404b828906236fb87e2c8482a93468/coverage-7.10.5-cp314-cp314-win_arm64.whl", hash = "sha256:5e78bd9cf65da4c303bf663de0d73bf69f81e878bf72a94e9af67137c69b9fe9", size = 219199 },
{ url = "https://files.pythonhosted.org/packages/ee/2e/55e20d3d1ce00b513efb6fd35f13899e1c6d4f76c6cbcc9851c7227cd469/coverage-7.10.4-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:5af4829904dda6aabb54a23879f0f4412094ba9ef153aaa464e3c1b1c9bc98e6", size = 217462 }, { url = "https://files.pythonhosted.org/packages/2f/54/5a7ecfa77910f22b659c820f67c16fc1e149ed132ad7117f0364679a8fa9/coverage-7.10.5-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:5661bf987d91ec756a47c7e5df4fbcb949f39e32f9334ccd3f43233bbb65e508", size = 217833 },
{ url = "https://files.pythonhosted.org/packages/47/b3/aab1260df5876f5921e2c57519e73a6f6eeacc0ae451e109d44ee747563e/coverage-7.10.4-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7bba5ed85e034831fac761ae506c0644d24fd5594727e174b5a73aff343a7508", size = 217675 }, { url = "https://files.pythonhosted.org/packages/4e/0e/25672d917cc57857d40edf38f0b867fb9627115294e4f92c8fcbbc18598d/coverage-7.10.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:a46473129244db42a720439a26984f8c6f834762fc4573616c1f37f13994b357", size = 218048 },
{ url = "https://files.pythonhosted.org/packages/67/23/1cfe2aa50c7026180989f0bfc242168ac7c8399ccc66eb816b171e0ab05e/coverage-7.10.4-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:d57d555b0719834b55ad35045de6cc80fc2b28e05adb6b03c98479f9553b387f", size = 259176 }, { url = "https://files.pythonhosted.org/packages/cb/7c/0b2b4f1c6f71885d4d4b2b8608dcfc79057adb7da4143eb17d6260389e42/coverage-7.10.5-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:1f64b8d3415d60f24b058b58d859e9512624bdfa57a2d1f8aff93c1ec45c429b", size = 259549 },
{ url = "https://files.pythonhosted.org/packages/9d/72/5882b6aeed3f9de7fc4049874fd7d24213bf1d06882f5c754c8a682606ec/coverage-7.10.4-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:ba62c51a72048bb1ea72db265e6bd8beaabf9809cd2125bbb5306c6ce105f214", size = 261341 }, { url = "https://files.pythonhosted.org/packages/94/73/abb8dab1609abec7308d83c6aec547944070526578ee6c833d2da9a0ad42/coverage-7.10.5-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:44d43de99a9d90b20e0163f9770542357f58860a26e24dc1d924643bd6aa7cb4", size = 261715 },
{ url = "https://files.pythonhosted.org/packages/1b/70/a0c76e3087596ae155f8e71a49c2c534c58b92aeacaf4d9d0cbbf2dde53b/coverage-7.10.4-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0acf0c62a6095f07e9db4ec365cc58c0ef5babb757e54745a1aa2ea2a2564af1", size = 263600 }, { url = "https://files.pythonhosted.org/packages/0b/d1/abf31de21ec92731445606b8d5e6fa5144653c2788758fcf1f47adb7159a/coverage-7.10.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a931a87e5ddb6b6404e65443b742cb1c14959622777f2a4efd81fba84f5d91ba", size = 263969 },
{ url = "https://files.pythonhosted.org/packages/cb/5f/27e4cd4505b9a3c05257fb7fc509acbc778c830c450cb4ace00bf2b7bda7/coverage-7.10.4-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e1033bf0f763f5cf49ffe6594314b11027dcc1073ac590b415ea93463466deec", size = 261036 }, { url = "https://files.pythonhosted.org/packages/9c/b3/ef274927f4ebede96056173b620db649cc9cb746c61ffc467946b9d0bc67/coverage-7.10.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:f9559b906a100029274448f4c8b8b0a127daa4dade5661dfd821b8c188058842", size = 261408 },
{ url = "https://files.pythonhosted.org/packages/02/d6/cf2ae3a7f90ab226ea765a104c4e76c5126f73c93a92eaea41e1dc6a1892/coverage-7.10.4-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:92c29eff894832b6a40da1789b1f252305af921750b03ee4535919db9179453d", size = 258794 }, { url = "https://files.pythonhosted.org/packages/20/fc/83ca2812be616d69b4cdd4e0c62a7bc526d56875e68fd0f79d47c7923584/coverage-7.10.5-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:b08801e25e3b4526ef9ced1aa29344131a8f5213c60c03c18fe4c6170ffa2874", size = 259168 },
{ url = "https://files.pythonhosted.org/packages/9e/b1/39f222eab0d78aa2001cdb7852aa1140bba632db23a5cfd832218b496d6c/coverage-7.10.4-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:822c4c830989c2093527e92acd97be4638a44eb042b1bdc0e7a278d84a070bd3", size = 259946 }, { url = "https://files.pythonhosted.org/packages/fc/4f/e0779e5716f72d5c9962e709d09815d02b3b54724e38567308304c3fc9df/coverage-7.10.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ed9749bb8eda35f8b636fb7632f1c62f735a236a5d4edadd8bbcc5ea0542e732", size = 260317 },
{ url = "https://files.pythonhosted.org/packages/74/b2/49d82acefe2fe7c777436a3097f928c7242a842538b190f66aac01f29321/coverage-7.10.4-cp314-cp314t-win32.whl", hash = "sha256:e694d855dac2e7cf194ba33653e4ba7aad7267a802a7b3fc4347d0517d5d65cd", size = 220226 }, { url = "https://files.pythonhosted.org/packages/2b/fe/4247e732f2234bb5eb9984a0888a70980d681f03cbf433ba7b48f08ca5d5/coverage-7.10.5-cp314-cp314t-win32.whl", hash = "sha256:609b60d123fc2cc63ccee6d17e4676699075db72d14ac3c107cc4976d516f2df", size = 220600 },
{ url = "https://files.pythonhosted.org/packages/06/b0/afb942b6b2fc30bdbc7b05b087beae11c2b0daaa08e160586cf012b6ad70/coverage-7.10.4-cp314-cp314t-win_amd64.whl", hash = "sha256:efcc54b38ef7d5bfa98050f220b415bc5bb3d432bd6350a861cf6da0ede2cdcd", size = 221346 }, { url = "https://files.pythonhosted.org/packages/a7/a0/f294cff6d1034b87839987e5b6ac7385bec599c44d08e0857ac7f164ad0c/coverage-7.10.5-cp314-cp314t-win_amd64.whl", hash = "sha256:0666cf3d2c1626b5a3463fd5b05f5e21f99e6aec40a3192eee4d07a15970b07f", size = 221714 },
{ url = "https://files.pythonhosted.org/packages/d8/66/e0531c9d1525cb6eac5b5733c76f27f3053ee92665f83f8899516fea6e76/coverage-7.10.4-cp314-cp314t-win_arm64.whl", hash = "sha256:6f3a3496c0fa26bfac4ebc458747b778cff201c8ae94fa05e1391bab0dbc473c", size = 219368 }, { url = "https://files.pythonhosted.org/packages/23/18/fa1afdc60b5528d17416df440bcbd8fd12da12bfea9da5b6ae0f7a37d0f7/coverage-7.10.5-cp314-cp314t-win_arm64.whl", hash = "sha256:bc85eb2d35e760120540afddd3044a5bf69118a91a296a8b3940dfc4fdcfe1e2", size = 219735 },
{ url = "https://files.pythonhosted.org/packages/bb/78/983efd23200921d9edb6bd40512e1aa04af553d7d5a171e50f9b2b45d109/coverage-7.10.4-py3-none-any.whl", hash = "sha256:065d75447228d05121e5c938ca8f0e91eed60a1eb2d1258d42d5084fecfc3302", size = 208365 }, { url = "https://files.pythonhosted.org/packages/08/b6/fff6609354deba9aeec466e4bcaeb9d1ed3e5d60b14b57df2a36fb2273f2/coverage-7.10.5-py3-none-any.whl", hash = "sha256:0be24d35e4db1d23d0db5c0f6a74a962e2ec83c426b5cac09f4234aadef38e4a", size = 208736 },
] ]
[[package]] [[package]]
@@ -213,27 +213,27 @@ wheels = [
[[package]] [[package]]
name = "email-validator" name = "email-validator"
version = "2.2.0" version = "2.3.0"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "dnspython" }, { name = "dnspython" },
{ name = "idna" }, { name = "idna" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/48/ce/13508a1ec3f8bb981ae4ca79ea40384becc868bfae97fd1c942bb3a001b1/email_validator-2.2.0.tar.gz", hash = "sha256:cb690f344c617a714f22e66ae771445a1ceb46821152df8e165c5f9a364582b7", size = 48967 } sdist = { url = "https://files.pythonhosted.org/packages/f5/22/900cb125c76b7aaa450ce02fd727f452243f2e91a61af068b40adba60ea9/email_validator-2.3.0.tar.gz", hash = "sha256:9fc05c37f2f6cf439ff414f8fc46d917929974a82244c20eb10231ba60c54426", size = 51238 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/d7/ee/bf0adb559ad3c786f12bcbc9296b3f5675f529199bef03e2df281fa1fadb/email_validator-2.2.0-py3-none-any.whl", hash = "sha256:561977c2d73ce3611850a06fa56b414621e0c8faa9d66f2611407d87465da631", size = 33521 }, { url = "https://files.pythonhosted.org/packages/de/15/545e2b6cf2e3be84bc1ed85613edd75b8aea69807a71c26f4ca6a9258e82/email_validator-2.3.0-py3-none-any.whl", hash = "sha256:80f13f623413e6b197ae73bb10bf4eb0908faf509ad8362c5edeb0be7fd450b4", size = 35604 },
] ]
[[package]] [[package]]
name = "faker" name = "faker"
version = "37.5.3" version = "37.6.0"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "tzdata" }, { name = "tzdata" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/ce/5d/7797a74e8e31fa227f0303239802c5f09b6722bdb6638359e7b6c8f30004/faker-37.5.3.tar.gz", hash = "sha256:8315d8ff4d6f4f588bd42ffe63abd599886c785073e26a44707e10eeba5713dc", size = 1907147 } sdist = { url = "https://files.pythonhosted.org/packages/24/cd/f7679c20f07d9e2013123b7f7e13809a3450a18d938d58e86081a486ea15/faker-37.6.0.tar.gz", hash = "sha256:0f8cc34f30095184adf87c3c24c45b38b33ad81c35ef6eb0a3118f301143012c", size = 1907960 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/4b/bf/d06dd96e7afa72069dbdd26ed0853b5e8bd7941e2c0819a9b21d6e6fc052/faker-37.5.3-py3-none-any.whl", hash = "sha256:386fe9d5e6132a915984bf887fcebcc72d6366a25dd5952905b31b141a17016d", size = 1949261 }, { url = "https://files.pythonhosted.org/packages/61/7d/8b50e4ac772719777be33661f4bde320793400a706f5eb214e4de46f093c/faker-37.6.0-py3-none-any.whl", hash = "sha256:3c5209b23d7049d596a51db5d76403a0ccfea6fc294ffa2ecfef6a8843b1e6a7", size = 1949837 },
] ]
[[package]] [[package]]
@@ -742,6 +742,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/5b/ee/7d76eb3b50ccb1397621f32ede0fb4d17aa55a9aa2251bc34e6b9929fdce/python_vlc-3.0.21203-py3-none-any.whl", hash = "sha256:1613451a31b692ec276296ceeae0c0ba82bfc2d094dabf9aceb70f58944a6320", size = 87651 }, { url = "https://files.pythonhosted.org/packages/5b/ee/7d76eb3b50ccb1397621f32ede0fb4d17aa55a9aa2251bc34e6b9929fdce/python_vlc-3.0.21203-py3-none-any.whl", hash = "sha256:1613451a31b692ec276296ceeae0c0ba82bfc2d094dabf9aceb70f58944a6320", size = 87651 },
] ]
[[package]]
name = "pytz"
version = "2025.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f8/bf/abbd3cdfb8fbc7fb3d4d38d320f2441b1e7cbe29be4f23797b4a2b5d8aac/pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3", size = 320884 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225 },
]
[[package]] [[package]]
name = "pyyaml" name = "pyyaml"
version = "6.0.2" version = "6.0.2"
@@ -843,28 +852,28 @@ wheels = [
[[package]] [[package]]
name = "ruff" name = "ruff"
version = "0.12.10" version = "0.12.11"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/3b/eb/8c073deb376e46ae767f4961390d17545e8535921d2f65101720ed8bd434/ruff-0.12.10.tar.gz", hash = "sha256:189ab65149d11ea69a2d775343adf5f49bb2426fc4780f65ee33b423ad2e47f9", size = 5310076 } sdist = { url = "https://files.pythonhosted.org/packages/de/55/16ab6a7d88d93001e1ae4c34cbdcfb376652d761799459ff27c1dc20f6fa/ruff-0.12.11.tar.gz", hash = "sha256:c6b09ae8426a65bbee5425b9d0b82796dbb07cb1af045743c79bfb163001165d", size = 5347103 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/24/e7/560d049d15585d6c201f9eeacd2fd130def3741323e5ccf123786e0e3c95/ruff-0.12.10-py3-none-linux_armv6l.whl", hash = "sha256:8b593cb0fb55cc8692dac7b06deb29afda78c721c7ccfed22db941201b7b8f7b", size = 11935161 }, { url = "https://files.pythonhosted.org/packages/d6/a2/3b3573e474de39a7a475f3fbaf36a25600bfeb238e1a90392799163b64a0/ruff-0.12.11-py3-none-linux_armv6l.whl", hash = "sha256:93fce71e1cac3a8bf9200e63a38ac5c078f3b6baebffb74ba5274fb2ab276065", size = 11979885 },
{ url = "https://files.pythonhosted.org/packages/d1/b0/ad2464922a1113c365d12b8f80ed70fcfb39764288ac77c995156080488d/ruff-0.12.10-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:ebb7333a45d56efc7c110a46a69a1b32365d5c5161e7244aaf3aa20ce62399c1", size = 12660884 }, { url = "https://files.pythonhosted.org/packages/76/e4/235ad6d1785a2012d3ded2350fd9bc5c5af8c6f56820e696b0118dfe7d24/ruff-0.12.11-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b8e33ac7b28c772440afa80cebb972ffd823621ded90404f29e5ab6d1e2d4b93", size = 12742364 },
{ url = "https://files.pythonhosted.org/packages/d7/f1/97f509b4108d7bae16c48389f54f005b62ce86712120fd8b2d8e88a7cb49/ruff-0.12.10-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d59e58586829f8e4a9920788f6efba97a13d1fa320b047814e8afede381c6839", size = 11872754 }, { url = "https://files.pythonhosted.org/packages/2c/0d/15b72c5fe6b1e402a543aa9d8960e0a7e19dfb079f5b0b424db48b7febab/ruff-0.12.11-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d69fb9d4937aa19adb2e9f058bc4fbfe986c2040acb1a4a9747734834eaa0bfd", size = 11920111 },
{ url = "https://files.pythonhosted.org/packages/12/ad/44f606d243f744a75adc432275217296095101f83f966842063d78eee2d3/ruff-0.12.10-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:822d9677b560f1fdeab69b89d1f444bf5459da4aa04e06e766cf0121771ab844", size = 12092276 }, { url = "https://files.pythonhosted.org/packages/3e/c0/f66339d7893798ad3e17fa5a1e587d6fd9806f7c1c062b63f8b09dda6702/ruff-0.12.11-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:411954eca8464595077a93e580e2918d0a01a19317af0a72132283e28ae21bee", size = 12160060 },
{ url = "https://files.pythonhosted.org/packages/06/1f/ed6c265e199568010197909b25c896d66e4ef2c5e1c3808caf461f6f3579/ruff-0.12.10-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:37b4a64f4062a50c75019c61c7017ff598cb444984b638511f48539d3a1c98db", size = 11734700 }, { url = "https://files.pythonhosted.org/packages/03/69/9870368326db26f20c946205fb2d0008988aea552dbaec35fbacbb46efaa/ruff-0.12.11-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6a2c0a2e1a450f387bf2c6237c727dd22191ae8c00e448e0672d624b2bbd7fb0", size = 11799848 },
{ url = "https://files.pythonhosted.org/packages/63/c5/b21cde720f54a1d1db71538c0bc9b73dee4b563a7dd7d2e404914904d7f5/ruff-0.12.10-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2c6f4064c69d2542029b2a61d39920c85240c39837599d7f2e32e80d36401d6e", size = 13468783 }, { url = "https://files.pythonhosted.org/packages/25/8c/dd2c7f990e9b3a8a55eee09d4e675027d31727ce33cdb29eab32d025bdc9/ruff-0.12.11-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8ca4c3a7f937725fd2413c0e884b5248a19369ab9bdd850b5781348ba283f644", size = 13536288 },
{ url = "https://files.pythonhosted.org/packages/02/9e/39369e6ac7f2a1848f22fb0b00b690492f20811a1ac5c1fd1d2798329263/ruff-0.12.10-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:059e863ea3a9ade41407ad71c1de2badfbe01539117f38f763ba42a1206f7559", size = 14436642 }, { url = "https://files.pythonhosted.org/packages/7a/30/d5496fa09aba59b5e01ea76775a4c8897b13055884f56f1c35a4194c2297/ruff-0.12.11-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:4d1df0098124006f6a66ecf3581a7f7e754c4df7644b2e6704cd7ca80ff95211", size = 14490633 },
{ url = "https://files.pythonhosted.org/packages/e3/03/5da8cad4b0d5242a936eb203b58318016db44f5c5d351b07e3f5e211bb89/ruff-0.12.10-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1bef6161e297c68908b7218fa6e0e93e99a286e5ed9653d4be71e687dff101cf", size = 13859107 }, { url = "https://files.pythonhosted.org/packages/9b/2f/81f998180ad53445d403c386549d6946d0748e536d58fce5b5e173511183/ruff-0.12.11-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5a8dd5f230efc99a24ace3b77e3555d3fbc0343aeed3fc84c8d89e75ab2ff793", size = 13888430 },
{ url = "https://files.pythonhosted.org/packages/19/19/dd7273b69bf7f93a070c9cec9494a94048325ad18fdcf50114f07e6bf417/ruff-0.12.10-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4f1345fbf8fb0531cd722285b5f15af49b2932742fc96b633e883da8d841896b", size = 12886521 }, { url = "https://files.pythonhosted.org/packages/87/71/23a0d1d5892a377478c61dbbcffe82a3476b050f38b5162171942a029ef3/ruff-0.12.11-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4dc75533039d0ed04cd33fb8ca9ac9620b99672fe7ff1533b6402206901c34ee", size = 12913133 },
{ url = "https://files.pythonhosted.org/packages/c0/1d/b4207ec35e7babaee62c462769e77457e26eb853fbdc877af29417033333/ruff-0.12.10-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1f68433c4fbc63efbfa3ba5db31727db229fa4e61000f452c540474b03de52a9", size = 13097528 }, { url = "https://files.pythonhosted.org/packages/80/22/3c6cef96627f89b344c933781ed38329bfb87737aa438f15da95907cbfd5/ruff-0.12.11-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4fc58f9266d62c6eccc75261a665f26b4ef64840887fc6cbc552ce5b29f96cc8", size = 13169082 },
{ url = "https://files.pythonhosted.org/packages/ff/00/58f7b873b21114456e880b75176af3490d7a2836033779ca42f50de3b47a/ruff-0.12.10-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:141ce3d88803c625257b8a6debf4a0473eb6eed9643a6189b68838b43e78165a", size = 13080443 }, { url = "https://files.pythonhosted.org/packages/05/b5/68b3ff96160d8b49e8dd10785ff3186be18fd650d356036a3770386e6c7f/ruff-0.12.11-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:5a0113bd6eafd545146440225fe60b4e9489f59eb5f5f107acd715ba5f0b3d2f", size = 13139490 },
{ url = "https://files.pythonhosted.org/packages/12/8c/9e6660007fb10189ccb78a02b41691288038e51e4788bf49b0a60f740604/ruff-0.12.10-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:f3fc21178cd44c98142ae7590f42ddcb587b8e09a3b849cbc84edb62ee95de60", size = 11896759 }, { url = "https://files.pythonhosted.org/packages/59/b9/050a3278ecd558f74f7ee016fbdf10591d50119df8d5f5da45a22c6afafc/ruff-0.12.11-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:0d737b4059d66295c3ea5720e6efc152623bb83fde5444209b69cd33a53e2000", size = 11958928 },
{ url = "https://files.pythonhosted.org/packages/67/4c/6d092bb99ea9ea6ebda817a0e7ad886f42a58b4501a7e27cd97371d0ba54/ruff-0.12.10-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7d1a4e0bdfafcd2e3e235ecf50bf0176f74dd37902f241588ae1f6c827a36c56", size = 11701463 }, { url = "https://files.pythonhosted.org/packages/f9/bc/93be37347db854806904a43b0493af8d6873472dfb4b4b8cbb27786eb651/ruff-0.12.11-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:916fc5defee32dbc1fc1650b576a8fed68f5e8256e2180d4d9855aea43d6aab2", size = 11764513 },
{ url = "https://files.pythonhosted.org/packages/59/80/d982c55e91df981f3ab62559371380616c57ffd0172d96850280c2b04fa8/ruff-0.12.10-py3-none-musllinux_1_2_i686.whl", hash = "sha256:e67d96827854f50b9e3e8327b031647e7bcc090dbe7bb11101a81a3a2cbf1cc9", size = 12691603 }, { url = "https://files.pythonhosted.org/packages/7a/a1/1471751e2015a81fd8e166cd311456c11df74c7e8769d4aabfbc7584c7ac/ruff-0.12.11-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c984f07d7adb42d3ded5be894fb4007f30f82c87559438b4879fe7aa08c62b39", size = 12745154 },
{ url = "https://files.pythonhosted.org/packages/ad/37/63a9c788bbe0b0850611669ec6b8589838faf2f4f959647f2d3e320383ae/ruff-0.12.10-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:ae479e1a18b439c59138f066ae79cc0f3ee250712a873d00dbafadaad9481e5b", size = 13164356 }, { url = "https://files.pythonhosted.org/packages/68/ab/2542b14890d0f4872dd81b7b2a6aed3ac1786fae1ce9b17e11e6df9e31e3/ruff-0.12.11-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:e07fbb89f2e9249f219d88331c833860489b49cdf4b032b8e4432e9b13e8a4b9", size = 13227653 },
{ url = "https://files.pythonhosted.org/packages/47/d4/1aaa7fb201a74181989970ebccd12f88c0fc074777027e2a21de5a90657e/ruff-0.12.10-py3-none-win32.whl", hash = "sha256:9de785e95dc2f09846c5e6e1d3a3d32ecd0b283a979898ad427a9be7be22b266", size = 11896089 }, { url = "https://files.pythonhosted.org/packages/22/16/2fbfc61047dbfd009c58a28369a693a1484ad15441723be1cd7fe69bb679/ruff-0.12.11-py3-none-win32.whl", hash = "sha256:c792e8f597c9c756e9bcd4d87cf407a00b60af77078c96f7b6366ea2ce9ba9d3", size = 11944270 },
{ url = "https://files.pythonhosted.org/packages/ad/14/2ad38fd4037daab9e023456a4a40ed0154e9971f8d6aed41bdea390aabd9/ruff-0.12.10-py3-none-win_amd64.whl", hash = "sha256:7837eca8787f076f67aba2ca559cefd9c5cbc3a9852fd66186f4201b87c1563e", size = 13004616 }, { url = "https://files.pythonhosted.org/packages/08/a5/34276984705bfe069cd383101c45077ee029c3fe3b28225bf67aa35f0647/ruff-0.12.11-py3-none-win_amd64.whl", hash = "sha256:a3283325960307915b6deb3576b96919ee89432ebd9c48771ca12ee8afe4a0fd", size = 13046600 },
{ url = "https://files.pythonhosted.org/packages/24/3c/21cf283d67af33a8e6ed242396863af195a8a6134ec581524fd22b9811b6/ruff-0.12.10-py3-none-win_arm64.whl", hash = "sha256:cc138cc06ed9d4bfa9d667a65af7172b47840e1a98b02ce7011c391e54635ffc", size = 12074225 }, { url = "https://files.pythonhosted.org/packages/84/a8/001d4a7c2b37623a3fd7463208267fb906df40ff31db496157549cfd6e72/ruff-0.12.11-py3-none-win_arm64.whl", hash = "sha256:bae4d6e6a2676f8fb0f98b74594a048bae1b944aab17e9f5d504062303c6dbea", size = 12135290 },
] ]
[[package]] [[package]]
@@ -883,6 +892,7 @@ dependencies = [
{ name = "pyjwt" }, { name = "pyjwt" },
{ name = "python-socketio" }, { name = "python-socketio" },
{ name = "python-vlc" }, { name = "python-vlc" },
{ name = "pytz" },
{ name = "sqlmodel" }, { name = "sqlmodel" },
{ name = "uvicorn", extra = ["standard"] }, { name = "uvicorn", extra = ["standard"] },
{ name = "yt-dlp" }, { name = "yt-dlp" },
@@ -904,7 +914,7 @@ requires-dist = [
{ name = "aiosqlite", specifier = "==0.21.0" }, { name = "aiosqlite", specifier = "==0.21.0" },
{ name = "apscheduler", specifier = "==3.11.0" }, { name = "apscheduler", specifier = "==3.11.0" },
{ name = "bcrypt", specifier = "==4.3.0" }, { name = "bcrypt", specifier = "==4.3.0" },
{ name = "email-validator", specifier = "==2.2.0" }, { name = "email-validator", specifier = "==2.3.0" },
{ name = "fastapi", extras = ["standard"], specifier = "==0.116.1" }, { name = "fastapi", extras = ["standard"], specifier = "==0.116.1" },
{ name = "ffmpeg-python", specifier = "==0.2.0" }, { name = "ffmpeg-python", specifier = "==0.2.0" },
{ name = "httpx", specifier = "==0.28.1" }, { name = "httpx", specifier = "==0.28.1" },
@@ -912,20 +922,21 @@ requires-dist = [
{ name = "pyjwt", specifier = "==2.10.1" }, { name = "pyjwt", specifier = "==2.10.1" },
{ name = "python-socketio", specifier = "==5.13.0" }, { name = "python-socketio", specifier = "==5.13.0" },
{ name = "python-vlc", specifier = "==3.0.21203" }, { name = "python-vlc", specifier = "==3.0.21203" },
{ name = "pytz", specifier = "==2025.2" },
{ name = "sqlmodel", specifier = "==0.0.24" }, { name = "sqlmodel", specifier = "==0.0.24" },
{ name = "uvicorn", extras = ["standard"], specifier = "==0.35.0" }, { name = "uvicorn", extras = ["standard"], specifier = "==0.35.0" },
{ name = "yt-dlp", specifier = "==2025.8.20" }, { name = "yt-dlp", specifier = "==2025.8.27" },
] ]
[package.metadata.requires-dev] [package.metadata.requires-dev]
dev = [ dev = [
{ name = "coverage", specifier = "==7.10.4" }, { name = "coverage", specifier = "==7.10.5" },
{ name = "faker", specifier = "==37.5.3" }, { name = "faker", specifier = "==37.6.0" },
{ name = "httpx", specifier = "==0.28.1" }, { name = "httpx", specifier = "==0.28.1" },
{ name = "mypy", specifier = "==1.17.1" }, { name = "mypy", specifier = "==1.17.1" },
{ name = "pytest", specifier = "==8.4.1" }, { name = "pytest", specifier = "==8.4.1" },
{ name = "pytest-asyncio", specifier = "==1.1.0" }, { name = "pytest-asyncio", specifier = "==1.1.0" },
{ name = "ruff", specifier = "==0.12.10" }, { name = "ruff", specifier = "==0.12.11" },
] ]
[[package]] [[package]]
@@ -1248,9 +1259,9 @@ wheels = [
[[package]] [[package]]
name = "yt-dlp" name = "yt-dlp"
version = "2025.8.20" version = "2025.8.27"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/c8/af/d3c81af35ae2aef148d0ff78f001650ce5a7ca73fbd3b271eb9aab4c56ee/yt_dlp-2025.8.20.tar.gz", hash = "sha256:da873bcf424177ab5c3b701fa94ea4cdac17bf3aec5ef37b91f530c90def7bcf", size = 3037484 } sdist = { url = "https://files.pythonhosted.org/packages/f4/d4/d9dd231b03f09fdfb5f0fe70f30de0b5f59454aa54fa6b2b2aea49404988/yt_dlp-2025.8.27.tar.gz", hash = "sha256:ed74768d2a93b29933ab14099da19497ef571637f7aa375140dd3d882b9c1854", size = 3038374 }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/33/e8/ebd888100684c10799897296e3061c19ba5559b641f8da218bf48229a815/yt_dlp-2025.8.20-py3-none-any.whl", hash = "sha256:073c97e2a3f9cd0fa6a76142c4ef46ca62b2575c37eaf80d8c3718fd6f3277eb", size = 3266841 }, { url = "https://files.pythonhosted.org/packages/cb/9c/b69fc0c800f80b94ea2f8eff1d1f473fecee6aa337681d297ba7c7c5d3fd/yt_dlp-2025.8.27-py3-none-any.whl", hash = "sha256:0b8fd3bb7c54bc2e7ecb5cdac7d64c30e2503ea4d3dd9ae24d4f09e22aaa95f4", size = 3267059 },
] ]