Skip to content
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
76 changes: 76 additions & 0 deletions .github/workflows/examples.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
name: Examples Tests

on:
push:
branches: [main, develop]
paths:
- 'examples/**'
- 'examples_tests/**'
- 'pgmq_sqlalchemy/**'
- '.github/workflows/examples.yml'
pull_request:
branches: [main, develop]
paths:
- 'examples/**'
- 'examples_tests/**'
- 'pgmq_sqlalchemy/**'
- '.github/workflows/examples.yml'

jobs:
test-examples:
runs-on: ubuntu-latest

permissions:
contents: read

strategy:
matrix:
python-version: ["3.9", "3.10", "3.11", "3.12"]

name: Test Examples (Python ${{ matrix.python-version }})

steps:
- uses: actions/checkout@v4

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Install uv
run: |
curl -LsSf https://astral.sh/uv/install.sh | sh
echo "$HOME/.local/bin" >> $GITHUB_PATH

- name: Install dependencies
run: |
uv sync --extra dev
# Install additional dependencies for examples
uv pip install fastapi uvicorn httpx

- name: Start PostgreSQL
run: |
cp pgmq_postgres.template.env pgmq_postgres.env
cp pgmq_tests.template.env pgmq_tests.env
make start-db

- name: Setup database for examples tests
run: |
docker compose exec -T pgmq_postgres psql -U postgres -c "CREATE EXTENSION IF NOT EXISTS pgmq CASCADE;"

- name: Run examples tests
run: |
uv run pytest examples_tests --cov=examples --cov-report=xml:coverage-examples-py${{ matrix.python-version }}.xml -v
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot

Suggested change
uv run pytest examples_tests --cov=examples --cov-report=xml:coverage-examples-py${{ matrix.python-version }}.xml -v
uv run pytest examples_tests --cov=examples --cov-report=xml:coverage-examples-py${{ matrix.python-version }}.xml -vss

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in commit 71a5868. Updated pytest flags to -vss for better test output visibility in CI.


- name: Upload coverage artifact
if: always()
uses: actions/upload-artifact@v4
with:
name: coverage-examples-py${{ matrix.python-version }}
path: coverage-examples-py${{ matrix.python-version }}.xml
retention-days: 1

- name: Cleanup
if: always()
run: |
docker compose down
117 changes: 117 additions & 0 deletions examples/fastapi_pub_sub/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
# FastAPI Pub/Sub Example with PGMQ

This example demonstrates a real-world scenario of using PGMQ with FastAPI for an order management system. It shows how to:

- Use PGMQ with FastAPI and sync SQLAlchemy sessions (psycopg2)
- Publish messages using `PGMQOperation` (op) in a web API
- Consume messages asynchronously using `PGMQueue` with asyncpg

## Architecture

- **API Server (api.py)**: FastAPI application that creates orders and publishes them to PGMQ
- Uses sync database driver (psycopg2)
- Uses `PGMQOperation` (imported as `op`) for publishing messages
- Provides REST endpoints for creating and retrieving orders

- **Consumer (consumer.py)**: Async worker that processes orders from the queue
- Uses async database driver (asyncpg)
- Uses `PGMQueue` class for reading messages
- Processes messages concurrently with asyncio

## Prerequisites

- PostgreSQL with PGMQ extension installed
- Python 3.9 or higher

Quick setup:
```bash
docker run -d --name postgres -e POSTGRES_PASSWORD=postgres -p 5432:5432 quay.io/tembo/pg16-pgmq:latest
```

## Installation

Install required dependencies using uv:

```bash
uv pip install fastapi uvicorn psycopg2-binary asyncpg pgmq-sqlalchemy
```

Or install from the project root with uv:

```bash
cd /path/to/pgmq-sqlalchemy
uv pip install -e ".[psycopg2-binary,asyncpg]"
```
Comment on lines +31 to +51
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use uv to manage the dependencies instead of requirements.txt

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in commit 0d2e4bf. Updated README to use uv pip install commands instead of pip with requirements.txt.


## Running the Example

### 1. Start the API Server

```bash
python api.py
```

The API will be available at http://localhost:8000

### 2. Start the Consumer

In a separate terminal:

```bash
python consumer.py
```

### 3. Create Orders

Create an order via the API:

```bash
curl -X POST "http://localhost:8000/orders" \
-H "Content-Type: application/json" \
-d '{
"customer_name": "John Doe",
"product_name": "Widget",
"quantity": 5,
"price": 29.99
}'
```

You should see:
- The API returns the created order with a message ID
- The consumer logs show the order being processed

### 4. View Order

Get an order by ID:

```bash
curl "http://localhost:8000/orders/1"
```

## API Endpoints

- `POST /orders` - Create a new order
- `GET /orders/{order_id}` - Get order by ID
- `GET /health` - Health check endpoint

## How It Works

1. When an order is created via the API:
- The order is saved to the database
- A message is published to PGMQ using `op.send()`
- The message contains order details

2. The consumer:
- Continuously polls the queue for new messages
- Processes messages concurrently using asyncio
- Deletes successfully processed messages
- Leaves failed messages in the queue for retry

## Configuration

You can modify the following constants in the files:

- `DATABASE_URL`: PostgreSQL connection string
- `QUEUE_NAME`: Name of the PGMQ queue (default: "order_queue")
- `batch_size`: Number of messages to process in each batch (consumer.py)
- `vt`: Visibility timeout in seconds (consumer.py)
202 changes: 202 additions & 0 deletions examples/fastapi_pub_sub/api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,202 @@
"""FastAPI API server for Order management with PGMQ message publishing.

This example demonstrates:
- Using FastAPI with SQLAlchemy sync session (psycopg2)
- Publishing messages to PGMQ using PGMQOperation (op)
- Creating orders and sending them to a message queue
"""
import os
from typing import Generator
from contextlib import contextmanager, asynccontextmanager

from fastapi import FastAPI, Depends, HTTPException
from pydantic import BaseModel, ConfigDict
from sqlalchemy import create_engine, Column, Integer, String, Float, DateTime
from sqlalchemy.orm import Session, sessionmaker, declarative_base
from datetime import datetime

from pgmq_sqlalchemy import op

# Database configuration - can be overridden by environment variables
DATABASE_URL = os.getenv("DATABASE_URL", "postgresql+psycopg2://postgres:postgres@localhost:5432/postgres")
QUEUE_NAME = os.getenv("QUEUE_NAME", "order_queue")

# SQLAlchemy setup
engine = create_engine(DATABASE_URL)
SessionLocal = sessionmaker(bind=engine, autocommit=False, autoflush=False)
Base = declarative_base()


# Order Model (SQLAlchemy ORM)
class Order(Base):
__tablename__ = "orders"

id = Column(Integer, primary_key=True, index=True)
customer_name = Column(String, nullable=False)
product_name = Column(String, nullable=False)
quantity = Column(Integer, nullable=False)
price = Column(Float, nullable=False)
created_at = Column(DateTime, default=datetime.utcnow)


# Pydantic models for request/response
class OrderCreate(BaseModel):
customer_name: str
product_name: str
quantity: int
price: float


class OrderResponse(BaseModel):
model_config = ConfigDict(from_attributes=True)

id: int
customer_name: str
product_name: str
quantity: int
price: float
created_at: datetime
message_id: int


# Lifespan context manager for startup/shutdown
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Initialize database tables and PGMQ queue on startup."""
# Startup
Base.metadata.create_all(bind=engine)

# Initialize PGMQ queue
with SessionLocal() as session:
op.check_pgmq_ext(session=session, commit=True)

# Create queue if it doesn't exist
try:
op.create_queue(QUEUE_NAME, session=session, commit=True)
except Exception:
# Queue might already exist, which is fine
pass

yield

# Shutdown (if needed)


# FastAPI app with lifespan
app = FastAPI(title="Order Management with PGMQ", lifespan=lifespan)

# Database dependency
def get_db() -> Generator[Session, None, None]:
"""Database session dependency."""
db = SessionLocal()
try:
yield db
finally:
db.close()


@app.post("/orders", response_model=OrderResponse, status_code=201)
def create_order(order_data: OrderCreate, db: Session = Depends(get_db)):
"""Create a new order and publish it to the message queue.

Args:
order_data: Order information
db: Database session

Returns:
Created order with message ID
"""
# Create order in database
db_order = Order(
customer_name=order_data.customer_name,
product_name=order_data.product_name,
quantity=order_data.quantity,
price=order_data.price,
)
db.add(db_order)
db.flush() # Flush to get the ID without committing

# Publish message to PGMQ using op in the same transaction
message_data = {
"order_id": db_order.id,
"customer_name": db_order.customer_name,
"product_name": db_order.product_name,
"quantity": db_order.quantity,
"price": db_order.price,
"created_at": db_order.created_at.isoformat(),
}

msg_id = op.send(QUEUE_NAME, message_data, session=db, commit=False)

# Commit both order and message in the same transaction
db.commit()
db.refresh(db_order)

# Return order with message ID
return OrderResponse(
id=db_order.id,
customer_name=db_order.customer_name,
product_name=db_order.product_name,
quantity=db_order.quantity,
price=db_order.price,
created_at=db_order.created_at,
message_id=msg_id,
)


@app.get("/orders/{order_id}", response_model=OrderCreate)
def get_order(order_id: int, db: Session = Depends(get_db)):
"""Get order by ID.

Args:
order_id: Order ID
db: Database session

Returns:
Order information
"""
order = db.query(Order).filter(Order.id == order_id).first()
if not order:
raise HTTPException(status_code=404, detail="Order not found")
return order

Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's add a route to query the msg from pgmq.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in commit 0d2e4bf. Added GET /messages?limit=10 endpoint to query messages from the PGMQ queue.


@app.get("/messages")
def get_messages(limit: int = 10, db: Session = Depends(get_db)):
"""Read messages from the PGMQ queue.

Args:
limit: Number of messages to read (default: 10)
db: Database session

Returns:
List of messages from the queue
"""
messages = op.read_batch(QUEUE_NAME, vt=30, batch_size=limit, session=db, commit=True)

if not messages:
return {"messages": []}

return {
"messages": [
{
"msg_id": msg.msg_id,
"read_ct": msg.read_ct,
"enqueued_at": msg.enqueued_at.isoformat(),
"vt": msg.vt.isoformat(),
"message": msg.message,
}
for msg in messages
]
}


@app.get("/health")
def health_check():
"""Health check endpoint."""
return {"status": "ok"}


if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
Loading
Loading