0
0
FastAPIframework~20 mins

Connection pooling in FastAPI - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Connection Pooling Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
component_behavior
intermediate
2:00remaining
How does connection pooling affect FastAPI app performance?

Consider a FastAPI app using a database connection pool. What is the main effect of using connection pooling on the app's performance under high load?

AIt disables database access to improve API response speed.
BIt increases latency because it opens a new connection for every request.
CIt causes the app to crash when many requests come in simultaneously.
DIt reduces latency by reusing existing database connections instead of opening new ones for each request.
Attempts:
2 left
💡 Hint

Think about how opening a new connection each time affects speed.

📝 Syntax
intermediate
2:00remaining
Identify correct connection pool setup in FastAPI with async SQLAlchemy

Which code snippet correctly creates an async connection pool for a PostgreSQL database using SQLAlchemy in FastAPI?

Aengine = create_async_engine('postgresql+asyncpg://user:pass@localhost/db', pool_size=10, max_overflow=20)
Bengine = create_async_engine('postgresql+asyncpg://user:pass@localhost/db', poolclass=QueuePool, pool_size=10)
Cengine = create_async_engine('postgresql+asyncpg://user:pass@localhost/db', poolclass=NullPool)
Dengine = create_async_engine('postgresql+asyncpg://user:pass@localhost/db', pool_size='ten')
Attempts:
2 left
💡 Hint

Check the parameter names and types for connection pool configuration.

🔧 Debug
advanced
2:00remaining
Why does the FastAPI app exhaust connections despite using a pool?

Given this FastAPI code snippet, why might the app run out of database connections under heavy load?

from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
from sqlalchemy.orm import sessionmaker

engine = create_async_engine('postgresql+asyncpg://user:pass@localhost/db', pool_size=5)
async_session = sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)

async def get_db():
    session = async_session()
    yield session

@app.get('/items')
async def read_items(db: AsyncSession = Depends(get_db)):
    result = await db.execute('SELECT * FROM items')
    return result.fetchall()
AThe SQL query syntax is invalid, causing connection leaks.
BThe sessions are not closed properly, causing connections to stay open and exhaust the pool.
CThe pool size is too large, causing too many connections to open.
DThe async_session is not awaited, so connections are never released.
Attempts:
2 left
💡 Hint

Look at how sessions are managed and closed.

state_output
advanced
2:00remaining
What is the maximum number of simultaneous DB connections with this pool config?

Given this connection pool configuration in FastAPI using SQLAlchemy, what is the maximum number of simultaneous database connections that can be open?

engine = create_async_engine(
    'postgresql+asyncpg://user:pass@localhost/db',
    pool_size=3,
    max_overflow=2
)
A5 connections maximum
B3 connections maximum
C2 connections maximum
DUnlimited connections
Attempts:
2 left
💡 Hint

Remember that max_overflow adds extra connections beyond the pool size.

🧠 Conceptual
expert
2:00remaining
Why prefer async connection pooling in FastAPI over sync pooling?

In a FastAPI app using async endpoints, why is it better to use an async connection pool rather than a synchronous one?

AAsync pooling disables connection reuse, forcing new connections each time.
BSync pooling is faster because it uses blocking calls, which are better for async apps.
CAsync pooling allows non-blocking database calls, improving concurrency and throughput in async endpoints.
DSync pooling automatically converts sync calls to async, so no difference exists.
Attempts:
2 left
💡 Hint

Think about how async code handles waiting for I/O.