You need to design a Content Delivery Network (CDN) to serve static images globally with low latency. Which architectural component is essential to reduce the load on the origin server?
Think about how to reduce repeated requests to the main server by serving content closer to users.
Edge cache servers store copies of content near users, reducing latency and origin server load. Other options do not reduce load or latency effectively.
Your CDN experiences sudden traffic spikes during a product launch. Which scaling strategy helps maintain performance without overloading origin servers?
Consider how caching duration affects origin server load during high traffic.
Increasing cache TTL keeps content longer at edge servers, reducing origin requests during spikes. Disabling caching or reducing edge servers increases load and latency.
In a CDN, what is the main tradeoff when setting a very short cache expiration time (TTL) for dynamic content?
Think about how often the CDN must fetch fresh content from the origin.
Short TTL causes frequent cache expiration, increasing origin requests and load, but ensures fresher content with lower latency for cache hits.
When a user requests a file from a CDN edge server that does not have the file cached, what is the correct sequence of events?
Think about the logical order from cache check to serving the user.
The edge server first checks cache (1). On miss, it requests origin (2). Origin responds (3). Edge caches and serves (4).
You expect 1 million users streaming a 5-minute video simultaneously via CDN. Each video bitrate is 3 Mbps. Estimate the minimum total bandwidth capacity the CDN must support to serve all users without buffering.
Calculate total bandwidth by multiplying users by bitrate, then convert units carefully.
1,000,000 users × 3 Mbps = 3,000,000 Mbps = 3,000 Gbps (1 Gbps = 1,000 Mbps). The CDN must support at least 3,000 Gbps (B).
