You have a web application with unpredictable traffic spikes that last only a few minutes. Which deployment option is best to handle this efficiently?
Think about cost and automatic scaling for short bursts.
Serverless functions scale automatically and you pay only for actual usage, making them ideal for unpredictable, short bursts. Containers on fixed EC2 instances may waste resources or fail to scale quickly.
You are designing a microservices app where some services need long-running processes and others are event-driven and short-lived. Which architecture fits best?
Consider which compute model suits long-running versus short-lived tasks.
Containers are better for long-running processes because serverless functions have execution time limits. Serverless is ideal for short-lived, event-driven tasks due to automatic scaling and cost efficiency.
Which security concern is more critical when using containers compared to serverless functions?
Think about who manages the OS in each model.
Containers require you to manage the OS and patch vulnerabilities, while serverless abstracts this away. Execution time limits and event triggers are more relevant to serverless.
What is the main impact of cold starts in serverless functions on user experience?
Consider what happens when a function is called after not being used for a while.
Cold starts cause a delay in the first invocation after inactivity, leading to longer response times. They do not cause failures or permanent downtime.
You run a hybrid app with containers for core services and serverless for auxiliary tasks. Which practice best optimizes cost and performance?
Think about scaling and cost efficiency for different workload types.
Dynamic scaling of containers saves cost and matches performance needs. Serverless suits unpredictable short tasks. Running containers at full capacity wastes resources. Converting all to serverless may not fit long-running needs.