0
0
Rest APIprogramming~20 mins

Why batch operations reduce round trips in Rest API - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Batch Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Understanding batch operations and network round trips

Why do batch operations reduce the number of network round trips in REST APIs?

ABecause batch operations combine multiple requests into one, reducing the number of separate network calls.
BBecause batch operations compress data to make each request smaller.
CBecause batch operations use a faster network protocol than normal requests.
DBecause batch operations cache responses locally to avoid sending requests.
Attempts:
2 left
💡 Hint

Think about how sending many small packages compares to sending one big package.

Predict Output
intermediate
2:00remaining
Output of batch request simulation

Consider this Python code simulating batch vs single requests. What is the output?

Rest API
import time

def single_requests(n):
    total_time = 0
    for _ in range(n):
        time.sleep(0.1)  # simulate network delay
        total_time += 0.1
    return total_time

def batch_request(n):
    time.sleep(0.1)  # simulate one batch delay
    return 0.1

print(f"Single requests total time: {single_requests(5)} seconds")
print(f"Batch request total time: {batch_request(5)} seconds")
A
Single requests total time: 0.5 seconds
Batch request total time: 0.5 seconds
B
Single requests total time: 0.5 seconds
Batch request total time: 0.1 seconds
C
Single requests total time: 0.1 seconds
Batch request total time: 0.5 seconds
D
Single requests total time: 0.1 seconds
Batch request total time: 0.1 seconds
Attempts:
2 left
💡 Hint

Each single request waits 0.1 seconds, batch waits once.

Predict Output
advanced
2:00remaining
What is the output of this batch API call simulation?

Given this JavaScript code simulating batch API calls, what is logged?

Rest API
async function simulateBatch() {
  const delay = ms => new Promise(res => setTimeout(res, ms));
  async function singleCall(id) {
    await delay(100);
    return `Response ${id}`;
  }
  async function batchCall(ids) {
    await delay(100);
    return ids.map(id => `Response ${id}`);
  }

  const singleResults = [];
  for (let i = 1; i <= 3; i++) {
    singleResults.push(await singleCall(i));
  }

  const batchResults = await batchCall([1, 2, 3]);

  console.log(singleResults);
  console.log(batchResults);
}
simulateBatch();
A
["Response 1", "Response 2", "Response 3"]
["Response 1", "Response 2", "Response 3"]
B
["Response 3"]
["Response 1", "Response 2", "Response 3"]
C
["Response 1"]
["Response 1", "Response 2", "Response 3"]
D
["Response 1", "Response 2", "Response 3"]
["Response 3"]
Attempts:
2 left
💡 Hint

Both single and batch calls return arrays of responses for ids 1 to 3.

🧠 Conceptual
advanced
2:00remaining
Why batch operations improve API efficiency beyond fewer round trips

Besides reducing the number of network round trips, what is another key reason batch operations improve API efficiency?

AThey automatically retry failed requests without client intervention.
BThey encrypt data more securely than single requests.
CThey reduce server processing overhead by handling multiple requests together.
DThey guarantee faster database queries by caching results.
Attempts:
2 left
💡 Hint

Think about how handling many requests at once can save work on the server side.

🚀 Application
expert
3:00remaining
Calculate total network time with and without batching

You have 10 API calls. Each call takes 50ms network latency plus 100ms server processing. If you batch all calls into one request, the network latency is 50ms once and server processing is 100ms times 10 calls. What is the total time difference between batching and sending calls individually?

ABatching takes 500ms, individual calls take 1000ms, so batching saves 500ms.
BBatching takes 550ms, individual calls take 1500ms, so batching saves 950ms.
CBatching takes 1050ms, individual calls take 1050ms, so no time is saved.
DBatching takes 1050ms, individual calls take 1500ms, so batching saves 450ms.
Attempts:
2 left
💡 Hint

Calculate total time for individual calls: (latency + processing) * number of calls. For batch: latency once + processing for all calls.