0
0
Testing Fundamentalstesting~15 mins

Why performance testing prevents bottlenecks in Testing Fundamentals - Automation Benefits in Action

Choose your learning style9 modes available
Verify system performance under load to prevent bottlenecks
Preconditions (3)
Step 1: Start the performance test with 50 concurrent users
Step 2: Gradually increase the number of users to 200 over 10 minutes
Step 3: Monitor response times and error rates during the test
Step 4: Identify any response time spikes or errors indicating bottlenecks
✅ Expected Result: The system maintains response times under 2 seconds and error rate below 1% throughout the test, indicating no bottlenecks
Automation Requirements - Locust
Assertions Needed:
Average response time is less than 2 seconds
Error rate is less than 1%
Best Practices:
Use gradual ramp-up of users to simulate real load
Collect and assert metrics programmatically
Keep test scripts modular and reusable
Automated Solution
Testing Fundamentals
from locust import HttpUser, task, between, events

class WebsiteUser(HttpUser):
    wait_time = between(1, 2)

    @task
    def load_main_page(self):
        with self.client.get("/", catch_response=True) as response:
            if response.status_code != 200:
                response.failure(f"Failed with status {response.status_code}")

@events.test_stop.add_listener
def on_test_stop(environment, **kwargs):
    stats = environment.runner.stats
    avg_response_time = stats.total.avg_response_time
    error_rate = stats.total.fail_ratio * 100
    assert avg_response_time < 2000, f"Average response time too high: {avg_response_time} ms"
    assert error_rate < 1, f"Error rate too high: {error_rate}%"

This script uses Locust to simulate users visiting the main page.

The WebsiteUser class defines user behavior with a wait time between requests.

The load_main_page task sends a GET request to the home page and marks failure if status is not 200.

At test stop, the on_test_stop listener checks average response time and error rate.

Assertions ensure the system meets performance criteria, preventing bottlenecks.

Common Mistakes - 3 Pitfalls
Starting all users at once without ramp-up
Not checking response status codes
Hardcoding thresholds without measuring baseline
Bonus Challenge

Now add data-driven testing with 3 different URLs to simulate varied user behavior

Show Hint