Quick Tips
5 min

JMeter Quick Start

Get started with JMeter for performance testing in minutes

...
jmeterperformanceload-testing

JMeter Quick Start Guide

1. Basic Test Plan Structure

Every JMeter test plan needs:

  1. Thread Group - Simulates users
  2. Sampler - Defines requests (HTTP, API, etc.)
  3. Listener - Displays results
  4. Assertions - Validates responses

2. Create Your First HTTP Test

Test Plan
└── Thread Group
    ├── HTTP Request Sampler
    ├── Response Assertion
    └── View Results Tree (Listener)

Thread Group Settings:

  • Number of Threads (users): 10
  • Ramp-Up Period: 5 seconds
  • Loop Count: 3

HTTP Request Configuration:

Protocol: https
Server Name: api.example.com
Path: /api/v1/products
Method: GET

3. Essential Samplers

HTTP Request:

Add → Sampler → HTTP Request
 
Common configurations:
- GET request: Just set Path
- POST request: Add Body Data in "Body Data" tab
- Headers: Add HTTP Header Manager

JSON Request Example:

Method: POST
Path: /api/v1/users
Body Data:
{
  "username": "testuser",
  "email": "test@example.com"
}
 
Headers (via HTTP Header Manager):
Content-Type: application/json
Authorization: Bearer ${token}

4. Add Assertions

Response Assertion:

Add → Assertions → Response Assertion
 
Settings:
- Field to Test: Response Code
- Pattern Matching Rules: Equals
- Patterns to Test: 200

JSON Assertion:

Add → Assertions → JSON Assertion
 
Assert JSON Path: $.data.status
Expected Value: success

Duration Assertion:

Add → Assertions → Duration Assertion
Duration in milliseconds: 2000

5. Essential Listeners

Add → Listener → [Choose One]
 
Common Listeners:
1. View Results Tree - Detailed request/response (dev/debug)
2. Summary Report - Aggregate statistics
3. Aggregate Report - Detailed metrics
4. Graph Results - Visual performance graph
5. Response Time Graph - Response times over time

6. Use Variables and Parameters

User Defined Variables:

Add → Config Element → User Defined Variables
 
Name          | Value
-------------|------------------
baseUrl      | api.example.com
apiKey       | your-api-key-123

CSV Data Config:

Add → Config Element → CSV Data Set Config
 
Filename: testdata.csv
Variable Names: username,password
Delimiter: ,

testdata.csv:

username,password
user1,pass123
user2,pass456
user3,pass789

Use in Request:

Path: /api/login
Body:
{
  "username": "${username}",
  "password": "${password}"
}

7. Extract and Reuse Data

JSON Extractor:

Add → Post Processor → JSON Extractor
 
Names of created variables: token
JSON Path expressions: $.auth.token
Match No.: 1
Default Values: ERROR
 
Use: ${token} in subsequent requests

Regular Expression Extractor:

Add → Post Processor → Regular Expression Extractor
 
Reference Name: userId
Regular Expression: "id":"([^"]+)"
Template: $1$
Match No.: 1

8. Command Line Execution

# Run test in non-GUI mode
jmeter -n -t test-plan.jmx -l results.jtl
 
# With HTML report
jmeter -n -t test-plan.jmx -l results.jtl -e -o ./report
 
# Override properties
jmeter -n -t test-plan.jmx \
  -Jusers=50 \
  -Jrampup=10 \
  -l results.jtl
 
# Use in test plan
${__P(users,10)}  // Uses 10 as default

9. Quick Load Test Setup

Thread Group Configuration:
 
Light Load Test:
- Threads: 10-50
- Ramp-up: 30-60s
- Duration: 300s (5 min)
 
Stress Test:
- Threads: 100-500
- Ramp-up: 60-300s
- Duration: 1800s (30 min)
 
Spike Test:
- Threads: 500
- Ramp-up: 10s
- Duration: 300s

Sample Test Plan Template

Test Plan: API Load Test

├── User Defined Variables
│   ├── baseUrl = api.example.com
│   └── apiPath = /api/v1

├── HTTP Header Manager
│   ├── Content-Type: application/json
│   └── Accept: application/json

└── Thread Group (50 users, 10s ramp-up)

    ├── Login Request
    │   ├── POST ${baseUrl}${apiPath}/login
    │   ├── JSON Extractor (token)
    │   └── Response Assertion (200)

    ├── Get Products
    │   ├── GET ${baseUrl}${apiPath}/products
    │   ├── Header: Authorization: Bearer ${token}
    │   ├── Response Assertion (200)
    │   └── JSON Assertion ($.success = true)

    ├── Summary Report
    └── View Results Tree

Quick Reference

Key Metrics to Monitor

  • Response Time: Average, Min, Max, 90th percentile
  • Throughput: Requests per second
  • Error Rate: Percentage of failed requests
  • Latency: Time to first byte

Best Practices

  • Run tests in non-GUI mode for actual load testing
  • Use Listeners only during development (remove for load tests)
  • Start with small thread counts to verify test plan
  • Ramp up gradually to avoid overwhelming the system
  • Save results to .jtl files for analysis
  • Use timers to simulate realistic user behavior
  • Disable View Results Tree in production tests

Key Takeaways

  • Start simple: Thread Group → HTTP Request → Listener
  • Use CSV files for test data parameterization
  • Extract dynamic values with JSON/RegEx extractors
  • Run in non-GUI mode for real performance tests
  • Monitor response times, throughput, and error rates
  • Always ramp up gradually for realistic load patterns

Comments (0)

Loading comments...