Skip to content

Commit 3d5835b

Browse files
committed
Add longevity results
1 parent fff65e7 commit 3d5835b

File tree

6 files changed

+166
-0
lines changed

6 files changed

+166
-0
lines changed
Lines changed: 83 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
# Results
2+
3+
## Test environment
4+
5+
NGINX Plus: false
6+
7+
NGINX Gateway Fabric:
8+
9+
- Commit: 89aee48bf6e660a828ffd32ca35fc7f52e358e00
10+
- Date: 2025-12-12T20:04:38Z
11+
- Dirty: false
12+
13+
GKE Cluster:
14+
15+
- Node count: 3
16+
- k8s version: v1.33.5-gke.1308000
17+
- vCPUs per node: 2
18+
- RAM per node: 4015672Ki
19+
- Max pods per node: 110
20+
- Zone: us-west2-a
21+
- Instance Type: e2-medium
22+
23+
## Summary:
24+
25+
- Still a lot of non-2xx or 3xx responses, many more than last time. Socket errors are all mostly read errors, with no write errors and fewer timeout errors.
26+
- We observe a continual increase in NGINX memory usage over time which could indicate a memory leak. Will bring this up with the Agent team.
27+
- CPU usage remained consistent with past results.
28+
- Error contacting TokenReview API, but may be a one-off.
29+
30+
## Traffic
31+
32+
HTTP:
33+
34+
```text
35+
Running 5760m test @ http://cafe.example.com/coffee
36+
2 threads and 100 connections
37+
Thread Stats Avg Stdev Max +/- Stdev
38+
Latency 190.35ms 141.74ms 2.00s 83.52%
39+
Req/Sec 289.84 187.59 3.52k 63.68%
40+
195509968 requests in 5760.00m, 66.75GB read
41+
Socket errors: connect 0, read 315485, write 0, timeout 6584
42+
Non-2xx or 3xx responses: 1763516
43+
Requests/sec: 565.71
44+
Transfer/sec: 202.53KB
45+
```
46+
47+
HTTPS:
48+
49+
```text
50+
Running 5760m test @ https://cafe.example.com/tea
51+
2 threads and 100 connections
52+
Thread Stats Avg Stdev Max +/- Stdev
53+
Latency 180.03ms 106.92ms 1.94s 67.25%
54+
Req/Sec 287.34 184.95 1.73k 63.36%
55+
193842103 requests in 5760.00m, 65.22GB read
56+
Socket errors: connect 0, read 309621, write 0, timeout 1
57+
Requests/sec: 560.89
58+
Transfer/sec: 197.88KB
59+
```
60+
## Key Metrics
61+
62+
### Containers memory
63+
64+
![oss-memory.png](oss-memory.png)
65+
66+
### Containers CPU
67+
68+
![oss-cpu.png](oss-cpu.png)
69+
70+
## Error Logs
71+
72+
### nginx-gateway
73+
74+
error=rpc error: code = Internal desc = error creating TokenReview: context canceled;level=error;logger=agentGRPCServer;msg=error validating connection;stacktrace=github.com/nginx/nginx-gateway-fabric/v2/internal/controller/nginx/agent/grpc/interceptor.(*ContextSetter).Stream.ContextSetter.Stream.func1
75+
/opt/actions-runner/_work/nginx-gateway-fabric/nginx-gateway-fabric/internal/controller/nginx/agent/grpc/interceptor/interceptor.go:62
76+
google.golang.org/grpc.(*Server).processStreamingRPC
77+
/opt/actions-runner/_work/nginx-gateway-fabric/nginx-gateway-fabric/.gocache/google.golang.org/[email protected]/server.go:1721
78+
google.golang.org/grpc.(*Server).handleStream
79+
/opt/actions-runner/_work/nginx-gateway-fabric/nginx-gateway-fabric/.gocache/google.golang.org/[email protected]/server.go:1836
80+
google.golang.org/grpc.(*Server).serveStreams.func2.1
81+
/opt/actions-runner/_work/nginx-gateway-fabric/nginx-gateway-fabric/.gocache/google.golang.org/[email protected]/server.go:1063;ts=2025-12-16T17:35:17Z
82+
83+
### nginx
Lines changed: 83 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
# Results
2+
3+
## Test environment
4+
5+
NGINX Plus: false
6+
7+
NGINX Gateway Fabric:
8+
9+
- Commit: 89aee48bf6e660a828ffd32ca35fc7f52e358e00
10+
- Date: 2025-12-12T20:04:38Z
11+
- Dirty: false
12+
13+
GKE Cluster:
14+
15+
- Node count: 3
16+
- k8s version: v1.33.5-gke.1308000
17+
- vCPUs per node: 2
18+
- RAM per node: 4015672Ki
19+
- Max pods per node: 110
20+
- Zone: us-west2-a
21+
- Instance Type: e2-medium
22+
23+
## Summary:
24+
25+
- Consistent traffic results from 2.2.
26+
- We observe a continual increase in NGINX memory usage over time which could indicate a memory leak. Will bring this up with the Agent team.
27+
- CPU usage remained consistent with past results.
28+
- Still get some "no live upstreams" errors.
29+
30+
## Traffic
31+
32+
HTTP:
33+
34+
```text
35+
Running 5760m test @ http://cafe.example.com/coffee
36+
2 threads and 100 connections
37+
Thread Stats Avg Stdev Max +/- Stdev
38+
Latency 184.82ms 102.91ms 1.45s 65.52%
39+
Req/Sec 284.19 179.74 1.52k 63.62%
40+
192198367 requests in 5760.00m, 65.91GB read
41+
Socket errors: connect 0, read 0, write 0, timeout 108
42+
Non-2xx or 3xx responses: 5
43+
Requests/sec: 556.13
44+
Transfer/sec: 199.96KB
45+
```
46+
47+
HTTPS:
48+
49+
```text
50+
Running 5760m test @ https://cafe.example.com/tea
51+
2 threads and 100 connections
52+
Thread Stats Avg Stdev Max +/- Stdev
53+
Latency 185.02ms 102.92ms 1.50s 65.52%
54+
Req/Sec 283.70 179.19 1.43k 63.75%
55+
191866398 requests in 5760.00m, 64.73GB read
56+
Socket errors: connect 0, read 0, write 0, timeout 114
57+
Non-2xx or 3xx responses: 6
58+
Requests/sec: 555.17
59+
Transfer/sec: 196.40KB
60+
```
61+
## Key Metrics
62+
63+
### Containers memory
64+
65+
![oss-memory.png](oss-memory.png)
66+
67+
### Containers CPU
68+
69+
![oss-cpu.png](oss-cpu.png)
70+
71+
## Error Logs
72+
73+
### nginx-gateway
74+
75+
### nginx
76+
77+
78+
79+
80+
10.168.0.90 - - [16/Dec/2025:15:47:08 +0000] "GET /tea HTTP/1.1" 502 150 "-" "-"
81+
2025/12/16 15:47:08 [error] 26#26: *361983622 no live upstreams while connecting to upstream, client: 10.168.0.90, server: cafe.example.com, request: "GET /tea HTTP/1.1", upstream: "http://longevity_tea_80/tea", host: "cafe.example.com"
82+
10.168.0.90 - - [16/Dec/2025:12:49:07 +0000] "GET /coffee HTTP/1.1" 502 150 "-" "-"
83+
2025/12/16 12:49:07 [error] 25#25: *350621339 no live upstreams while connecting to upstream, client: 10.168.0.90, server: cafe.example.com, request: "GET /coffee HTTP/1.1", upstream: "http://longevity_coffee_80/coffee", host: "cafe.example.com"
48.3 KB
Loading
41.7 KB
Loading
50.1 KB
Loading
37.1 KB
Loading

0 commit comments

Comments
 (0)