Skip to main content

Performance Metrics

developer
GET/v1/analytics/performance

Returns API performance metrics from your perspective as a consumer — average response time, latency percentiles (p50, p95, p99), error rate, and performance trends over the selected period. Implements the SRE Golden Signals "Latency" metric and the RED Method "Duration" dimension. Percentile-based analysis reveals tail latency issues hidden by averages.

What It Does

Calculates performance statistics across all API requests for the authenticated organization in the selected period. Computes: average response time in milliseconds, p50 (median), p95, and p99 latency percentiles, overall error rate percentage, and performance timeseries for trend analysis. Period options: 7d, 30d, or current (default). Latency is measured from request receipt to response delivery at the Cloudflare Workers edge.

Why It's Useful

The Google SRE Book identifies latency as one of the four Golden Signals for monitoring. Averages alone hide tail latency that affects your worst-experience users — a 200ms average with a 2,000ms p99 means 1% of requests are 10x slower. Monitoring percentiles (especially p95 and p99) is essential for SLA compliance, integration debugging, and understanding the real user experience of your API-dependent applications.

Use Cases

SRE / Platform Engineer

SLA Compliance Monitoring

Track p95 and p99 latency against your internal SLA targets (e.g., p95 < 500ms). Set up alerts when percentiles exceed thresholds. Use the performance timeseries to correlate latency spikes with deployment events or traffic changes.

Ensure API integration meets internal SLA commitments with percentile-based monitoring that catches tail latency issues.

Developer

Integration Performance Debugging

Investigate slow API responses reported by users or application monitoring. Compare p50 (typical experience) against p99 (worst case) to determine if latency is consistent or spiky. Cross-reference with the by-endpoint analytics to identify which specific endpoints are slow.

Quickly pinpoint whether latency issues are systemic or endpoint-specific, reducing mean time to resolution.

Solutions Architect / Engineering Manager

Performance Trend Analysis

Review 30-day performance trends to assess whether API latency is stable, improving, or degrading. Use trend data to make informed decisions about architecture changes like adding caching layers, implementing request batching, or choosing different endpoints.

Data-driven architecture decisions based on real latency trends rather than point-in-time measurements.

Parameters

NameTypeRequiredDescription
periodstringOptionalTime range: 7d (last 7 days), 30d (last 30 days), or current (current billing period). Default: current.Example: 7d

Response Fields

FieldTypeDescription
avg_response_timenumberAverage response time in milliseconds
p50number50th percentile (median) latency in ms
p95number95th percentile latency in ms — 1 in 20 requests is slower than this
p99number99th percentile latency in ms — worst 1% of requests
error_ratenumberError rate as percentage (4xx + 5xx responses)
timelinearrayPerformance metrics over time [{timestamp, avg_ms, p95_ms, error_rate}, ...]

Code Examples

cURL
curl "https://api.edgedns.dev/v1/analytics/performance" \
  -H "Authorization: Bearer YOUR_API_KEY"
JavaScript
const response = await fetch(
  'https://api.edgedns.dev/v1/analytics/performance',
  {
    headers: {
      'Authorization': 'Bearer YOUR_API_KEY'
    }
  }
);

const data = await response.json();
console.log(data);
Python
import requests

response = requests.get(
    'https://api.edgedns.dev/v1/analytics/performance',
    headers={'Authorization': 'Bearer YOUR_API_KEY'},
    params={

    }
)

data = response.json()
print(data)

Read the full Performance Metrics guide

Why it matters, real-world use cases, parameters, response fields, and how to call it from Claude, ChatGPT, or Gemini via MCP.

Read the guide →

Related Endpoints

External References

Learn more about the standards and protocols behind this endpoint.

Try This Endpoint

Test the Performance Metrics endpoint live in the playground.