Skip to main content
Annotations let you attach metadata directly to your Playwright tests. You can tag each test with a priority, feature area, owner, related ticket link, a Slack channel or user to notify on failure, and custom metrics like page load time or API latency. TestDino picks up these annotations and displays them in the UI next to each test case.

Quick Reference

TopicLink
Supported annotationsAll annotation types TestDino recognizes
Add annotationsHow to write annotations in your test code
Custom metricsTrack performance and business metrics per test
View in TestDinoWhere annotations show up in the UI
Slack notificationsHow testdino:notify-slack triggers alerts
Configure Slack mappingConnect annotation targets to Slack channels or users

Supported Annotations

Annotations use the standard Playwright annotation array. All types use the testdino: prefix.
Annotation TypeExample ValueWhat it does
testdino:priorityp0, p1, p2, p4Tags the test with a priority level
testdino:featureNavbar, Cart, CheckoutTags the feature area this test covers
testdino:linkJira, Linear, or any URLLinks to a related ticket or document
testdino:ownerqa-team, @ashishIdentifies who owns or maintains this test
testdino:notify-slack#e2e-alerts, @ashishNotifies a Slack channel or user when this test fails
testdino:contextFree-text descriptionAdds context that other testers need to know
testdino:flaky-reasonUpload feature depends on file sizeDocuments a known reason the test is flaky
testdino:metricJSON with name, value, unitTracks a custom numeric metric per test run (details)
testdino:notify-slack triggers Slack notifications when configured. testdino:metric tracks numeric values over time with charts. All other annotation types display in the TestDino UI for reference.

Add Annotations to Tests

Add the annotation array to any Playwright test. Each entry has a type (the annotation name) and a description (the value):
tests/navbar.spec.ts
import { test, expect } from '@playwright/test';

test('Verify navbar', {
  annotation: [
    { type: 'testdino:priority', description: 'p0' },
    { type: 'testdino:feature', description: 'Navbar' },
    { type: 'testdino:link', description: 'https://jira.example.com/NAVBAR-1' },
    { type: 'testdino:owner', description: 'qa-team' },
    { type: 'testdino:notify-slack', description: '@ashish' },
  ],
}, async ({ page }) => {
  await page.goto('/');
  await expect(page.locator('nav')).toBeVisible();
});
You can notify multiple channels and users from a single annotation by separating them with commas:
{ type: 'testdino:notify-slack', description: '#e2e-alerts,#qa-channel,@ashish,@vishwas' }
For better readability, use separate entries:
annotation: [
  { type: 'testdino:notify-slack', description: '#e2e-alerts' },
  { type: 'testdino:notify-slack', description: '#qa-channel' },
  { type: 'testdino:notify-slack', description: '@ashish' },
],

Slack Notification Targets

FormatExampleWhat happens
#channel-name#e2e-alertsSends the failure alert to that Slack channel
@username@ashishSends the failure alert directly to that Slack user
Comma-separated#e2e-alerts,@ashishNotifies multiple targets from one entry
Separate entries per target are recommended for readability and easier maintenance.

Custom Metrics

The metric annotation type tracks custom numeric values across test runs. Unlike other annotations that store text, metrics store structured data (name, value, unit, optional threshold) and render as time-series charts in TestDino. Use metrics to track anything you measure during a test: page load time, API latency, memory usage, bundle size, Lighthouse scores, or business numbers like conversion rate.

Metric Format

The testdino:metric annotation uses a JSON string as the description:
{
  type: 'testdino:metric',
  description: JSON.stringify({
    name: 'page-load-time',  // Metric name (keep consistent across runs)
    value: 1250,             // Numeric value for this run
    unit: 'ms',              // Display unit
    threshold: 2000,         // Optional: threshold line on the chart
  }),
}
FieldRequiredDescription
nameYesIdentifier for the metric. Use the exact same name across runs to build a trend line.
valueYesNumeric value recorded in this test run
unitYesDisplay unit shown on the chart and labels
thresholdNoReference line drawn on the chart. Marks a performance budget or target.

Supported Units

UnitExample use
msPage load time, API latency
sFull test duration, timeout values
mbMemory usage, bundle size
gbLarge asset sizes
%Conversion rate, pass rate
countError count, API calls per test
scoreLighthouse score, accessibility score

Static vs. Runtime Metrics

You can set metric values in two ways depending on your use case. Static values go in the annotation array at test declaration. Use this for values you know ahead of time or compute before the test:
tests/static-metric.spec.ts
import { test, expect } from '@playwright/test';

test('Homepage check', {
  annotation: [
    {
      type: 'testdino:metric',
      description: JSON.stringify({
        name: 'lighthouse-score',
        value: 94,
        unit: 'score',
        threshold: 90,
      }),
    },
  ],
}, async ({ page }) => {
  await page.goto('/');
  await expect(page.locator('h1')).toBeVisible();
});
Runtime values are measured during the test and pushed with test.info().annotations.push(). Use this for performance timings, API latency, or anything captured at execution time:
tests/performance.spec.ts
import { test, expect } from '@playwright/test';

test('Homepage loads within budget', async ({ page }) => {
  const start = Date.now();
  await page.goto('/');
  await expect(page.locator('h1')).toBeVisible();
  const loadTime = Date.now() - start;

  test.info().annotations.push({
    type: 'testdino:metric',
    description: JSON.stringify({
      name: 'page-load-time',
      value: loadTime,
      unit: 'ms',
      threshold: 2000,
    }),
  });
});
Use test.info().annotations.push() for any metric that depends on runtime measurement. The annotation array on the test declaration runs before the test body, so it cannot access runtime values.

Example: Track Multiple Metrics at Runtime

A single test can report multiple metrics. Push each one after you capture the value:
tests/checkout.spec.ts
import { test, expect } from '@playwright/test';

test('Checkout flow performance', async ({ page }) => {
  const start = Date.now();
  await page.goto('/checkout');
  await expect(page.locator('[data-testid="order-summary"]')).toBeVisible();
  const flowTime = Date.now() - start;

  // Track the checkout flow duration
  test.info().annotations.push({
    type: 'testdino:metric',
    description: JSON.stringify({
      name: 'checkout-flow-time',
      value: flowTime,
      unit: 'ms',
      threshold: 5000,
    }),
  });

  // Track the number of API calls made during the test
  test.info().annotations.push({
    type: 'testdino:metric',
    description: JSON.stringify({
      name: 'api-calls',
      value: 12,
      unit: 'count',
    }),
  });
});

Common Metric Examples

These show the annotation format for different categories. Replace the value with your actual measurement.
// Page load time
{ type: 'testdino:metric', description: JSON.stringify({ name: 'page-load-time', value: loadTime, unit: 'ms', threshold: 2000 }) }

// API latency
{ type: 'testdino:metric', description: JSON.stringify({ name: 'api-latency', value: latency, unit: 'ms', threshold: 200 }) }

// Memory usage
{ type: 'testdino:metric', description: JSON.stringify({ name: 'memory-usage', value: memoryMb, unit: 'mb' }) }

How Metrics Display in TestDino

Metric values appear on the test case detail page. TestDino plots a time-series chart for each metric name, with the X-axis showing test run timestamps and the Y-axis showing the metric value. If a threshold is set, a reference line is drawn on the chart. Filter by metric name to focus on a specific measurement. The chart updates as new test runs report values for that metric.
Keep metric names consistent across runs. Use the exact same name string every time (for example, always page-load-time, not sometimes pageLoadTime). This ensures all data points appear on the same trend line.

View Annotations in TestDino

Once your tests run, annotations appear in two places in TestDino.

Test Case Detail

Open any test case from a test run. Below the KPI tiles, the Annotations panel lists every annotation on that test: priority, feature, link, owner, Slack targets, context, and flaky reason. Metric values also appear with their name, value, and unit.

Detailed Analysis Table

In the Test Runs > Summary > Detailed Analysis table, each test row has an Annotations badge. Click it to expand and see annotation chips (priority, feature, owner, Slack targets) inline with the test result. This makes it easy to scan annotations across all tests in a run without opening each one.

Annotation-Based Slack Notifications

When a test with a testdino:notify-slack annotation fails, TestDino sends a Slack alert to the mapped channel or user. This works independently from test run alerts, which notify on every run completion regardless of annotations. The notification flow:
  1. Your test has testdino:notify-slack set to @ashish or #e2e-alerts.
  2. The test fails during a run.
  3. TestDino looks up the Annotation-Slack mapping in your Slack App configuration.
  4. If there is a mapping for that target, the alert goes to the configured Slack destination.
Annotation-based Slack notifications require the Slack App to be connected to your project. The Slack Webhook integration does not support annotation-based alerts.

Configure Annotation-Slack Mapping

The mapping connects the testdino:notify-slack values you write in your test code to actual Slack channels and users in your workspace.
1

Connect the Slack App

Go to Project Settings > Integrations > Communication > Slack App and connect your Slack workspace. See Slack App setup if you have not connected yet.
2

Open Annotation Alerts tab

In the Slack Notification Configuration dialog, switch to the Annotation Alerts tab. This is where you define which annotation targets map to which Slack destinations.
3

Add your mappings

For each annotation target in your test code, add a row and pick the Slack channel or user it should notify:
Annotation Target (from your test code)Slack Channel / User
@ashish@ashi-deve
@vishwas@Vishwas Tiwari
#e2e-alerts#td-stage
Type in the search box and select from the dropdown. The dropdown lists all channels and users from your connected Slack workspace.
4

Save the configuration

Click Save. From now on, when a test with a matching testdino:notify-slack annotation fails, the alert is sent to the mapped Slack destination.

Things to Know

  • Mapping is stored at the integration level, not at the project level.
  • Disconnecting Slack removes all mappings. If you disconnect the Slack App, all Annotation-Slack mappings are deleted. You need to set them up again after reconnecting.
  • One test can notify multiple targets. Add separate testdino:notify-slack entries for each channel or user you want to alert.

Example: Full Annotation Setup

This test uses all supported annotation types, including a runtime metric:
tests/order.spec.ts
import { test, expect } from '@playwright/test';

test('New user can place and cancel order', {
  annotation: [
    { type: 'testdino:priority', description: 'p0' },
    { type: 'testdino:feature', description: 'Registration to Order' },
    { type: 'testdino:link', description: 'https://jira.example.com/PROJ-123' },
    { type: 'testdino:owner', description: 'qa-team' },
    { type: 'testdino:notify-slack', description: '#ch-td-extra' },
    { type: 'testdino:notify-slack', description: '@ashish' },
    { type: 'testdino:context', description: 'Uses fixture user: [email protected]' },
    { type: 'testdino:flaky-reason', description: 'Delay in API call response' },
  ],
}, async ({ page }) => {
  const start = Date.now();

  // Test steps: navigate, place order, cancel order
  await page.goto('/orders/new');
  await expect(page.locator('[data-testid="order-confirmed"]')).toBeVisible();

  const flowTime = Date.now() - start;

  test.info().annotations.push({
    type: 'testdino:metric',
    description: JSON.stringify({
      name: 'order-flow-time',
      value: flowTime,
      unit: 'ms',
      threshold: 5000,
    }),
  });
});
When this test runs:
  • TestDino shows all annotations in the test case Annotations panel.
  • The order-flow-time metric appears on the test detail page and is plotted on a trend chart across runs.
  • If the test fails, Slack alerts go to #ch-td-extra and @ashish (if mapped in the Slack App configuration).
  • The Detailed Analysis table shows annotation chips for quick scanning across all tests in the run.