Quick Reference
| Topic | Link |
|---|---|
| Supported annotations | All annotation types TestDino recognizes |
| Add annotations | How to write annotations in your test code |
| Custom metrics | Track performance and business metrics per test |
| View in TestDino | Where annotations show up in the UI |
| Slack notifications | How testdino:notify-slack triggers alerts |
| Configure Slack mapping | Connect annotation targets to Slack channels or users |
Supported Annotations
Annotations use the standard Playwrightannotation array. All types use the testdino: prefix.
| Annotation Type | Example Value | What it does |
|---|---|---|
testdino:priority | p0, p1, p2, p4 | Tags the test with a priority level |
testdino:feature | Navbar, Cart, Checkout | Tags the feature area this test covers |
testdino:link | Jira, Linear, or any URL | Links to a related ticket or document |
testdino:owner | qa-team, @ashish | Identifies who owns or maintains this test |
testdino:notify-slack | #e2e-alerts, @ashish | Notifies a Slack channel or user when this test fails |
testdino:context | Free-text description | Adds context that other testers need to know |
testdino:flaky-reason | Upload feature depends on file size | Documents a known reason the test is flaky |
testdino:metric | JSON with name, value, unit | Tracks a custom numeric metric per test run (details) |
testdino:notify-slack triggers Slack notifications when configured. testdino:metric tracks numeric values over time with charts. All other annotation types display in the TestDino UI for reference.Add Annotations to Tests
Add theannotation array to any Playwright test. Each entry has a type (the annotation name) and a description (the value):
tests/navbar.spec.ts
Slack Notification Targets
| Format | Example | What happens |
|---|---|---|
#channel-name | #e2e-alerts | Sends the failure alert to that Slack channel |
@username | @ashish | Sends the failure alert directly to that Slack user |
| Comma-separated | #e2e-alerts,@ashish | Notifies multiple targets from one entry |
Custom Metrics
Themetric annotation type tracks custom numeric values across test runs. Unlike other annotations that store text, metrics store structured data (name, value, unit, optional threshold) and render as time-series charts in TestDino.
Use metrics to track anything you measure during a test: page load time, API latency, memory usage, bundle size, Lighthouse scores, or business numbers like conversion rate.
Metric Format
Thetestdino:metric annotation uses a JSON string as the description:
| Field | Required | Description |
|---|---|---|
name | Yes | Identifier for the metric. Use the exact same name across runs to build a trend line. |
value | Yes | Numeric value recorded in this test run |
unit | Yes | Display unit shown on the chart and labels |
threshold | No | Reference line drawn on the chart. Marks a performance budget or target. |
Supported Units
| Unit | Example use |
|---|---|
ms | Page load time, API latency |
s | Full test duration, timeout values |
mb | Memory usage, bundle size |
gb | Large asset sizes |
% | Conversion rate, pass rate |
count | Error count, API calls per test |
score | Lighthouse score, accessibility score |
Static vs. Runtime Metrics
You can set metric values in two ways depending on your use case. Static values go in the annotation array at test declaration. Use this for values you know ahead of time or compute before the test:tests/static-metric.spec.ts
test.info().annotations.push(). Use this for performance timings, API latency, or anything captured at execution time:
tests/performance.spec.ts
Example: Track Multiple Metrics at Runtime
A single test can report multiple metrics. Push each one after you capture the value:tests/checkout.spec.ts
Common Metric Examples
These show the annotation format for different categories. Replace thevalue with your actual measurement.
- Performance
- Quality
- Resources
How Metrics Display in TestDino
Metric values appear on the test case detail page. TestDino plots a time-series chart for each metric name, with the X-axis showing test run timestamps and the Y-axis showing the metric value. If athreshold is set, a reference line is drawn on the chart.
Filter by metric name to focus on a specific measurement. The chart updates as new test runs report values for that metric.
View Annotations in TestDino
Once your tests run, annotations appear in two places in TestDino.Test Case Detail
Open any test case from a test run. Below the KPI tiles, the Annotations panel lists every annotation on that test: priority, feature, link, owner, Slack targets, context, and flaky reason. Metric values also appear with their name, value, and unit.Detailed Analysis Table
In the Test Runs > Summary > Detailed Analysis table, each test row has an Annotations badge. Click it to expand and see annotation chips (priority, feature, owner, Slack targets) inline with the test result. This makes it easy to scan annotations across all tests in a run without opening each one.Annotation-Based Slack Notifications
When a test with atestdino:notify-slack annotation fails, TestDino sends a Slack alert to the mapped channel or user. This works independently from test run alerts, which notify on every run completion regardless of annotations.
The notification flow:
- Your test has
testdino:notify-slackset to@ashishor#e2e-alerts. - The test fails during a run.
- TestDino looks up the Annotation-Slack mapping in your Slack App configuration.
- If there is a mapping for that target, the alert goes to the configured Slack destination.
Configure Annotation-Slack Mapping
The mapping connects thetestdino:notify-slack values you write in your test code to actual Slack channels and users in your workspace.
Connect the Slack App
Go to Project Settings > Integrations > Communication > Slack App and connect your Slack workspace. See Slack App setup if you have not connected yet.
Open Annotation Alerts tab
In the Slack Notification Configuration dialog, switch to the Annotation Alerts tab. This is where you define which annotation targets map to which Slack destinations.
Add your mappings
For each annotation target in your test code, add a row and pick the Slack channel or user it should notify:
Type in the search box and select from the dropdown. The dropdown lists all channels and users from your connected Slack workspace.
| Annotation Target (from your test code) | Slack Channel / User |
|---|---|
@ashish | @ashi-deve |
@vishwas | @Vishwas Tiwari |
#e2e-alerts | #td-stage |
Things to Know
- Mapping is stored at the integration level, not at the project level.
- Disconnecting Slack removes all mappings. If you disconnect the Slack App, all Annotation-Slack mappings are deleted. You need to set them up again after reconnecting.
- One test can notify multiple targets. Add separate
testdino:notify-slackentries for each channel or user you want to alert.
Example: Full Annotation Setup
This test uses all supported annotation types, including a runtime metric:tests/order.spec.ts
- TestDino shows all annotations in the test case Annotations panel.
- The
order-flow-timemetric appears on the test detail page and is plotted on a trend chart across runs. - If the test fails, Slack alerts go to
#ch-td-extraand@ashish(if mapped in the Slack App configuration). - The Detailed Analysis table shows annotation chips for quick scanning across all tests in the run.