Skip to content

[OV][ITT] Enhance ITT MACROS to accept metadata for ID propagation#33311

Closed
tovinkere wants to merge 0 commit intoopenvinotoolkit:masterfrom
tovinkere:itt_markers_extn_core
Closed

[OV][ITT] Enhance ITT MACROS to accept metadata for ID propagation#33311
tovinkere wants to merge 0 commit intoopenvinotoolkit:masterfrom
tovinkere:itt_markers_extn_core

Conversation

@tovinkere
Copy link
Copy Markdown
Contributor

@tovinkere tovinkere commented Dec 18, 2025

#Feature enhancement - Part 1

This PR is the first of a series of PRs to standardize the ITT markers in OpenVINO that will be enabled by default through host-side instrumentation.

  1. This first PR addresses the enhancements required in ITT and the framework to support the creation and propagation of IDs when asynchronous execution is in play.
  2. The second PR will standardize ITT markers in the CPU and enhance support to include asynchronous execution.
  3. The third PR will enable default markers for GPU plugin to allow visibility into inference pass begin/end and operator preparation and submission within each inference.
  4. The final PR will extend the same host side markers for NPU execution, which capturing the inference span and pipeline activity.

Summary of the current PR (PR#1)

  • Enhances base macros to also accept metadata as a pair that will be associated with a region or a task
  • Updated core markers, such as Read Model, Compile Model to follow a standard convention and belong to the namespace ov::phases, which is also the ITT domain
  • Added early exit to domain and string handle creation if a collector is not initialized and actively receiving the data

Details:

For some plugins that use pipeline execution during asynchronous evaluation, the inference is initiated on one thread, the pipeline execution begins on another thread and is completed on a different thread. In order to get the full inference time, including host side timestamps, the stages of execution that belong to the same inference pass MUST:

  • Share the same inference ID, which is provided by a thread-safe global counter
  • This ID must be associated with the region or task of interest
  • And propagated by the ITT subsystem. ScopedRegion() and ScopedTask() have been enhanced to support this.

@aobolensk Please review as this is an extension of your work with regions.

namespace domains {
OV_ITT_DOMAIN(OV, "ov");
OV_ITT_DOMAIN(ReadTime, "ov::ReadTime");
OV_ITT_DOMAIN(LoadTime, "ov::LoadTime");
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aobolensk
It looks like change of the regions names was API break should be deprecated first as this is public interface.
Could be old domains restored and both used?

@praasz praasz requested a review from aobolensk January 14, 2026 06:33
@praasz praasz added this to the 2026.0 milestone Jan 14, 2026
#include "openvino/runtime/tensor.hpp"
#include "openvino/runtime/threading/itask_executor.hpp"

#if defined(ENABLE_PROFILING_ITT_FULL) || defined(ENABLE_PROFILING_ITT_BASE)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For future improvement it could be hidden in current or some additional util macro for ITT.
This macro could inject members into class, struct and macros like OV_ITT_SCOPED_REGION_BASE could assume the members are inside classes. It will help if such class instrumentation is required for others and this code will be not repeated and hidden from main class functionality.

namespace domains {
OV_ITT_DOMAIN(Plugin)
OV_ITT_DOMAIN(PluginLoadTime)
// Domain to define Inference phase tasks
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It can be done later but for this file we should set reference to guide how the domains can be modified/added/removed etc.

@praasz
Copy link
Copy Markdown
Contributor

praasz commented Jan 15, 2026

build_jenkins

1 similar comment
@praasz
Copy link
Copy Markdown
Contributor

praasz commented Jan 15, 2026

build_jenkins

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

category: Core OpenVINO Core (aka ngraph) category: inference OpenVINO Runtime library - Inference ExternalIntelPR External contributor from Intel

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants