![]() |
OpenSplice C# API
v6.x
OpenSplice C# Data Distribution Service Data-Centric Publish-Subscribe API
|
The DataReader is responsible for reading typed sample and sample information from the middleware and is equivalent to the DataWriter, the DataReader and DataWriter provide the bulk of Listeners, conditions and QoS available in DDS as they're the intermediaries between the middleware and the application.
The DataReader is an abstract class, that is always sub-classed by a typed specialisation, which is generated by the pre-processor:
QoS | Brief |
---|---|
USER_DATA | User data |
DURABILITY | Expresses the lifetime of a sample |
DEADLINE | Sets the period in which a sample must be received |
LATENCY_BUDGET | Specifies the maximum acceptable delay for a sample in transit |
LIVELINESS | Set the method by which an instance is considered alive (publishing) |
OWNERSHIP | Specifies if samples should be treated as having a single of multiple owners |
TIME_BASED_FILTER | Filter based on the time separation between samples |
RELIABILITY | Set the required reliability of a DataReader/DataWriter pair, reliable or best effort |
DESTINATION_ORDER | Controls the logical order of the samples, by reception time-stamp or source time-stamp |
HISTORY | Specify how many generations of the same instance to keep |
RESOURCE_LIMITS | Specify the amount of resources the DataReader can consume |
READER_DATA_LIFECYCLE | Specify the lifecycle for data-instances |
SUBSCRIPTION_KEYS | Allows the DataReader to define its own set of keys on the data, different from the keys defined by the topic |
READER_LIFESPAN | Automatically remove samples from the DataReader after a specified timeout. |
SHARE | Used to share a DataReader between multiple processes |
QoS | Attribute | Value |
---|---|---|
USER_DATA | value.length | 0 |
DURABILITY | kind | VOLATILE |
DEADLINE | period | DURATION_INFINITE |
LATENCY_BUDGET | duration | 0 |
LIVELINESS | kind lease_duration | AUTOMATIC DURATION_INFINITE |
OWNERSHIP | kind | SHARED |
TIME_BASED_FILTER | minimum_separation | 0 |
RELIABILITY | kind max_blocking_time synchronous | BEST_EFFORT 100 ms FALSE |
DESTINATION_ORDER | kind | BY_RECEPTION_TIMESTAMP |
HISTORY | kind depth | KEEP_LAST 1 |
RESOURCE_LIMITS | max_samples max_instances max_samples_per_instance | LENGTH_UNLIMITED LENGTH_UNLIMITED LENGTH_UNLIMITED |
READER_DATA_LIFECYCLE | autopurge_nowriter_samples_delay autopurge_disposed_samples_delay autopurge_dispose_all enable_invalid_samples invalid_sample_visibility.kind | DURATION_INFINITE DURATION_INFINITE FALSE TRUE MINIMUM_INVALID_SAMPLES |
SUBSCRIPTION_KEYS | use_key_list key_list.length | FALSE 0 |
READER_LIFESPAN | use_lifespan duration | FALSE DURATION_INFINITE |
SHARE | name enable | NULL FALSE |
Status | Brief |
---|---|
SAMPLE_REJECTED | A received sample has been rejected |
LIVELINESS_CHANGED | The liveliness of a DataWriter has changed |
REQUESTED_DEADLINE_MISSED | A deadline was not met |
REQUESTED_INCOMPATIBLE_QOS | A QoS policy was not compatible with the offering |
DATA_AVAILABLE | New information available |
SAMPLE_LOST | A sample has been lost |
SUBSCRIPTION_MATCHED | Found matching DataWriter with compatible QoS and Topic |
Listener | Brief |
---|---|
on_sample_rejected() | A received sample has been rejected |
on_liveliness_changed() | The liveliness of a DataWriter has changed |
on_requested_deadline_missed() | A deadline was not met |
on_requested_incompatible_qos() | A QoS policy was not compatible with the offering |
on_data_available() | New information available |
on_sample_lost() | A sample has been lost |
on_subscription_match() | Found matching DataWriter with compatible QoS and Topic |