Force Binary encoding instead of Encodable_Object encoding
Posted: 23 Aug 2019, 10:07
Dear Support Team,
* We are using C++ SDK 1.5.5 (will upgrade to 1.7.0 soon, most likely).
* We have unit tests which run both the client and the service within the same process.
* The service exposes a node which is an array of structures (OPC Binary struct array). These data types were defined and created using UaModeler.
* The client reads these nodes, and interprets the nodes by the following steps:
-- Read the node as UaVariant.
-- Convert the UaVariant to an ExtensionObjectArray (using UaVariant::toExtensionObjectArray).
-- Retrieve the UaStructureDefinition for the UaExtensionObject from the service.
-- Convert each element of the extension object array to a UaGenaricStructureValue, using UaGenericStructureValue::setGenericValue(item, structureDefinition).
-- This calls UaGenaricStructureValue::checkExtensionObject(...), which basically checks the encoding of the extension object.
* When the client runs within the same process as the service, the encoding is ExtensionObjectEncoding::EncodeableObject. This is not supported, with an empty UaGenericStructureValue as the result.
* When we execute the same code, but with the client running in a different process, the encoding is ExtensionObjectEncoding::Binary, and the deserialization works correctly.
Can you confirm our findings, or are we doing something wrong/unintended?
How does this mechanism of selecting the encoding work? (which sources?)
Are there configuration (or programmatical) settings, which can be used to force the used encoding? (If so, where to find those/how to set those)?
Thanks in advance for your feedback.
Kind regards,
* We are using C++ SDK 1.5.5 (will upgrade to 1.7.0 soon, most likely).
* We have unit tests which run both the client and the service within the same process.
* The service exposes a node which is an array of structures (OPC Binary struct array). These data types were defined and created using UaModeler.
* The client reads these nodes, and interprets the nodes by the following steps:
-- Read the node as UaVariant.
-- Convert the UaVariant to an ExtensionObjectArray (using UaVariant::toExtensionObjectArray).
-- Retrieve the UaStructureDefinition for the UaExtensionObject from the service.
-- Convert each element of the extension object array to a UaGenaricStructureValue, using UaGenericStructureValue::setGenericValue(item, structureDefinition).
-- This calls UaGenaricStructureValue::checkExtensionObject(...), which basically checks the encoding of the extension object.
* When the client runs within the same process as the service, the encoding is ExtensionObjectEncoding::EncodeableObject. This is not supported, with an empty UaGenericStructureValue as the result.
* When we execute the same code, but with the client running in a different process, the encoding is ExtensionObjectEncoding::Binary, and the deserialization works correctly.
Can you confirm our findings, or are we doing something wrong/unintended?
How does this mechanism of selecting the encoding work? (which sources?)
Are there configuration (or programmatical) settings, which can be used to force the used encoding? (If so, where to find those/how to set those)?
Thanks in advance for your feedback.
Kind regards,