The interlok-json optional package handles JSON documents within the framework. It supports the majority of operations that are available for XML within the base packages. It is included as an optional package because it introduces a significant number of additional dependencies. The key features are described here; some JSON specific implementations are also available to support other services such as jdbc-json-first-resultset-output for jdbc-data-query-service; routing-json-path-syntax-identifier for syntax-routing-service; json-array-splitter, json-path-splitter as a MessageSplitter implementation.
JSON Path
Similar to XPath, there is a defacto standard for extracting data from a JSON document. We use JsonPath as our implementation; their documentation on github is replicated in the javadocs for json-path-service as a handy reminder, however you should always treat the original JsonPath documentation as the canonical source.
For a given document shown below
{
"rectangle" : {
"length" : 5,
"breadth" : 5
}
}
We can extract the rectangle’s length and breadth into metadata by using the following service definition :
<json-path-service>
<source class="string-payload-data-input-parameter"/>
<json-path-execution>
<source class="constant-data-input-parameter">
<value>$.rectangle.length</value>
</source>
<target class="metadata-data-output-parameter">
<metadata-key>length</metadata-key>
</target>
</json-path-execution>
<json-path-execution>
<source class="constant-data-input-parameter">
<value>$.rectangle.breadth</value>
</source>
<target class="metadata-data-output-parameter">
<metadata-key>breadth</metadata-key>
</target>
</json-path-execution>
</json-path-service>
Since 3.6.4 you have the option of extract all top level fields as metadata directly using json-to-metadata (nested objects are effectively converted into strings) as a convenience service if you need all the data from a simple JSON document exposed as metadata.
Native JSON transformations
You can use JOLT to perform direct JSON to JSON mappings. This is useful if you need to structurally change the JSON document (removing elements, changing objects into arrays) but it does not have any of the advanced features that are available in XSLT; if you do need those, consider converting to XML, using standard xslt, and then rendering back to JSON again.
Again, the documentation from the JOLT github page is replicated as part of the javadocs for json-transform-service, but you should always consider the JOLT github documentation as the canonical source. The only addition that we have added is support for arbitrary metadata to be inserted into the transform directive as constant values. So, for instance if you had a standard swagger.json document generated from swagger.yml (you can use yaml-to-json to do this), and you wanted to change the host
field to be something that was defined in metadata then you could use the following mapping specification (where adapter.api.hostname
is the metadata key you want to use.)
[{
"operation": "remove",
"spec": {
"host": ""
}
}, {
"operation": "shift",
"spec": {
"*": "&",
"#${adapter.api.hostname}": "host"
}
}]
Giving a resulting configuration of
<json-transform-service>
<mapping-spec class="file-data-input-parameter">
<destination class="configured-destination">
<destination>file:///path/to/my/mapping.json</destination>
</destination>
</mapping-spec>
<metadata-filter class="regex-metadata-filter"/>
</json-transform-service>
Custom transform operations
We also provide com.adaptris.core.transform.json.jolt.EmptyStringToNull and com.adaptris.core.transform.json.jolt.NullToEmptyString as possible operations to convert all null/empty string instances into something else, so {"empty": null}
would be transformed to {"empty":""}
by com.adaptris.core.transform.json.jolt.EmptyStringToNull
.
JSON to XML
You can convert to and from XML via json-xml-transform-service by specifying the direction and driver; there are a number of available drivers, and they are discussed more fully here. It’s fully expected that when rendering XML as JSON you will need to execute a transform to get the document into the right format for the driver implementation chosen.
Driver implementations
Implementation | Description |
---|---|
simple-transformation-driver | The simplest driver that uses json.org as the implementation. It adds a root element json or json-array depending on the JSON received, each fieldname in the JSON document creates a new XML element. This is the preferred implementation if you are using relatively simple JSON documents and you know that the fieldnames aren’t invalid XML element names; if fieldnames contain invalid XML characters, then output will still be generated, but subsequent XML parsing steps will fail. This is the default driver as of 3.10 |
jsonlib-transformation-driver | The driver based on json-lib. It allows you to provide type hints for conversion purposes, and supports both json objects and arrays converted to and from XML; XML that should be rendered as a json array with a single element is possible; this would not be the case with simple-transformation-driver. It will also attempt to make XML names safe according to the XML specification; if fieldnames contain invalid XML characters, then the XML output will not accurately reflect the JSON field. |
This was the default prior to 3.10, but has been replaced by jsonlib-transformation-driver since its performance characteristics aren’t predictable enough for it to remain the default | |
json-array-transformation-driver | Same behaviour as default-transformation-driver but only allows json arrays |
json-object-transformation-driver | Same behaviour as default-transformation-driver but only allows json objects |
json-safe-transformation-driver | Since 3.6.4 Same behaviour as default-transformation-driver but strips any formatting prior to rendering XML as JSON as default-transformation-driver can be sensitive to whitespace. Since you are very likely to be executing a stylesheet to get your data into the right format anyway, you should use <xsl:strip-space elements="*" /> appropriately in your stylesheet. |
JSON to XML via StAX events
Since 3.8.2 you can use the interlok-json-streaming optional component to perform conversions to and from XML. This package contains a XMLInputFactory
and XMLOutputFactory
implementations for use with STaX events. Coupled with stax-streaming-service from interlok-stax you perform conversion to and from XML without building a DOM in memory; this is the preferred mechanism for converting huge documents.
JSON to CSV
Since 3.6.6 you can convert to and from CSV via the interlok-csv-json optional package. This adds new services that allow you to easily convert JSON to CSV and vice versa. It has a dependency on both interlok-csv and interlok-json.
JSON Schema
You can use json-schema-service to validate that a document conforms to a specific schema. You can also define specific behaviour if schema validation fails (the default is to throw an exception detailing all the schema violations).
If we revisit the JSON document from JSON Path then we could define our schema thus:
{
"type" : "object",
"properties" : {
"rectangle" : {"$ref" : "#/definitions/Rectangle" }
},
"definitions" : {
"size" : {
"type" : "number",
"minimum" : 0
},
"Rectangle" : {
"type" : "object",
"properties" : {
"length" : {"$ref" : "#/definitions/size"},
"breadth" : {"$ref" : "#/definitions/size"}
}
}
}
}
along with configuration, which will throw an exception for error handling if the document does not conform to the schema.
<json-schema-service>
<schema-url class="file-data-input-parameter">
<destination class="configured-destination">
<destination>file:///./config/schema.rectangle.json</destination>
</destination>
</schema-url>
</json-schema-service>