Joule
  • Welcome to Joule's Docs
  • Why Joule?
    • Joule capabilities
  • What is Joule?
    • Key features
    • The tech stack
  • Use case enablement
    • Use case building framework
  • Concepts
    • Core concepts
    • Low code development
    • Unified execution engine
    • Batch and stream processing
    • Continuous metrics
    • Key Joule data types
      • StreamEvent object
      • Contextual data
      • GeoNode
  • Tutorials
    • Getting started
    • Build your first use case
    • Stream sliding window quote analytics
    • Advanced tutorials
      • Custom missing value processor
      • Stateless Bollinger band analytics
      • IoT device control
  • FAQ
  • Glossary
  • Components
    • Pipelines
      • Use case anatomy
      • Data priming
        • Types of import
      • Processing unit
      • Group by
      • Emit computed events
      • Telemetry auditing
    • Processors
      • Common attributes
      • Filters
        • By type
        • By expression
        • Send on delta
        • Remove attributes
        • Drop all events
      • Enrichment
        • Key concepts
          • Anatomy of enrichment DSL
          • Banking example
        • Metrics
        • Dynamic contextual data
          • Caching architecture
        • Static contextual data
      • Transformation
        • Field Tokeniser
        • Obfuscation
          • Encryption
          • Masking
          • Bucketing
          • Redaction
      • Triggers
        • Change Data Capture
        • Business rules
      • Stream join
        • Inner stream joins
        • Outer stream joins
        • Join attributes & policy
      • Event tap
        • Anatomy of a Tap
        • SQL Queries
    • Analytics
      • Analytic tools
        • User defined analytics
          • Streaming analytics example
          • User defined analytics
          • User defined scripts
          • User defined functions
            • Average function library
        • Window analytics
          • Tumbling window
          • Sliding window
          • Aggregate functions
        • Analytic functions
          • Stateful
            • Exponential moving average
            • Rolling Sum
          • Stateless
            • Normalisation
              • Absolute max
              • Min max
              • Standardisation
              • Mean
              • Log
              • Z-Score
            • Scaling
              • Unit scale
              • Robust Scale
            • Statistics
              • Statistic summaries
              • Weighted moving average
              • Simple moving average
              • Count
            • General
              • Euclidean
        • Advanced analytics
          • Geospatial
            • Entity geo tracker
            • Geofence occupancy trigger
            • Geo search
            • IP address resolver
            • Reverse geocoding
            • Spatial Index
          • HyperLogLog
          • Distinct counter
      • ML inferencing
        • Feature engineering
          • Scripting
          • Scaling
          • Transform
        • Online predictive analytics
        • Model audit
        • Model management
      • Metrics engine
        • Create metrics
        • Apply metrics
        • Manage metrics
        • Priming metrics
    • Contextual data
      • Architecture
      • Configuration
      • MinIO S3
      • Apache Geode
    • Connectors
      • Sources
        • Kafka
          • Ingestion
        • RabbitMQ
          • Further RabbitMQ configurations
        • MQTT
          • Topic wildcards
          • Session management
          • Last Will and Testament
        • Rest endpoints
        • MinIO S3
        • File watcher
      • Sinks
        • Kafka
        • RabbitMQ
          • Further configurations
        • MQTT
          • Persistent messaging
          • Last Will and Testament
        • SQL databases
        • InfluxDB
        • MongoDB
        • Geode
        • WebSocket endpoint
        • MinIO S3
        • File transport
        • Slack
        • Email
      • Serialisers
        • Serialisation
          • Custom transform example
          • Formatters
        • Deserialisers
          • Custom parsing example
    • Observability
      • Enabling JMX for Joule
      • Meters
      • Metrics API
  • DEVELOPER GUIDES
    • Setting up developer environment
      • Environment setup
      • Build and deploy
      • Install Joule
        • Install Docker demo environment
        • Install with Docker
        • Install from source
        • Install Joule examples
    • Joulectl CLI
    • API Endpoints
      • Mangement API
        • Use case
        • Pipelines
        • Data connectors
        • Contextual data
      • Data access API
        • Query
        • Upload
        • WebSocket
      • SQL support
    • Builder SDK
      • Connector API
        • Sources
          • StreamEventParser API
        • Sinks
          • CustomTransformer API
      • Processor API
      • Analytics API
        • Create custom metrics
        • Define analytics
        • Windows API
        • SQL queries
      • Transformation API
        • Obfuscation API
        • FieldTokenizer API
      • File processing
      • Data types
        • StreamEvent
        • ReferenceDataObject
        • GeoNode
    • System configuration
      • System properties
  • Deployment strategies
    • Deployment Overview
    • Single Node
    • Cluster
    • GuardianDB
    • Packaging
      • Containers
      • Bare metal
  • Product updates
    • Public Roadmap
    • Release Notes
      • v1.2.0 Join Streams with stateful analytics
      • v1.1.0 Streaming analytics enhancements
      • v1.0.4 Predictive stream processing
      • v1.0.3 Contextual SQL based metrics
    • Change history
Powered by GitBook
On this page
  • Overview
  • JSON
  • Example
  • Attributes schema
  • CSV
  • Example
  • Attributes schema
  • Parquet
  • Example
  • Attributes schema

Was this helpful?

  1. Components
  2. Connectors
  3. Serialisers
  4. Serialisation

Formatters

Apply standard data format process to outgoing data

Overview

Formatters are mainly used for direct storage whereby data tools such as PySpark, Apache Presto, DuckDB, MongoDB, Postgres, MySQL etc,. can use the data directly without further overhead.

This is useful for stream processing systems where data from different sources or formats needs to be transformed into a consistent event format for downstream processing.

JSON

Standard JSON formatter converts processed events to a JSON string using specified attributes

Example

file:
  ...
  json formatter:
    date format: YYYY/MM/dd
    contentType: application/json
    indent output: false

Attributes schema

Attribute
Description
Data Type
Required

date format

Date format to apply to date fields

String

Default: yyyy/MM/dd

indent output

Apply indentation formatting

Boolean

Default: false

contentType

Type of content to inform receiving application

String

Default: application/json

encoding

Payload encoding method

String

Default: UTF-8

ext

File extension

String

Default: json

CSV

Standard CSV formatter converts processed events to a CSV string using specified attributes

Example

file:
  ...
  csv formatter:
    contentType: text/csv
    encoding: UTF_8
    delimiter: "|"

Attributes schema

Attribute
Description
Data Type
Required

date format

Date format to apply to date fields

String

Default: yyyy/MM/dd

delimiter

Field delimiter

Character

Default: ","

contentType

Type of content to inform receiving application

String

Default: text/csv

encoding

Payload encoding method

String

Default: UTF-8

ext

File extension

String

Default: csv

Parquet

Converts a StreamEvent to an AVRO object using a target schema format before writing to a parquet formatted object.

Example

file:
  ...
  parquet formatter:
    schema path: /home/joule/outputschema.avro
    compression codec: SNAPPY
    temp filedir: /tmp
    contentType: binary/octet-stream
    encoding: UTF_8

Attributes schema

Attribute
Description
Data Type
Required

schema path

Path location for the Avro output schema

String

compression codec

Algorithm to use to compress file. Available types:

  • UNCOMPRESSED

  • SNAPPY

  • GZIP

  • LZO

  • BROTLI

  • LZ4

  • ZSTD

String

Default: UNCOMPRESSED

contentType

Type of content to inform receiving application

String

Default: binary/octet-stream

encoding

Payload encoding method

String

Default: UTF_8

ext

File extension

String

Default: parquet

temp file directory

Directory path for temp files

String

Default: ./tmp

PreviousCustom transform exampleNextDeserialisers

Last updated 6 months ago

Was this helpful?