Joule
  • Welcome to Joule's Docs
  • Why Joule?
    • Joule capabilities
  • What is Joule?
    • Key features
    • The tech stack
  • Use case enablement
    • Use case building framework
  • Concepts
    • Core concepts
    • Low code development
    • Unified execution engine
    • Batch and stream processing
    • Continuous metrics
    • Key Joule data types
      • StreamEvent object
      • Contextual data
      • GeoNode
  • Tutorials
    • Getting started
    • Build your first use case
    • Stream sliding window quote analytics
    • Advanced tutorials
      • Custom missing value processor
      • Stateless Bollinger band analytics
      • IoT device control
  • FAQ
  • Glossary
  • Components
    • Pipelines
      • Use case anatomy
      • Data priming
        • Types of import
      • Processing unit
      • Group by
      • Emit computed events
      • Telemetry auditing
    • Processors
      • Common attributes
      • Filters
        • By type
        • By expression
        • Send on delta
        • Remove attributes
        • Drop all events
      • Enrichment
        • Key concepts
          • Anatomy of enrichment DSL
          • Banking example
        • Metrics
        • Dynamic contextual data
          • Caching architecture
        • Static contextual data
      • Transformation
        • Field Tokeniser
        • Obfuscation
          • Encryption
          • Masking
          • Bucketing
          • Redaction
      • Triggers
        • Change Data Capture
        • Business rules
      • Stream join
        • Inner stream joins
        • Outer stream joins
        • Join attributes & policy
      • Event tap
        • Anatomy of a Tap
        • SQL Queries
    • Analytics
      • Analytic tools
        • User defined analytics
          • Streaming analytics example
          • User defined analytics
          • User defined scripts
          • User defined functions
            • Average function library
        • Window analytics
          • Tumbling window
          • Sliding window
          • Aggregate functions
        • Analytic functions
          • Stateful
            • Exponential moving average
            • Rolling Sum
          • Stateless
            • Normalisation
              • Absolute max
              • Min max
              • Standardisation
              • Mean
              • Log
              • Z-Score
            • Scaling
              • Unit scale
              • Robust Scale
            • Statistics
              • Statistic summaries
              • Weighted moving average
              • Simple moving average
              • Count
            • General
              • Euclidean
        • Advanced analytics
          • Geospatial
            • Entity geo tracker
            • Geofence occupancy trigger
            • Geo search
            • IP address resolver
            • Reverse geocoding
            • Spatial Index
          • HyperLogLog
          • Distinct counter
      • ML inferencing
        • Feature engineering
          • Scripting
          • Scaling
          • Transform
        • Online predictive analytics
        • Model audit
        • Model management
      • Metrics engine
        • Create metrics
        • Apply metrics
        • Manage metrics
        • Priming metrics
    • Contextual data
      • Architecture
      • Configuration
      • MinIO S3
      • Apache Geode
    • Connectors
      • Sources
        • Kafka
          • Ingestion
        • RabbitMQ
          • Further RabbitMQ configurations
        • MQTT
          • Topic wildcards
          • Session management
          • Last Will and Testament
        • Rest endpoints
        • MinIO S3
        • File watcher
      • Sinks
        • Kafka
        • RabbitMQ
          • Further configurations
        • MQTT
          • Persistent messaging
          • Last Will and Testament
        • SQL databases
        • InfluxDB
        • MongoDB
        • Geode
        • WebSocket endpoint
        • MinIO S3
        • File transport
        • Slack
        • Email
      • Serialisers
        • Serialisation
          • Custom transform example
          • Formatters
        • Deserialisers
          • Custom parsing example
    • Observability
      • Enabling JMX for Joule
      • Meters
      • Metrics API
  • DEVELOPER GUIDES
    • Setting up developer environment
      • Environment setup
      • Build and deploy
      • Install Joule
        • Install Docker demo environment
        • Install with Docker
        • Install from source
        • Install Joule examples
    • Joulectl CLI
    • API Endpoints
      • Mangement API
        • Use case
        • Pipelines
        • Data connectors
        • Contextual data
      • Data access API
        • Query
        • Upload
        • WebSocket
      • SQL support
    • Builder SDK
      • Connector API
        • Sources
          • StreamEventParser API
        • Sinks
          • CustomTransformer API
      • Processor API
      • Analytics API
        • Create custom metrics
        • Define analytics
        • Windows API
        • SQL queries
      • Transformation API
        • Obfuscation API
        • FieldTokenizer API
      • File processing
      • Data types
        • StreamEvent
        • ReferenceDataObject
        • GeoNode
    • System configuration
      • System properties
  • Deployment strategies
    • Deployment Overview
    • Single Node
    • Cluster
    • GuardianDB
    • Packaging
      • Containers
      • Bare metal
  • Product updates
    • Public Roadmap
    • Release Notes
      • v1.2.0 Join Streams with stateful analytics
      • v1.1.0 Streaming analytics enhancements
      • v1.0.4 Predictive stream processing
      • v1.0.3 Contextual SQL based metrics
    • Change history
Powered by GitBook
On this page
  • Overview
  • Supported file formats supported
  • FileProcessingTask
  • Example
  • ReferenceDataFileProcessingTask
  • Example

Was this helpful?

  1. DEVELOPER GUIDES
  2. Builder SDK

File processing

Joule provides utility classes to load large files efficiently

PreviousFieldTokenizer APINextData types

Last updated 6 months ago

Was this helpful?

Overview

Under the hood Joule uses to read files and thereby enable efficient large file handling and OOTB standard file format support. The classes that perform this work have been surfaced to developers in the form of a Callable task.

Two key classes are provided:

Supported file formats supported

  • PARQUET

  • ORC

  • CSV

  • JSON

  • ARROW_IPC

The provided classes can be found under the SDK package

com.fractalworks.streams.sdk.util.file

FileProcessingTask

This processing task class reads a file contents and automatically converts each file logical row in to StreamEvent object. This is performed using micro-batch processing which reduces memory and processing overhead while driving stream processing throughput.

Example

The below example loads


FileProcessStatus comsumeFile(String eventType, String filename, String absoluteFilePath, FileFormat fileFormat, AtomicLong counter) throws Exception {
    var listener = new TransportListener() {
        @Override
        public void onEvent(Collection<StreamEvent> events) {
            counter.addAndGet(events.size());
        }
    };

    File fileuri = new File(absoluteFilePath);
    FileProcessingTask task = new FileProcessingTask(eventType, filename, fileuri.getAbsolutePath(), fileFormat, listener);
    FileProcessStatus status = task.call();

    await()
            .pollInterval(100, TimeUnit.MILLISECONDS)
            .until(checkForEvents(counter));
    return status;
}

// Simple event handler to check for number of events received
private Callable<Boolean> checkForEvents(AtomicLong eventsSeen) {
    return () -> (eventsSeen.get() == NUM_EVENTS);
}

ReferenceDataFileProcessingTask

ReferenceData objects are stored within a in-memory data store to reduce the retrieval latency and I/O overhead.

Example

// Create a in-memory store that implements the Store interface
CellTowerStore cellTowerStore = new CellTowerStore(250, 250);
cellTowerStore.setMaxElementsPerLevel(10000);
cellTowerStore.setTreeLevels(5);
cellTowerStore.initialize();

// Load the celltower file contents in to the store
File f = new File(CELLTOWER_FILE);
var task = new ReferenceDataFileProcessingTask<CellTower>((Store)cellTowerStore, f.getName(), f.getAbsolutePath(), FileFormat.CSV);
task.setMoveFileAfterProcessing(false);
task.setParser(new CellTowerArrowParser());
var status = task.call();

This processing task class reads a reference data file contents and automatically converts each file logical row in to object. This is performed using micro-batch processing to reduce memory footprint and processing overhead and therefore able to read large files in to memory.

This example can be found within the project test CellTowerCSVParserTest class.

ReferenceData
fractalworks-geospatial-processor
Apache Arrow
FileProcessingTask
ReferenceDataFileProcessingTask