Joule
  • Welcome to Joule's Docs
  • Why Joule?
    • Joule capabilities
  • What is Joule?
    • Key features
    • The tech stack
  • Use case enablement
    • Use case building framework
  • Concepts
    • Core concepts
    • Low code development
    • Unified execution engine
    • Batch and stream processing
    • Continuous metrics
    • Key Joule data types
      • StreamEvent object
      • Contextual data
      • GeoNode
  • Tutorials
    • Getting started
    • Build your first use case
    • Stream sliding window quote analytics
    • Advanced tutorials
      • Custom missing value processor
      • Stateless Bollinger band analytics
      • IoT device control
  • FAQ
  • Glossary
  • Components
    • Pipelines
      • Use case anatomy
      • Data priming
        • Types of import
      • Processing unit
      • Group by
      • Emit computed events
      • Telemetry auditing
    • Processors
      • Common attributes
      • Filters
        • By type
        • By expression
        • Send on delta
        • Remove attributes
        • Drop all events
      • Enrichment
        • Key concepts
          • Anatomy of enrichment DSL
          • Banking example
        • Metrics
        • Dynamic contextual data
          • Caching architecture
        • Static contextual data
      • Transformation
        • Field Tokeniser
        • Obfuscation
          • Encryption
          • Masking
          • Bucketing
          • Redaction
      • Triggers
        • Change Data Capture
        • Business rules
      • Stream join
        • Inner stream joins
        • Outer stream joins
        • Join attributes & policy
      • Event tap
        • Anatomy of a Tap
        • SQL Queries
    • Analytics
      • Analytic tools
        • User defined analytics
          • Streaming analytics example
          • User defined analytics
          • User defined scripts
          • User defined functions
            • Average function library
        • Window analytics
          • Tumbling window
          • Sliding window
          • Aggregate functions
        • Analytic functions
          • Stateful
            • Exponential moving average
            • Rolling Sum
          • Stateless
            • Normalisation
              • Absolute max
              • Min max
              • Standardisation
              • Mean
              • Log
              • Z-Score
            • Scaling
              • Unit scale
              • Robust Scale
            • Statistics
              • Statistic summaries
              • Weighted moving average
              • Simple moving average
              • Count
            • General
              • Euclidean
        • Advanced analytics
          • Geospatial
            • Entity geo tracker
            • Geofence occupancy trigger
            • Geo search
            • IP address resolver
            • Reverse geocoding
            • Spatial Index
          • HyperLogLog
          • Distinct counter
      • ML inferencing
        • Feature engineering
          • Scripting
          • Scaling
          • Transform
        • Online predictive analytics
        • Model audit
        • Model management
      • Metrics engine
        • Create metrics
        • Apply metrics
        • Manage metrics
        • Priming metrics
    • Contextual data
      • Architecture
      • Configuration
      • MinIO S3
      • Apache Geode
    • Connectors
      • Sources
        • Kafka
          • Ingestion
        • RabbitMQ
          • Further RabbitMQ configurations
        • MQTT
          • Topic wildcards
          • Session management
          • Last Will and Testament
        • Rest endpoints
        • MinIO S3
        • File watcher
      • Sinks
        • Kafka
        • RabbitMQ
          • Further configurations
        • MQTT
          • Persistent messaging
          • Last Will and Testament
        • SQL databases
        • InfluxDB
        • MongoDB
        • Geode
        • WebSocket endpoint
        • MinIO S3
        • File transport
        • Slack
        • Email
      • Serialisers
        • Serialisation
          • Custom transform example
          • Formatters
        • Deserialisers
          • Custom parsing example
    • Observability
      • Enabling JMX for Joule
      • Meters
      • Metrics API
  • DEVELOPER GUIDES
    • Setting up developer environment
      • Environment setup
      • Build and deploy
      • Install Joule
        • Install Docker demo environment
        • Install with Docker
        • Install from source
        • Install Joule examples
    • Joulectl CLI
    • API Endpoints
      • Mangement API
        • Use case
        • Pipelines
        • Data connectors
        • Contextual data
      • Data access API
        • Query
        • Upload
        • WebSocket
      • SQL support
    • Builder SDK
      • Connector API
        • Sources
          • StreamEventParser API
        • Sinks
          • CustomTransformer API
      • Processor API
      • Analytics API
        • Create custom metrics
        • Define analytics
        • Windows API
        • SQL queries
      • Transformation API
        • Obfuscation API
        • FieldTokenizer API
      • File processing
      • Data types
        • StreamEvent
        • ReferenceDataObject
        • GeoNode
    • System configuration
      • System properties
  • Deployment strategies
    • Deployment Overview
    • Single Node
    • Cluster
    • GuardianDB
    • Packaging
      • Containers
      • Bare metal
  • Product updates
    • Public Roadmap
    • Release Notes
      • v1.2.0 Join Streams with stateful analytics
      • v1.1.0 Streaming analytics enhancements
      • v1.0.4 Predictive stream processing
      • v1.0.3 Contextual SQL based metrics
    • Change history
Powered by GitBook
On this page
  • Objective
  • Explanation on what is going on
  • Example
  • Storage definition

Was this helpful?

  1. Components
  2. Analytics
  3. ML inferencing

Model management

Deploy a retrained model directly in to a running Joule with zero down time

PreviousModel auditNextMetrics engine

Last updated 6 months ago

Was this helpful?

Objective

The predictor processor can be configured to load and refresh a model from a linked reference data storage platform.

When this optional setting is applied the relevant logic to switch the model in-place is activated and a model file change process occurs. On a model update the predictor processor execution is paused while the model is replaced.

Any model replacement can be trigger by external drift analysis by using the function.

Explanation on what is going on

Although the configuration may look confusing it due to the ability for every processor to be able to reuse reference data store while being able to gain flexible approach to building capabilities.

To ensure the PMML processor operates with reliable models, it must connect to a model store, allowing for model replacement when necessary.

The key configuration items are:

  1. Defines which stores to include. PMML only requires the model store.

  2. Set the model store to the production_churn_models.

  3. One processor initialisation the processor will configure file watchers on the store to trigger model refreshes.

Example

The example below demonstrates how to link to model store.

pmml predictor:
  name: customer_churn_predictor
  model: churn/customer_churn_rf.pmml
  response field: churn_prediction
  
  # Link to the store for this model
  model store: production_churn_models
  
  audit configuration:
    ...
  
  # 1. Define interested stores to bind too
  stores:
    # Model store
    production_churn_models:
      store name: customer_churn_models
    
    # Customer spend profiles
    production_customer_profiles:
      store name: customer_spend_profiles

Storage definition

For this example to work the below reference data definition would need to be deployed so that the processor can successfully bind to the underlying storage platform.

reference data:
  name: predictiveModelStores
  data stores:
    - minio stores:
        name: production_models
        connection:
          endpoint: "https://localhost"
          port: 9000
          tls: false
          credentials:
            access key: "AKIAIOSFODNN7"
            secret key: "wJalrXUtnFEMIK7MDENGbPxRfiCY"
        
        stores:
          customer_churn_models:
            bucketId: churn_models
          customer_spend_profiles:
            bucketId: customer_historical_spend          

For more details or assistance, is ready to support in your enquiries

See documentation for further details.

Fractalworks
Reference Data API
model audit
Example S3 integration for model management