Build your first use case

Prerequisites

Revisit the use case building framework to get an understanding on how to create a use case out of business context.

Revisit low code development to get an understanding on what is the anatomy of a use case.

Overview

These instructions cover how to define, build, deploy a use case on to the Joule Platform using the Getting started project provided here. These instructions should only be used to configure the Joule Platform for development or demonstration purposes.

Use case definition

Get Nasdaq major bank quotes

1

Use case objective

Provide trading consumer application for the current business day with bid, ask and spread quotes for all major banks with a market cap of over $350 billion trading on the nasdaq stock market.

The use case should only be processing for a single defined market business day.

use case:
  name: nasdaq_banking_quotes
  constraints:
    valid from: '2024-10-01T09:25:00.000Z'
    valid to: '2024-10-01T16:35:00.000Z'
  sources:
    - live_nasdaq_quotes
  stream name: nasdaq_major_banks_stream
  sinks:
    - nasdaq_major_bank_quotes
2

Processing steps

  1. Enrich events with industry and market cap context information

  2. Filter events by 'Major Banks' industry and with market cap greater than $350 billion

  3. Send a stock record with following attributes for every event; symbol, company_name, market_cap, bid, ask

Stream definition

stream:
  name: nasdaq_major_banks_stream
  eventTimeType: EVENT_TIME
  
  initialisation:
    - data import:
        schema: reference_data
        csv:
          - table: nasdaq_companies
            file: data/csv/nasdaq.csv
            drop table: true
            index:
              fields: [ 'Symbol' ]
              unique: true

  processing unit:
    pipeline:
      # Enrich every event with company information
      - enricher:
          fields:
            company_info:
              by query: "select * from reference_data.nasdaq_companies where Symbol = ?"
              query fields: [symbol]
              with values: [company_name,industry,market_cap]
              using: JouleDB
      
      # Filter events to banks and market cap
      - filter:
          expression: "industry == 'Major Banks' && market_cap > 3500000000"

  emit:
    select: "symbol, company_name, market_cap, bid, ask"

  group by:
    - symbol

3

Data sources

Subscribe to live nasdaq quote data (note we are using simulated data)

Source definition

kafkaConsumer:
    name: nasdaq_quotes_stream
    cluster address: joule-gs-redpanda-0:9092
    consumerGroupId: nasdaq
    topics:
      - quotes

    deserializer:
      parser: com.fractalworks.examples.banking.data.QuoteToStreamEventParser
      key deserializer: org.apache.kafka.common.serialization.IntegerDeserializer
      value deserializer: com.fractalworks.streams.transport.kafka.serializers.object.ObjectDeserializer

    properties:
      partition.assignment.strategy: org.apache.kafka.clients.consumer.StickyAssignor
      max.poll.records" : 7000
      fetch.max.bytes : 10485760

4

Outcome

Send enriched and filtered data to a Kafka consumer topic.

  1. Transform the user projection in to an expected domain data type, see below.

  2. Send quotes to on the nasdaq_major_bank_quotes Kafka topic.

AVRO schema

{
  "type" : "record",
  "name" : "StockRecord",
  "namespace" : "com.fractalworks.examples.banking.data",
  "fields" : [
    {"name" : "symbol", "type" : "string"},
    {"name" : "company_name", "type" : "string"},
    {"name" : "market_cap", "type" : "long"},
    {"name" : "bid", "type" : "double"},
    {"name" : "ask", "type" : "double"}
    ]
}

Sink definition

kafkaPublisher:
  cluster address: joule-gs-redpanda-0:9092
  topic: nasdaq_major_bank_quotes
  partitionKeys:
    - symbol

  serializer:
    key serializer: org.apache.kafka.common.serialization.IntegerSerializer
    avro setting:
      local schema: ./conf/avro/stockrecord.avsc

Deploying the use case

Now we have the use case definition we can now deployable the definitions to Joule.

Note the above definitions need to be converted to JSON ready for deployment

Reviewing results

What we have learnt

Last updated