Build your first use case
Last updated
Last updated
This is an introductory example of how to use a combination of out-of-the-box features to provide streaming enriched quotes for major banks over a $3.5 billion market capital size.
The tutorial will teach you how to use Joules OOTB features to filter, enrich and publish user defined alerts to a Kafka topic and csv file.
As a first use case we will cover a number of key features:
Subscribe and consume events Subscribe, consume, parse and present events ready for pipeline processing using Kafka.
Initialise Joule with contextual data Load local CSV contextual data in to the JouleDB
Filters and enrichment Apply filter for a subset of events using Javascript expressions and apply event enrichment with company data loaded in to JouleDB
Filter results by a constriant Using the "having" clause with a Javascript expression to only send events based upon a spread ratio breach
Publishing events Send processed events to either a CSV file or on to a Kafka topic as a defined AVRO domain data structure
Getting started project can be found here.
Provide trading consumer applications with bid and ask quotes and company information for all major banks with a market cap of over $350 billion trading on the nasdaq stock market and when the spread widens to over 1.5% for the current business day.
Change the the valid from and to dates.
The processing stream defines an initialisation step to load contextual data in to memory, processing pipeline, event emit clause and the grouping of data.
The key processing steps include:
Enrich events with industry and market cap context information
Filter events by 'Major Banks' industry and with market cap greater than $350 billion
Send a stock record with following attributes for every event; symbol, company_name, market_cap, bid, ask
This use case example will output events to a CSV file and a Kafka topic concurrently which both require there own configuration deployments.
A quick and easy way to validate your use case processing is to send the resulting events to a CSV file.
Resukting events will also be sent on to a Kafka topic ready for downstream trading applications to consume and continue processing.
A quick recap of how events will be transformed to AVRO data structures:
The user emit projection is transformed to provided domain data type using an AVRO schema, see below.
The resulting events are then published on to the nasdaq_major_bank_quotes
Kafka topic.
Now we have all the use case definitions we can now deploy to Joule via the Rest API using Postman. Following the same getting started deployment steps for this project.
Go to the "Build your first use case" folder under the Joule - Banking demo / Tutorials Postman examples within the getting started project
That's it, you should now have an understanding how the components fit together to form a single use case.
To recap this example covers a number of key features:
Filter
Javascript expression filter.
Enrichment
Add contextual data to streaming events.
Output projection
Define an output projection that matches a AVRO schema attribute requirements.
Having clause
Define a Javascript analytic expression that sends alerts only when a specified condition is met.
File validation
Publish events to CSV file to validate results