Title
Benchmarking Financial Data Feed Systems
Abstract
Data-driven solutions for the investment industry require event-based backend systems to process high-volume financial data feeds with low latency, high throughput, and guaranteed delivery modes. At vwd we process an average of 18 billion incoming event notifications from 500+ data sources for 30 million symbols per day and peak rates of 1+ million notifications per second using custom-built platforms that keep audit logs of every event. We currently assess modern open source event-processing platforms such as Kafka, NATS, Redis, Flink or Storm for the use in our ticker plant to reduce the maintenance effort for cross-cutting concerns and leverage hybrid deployment models. For comparability and repeatability we benchmark candidates with a standardized workload we derived from our real data feeds. We have enhanced an existing light-weight open source benchmarking tool in its processing, logging, and reporting capabilities to cope with our workloads. The resulting tool wrench can simulate workloads or replay snapshots in volume and dynamics like those we process in our ticker plant. We provide the tool as open source. As part of ongoing work we contribute details on (a) our workload and requirements for benchmarking candidate platforms for financial feed processing; (b) the current state of the tool wrench.
Year
DOI
Venue
2019
10.1145/3328905.3332506
Proceedings of the 13th ACM International Conference on Distributed and Event-based Systems
Keywords
Field
DocType
Event-processing, benchmarking, big data, event bus, financial data, publish/subscribe, requirements, stream-processing, workload
Workload,Computer science,Complex event processing,Wrench,Throughput,Finance,Stream processing,Big data,Benchmarking,Data feed
Conference
ISBN
Citations 
PageRank 
978-1-4503-6794-3
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Manuel Coenen100.34
Christoph Wagner200.34
Alexander Echler300.68
Sebastian Frischbier400.68