To address their Big Data processing requirements, a client wanted to create a custom-built data integration platform to provide near real-time access for processing and analytics of mainframe and streaming data. However, the traditional Data Warehousing approach of ETL and staging areas limited their ability to deliver results to the business. An innovative solution was sought, which is where Alumnus was brought in.
We developed an end-to-end data transfer pipeline for static data from varied sources(Mainframes) to non-mainframe target technologies. In addition, a pipeline was also custom-built for real-time transfer and visualization of streaming data from varied technologies to a Kafka-based target system.
This streaming analytics capability was delivered on an AWS platform using Java / Nodejs technology while avoiding the cost and delay of traditional ETL and staging processes.
Technology Stack:
Cassandra, Teradata, Redis, JDBC, HDFS, RabbitMQ, SNMP, sFlow, Netflow, Kafka, AWS, Java, Node.js