Building Near Real Time Analytics Platform

Description of each stage for building the solution i.e Near Real Time Analytics Platform –

  • When a user buys a product, the id related to the particular product with the order status and time gets passed to the Kafka topic.
  • Spark streaming code takes the data from Kafka topic in a window of a few seconds, processes it as to count each different order status in that few seconds of an open window.
  • As soon as Spark stream processes the total count of each unique order, the state gets pushed to new Kafka topic.
  • Now a Node.js server will start consuming messages as soon as new messages are available in the one minute Kafka topic, and then the consumed message gets emitted to the browser via Socket.io.
  • As socket.io-client in the browser receives a new “message” event, data in the event start getting processed.
  • If the order status of the buyer is “shipped” in the received data, then it gets attached to HighCharts series and gets presented on the browser of the buyer.