THE CHALLENGE

An important client with presence in several European and international countries needed to export information in real time to different applications or clients (Business Intelligence, Risk Management, Partners…) from its main software platform.

THE SOLUTION

We created different points on the base code (of the main application) to export the information to a database table, which we would use as a message queue for another application (Processor) to read and send these notifications to a message brokerage system with real time processing, high performance and availability (Kafka).

Processor is a backend application whose main task is to read the queue table and produce messages for the intermediary system mentioned above (Kafka).

Among the most notable features are:

  • Management of Processing Order and Errors: The order of the messages produced was a fundamental project requirement, as well as establishing a mechanism of retries to achieve their execution in an orderly manner. Also, to establish a fault-tolerant system, an Active/Active style cluster of instances was deployed.
  • High Performance and Metrics: The high performance of the application was also an indispensable requirement, so for the control of its correct operation a module was included to be able to generate metrics to well-known monitoring systems.
  • Code Quality and Test Coverage: Given the critical nature of the application, a test coverage of over 98% was achieved. The Acceptance Tests were developed using methodology that uses non-technical language so the application could be tested with the direct participation of the end user.

TECHNOLOGICAL SOLUTION

Solution features:

  • The “Processor” is a backend application based on Spring Boot and Quartz, whose main task is to read the queue table, and produce messages to Kafka in JSON format.
  • Use of synchronous Kafka Producer, which will wait until knowing that the message has been sent correctly.
  • Test coverage of over 98% (based on Sonar statistics), including Unit Tests, Integration Tests and Acceptance Tests Use of TDD methodology for the most part, so the tests act as “living documentation”.
  • The Cucumber framework was used to carry out the Acceptance Tests, so that the application could be tested with the End User, following the Gherkin syntax.
  • Code adapted for both Oracle DB and PostgreSQL
  • Integration and Acceptance Test with Kafka, Zookeeper and Postgresql Docker containers.
  • Micrometer Integration for dumping metrics into Graphite and Grafana.