The need for secure data transactions has become a necessity of our time. Medical records, financial records, legal information and payment gateway are all in need of secure data transaction process. There have been several methods proposed to perform secure, fast and scalable data transactions in web services. As the web servers deals with the huge amount of query it becomes really difficult to handle all the query to perform in the limited amount of time , and the failure of this can crash the web service or it may cause transaction failures, which can cause a huge financial losses to the organizations. Batched stream processing is a new distributed data processing paradigm that models recurring batch computations on incrementally bulk-appended data streams. The model is inspired by our empirical study on a trace from large-scale production data-processing clusters at the web server end; it allows a set of effective query optimizations that are not possible in a traditional batch processing model. By applying Bayesian Networks concept we stream the query so that similar queries are batched as cluster of queries which we called them as jumbo query. These batched queries are then commit for the transaction so that the complete process runs without any much load on the servers and they can handle heavy amount of transactions without any failures which lead to any fuss.
Keywords: Batched Stream processing, Bayesian Networks, Jumbo Query, Query Series, Web Server