European Journal of Computer Science and Information Technology (EJCSIT)

EA Journals

serverless computing

Journey to Scalability and Efficiency with Microservices and Serverless Computing (Published)

This article examines Netflix’s transformative journey from a monolithic architecture to a modern, distributed system leveraging microservices and serverless computing. The article analyzes the challenges faced by the original monolithic system and explores how the adoption of cloud-native architectures revolutionized Netflix’s ability to deliver content globally. Through a detailed examination of performance metrics, system reliability, and operational efficiency, this article demonstrates how architectural evolution enabled Netflix to achieve unprecedented levels of scalability, resilience, and service quality. The analysis encompasses various aspects of the transformation, including service isolation, data architecture evolution, API gateway implementation, and the integration of serverless computing for specific workloads, providing valuable insights for organizations undertaking similar digital transformation initiatives.

Keywords: Cloud-Native Architecture, Digital Transformation, System Scalability, microservices, serverless computing

The Rise of Serverless AI: Transforming Machine Learning Deployment (Published)

Serverless computing has revolutionized artificial intelligence deployment by introducing a paradigm shift in infrastructure management and resource utilization. The technology enables organizations to deploy AI solutions without managing underlying infrastructure, offering automatic scaling and pay-per-use pricing models. Function-as-a-Service dominates the market share, particularly in the Banking, Financial Services and Insurance sector, while Backend-as-a-Service gains traction in AI applications. Organizations achieve significant reductions in total cost of ownership while maintaining high service availability. The geographical distribution showcases North American leadership, with Asia Pacific regions demonstrating substantial growth potential. Technical advancements in serverless AI platforms support diverse ML frameworks and model architectures, enabling efficient resource utilization and rapid deployment capabilities. While cold start latency and resource constraints present challenges, continuous platform optimization and framework development address these issues. The integration of edge computing with serverless principles enhances distributed AI applications, reducing data transfer requirements and improving overall system performance.

Keywords: AWS lambda, artificial intelligence deployment, cloud functions, cloud infrastructure management, cold start, cost optimization, edge AI, function-as-a-service, scalability, serverless computing

Scroll to Top

Don't miss any Call For Paper update from EA Journals

Fill up the form below and get notified everytime we call for new submissions for our journals.