European Journal of Computer Science and Information Technology (EJCSIT)

EA Journals

serverless computing

Dissecting Serverless Computing for AI-Driven Network Functions: Concepts, Challenges, and Opportunities (Published)

Serverless computing represents a transformative paradigm in cloud architecture that is fundamentally changing how network functions are deployed and managed. This article examines the intersection of serverless computing and artificial intelligence in the context of network functions, highlighting how this convergence enables more efficient, scalable, and intelligent network operations. The serverless model abstracts infrastructure management while offering automatic scaling and consumption-based pricing, creating an ideal environment for deploying AI-driven network capabilities. The architectural components of serverless platforms are explored, including event sources, function runtimes, scaling mechanisms, state management systems, and integration layers, with particular attention to how these components support AI workloads. Despite compelling advantages, several challenges must be addressed, including cold start latency, state management in stateless environments, and resource limitations for complex AI models. Mitigation strategies such as provisioned concurrency, external state stores, and model optimization have proven effective in overcoming these obstacles. Integration with complementary cloud-native technologies like Kubernetes, Knative, and service meshes further enhances the capabilities of serverless network functions. Practical applications in intelligent DDoS mitigation, network configuration management, predictive maintenance, and dynamic traffic optimization demonstrate the real-world value of this approach, while economic and security assessments reveal significant benefits in cost reduction, operational efficiency, and security posture.

Keywords: Artificial Intelligence, cloud-native networking, event-driven architecture, network functions virtualization, serverless computing

Journey to Scalability and Efficiency with Microservices and Serverless Computing (Published)

This article examines Netflix’s transformative journey from a monolithic architecture to a modern, distributed system leveraging microservices and serverless computing. The article analyzes the challenges faced by the original monolithic system and explores how the adoption of cloud-native architectures revolutionized Netflix’s ability to deliver content globally. Through a detailed examination of performance metrics, system reliability, and operational efficiency, this article demonstrates how architectural evolution enabled Netflix to achieve unprecedented levels of scalability, resilience, and service quality. The analysis encompasses various aspects of the transformation, including service isolation, data architecture evolution, API gateway implementation, and the integration of serverless computing for specific workloads, providing valuable insights for organizations undertaking similar digital transformation initiatives.

Keywords: Cloud-Native Architecture, Digital Transformation, System Scalability, microservices, serverless computing

The Rise of Serverless AI: Transforming Machine Learning Deployment (Published)

Serverless computing has revolutionized artificial intelligence deployment by introducing a paradigm shift in infrastructure management and resource utilization. The technology enables organizations to deploy AI solutions without managing underlying infrastructure, offering automatic scaling and pay-per-use pricing models. Function-as-a-Service dominates the market share, particularly in the Banking, Financial Services and Insurance sector, while Backend-as-a-Service gains traction in AI applications. Organizations achieve significant reductions in total cost of ownership while maintaining high service availability. The geographical distribution showcases North American leadership, with Asia Pacific regions demonstrating substantial growth potential. Technical advancements in serverless AI platforms support diverse ML frameworks and model architectures, enabling efficient resource utilization and rapid deployment capabilities. While cold start latency and resource constraints present challenges, continuous platform optimization and framework development address these issues. The integration of edge computing with serverless principles enhances distributed AI applications, reducing data transfer requirements and improving overall system performance.

Keywords: AWS lambda, artificial intelligence deployment, cloud functions, cloud infrastructure management, cold start, cost optimization, edge AI, function-as-a-service, scalability, serverless computing

Scroll to Top

Don't miss any Call For Paper update from EA Journals

Fill up the form below and get notified everytime we call for new submissions for our journals.