Overview

The emergence of the computing continuum, with ever-expanding cloud boundaries and its evolution into fog, edge, and IoT paradigms, has created unprecedented opportunities for realizing transformative and intelligent digital solutions. New application domains, such as real-time large language model (LLM) inference, federated learning, and multimodal media processing, demand scalable, low-latency, and adaptive processing across distributed, heterogeneous infrastructures. While the geo-distributed and federated nature of continuum computing improves agility, responsiveness, and service quality, it also introduces significant challenges in scalability, resilience, energy efficiency, and carbon awareness. The demands of dynamic service deployment and rapidly evolving operational requirements require a fundamental rethinking of infrastructure and software stacks across edge-cloud layers. Managing such large-scale, heterogeneous, and performance-critical environments through manual intervention is no longer feasible. Instead, new paradigms, including serverless computing, edge-native orchestration frameworks, and AI-driven decision logic, are essential for unlocking the potential of intelligent, resilient, and sustainable continuum systems.

Recent advances in modular runtimes such as WebAssembly and microVMs, carbon-aware scheduling leveraging real-time telemetry, and learning-based orchestration strategies promise to accelerate this transformation. In parallel, the rise of hybrid and distributed LLM inference pipelines pushes the frontier further, requiring novel runtime systems, efficient model partitioning and distillation techniques, and fine-grained orchestration across compute tiers to meet strict latency, accuracy, and energy objectives. Building on this momentum, ScaleSys 2025 aims to bring together researchers, developers, and practitioners from academia and industry to present their latest research and experiences at the intersection of systems and AI. The workshop aspires to provide a forum for exchanging ideas and advancing the state of the art in scalable, intelligent, and sustainable computing across the edge–cloud continuum. By fostering this dialogue, ScaleSys 2025 seeks to help define open standards, reusable benchmarks, and reproducible experimental frameworks that propel forward both “systems for AI” and “AI for systems” research in this space.

Topics

ScaleSys 2025 welcomes original contributions that address system-level challenges in enabling intelligent, resilient, and/or sustainable computing across the continuum. We seek high-quality papers presenting new ideas, architectures, methods, algorithms, benchmarks, or practical deployments. Topics of interest include, but are not limited to:

  • Interoperable and scalable architectures for distributed and IoT-aware computing
  • Edge and cloud-based multimedia streaming, networking, and processing
  • Serverless and microservice execution models for modular and distributed computation
  • Serverless function/workflow management on the computing continuum
  • Backend-as-a-Service (BaaS) integration and abstraction layers for decentralized deployments
  • Carbon-aware scheduling and telemetry-driven function/workflow placement
  • Edge-native AI systems for constrained, low-latency inference and on-device adaptation
  • LLM inference across the continuum using partitioning, scheduling, and distillation
  • Orchestration of AI Agents across the computing continuum
  • WebAssembly and microVM runtimes for cross-platform deployment
  • Learning-based and control-theoretic computing continuum orchestration techniques
  • Cross-tier model partitioning and collaborative execution
  • Secure and isolated multi-tenant execution environments
  • Caching, data sharing, and tiered storage architectures between edge and cloud
  • Real-time observability, monitoring, and telemetry integration
  • Experiment reproducibility, benchmarking, and evaluation on open continuum testbeds and simulators 

Click to Check more on Call for papers.

Contact

For any queries and questions regarding the workshop and submissions, please contact the organizers.