IOTA 5101: Fog/Edge/Cloud Computing for IoT - Spring 2021

Instructor: Prof. Songze Li

Course objectives:

The goal of this course is to get students familiar with architectures of Fog/Edge/Cloud computing systems to support Internet-of-Things applications, understand the challenges and open problems researchers and developers are currently facing in building these systems, and gain essential background knowledge on tackling those challenges. This course serves as an introductory PG course to state-of-the-art research problems on improving efficiency, robustness, privacy, and scalability of modern IoT computing systems, and novel techniques using information/coding theory, optimization, and cryptography to solve the problems.

Course materials:

Topics that will be covered include coded storage/caching, coded computing to reduce tail latency and bandwidth consumption, security and privacy in distributed machine learning and federated learning, mobile edge computing and computation offloading, and blockchain systems.

Pre-requisite:

Prior knowledge on probability is needed (or can be obtained along the course). Backgrounds on information/coding theory, learning theory and optimization are preferred.

Grading:

Class schedule

Lec. # Date Topics References
1 Feb. 1
  • Introduction to IoT infrastructures and applications
  • IoT computing paradigms
  1. IoT market
  2. IoT spending
  3. Foundations and evolution of modern computing paradigms: Cloud, iot, edge, and fog
  4. Fog and IoT: An overview of research opportunities
2 Feb. 8
  • Information measures
  • Introduction to coding theory
  1. Elements of Information Theory, 2nd edition
  2. Error Control Coding, 2nd edition
3 Feb. 22
  • Distributed file systems
  • Coded storage
  1. The Google file system
  2. HDFS Architecture
  3. Saving capacity with HDFS RAID
  4. Network coding for distributed storage systems
  5. Optimal Exact-Regenerating Codes for Distributed Storage at the MSR and MBR Points via a Product-Matrix Construction
  6. A family of optimal locally recoverable codes
4 Mar. 1
  • Coded computing for straggler mitigation
  1. Speeding up distributed machine learning using codes
  2. Polynomial Codes: an Optimal Design for High-Dimensional Coded Matrix Multiplication
  3. Lagrange coded computing: Optimal design for resiliency, security, and privacy
5 Mar. 8
  • MapReduce framework
  • Coded computing for bandwidth reduction
  1. Network information flow
  2. An Algebraic Approach to Network Coding
  3. Fundamental limits of caching
  4. MapReduce: simplified data processing on large clusters
  5. A fundamental tradeoff between computation and communication in distributed computing
  6. Coded terasort
6 Mar. 15
  • Distributed machine learning
  1. Understanding Machine Learning: From Theory to Algorithms, 1st edition
  2. Convex Optimization: Algorithms and Complexity
  3. Optimization Methods for Large-Scale Machine Learning
  4. Gradient Coding: Avoiding Stragglers in Distributed Learning
  5. Hogwild!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
7 Mar. 22
  • Mobile edge computing
  1. A Survey on Mobile Edge Computing: The Communication Perspective
  2. Communicating while computing: Distributed mobile cloud computing over 5G heterogeneous networks
  3. BottleNet++: An End-to-End Approach for Feature Compression in Device-Edge Co-Inference Systems
  4. Learning Task-Oriented Communication for Edge Inference: An Information Bottleneck
  5. Communication-Computation Trade-off in Resource-Constrained Edge Inference
8 Apr. 12
  • Secure and private computing
  1. Communication Theory of Secrecy Systems
  2. A Graduate Course in Applied Cryptography
  3. How to share a secret
  4. Completeness theorems for non-cryptographic fault-tolerant distributed computation
  5. Lagrange coded computing: Optimal design for resiliency, security, and privacy
9 Apr. 19
  • Federated learning
  1. Communication-efficient learning of deep networks from decentralized data
  2. Advances and open problems in federated learning
  3. Federated learning: Challenges, methods, and future directions
  4. The algorithmic foundations of differential privacy
  5. Differentially Private Federated Learning: A Client Level Perspective
  6. Deep leakage from gradients
  7. Practical secure aggregation for privacy-preserving machine learning
  8. Fedml: A research library and benchmark for federated machine learning
10 Apr. 26
  • Distributed consensus
  1. The Byzantine Generals Problem
  2. Impossibility of distributed consensus with one faulty process
  3. Consensus in the presence of partial synchrony
  4. Paxos made simple
  5. Practical Byzantine fault tolerance
  6. Streamlet: Textbook Streamlined Blockchains
11 May 3
  • Blockchains
  1. Foundations of Distributed Consensus and Blockchains
  2. A peer-to-peer electronic cash system
  3. Analysis of Nakamoto Consensus
  4. Secure high-rate transaction processing in bitcoin
  5. Hybrid consensus: Efficient consensus in the permissionless model