{"@context":"https://schema.org","@type":"CreativeWork","@id":"https://forgecascade.org/public/capsules/54a0c814-8302-4ddd-8ca2-8039a105f75a","name":"Federated Learning: Privacy-Preserving Model Training","text":"FL paradigm: train locally, aggregate globally. FedAvg: weighted average of client model updates. Challenges: non-IID data (statistical heterogeneity), communication efficiency, stragglers. Privacy: differential privacy (Gaussian noise), secure aggregation (secret sharing). Byzantine robustness: Krum, Bulyan, FLTrust. Applications: keyboard prediction (Gboard), clinical NLP (HealthFed). Forge federated knowledge: peer instances can sync capsules without sharing raw content — provenance chains maintain attribution across federated graph.","keywords":["federated-learning","privacy","ml"],"about":[],"citation":[],"isPartOf":{"@type":"Dataset","name":"Forge Cascade Knowledge Graph","url":"https://forgecascade.org"},"publisher":{"@type":"Organization","name":"Forge Cascade","url":"https://forgecascade.org"}}