日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

从文章「A Field Guide to Federated Optimization」整理的联邦学习科研入门实验

發(fā)布時(shí)間:2023/12/20 编程问答 34 豆豆
生活随笔 收集整理的這篇文章主要介紹了 从文章「A Field Guide to Federated Optimization」整理的联邦学习科研入门实验 小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

A Field Guide to Federated Optimization

本文是我從文章「A Field Guide to Federated Optimization」整理的聯(lián)邦學(xué)習(xí)科研入門實(shí)驗(yàn)

〇、作者信息

Jianyu Wang 卡耐基梅隆大學(xué),谷歌實(shí)習(xí)期間完成的這篇文章,代表了谷歌對于聯(lián)邦學(xué)習(xí)最新的認(rèn)識吧。

Abtract

聯(lián)邦學(xué)習(xí)和分析都是一個(gè)分布式方法,用來從去中心化的數(shù)據(jù)中協(xié)作學(xué)習(xí)模型(或者是單純的統(tǒng)計(jì)數(shù)據(jù)),設(shè)計(jì)它的目的是為了隱私保護(hù)。這個(gè)分布式學(xué)習(xí)的過程可以被定義為解決一個(gè)聯(lián)邦優(yōu)化問題,它強(qiáng)調(diào)通信效率、數(shù)據(jù)異構(gòu)性、兼顧隱私和系統(tǒng)需求、以及其他約束(那些在其他問題設(shè)置中不是主要被考慮的因素「我覺得這個(gè)指的是中心化訓(xùn)練轉(zhuǎn)向到分布式環(huán)境中二產(chǎn)生的問題」)。本文通過具體的例子和實(shí)際的實(shí)現(xiàn),為制定、設(shè)計(jì)、評估和分析聯(lián)邦優(yōu)化算法提供了建議和指南,重點(diǎn)是進(jìn)行有效的模擬以推斷真實(shí)世界的性能。這項(xiàng)工作的目的不是調(diào)查當(dāng)前的文獻(xiàn),而是啟發(fā)研究人員和實(shí)踐者設(shè)計(jì)可以用于各種實(shí)際應(yīng)用的聯(lián)合學(xué)習(xí)算法。

個(gè)人實(shí)驗(yàn)

本章記錄我在本文中找到的可以在聯(lián)邦學(xué)習(xí)入門階段的實(shí)驗(yàn),以便于盡快進(jìn)入到學(xué)習(xí)的過程中。

實(shí)驗(yàn)一:Client update rule

Jianyu Wang, Qinghua Liu, Hao Liang, Gauri Joshi, and H Vincent Poor. Tackling the objective inconsistency problem in heterogeneous federated optimization. In Advances in Neural Information Processing Systems (NeurIPS), 2020.

Jianyu Wang, Zheng Xu, Zachary Garrett, Zachary Charles, Luyang Liu, and Gauri Joshi. Local adaptivity in federated learning: Convergence and consistency. arXiv preprint arXiv:2106.02305, 2021.

Honglin Yuan and Tengyu Ma. Federated accelerated stochastic gradient descent. In Advances in Neural Information Processing Systems (NeurIPS), 2020.

實(shí)驗(yàn)二:Global Update Rule

Tzu-Ming Harry Hsu, Hang Qi, and Matthew Brown. Measuring the effects of non-identical data distribution for federated visual classification. arXiv preprint arXiv:1909.06335, 2019.

Sashank J. Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Koneˇcn′y, Sanjiv Kumar, and Hugh Brendan McMahan. Adaptive federated optimization. In International Conference on Learning Representations, 2021. URL https://openreview.net/ forum?id=LkFG3lB13U5

Jianyu Wang, Vinayak Tantia, Nicolas Ballas, and Michael Rabbat. SlowMo: Improving communication-efficient distributed SGD with slow momentum. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=SkxJ8REYPH.

實(shí)驗(yàn)三:Aggregation Method

Chaoyang He, Murali Annavaram, and Salman Avestimehr. Group knowledge transfer: Feder- ated learning of large cnns at the edge. Advances in Neural Information Processing Systems, 33, 2020.

Tao Lin, Lingjing Kong, Sebastian U Stich, and Martin Jaggi. Ensemble distillation for robust model fusion in federated learning. Advances in Neural Information Processing Systems, 33, 2020.

實(shí)驗(yàn)四:personalized model

Fei Chen, Mi Luo, Zhenhua Dong, Zhenguo Li, and Xiuqiang He. Federated meta-learning with fast convergence and efficient communication. arXiv preprint arXiv:1802.07876, 2018.

Alireza Fallah, Aryan Mokhtari, and Asuman Ozdaglar. Personalized federated learning: A meta-learning approach. In Advances in Neural Information Processing Systems, 2020.

Yihan Jiang, Jakub Koneˇcn′y, Keith Rush, and Sreeram Kannan. Improving federated learning personalization via model agnostic meta learning. arXiv preprint arXiv:1909.12488, 2019.

Jeffrey Li, Mikhail Khodak, Sebastian Caldas, and Ameet Talwalkar. Differentially private meta-learning. In International Conference on Learning Representations, 2020.

實(shí)驗(yàn)五:multi-task learning

Canh T Dinh, Nguyen H Tran, and Tuan Dung Nguyen. Personalized federated learning with moreau envelopes. In Advances in Neural Information Processing Systems, 2020.

Theodoros Evgeniou and Massimiliano Pontil. Regularized multi–task learning. In International Conference on Knowledge Discovery and Data Mining, 2004.

Filip Hanzely and Peter Richt′ arik. Federated learning of a mixture of global and local models. arXiv preprint arXiv:2002.05516, 2020.

Tian Li, Shengyuan Hu, Ahmad Beirami, and Virginia Smith. Ditto: Fair and robust federated learning through personalization. In International Conference on Machine Learning, 2021

Virginia Smith, Chao-Kai Chiang, Maziar Sanjabi, and Ameet S Talwalkar. Federated multi-task learning. In Advances in Neural Information Processing Systems, 2017.

其他

有一說一,科研太難了emmm

總結(jié)

以上是生活随笔為你收集整理的从文章「A Field Guide to Federated Optimization」整理的联邦学习科研入门实验的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。