日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

今日arXiv精选 | 15篇ICCV 2021最新论文

發布時間:2024/10/8 编程问答 35 豆豆
生活随笔 收集整理的這篇文章主要介紹了 今日arXiv精选 | 15篇ICCV 2021最新论文 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

?關于?#今日arXiv精選?

這是「AI 學術前沿」旗下的一檔欄目,編輯將每日從arXiv中精選高質量論文,推送給讀者。

Image In painting Applied to Art Completing Escher's Print Gallery

Comment: Abstract submitted to LatinX workshop on ICML 2021

Link:?http://arxiv.org/abs/2109.02536

Abstract

This extended abstract presents the first stages of a research on in-paintingsuited for art reconstruction. We introduce M.C Eschers Print Gallerylithography as a use case example. This artwork presents a void on its centerand additionally, it follows a challenging mathematical structure that needs tobe preserved by the in-painting method. We present our work so far and ourfuture line of research.

Statistical Privacy Guarantees of Machine Learning Preprocessing Techniques

Comment: Accepted to the ICML 2021 Theory and Practice of Differential Privacy ?Workshop

Link:?http://arxiv.org/abs/2109.02496

Abstract

Differential privacy provides strong privacy guarantees for machine learningapplications. Much recent work has been focused on developing differentiallyprivate models, however there has been a gap in other stages of the machinelearning pipeline, in particular during the preprocessing phase. Ourcontributions are twofold: we adapt a privacy violation detection frameworkbased on statistical methods to empirically measure privacy levels of machinelearning pipelines, and apply the newly created framework to show thatresampling techniques used when dealing with imbalanced datasets cause theresultant model to leak more privacy. These results highlight the need fordeveloping private preprocessing techniques.

On Second-order Optimization Methods for Federated Learning

Comment: ICML 2021?Workshop "Beyond first-order methods in ML systems"

Link:?http://arxiv.org/abs/2109.02388

Abstract

We consider federated learning (FL), where the training data is distributedacross a large number of clients. The standard optimization method in thissetting is Federated Averaging (FedAvg), which performs multiple localfirst-order optimization steps between communication rounds. In this work, weevaluate the performance of several second-order distributed methods with localsteps in the FL setting which promise to have favorable convergence properties. ?We (i) show that FedAvg performs surprisingly well against its second-ordercompetitors when evaluated under fair metrics (equal amount of localcomputations)-in contrast to the results of previous work. Based on ournumerical study, we propose (ii) a novel variant that uses second-order localinformation for updates and a global line search to counteract the resultinglocal specificity.

·

總結

以上是生活随笔為你收集整理的今日arXiv精选 | 15篇ICCV 2021最新论文的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。