|
[1] T.-H. Chang, M. Hong, H.-T. Wai, X. Zhang, and S. Lu, "Distributed learning in the nonconvex world: From batch data to streaming and beyond," IEEE Signal Processing Magazine, vol. 37, no. 3, pp. 26-38, 2020. [2] Y. Wang, Y. Xu, Q. Shi, and T.-H. Chang, "Quantized federated learning under transmission delay and outage constraints," IEEE Journal on Selected Areas in Communications, vol. 40, no. 1, pp. 323-341, 2021. [3] S. Wang and T.-H. Chang, "Federated matrix factorization: Algorithm design and application to data clustering," IEEE Trans. Signal Processing, vol. 70, pp. 1625-1640, 2022. [4] H. McMahan, E. Moore, D. Ramage, and B. A. y Arcas, "Federated learning of deep networks using model averaging," arXiv preprint arXiv 1602.05629,2016. [5] X. Li, K. Huang, W. Yang, S. Wang, and Z. Zhang, "On the convergence of FedAvg on non-IID data," in Proc. International Conference on Learning Representations (ICLR), 2020, pp. 1-26. [6] S. Wang, Y. Xu, Y. Yuan, and T. Q. Quek, "Towards fast personalized semi-supervised federated learning in edge networks: Algorithm design and theoretical guarantee," IEEE Trans. Wireless Communications, pp. 1-14, 2023. [7] M. Fredrikson, S. Jha, and T. Ristenpart, "Model inversion attacks that exploit confidence information and basic countermeasures," in Proc. ACM SIGSAC Conference on Computer and Communications Security, 2015, pp. 1322-1333. [8] J. Geiping, H. Bauermeister, H. Dröge, and M. Moeller, "Inverting gradients-how easy is it to break privacy in federated learning?" in Proc. Neural Information Processing Systems (NIPS), 2020, pp. 937-947. [9] C. Dwork, A. Roth et al., "The algorithmic foundations of differential privacy." Foundations and Trends in Theoretical Computer Science, vol. 9, pp. 211-407, 2014. [10] E. Bagdasaryan, A. Veit, Y. Hua, D. Estrin, and V. Shmatikov, "How to backdoor federated learning," in Proc. International Conference on Artificial Intelligence and Statistics, 2020, pp. 2938-2948. [11] Z. Wang, M. Song, Z. Zhang, Y. Song, Q. Wang, and H. Qi, "Beyond inferring class representatives: User-level privacy leakage from federated learning," in Proc. IEEE International Conference on Computer Communications (INFOCOM), 2019, pp. 2512-2520. [12] C. Ma, J. Li, M. Ding, H. H. Yang, F. Shu, T. Q. Quek, and H. V. Poor, "On safeguarding privacy and security in the framework of federated learning," IEEE Network, vol. 34, pp. 242-248, 2020. [13] H. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. Areas, "Communication-efficient learning of deep networks from decentralized data," in Proc. International Conference on Machine Learning (ICML), 2017, pp. 1-10. [14] H. Yu, S. Yang, and S. Zhu, "Parallel restarted SGD with faster convergence and less communication: Demystifying why model averaging works," in Proc. AAAI Conference on Artificial Intelligence, 2019, pp. 5693-5700. [15] B. McMahan, E. Moore, D. Ramage, and Hampson, "Communication-efficient learning of deep networks from decentralized data," in Proc. International Conference on Artificial Intelligence and Statistics, 2017, pp. 1273-1282. [16] M. Hong, Z.-Q. Luo, and M. Razaviyayn, "Convergence analysis of alternating direction method of multipliers for a family of nonconvex problems," SIAM Journal on Optimization, vol. 26, no. 1, pp. 337-364,2016. [17] D. Hajinezhad, M. Hong, T. Zhao, and Z. Wang, ENESTT: A nonconvex primal-dual splitting method for distributed and stochastic optimization," in Proc. Advances in Neural Information Processing Systems (NIPS), 2016, pp. 3207-3215. [18] S. Zhou and G. Y. Li, "Federated learning via inexact ADMM," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 45, no. 8, pp. 9699-9708, 2023. [19] Y. Li, C.-W. Huang, S. Wang, C.-Y. Chi, and Q. S. T. Quek, "Privacy-preserving federated primal-dual learning for non-convex problems with non-smooth regularization," in Proc. IEEE International Workshop on Machine Learning for Signal Processing (MLSP), 2023, pp. 1-6. [20] Y. Li, S. Wang, T.-H. Chang, and C.-Y. Chi, "Federated stochastic primal-dual learning with differential privacy," arXiv preprint arXiv:2204.12284, 2022. [21] S. P. Karimireddy, S. Kale, M. Mohri, S. Reddi, S. Stich, and A. T. Suresh, "Scaffold: Stochastic controlled averaging for federated learning," in Proc. International Conference on Machine Learning (ICML), 2020, pp. 5132-5143. [22] X. Zhang, M. Hong, S. Dhople, W. Yin, and Y. Liu, "FedPD: A federated learning framework with adaptivity to non-IID data," IEEE Trans. Signal Processing, vol. 69, pp. 6055-6070, 2021. [23] A. K. Sahu, T. Li, M. Sanjabi, M. Zaheer, A. Talwalkar, and V. Smith, "On the convergence of federated optimization in heterogeneous networks," arXiv preprint arXiv:1812.06127, 2018. [24] W. Shi, Q. Ling, K. Yuan, G. Wu, and W. Yin, "On the linear convergence of the ADMM in decentralized consensus optimization," IEEE Trans. Signal Processing, vol. 62, pp. 1750-1761, 2014. [25] A. Makhdoumi and A. Ozdaglar, "Convergence rate of distributed ADMM over networks," IEEE Trans. Automatic Control, vol. 62, pp. 5082-5095, 2017. [26] M. Hong and T.-H. Chang, "Stochastic proximal gradient consensus over random networks," IEEE Trans. Signal Processing, vol. 65, pp. 2933-2948, 2017. [27] C. Dwork, K. Kenthapadi, F. McSherry, I. Mironov, and M. Naor, "Our data, ourselves: Privacy via distributed noise generation," in Proc. Annual International Conference on the Theory and Applications of Cryptographic Techniques, 2006, pp. 486 503. [28] P. Mohassel and Y. Zhang, "Secureml: A system for scalable privacy-preserving machine learning," in Proc. IEEE Symposium on Security and Privacy, 2017, pp. 19-38. [29] I. Giacomelli, S. Jha, M. Joye, C. D. Page, and K. Yoon, "Privacy-preserving ridge regression with only linearly homomorphic encryption," in Proc. International Conference on Applied Cryptography and Network Security, 2018, pp. 243-261. [30] K. Bonawitz, V. Ivanov, B. Kreuter, A. Marcedone, H. B. McMahan, S. Patel, D. Ramage, A. Segal, and K. Seth, "Practical secure aggregation for privacy-preserving machine learning," in Proc. ACM SIGSAC Conference on Computer and Communications Security, 2017, pp. 1175-1191. [31] P. Vepakomma, T. Swedish, R. Raskar, O. Gupta, and A. Dubey, "No peek: A survey of private distributed deep learning," arXiv preprint arXiv:1812.03288, 2018. [32] Y. Li, S. Wang, C.-Y. Chi, and T. Q. Quek, "Differentially private federated learning in edge networks: The perspective of noise reduction," IEEE Network, vol. 36, no. 5, pp. 167-172, 2022. [33] P. Kairouz, H. B. McMahan, B. Avent, Bellet et al., "Advances and open problems in federated learning," Foundations and Trends in Machine Learning, vol. 14, pp. 1-210, 2021. [34] A. Triastcyn and B. Faltings, "Federated learning with Bayesian differential privacy," in Proc. IEEE International Conference on Big Data, 2019, pp. 2587-2596. [35] S. Truex, L. Liu, K.-H. Chow, M. E. Gursoy, and W. Wei, "LDP-Fed: Federated learning with local differential privacy," in Proc. ACM International Workshop on Edge Systems, Analytics and Networking, 2020, pp. 61-66. [36] Ú. Erlingsson, V. Feldman, I. Mironov, A. Raghunathan, K. Talwar, and A. Thakurta, "Amplification by shuffling: From local to central differential privacy via anonymity," in Proc. ACM-SIAM Symposium on Discrete Algorithms, 2019, pp. 2468-2479. [37] B. Balle, G. Barthe, and M. Gaboardi, "Privacy amplification by subsampling: Tight analyses via couplings and divergences," in Proc. Neural Information Processing Systems (NIPS), 2018, pp. 6277-6287. [38] S. Li, S. Hou, B. Buyukates, and S. Avestimehr, "Secure federated clustering," arXiv preprint arXiv:2205.15564, 2022. [39] X. Shen, Y. Liu, and Z. Zhang, "Performance-enhanced federated learning with differential privacy for Internet of Things," IEEE Internet of Things Journal, vol. 9, no. 23, pp. 24079-24094,2022. [40] S. Wang, T.-H. Chang, Y. Cui, and J.-S. Pang, "Clustering by orthogonal NMF model and non-convex penalty optimization," IEEE Trans. Signal Processing, vol. 69, pp. 5273-5288, 2021. [41] J. Ma, G. Long, T. Zhou, J. Jiang, and C. Zhang, "On the convergence of clustered federated learning," arXiv preprint arXiv:2202.06187, 2022. [42] H. Ding, Y. Liu, L. Huang, and J. Li, "K-means clustering with distributed dimensions," in Proc. International Conference on Machine Learning (ICML), 2016, pp. 1339-1348. [43] A. Ghosh, J. Hong, D. Yin, and K. Ramchandran, "Robust federated learning in a heterogeneous environment," arXiv preprint arXiv: 1906.06629, 2019. [44] F. Sattler, K.-R. Müller, and W. Samek, "Clustered federated learning: Modelagnostic distributed multitask optimization under privacy constraints," IEEE Trans. Neural Networks and Learning Systems, vol. 32, no. 8, pp. 3710-3722, 2020. [45] Y. Fraboni, R. Vidal, L. Kameni, and M. Lorenzi, "Clustered sampling: Low-variance and improved representativity for clients selection in federated learning," in Proc. International Conference on Machine Learning (ICML), 2021, pp. 3407-3416. [46] D. K. Dennis, T. Li, and V. Smith, "Heterogeneity for the win: One-shot federated clustering," in Proc. International Conference on Machine Learning (ICML), 2021, pp. 2611-2620. [47] B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, "Communication-efficient learning of deep networks from decentralized data," in Porc. Artificial intelligence and statistics, 2017, pp. 1273-1282. [48] M. Abadi, A. Chu, I. Goodfellow, H. B. McMahan, I. Mironov, K. Talwar, and L. Zhang, "Deep learning with differential privacy," in Proc. ACM SIGSAC Conference on Computer and Communications Security, 2016, pp. 308-318. [49] Y. Li, T.-H. Chang, and C.-Y. Chi, "Secure federated averaging algorithm with differential privacy," in Proc. IEEE International Workshop on Machine Learning for Signal Processing (MLSP), 2020, pp. 1-6. [50] N. Wang, X. Xiao, Y. Yang, J. Zhao, S. C. Hui, H. Shin, J. Shin, and G. Yu, "Collecting and analyzing multidimensional data with local differential privacy," in Proc. IEEE 35th International Conference on Data Engineering (ICDE), 2019, pp. 638-649. [51] X. Jiang, X. Zhou, and J. Grossklags, "Signds-fl: Local differentially private federated learning with sign-based dimension selection," ACM Transactions on Intelligent Systems and Technology (TIST), vol. 13, no. 5, pp. 1-22, 2022. [52] K. Wei, J. Li, M. Ding, C. Ma, H. H. Yang et al., "Performance analysis on federated learning with differential privacy," arXiv preprint arXiv:1911.00222, 2019. [53] T. Zhang and Q. Zhu, "Dynamic differential privacy for ADMM-based distributed classification learning," IEEE Trans. Information Forensics and Security, vol. 12, no. 1, pp. 172-187, 2016. [54] X. Zhang, M. M. Khalili, and M. Liu, "Improying the privacy and accuracy of ADMM-based distributed algorithms," in Proc. International Conference on Machine Learning (ICML), 2018, pp. 5796-5805. [55] X. Zhang, M. M. Khalili, and M. Liu, "Recyeled ADMM: Improve privacy and accuracy with less computation in distributed algerithms," in Proc. Annual Allerton Conference on Communication, Control, and Computing (Allerton), 2018, pp. 959-965. [56] Y. Guo and Y. Gong, "Practical collaborative learning for crowdsensing in the internet of things with differential privacy," in Proc. IEEE Conference on Communications and Network Security, 2018, pp. 1-9. [57] Z. Huang, R. Hu, Y. Guo, E. Chan-Tin, and Y. Gong, "DP-ADMM: ADMM-based distributed learning with differential privacy," IEEE Trans. Information Forensics and Security, vol. 15, pp. 1002-1012, 2019. [58] J. Ding, S. M. Errapotu, H. Zhang, Y. Gong, M. Pan, and Z. Han, "Stochastic ADMM based distributed machine learning with differential privacy," in Proc. Security and Privacy in Communication Network, 2019, pp. 257-277. [59] X. Wang, H. Ishii, L. Du, P. Cheng, and J. Chen, "Privacy-preserving distributed machine learning via local randomization and ADMM perturbation," IEEE Trans. Signal Processing, vol. 68, pp. 4226 -4241, 2020. [60] B. Bahmani, B. Moseley, A. Vattani, R. Kumar, and S. Vassilvitskii, "Scalable kmeans++," in Proc. VLDB Endowment, 2012, pp. 622-633. [61] T. Kucukyilmaz, "Parallel k-means algorithm for shared memory multiprocessors," Journal of Computer and Communications, vol. 2, pp. 15-23, 2014. [62] M. Ester, H. P. Kriegel, J. Sander, and X. Xu, "A density-based algorithm for discovering clusters in large spatial databases with noise," in Proc. Knowledge Discovery and Data Mining (KDD), 1996, pp. 226-231. [63] M.-F. F. Balcan, S. Ehrlich, and Y. Liang, "Distributed k-means and k-median clustering on general topologies," in Proc. Neural Information Processing Systems (NIPS), 2013, pp. 1995-2003. [64] H. Ding, Y. Liu, L. Huang, and J. Li, "K-means clustering with distributed dimensions," in Proc. International Conference on Machine Learning (ICML), 2016, pp. 1339-1348. [65] M. Stallmann and A. Wilbik, "Towards federated clustering: A federated fuzzy c means algorithm (FFCM)," arXiv preprint arXiv:2201.07316, 2022. [66] W. Pedrycz, "Federated FCM: Clustering under privacy requirements," IEEE Trans. Fuzzy Systems, vol. 30, no. 8, pp. 3384-3388, 2022. [67] E. Hernández-Pereira, O. Fontenla-Romero, B. Guijarro-Berdiñas, and B. PérezSánchez, "Federated learning approach for spectral clustering." in Proc. European Symposium on Artificial Neural Networks, 2021, pp, 423-428. [68] C. Li, G. Li, and P. K. Varshney, Federated learning with soft clustering," IEEE Internet of Things Journal, vol. 9, no. 10, pp. 7773-7782, 2021. [69] C. Xu, Y. Qu, Y. Xiang, and L. Gao, "Asynchronous federated learning on heterogeneous devices: A survey," arXivpreprint arXiv:2109.04269, 2021. [70] S. Wang and T.-H. Chang, "Federated matrix factorization: Algorithm design and application to data clustering,"/EEE Trans. Signal Processing, vol. 70, pp. 1625- 1640,2022. [71] J. Chung, K. Lee, and K. Ramchandran, "Federated unsupervised clustering with generative models," in Proc. AAAI International Workshop on Trustable, Verifiable and Auditable Federated Learning, 2022, pp. 1-9. [72] N. Parikh and S. Boyd, "Proximal algorithms," Foundations and Trends in Optimization, vol. 1, no. 3, pp. 127-239, 2014. [73] M. Hong, D. Hajinezhad, and M.-M. Zhao, "Prox-PDA: The proximal primal-dual algorithm for fast distributed nonconvex optimization and learning over networks," in Proc. ICML, 2017, pp. 1529-1538. [74] C.-Y. Chi, W.-C. Li, and C.-H. Lin, Convex optimization for signal processing and communications: From fundamentals to applications. CRC Press, Boca Raton, FL, Feb 2017. [75] H. Ouyang, N. He, L. Tran, and A. Gray, "Stochastic alternating direction method of multipliers," in Proc. International Conference on Machine Learning (ICML), 2013, pp. 80-88. [76] H. Yang, M. Fang, and J. Liu, "Achieving linear speedup with partial worker participation in non-IID federated learning," in Proc. International Conference on Learning Representations (ICLR), 2021, pp. 1-23. [77] C. L. Blake and C. J. Merz, "UCI repository of machine learning databases," 1998, Irvine. CA: University of California, Department of Information and Computer Science. [Online]. Available: http://www.ics.uci.edu/rvmlearnIMLRepository.html. [78] Y. LeCun, C. Cortes, and C. Burges. The MNIST database. [Online]. Available: http://yann.lecun.com/exdb/mnist. [79] T. Li, A. K. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar, and V. Smith, "Federated optimization in heterogeneous networks," in Proc. Machine Learning and Systems, 2020, pp. 429-450. [80] B. Yang, X. Fu, and N. D. Sidiropoulos, "Learning from hidden traits: Joint factor analysis and latent clustering," IEEE Trans. Signal Processing, vol. 65, pp. 256-269, 2016. [81] D. Arthur and S. Vassilvitskii, "K-means++: The advantages of careful seeding," in Proc. Symposium on Discrete Algorithms (SODA), 2007, pp. 1027-1035. [82] T. Li, A. K. Sahu, M. Sanjabi, M. Zaheer, A. Talwalkar, and V. Smith, "Federated optimization in heterogeneous networks," in Proc. Machine Learning and Systems, 2020, pp. 1-12. [83] S. Wang, T.-H. Chang, Y. Cui, and J.-S. Pang, "Clustering by orthogonal non-negative matrix factorization: A sequential non-convex penalty approach," in Proc. IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019, pp. 5576-5580. [84] H. Kim and H. Park, "Sparse non-negative matrix factorization via alternating nonnegative-constrained least squares for microarray data analysis," Bioinformatics, vol. 23, no. 12, pp. 1495-1502, 2007. [85] W. E. Zhang, M. Tan, Q. Z. Sheng, L. Yao, and Q. Shi, "Efficient orthogonal nonnegative matrix factorization over stiefel manifold," in Proc. ACM International on Conference on Information and Knowledge Management (ICKM), 2016, pp. 1743-1752. [86] K. Yu, S. Yu, and V. Tresp, "Soft clustering on graphs," in Proc. Neural Information Processing Systems (NIPS), 2005, pp. 1-8. [87] J. Kŏnecný, H. B. McMahan, and D. Ramage, "Federated optimization: Distributed optimization beyond the datacenter," in Proc. NeuIPS Optimization for Machine Learning Workshop, 2015, pp. 1-5. [88] J. Kǒnecný, H. B. McMahan, D. Ramage, and P. Richtarik, "Federated optimization: Distributed machine learning for on-device intelligence," arXiv preprint arXiv:1610.02527, 2016. [89] D. Chai, L. Wang, K. Chen, and Q. Yang, "Secure federated matrix factorization," IEEE Intelligent Systems, vol. 36, no. 5, pp. 11-20, 2020. [90] P. Tseng, "Convergence of a block coordinate descent method for nondifferentiable minimization," Journal of Optimization Theory and Applications, vol. 109, pp. 475 494, 2001. [91] R. Mclendon, A. Friedman, D. Bigner et al., "Comprehensive genomic characterization defines human glioblastoma genes and core pathways," Nature, vol. 455, pp. 1061-1068, 2008. [92] J. Bolte, S. Sabach, and M. Teboulle, "Proximal alternating linearized minimization for nonconvex and nonsmooth problems," Mathematical Programming, vol. 146, pp. 459-494, 2014.
|