帳號:guest(3.144.47.208)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):黃有德
作者(外文):Huang, You-De
論文名稱(中文):應用本地自適應字典於異質性用戶端之分散式字典學習
論文名稱(外文):Distributed Dictionary Learning over Heterogeneous Clients using Local Adaptive Dictionaries
指導教授(中文):洪樂文
指導教授(外文):Hong, Yao-Win Peter
口試委員(中文):李祈均
林嘉文
方士豪
口試委員(外文):LEE, CHI-CHUN
LIN, CHIA-WEN
FANG, SHIH-HAU
學位類別:碩士
校院名稱:國立清華大學
系所名稱:通訊工程研究所
學號:106064542
出版年(民國):110
畢業學年度:109
語文別:英文
論文頁數:34
中文關鍵詞:字典學習分散式學習多任務學習
外文關鍵詞:Dictionary LearningDistributed LearningMultitask Learning
相關次數:
  • 推薦推薦:0
  • 點閱點閱:44
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
此論文研究了字典學習在具有異質任務的分布式用戶端中的應用。我們提出了一種分布式字典學習演算法,該演算法能夠在用戶端之間合作訓練一個共享的全局字典,同時自適應建構本地字典元素以解決本地任務的異質性。所提出的帶有本地自適應的分布式字典學習(DDL-LAD)演算法由兩部分組成:一個分布式優化程序,使字典的合作訓練無須與服務器共享本地數據集;一個分割和消除程序,用於自適應建構本地字典元素。分割過程從全局字典中提取對特定客戶具有鑑別力和針對性的基礎。這些元素被拆分並附加到本地字典中,有兩種不同的方式,一種是在優化過程中,全局字典和本地的總大小被固定,通過固定整體字典的大小,改變了全局字典與本地字典的比例大小,並保留了本地客戶端的特點且避免了計算複查性的增加。另一方面,當全局基礎被拆分到本地字典時,全局基礎將被儲存在服務器中的相應基底所取代,這不僅保留了字典訓練過程中本地客戶端的特徵,而且由於全局字典的規模沒有減少,因此也保持了一定程度的訊息交流。同時為了避免本地字典的過度增長,我們採用了一個消除程序來修剪使用率較低的元素。在一個分布式的EMNIST數據集上的實驗證明了所提出的DDL-LAD演算法與採用全局共享字典的現存方法有效。
This work examines the use of dictionary learning among distributed clients with heterogeneous tasks. We propose a distributed dictionary learning algorithm that enables collaborative training of a shared global dictionary among clients while adaptively constructing local dictionary elements to address the heterogeneity of local tasks. The proposed distributed dictionary learning with local adaptive dictionaries (DDL-LAD) algorithm consists of two parts: a distributed optimization procedure that enables joint training of the dictionaries without sharing of the local datasets with the server, and a splitting and elimination procedure that is used to adaptively construct local dictionary elements. The splitting process extracts the basis from the global dictionary that is discriminatory and specific to the particular client. These elements are split and appended to the local dictionary in two different ways, one is that the total size of the global and local dictionaries will be fixed during the optimization process, which preserves the features of the local client and avoids the increase in computational complexity. The other is that the global basis will be replaced by the corresponding basis stored in the server after splitting to local dictionary, which not only preserves the features of the local client during the dictionary training but also maintains a certain degree of information exchange. Then, to avoid overgrowing of the local dictionaries, an elimination procedure is adopted to prune elements with less usage. Experiments on a distributed EMNIST dataset is provided to demonstrate the effectiveness of the proposed DDL-LAD algorithm compared to existing schemes that adopt only a global shared dictionary.
abstract
content
1 Introduction 1
2 Related Work 4
2.1 Dictionary Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.2 Distributed Dictionary Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.3 Multitask Dictionary Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.4Dynamic Size of Dictionary Learning . . . . . . . . . . . . . . . . . . . . . . . . 8
3 Problem Formulation 11
4 Distributed Dictionary Learning with Local Adaptive Dictionaries 15
4.1 Distributed Optimization of the Dictionaries . . . . . . . . . . . . . . . . . . . . . 15
4.2 Splitting and Elimination of Dictionary Elements . . . . . . . . . . . . . . . . . . 19
5 Experiment Result 24
5.1 Experimental Settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.2 Comparison Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.3 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.4 Experiment Results and Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . 26
6 Conclusion 30
[1] P. Peng, Y. Tian, T. Xiang, Y. Wang, M. Pontil, and T. Huang, “Joint semantic and latent attribute modelling for cross-class transfer learning,” IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), vol. 40, no. 7, pp. 1625–1638, 2017.
[2] Z. Jiang, Z. Lin, and L. S. Davis, “Label consistent K-SVD: Learning a discriminative dictionary for recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), vol. 35, no. 11, pp. 2651–2664, 2013.
[3] J. Liang, M. Zhang, X. Zeng, and G. Yu, “Distributed dictionary learning for sparse representation in sensor networks,” IEEE Transactions on Image Processing, vol. 23, no. 6, pp. 2528–2541, 2014.
[4] A. Daneshmand, G. Scutari, and F. Facchinei, “Distributed dictionary learning,” in Proceedings of Asilomar Conference on Signals, Systems and Computers (ACSSC), 2016, pp. 1001– 1005.
[5] H. Raja and W. U. Bajwa, “Cloud K-SVD: A collaborative dictionary learning algorithm for big, distributed data,” IEEE Transactions on Signal Processing, vol. 64, no. 1, pp. 173–188, 2015. [6] J. Chen, Z. J. Towfic, and A. H. Sayed, “Dictionary learning over distributed models,” IEEE Transactions on Signal Processing, vol. 63, no. 4, pp. 1001–1016, 2014. 31
[7] R. Mazhar and P. D. Gader, “EK-SVD: Optimized dictionary design for sparse representations,” in Proceedings of International Conference on Pattern Recognition (ICPR), 2008, pp. 1–4.
[8] M. Marsousi, K. Abhari, P. Babyn, and J. Alirezaie, “An adaptive approach to learn overcomplete dictionaries with efficient numbers of elements,” IEEE Transactions on Signal Processing, vol. 62, no. 12, pp. 3272–3283, 2014.
[9] C. Rusu and B. Dumitrescu, “Stagewise K-SVD to design efficient dictionaries for sparse representations,” IEEE Signal Processing Letters, vol. 19, no. 10, pp. 631–634, 2012.
[10] A. Kumar and H. Daume III, “Learning task grouping and overlap in multi-task learning,” arXiv preprint arXiv:1206.6417, 2012.
[11] A. Maurer, M. Pontil, and B. Romera-Paredes, “Sparse coding for multitask and transfer learning,” in Proceedings of International Conference on Machine Learning (ICML), 2013, pp. 343–351.
[12] B. Wang and J. Pineau, “Generalized dictionary for multitask learning with boosting.” in Pro ceedings of International Joint Conferences on Artificial Intelligence Organization (IJCAI), 2016, pp. 2097–2103.
[13] Y.-D. Huang and Y.-W. P. Hong, “Distributed dictionary learning over heterogeneous clients using local adaptive dictionaries,” in 2021 IEEE 31th International Workshop on Machine Learning for Signal Processing (MLSP).
[14] M. Yang, L. Zhang, X. Feng, and D. Zhang, “Fisher discrimination dictionary learning for sparse representation,” in 2011 International Conference on Computer Vision. IEEE, 2011, pp. 543–550. 32
[15] S. Chen and D. Donoho, “Basis pursuit,” in Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers, vol. 1. IEEE, 1994, pp. 41–44.
[16] Y. C. Pati, R. Rezaiifar, and P. S. Krishnaprasad, “Orthogonal matching pursuit: Recursive function approximation with applications to wavelet decomposition,” in Proceedings of Asilomar Conference on Signals, Systems and Computers, 1993, pp. 40–44.
[17] K. Engan, S. O. Aase, and J. H. Husoy, “Method of optimal directions for frame design,” in 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No. 99CH36258), vol. 5. IEEE, 1999, pp. 2443–2446.
[18] Z. Shakeri, H. Raja, and W. U. Bajwa, “Dictionary learning based nonlinear classifier training from distributed data,” in Proceedings of IEEE Global Conference on Signal and Information Processing (GlobalSIP), 2014, pp. 759–763.
[19] A. Koppel, G. Warnell, E. Stump, and A. Ribeiro, “D4l: Decentralized dynamic discriminative dictionary learning,” IEEE Transactions on Signal and Information Processing over Networks, vol. 3, no. 4, pp. 728–743, 2017.
[20] B.-S. Shin, M. Yukawa, R. L. Cavalcante, and A. Dekorsy, “A hybrid dictionary approach for distributed kernel adaptive filtering in diffusion networks,” in 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2018, pp. 3414–3418.
[21] P. Pandey, M. Rahmati, W. U. Bajwa, and D. Pompili, “Real-time in-network image compression via distributed dictionary learning,” IEEE Transactions on Mobile Computing, pp. 1–1, 2021.
[22] J. Wang, M. Kolar, and N. Srerbo, “Distributed multi-task learning,” in Artificial intelligence and statistics. Proceedings of Machine Learning Research, 2016, pp. 751–760. 33
[23] A. Fawzi, M. Sinn, and P. Frossard, “Multitask additive models with shared transfer functions based on dictionary learning,” IEEE Transactions on Signal Processing, vol. 65, no. 5, pp. 1352–1365, 2017.
[24] J. Feng, L. Song, X. Yang, and W. Zhang, “Sub clustering K-SVD: size variable dictionary learning for sparse representations,” in IEEE International Conference on Image Processing (ICIP), 2009, pp. 2149–2152.
[25] ——, “Learning dictionary via subspace segmentation for sparse representation,” in 2011 18th IEEE International Conference on Image Processing. IEEE, 2011, pp. 1245–1248.
[26] B. Dumitrescu and P. Irofti, “Low dimensional subspace finding via size-reducing dictionary learning,” in 2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP), 2016, pp. 1–6.
[27] R. E. Rolon, L. E. Di Persia, R. D. Spies, and H. L. Rufiner, “A multi-class structured dictionary learning method using discriminant atom selection,” Pattern Analysis and Applications, vol. 24, no. 2, pp. 685–700, 2021.
[28] W. Wang, Y. Yan, S. Winkler, and N. Sebe, “Category specific dictionary learning for attribute specific feature selection,” IEEE Transactions on Image Processing, vol. 25, no. 3, pp. 1465– 1478, 2016.
[29] G. Cohen, S. Afshar, J. Tapson, and A. Van Schaik, “EMNIST: Extending mnist to handwritten letters,” in Proceedings of International Joint Conference on Neural Networks (IJCNN), 2017, pp. 2921–2926.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *