|
[1] Cao, C., Zhang, Y., Wu, Y., Lu, H. & Cheng, J. Egocentric gesture recognition using recurrent 3d convolutional neural networks with spatiotemporal transformer modules. Proceedings Of The IEEE International Conference On Computer Vision. pp. 3763-3771 (2017) [2] Frankle, J. & Carbin, M. The lottery ticket hypothesis: Finding sparse, trainable neural networks. ArXiv Preprint ArXiv:1803.03635. (2018) [3] Frankle, J., Dziugaite, G., Roy, D. & Carbin, M. Linear mode connectivity and the lottery ticket hypothesis. International Conference On Machine Learning. pp. 3259-3269 (2020) [4] Fu, C. PyTorch-VGG-CIFAR10. (https://github.com/chengyangfu/pytorchvgg- cifar10), Accessed: 2023-06-23 [5] Ghodasara, K. Overview of Decision Tree Pruning in Machine Learning. International Research Journal Of Engineering And Technology (IRJET). 8 (2021) [6] Han, S., Pool, J., Tran, J. & Dally, W. Learning both weights and connections for efficient neural network. Advances In Neural Information Processing Systems. 28 (2015) [7] He, K., Zhang, X., Ren, S. & Sun, J. Deep residual learning for image recognition. Proceedings Of The IEEE Conference On Computer Vision And Pattern Recognition. pp. 770-778 (2016) [8] He, Y., Kang, G., Dong, X., Fu, Y. & Yang, Y. Soft filter pruning for accelerating deep convolutional neural networks. ArXiv Preprint ArXiv:1808.06866. (2018) [9] Idelbayev, Y. Proper ResNet Implementation for CIFAR10/CIFAR100 in PyTorch. (https://github.com/akamaster/pytorch resnet cifar10), Accessed: 2023-05-14 [10] Krizhevsky, A., Hinton, G. & Others Learning multiple layers of features from tiny images. (Toronto, ON, Canada,2009) [11] Le, D. & Hua, B. Network pruning that matters: A case study on retraining variants. ArXiv Preprint ArXiv:2105.03193. (2021) [12] LeCun, Y., Denker, J. & Solla, S. Optimal brain damage. Advances In Neural Information Processing Systems. 2 (1989) [13] Li, H., Kadav, A., Durdanovic, I., Samet, H. & Graf, H. Pruning filters for efficient convnets. ArXiv Preprint ArXiv:1608.08710. (2016) [14] Liu, Z., Sun, M., Zhou, T., Huang, G. & Darrell, T. Rethinking the value of network pruning. ArXiv Preprint ArXiv:1810.05270. (2018) [15] Neyshabur, B., Li, Z., Bhojanapalli, S., LeCun, Y. & Srebro, N. Towards understanding the role of over-parametrization in generalization of neural networks. ArXiv Preprint ArXiv:1805.12076. (2018) [16] Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen, T., Lin, Z., Gimelshein, N., Antiga, L. & Others Pytorch: An imperative style, high-performance deep learning library. Advances In Neural Information Processing Systems. 32 (2019) [17] Renda, A., Frankle, J. & Carbin, M. Comparing rewinding and fine-tuning in neural network pruning. ArXiv Preprint ArXiv:2003.02389. (2020) [18] Simonyan, K. & Zisserman, A. Very deep convolutional networks for large-scale image recognition. ArXiv Preprint ArXiv:1409.1556. (2014) [19] Smith, L. & Topin, N. Super-convergence: Very fast training of neural networks using large learning rates. Artificial Intelligence And Machine Learning For Multi-domain Operations Applications. 11006 pp. 369-386 (2019) [20] Tran, D., Wang, H., Torresani, L., Ray, J., LeCun, Y. & Paluri, M. A closer look at spatiotemporal convolutions for action recognition. Proceedings Of The IEEE Conference On Computer Vision And Pattern Recognition. pp. 6450-6459 (2018) [21] Zhang, Y., Cao, C., Cheng, J. & Lu, H. EgoGesture: A new dataset and benchmark for egocentric hand gesture recognition. IEEE Transactions On Multimedia. 20, 1038-1050 (2018) |