Mobile Edge Computing Enabled Efficient Communication Based on Federated Learning in Internet of Medical Things

The rapid growth of the Internet of Medical Things (IoMT) has led to the ubiquitous home health diagnostic network. Excessive demand from patients leads to high cost, low latency, and communication overload. However, in the process of parameter updating, the communication cost of the system or netwo...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Xiao Zheng, Syed Bilal Hussain Shah, Xiaojun Ren, Fengqi Li, Liqaa Nawaf, Chinmay Chakraborty, Muhammad Fayaz
Formato: article
Lenguaje:EN
Publicado: Hindawi-Wiley 2021
Materias:
T
Acceso en línea:https://doaj.org/article/6a6250d046be4736a452c19e6c3cc59c
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Sumario:The rapid growth of the Internet of Medical Things (IoMT) has led to the ubiquitous home health diagnostic network. Excessive demand from patients leads to high cost, low latency, and communication overload. However, in the process of parameter updating, the communication cost of the system or network becomes very large due to iteration and many participants. Although edge computing can reduce latency to some extent, there are significant challenges in further reducing system latency. Federated learning is an emerging paradigm that has recently attracted great interest in academia and industry. The basic idea is to train a globally optimal machine learning model among all participating collaborators. In this paper, a gradient reduction algorithm based on federated random variance is proposed to reduce the number of iterations between the participant and the server from the perspective of the system while ensuring the accuracy, and the corresponding convergence analysis is given. Finally, the method is verified by linear regression and logistic regression. Experimental results show that the proposed method can significantly reduce the communication cost compared with the general stochastic gradient descent federated learning.