Logo: to the web site of Uppsala University

uu.sePublications from Uppsala University
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
On Analog Distributed Approximate Newton with Determinantal Averaging
Maynooth Univ, Hamilton Inst, Maynooth, Kildare, Ireland..
Uppsala University, Disciplinary Domain of Science and Technology, Technology, Department of Electrical Engineering, Signals and Systems.ORCID iD: 0000-0003-0762-5743
2022 (English)In: 2022 IEEE 33rd Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), Institute of Electrical and Electronics Engineers (IEEE), 2022Conference paper, Published paper (Refereed)
Abstract [en]

This paper considers the problem of communication and computation-efficient distributed learning via a wireless fading Multiple Access Channel (MAC). The distributed learning task is performed over a large network of nodes containing local data with the help of an edge server coordinating between the nodes. The information from each distributed node is transmitted as an analog signal through a noisy fading wireless MAC, using a common shaping waveform. The edge server receives a superposition of the analog signals, computes a new parameter estimate and communicates it back to the nodes, a process which continues until an appropriate convergence criterion is met. Unlike typical Federated learning approaches based on communication of local gradients and averaging at the edge server, in this paper, we investigate a scenario where the local nodes implement a second order optimization technique known as Determinantal Averaging. The communication complexity at each iteration per node of this method is the same as any gradient based method, i.e. O(d) , where d is the number of parameters. To reduce the computational load at each node, we also employ an approximate Newton method to compute the local Hessians. Under the usual assumptions of convexity and double differentiability on the local objective functions, we propose an algorithm titled Distributed Approximate Newton with Determinantal Averaging (DANDA). The state-of-art first and second-order distributed optimization algorithms are numerically compared with DANDA on a standard dataset with least squares based local objective functions (linear regression). Simulation results illustrate that DANDA not only displays faster convergence compared to gradient-based methods, but also compares favourably with exact distributed Newton methods, such as LocalNewton.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2022.
Series
IEEE International Symposium on Personal Indoor and Mobile Radio Communications Workshops, ISSN 2166-9570, E-ISSN 2166-9589
Keywords [en]
Distributed Learning (DL), Analog Transmission, Fading Multiple Access Channel (MAC), Approximate Newton methods
National Category
Communication Systems Signal Processing
Identifiers
URN: urn:nbn:se:uu:diva-498512DOI: 10.1109/PIMRC54779.2022.9977466ISI: 000930733200001ISBN: 978-1-6654-8053-6 (electronic)ISBN: 978-1-6654-8054-3 (print)OAI: oai:DiVA.org:uu-498512DiVA, id: diva2:1744328
Conference
33rd IEEE Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), SEP 12-15, 2022, Online
Available from: 2023-03-17 Created: 2023-03-17 Last updated: 2023-03-17Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Dey, Subhrakanti

Search in DiVA

By author/editor
Dey, Subhrakanti
By organisation
Signals and Systems
Communication SystemsSignal Processing

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 6 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf