|
Subject
Deep neural networks (DNNs) have revolutionized many domains, particularly in vision-based classification tasks. The training of such models, including foundation models, has become crucial across numerous applications, from medical imaging and autonomous driving to industrial inspection and security. However, training these models efficiently remains a significant challenge due to the high computational and memory demands. To address this, distributed learning and optimization techniques are essential, enabling collaborative training across multiple devices or computing nodes. Moreover, a bottleneck in distributed learning is the communication overhead between processing units. Learned compression and coding strategies play a vital role in mitigating this issue, ensuring efficient information exchange without compromising model performance.
Kind of work
The thesis will focus on distributed learning frameworks for training deep learning models efficiently. It will also involve the exploration and implementation of deep learning architectures tailored for vision-based classification tasks. Additionally, compression and coding techniques will be studied and applied to optimize communication overhead in distributed learning. The student will engage with these topics through theoretical study and practical implementation, gaining a comprehensive understanding of their interplay in distributed deep learning.
Framework of the Thesis
The thesis will begin with a thorough literature review, covering existing research on distributed learning, deep learning for classification, and compression techniques. The student will then develop and evaluate a deep learning pipeline for a vision classification task, establishing baseline performance in a centralized setting. A suitable distributed learning method will be implemented, incorporating efficient compression strategies to reduce communication overhead. The next phase will involve integrating the distributed learning framework with the classification pipeline to assess its effectiveness in a real-world scenario. Finally, the research findings, results, and contributions will be documented in a thesis manuscript.
Number of Students
1
Expected Student Profile
The ideal candidate for this master thesis should have a strong background and interest in deep learning via Python programming. An interest in distributed learning frameworks and information theory is essential.
This thesis will provide the student with hands-on experience in cutting-edge distributed learning techniques and vision-based classification methods, preparing them for both academic research and industry applications in AI and deep learning.
|
|