*Result*: Edge AI를 위한 Pure-C 모델을 활용한 TRU-Net 기반 실시간 음성 개선.

Title:
Edge AI를 위한 Pure-C 모델을 활용한 TRU-Net 기반 실시간 음성 개선. (Korean)
Alternate Title:
Real-time Speech Enhancement Based on TRU-Net Using a Pure-C Model for Edge AI. (English)
Authors:
Source:
Journal of the Korea Institute of Information & Communication Engineering; Nov2025, Vol. 29 Issue 11, p1470-1480, 11p
Database:
Complementary Index

*Further Information*

*Developing AI models using machine learning frameworks such as TensorFlow or PyTorch often introduces limitations in memory usage, power efficiency, and real-time control. For edge AI deployment, optimizing computational performance alone is insufficient; architectures must also be designed to maximize memory efficiency and resource utilization. In this paper, we implement a TRU-Net based ambient noise reduction based on real-time AI model entirely in pure C to enable fine grained detailed control of system resources, including memory access, buffer size configuration, computational strategies, and parallel processing. The proposed implementation incorporates several optimization techniques, including optimized NPU architecture, minimal memory usage for speech enhancement tasks, efficient buffer and intermediate tensor handling, and improved parallelism via loop unrolling. Experimental results show that the proposed implementation reduces CNN execution time by 40% while minimizing memory usage, demonstrating its effectiveness for real-time, low-power edge AI applications. [ABSTRACT FROM AUTHOR]

Copyright of Journal of the Korea Institute of Information & Communication Engineering is the property of Korea Institute of Information & Communication Engineering and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)*