Skip to main content
SHARE
Publication

Understanding and Estimating Error Propagation in Neural Networks for Scientific Data Analysis...

Publication Type
Conference Paper
Book Title
2025 IEEE 41st International Conference on Data Engineering (ICDE)
Publication Date
Page Numbers
1869 to 1881
Publisher Location
New Jersey, United States of America
Conference Name
2025 IEEE International Conference on Data Engineering (ICDE)
Conference Location
Hong Kong, Hong Kong
Conference Sponsor
Hong Kong Tourism Board, Huawei, Alibaba Could, Baidu, ByteDance, The Hong Kong Polytechnic University, The Hong Kong University of Science and Technology
Conference Date
-

Neural networks are increasingly integrated into scientific discovery, where input data reduction and model quantization play a key role in accelerating inference. However, understanding and mitigating the impact of these techniques on output error is critical for ensuring reliable results, particularly in tasks demanding high numerical precision. This paper introduces a comprehensive framework for optimizing neural network inference in scientific computing by combining data reduction and weight quantization while maintaining error-controlled outcomes. We develop theoretical analyses to bound error propagation under these reductions and propose a framework that balances computational performance with error constraints. Evaluation on real-world learning-based combustion simulations and satellite image classification demonstrates that our derived error bounds accurately predict observed errors while enabling significant computational speedup under our framework. This work highlights the potential for further leveraging advancements in modern lossy compression algorithms and hardware accelerators that support lower-precision formats.