Deep learning is a powerful tool not only in computer science and data science but also in scientific computing. It has led to numerous breakthroughs in science and engineering. This talk introduces newly developed reproducing activation functions as a simple but very efficient technique to boost the performance of deep learning. Both theoretical insights and numerical evidence are provided to demonstrate the effectiveness of reproducing activation functions in terms of approximation theory and optimization analysis.