"Singular Value Perturbation and Deep Network Optimization", Rudolf H. Riedi, Randall Balestriero, and Richard G. Baraniuk, Constructive Approximation, 27 November 2022 (also arXiv preprint 2203.03099, 7 March 2022)

Deep learning practitioners know that ResNets and DenseNets are much preferred over ConvNets, because empirically their gradient descent learning converges faster and more stably to a better solution.  In other words, it is not what a deep network can approximate that matters, but rather how it learns to approximate. Empirical studies have indicated that this is because the so-called loss landscape of the objective function navigated by gradient descent as it optimizes the deep network parameters is much smoother for ResNets and DenseNets as compared to ConvNets (see Figure 1 from Tom Goldstein's group below). However, to date there has been no analytical work in this direction.

Building on our earlier work connecting deep networks with continuous piecewise-affine splines, we develop an exact local linear representation of a deep network layer for a family of modern deep networks that includes ConvNets at one end of a spectrum and networks with skip connections, such as ResNets and DenseNets, at the other. For tasks that optimize the squared-error loss, we prove that the optimization loss surface of a modern deep network is piecewise quadratic in the parameters, with local shape governed by the singular values of a matrix that is a function of the local linear representation. We develop new perturbation results for how the singular values of matrices of this sort behave as we add a fraction of the identity and multiply by certain diagonal matrices. A direct application of our perturbation results explains analytically why a network with skip connections (e.g., ResNet or DenseNet) is easier to optimize than a ConvNet: thanks to its more stable singular values and smaller condition number, the local loss surface of a network with skip connections is less erratic, less eccentric, and features local minima that are more accommodating to gradient-based optimization. Our results also shed new light on the impact of different nonlinear activation functions on a deep network's singular values, regardless of its architecture.

Rice DSP PhD AmirAli Aghazadeh (PhD, 2017) has accepted an assistant professor position at Georgia Tech in the Department of Electrical and Computer Engineering.  He has spent the past few years as a postdoc at Stanford University and UC-Berkeley.  AmirAli joins Rice DSP PhD alums James McClellan, Douglas Williams, Justin Romberg, Christopher Rozell, Mark Davenport, and Eva Dyer and  Rice ECE PhD alum Robert Butera.

Richard G. Baraniuk, the C. Sidney Burrus Professor of Electrical and Computer Engineering (ECE) and founding director of OpenStax, Rice’s educational technology initiative, has received the Harold W. McGraw, Jr. Prize in Education.  The award is given annually by the Harold W. McGraw, Jr. Family Foundation and the University of Pennsylvania Graduate School of Education and goes to “outstanding individuals whose accomplishments are making a difference in the lives of students.”  Baraniuk is one of the founders of the Open Education movement that promotes the use of free and open-source-licensed Open Educational Resources. He is founder and director of OpenStax (formerly Connexions), a non-profit educational and scholarly publishing project he founded in 1999 to bring textbooks and other learning materials into the digital age.

DSP alum Justin Romberg (PhD, 2003), Schlumberger Professor Electrical and Computer Engineering at Georgia Tech, has been awarded the 2021 IEEE Jack S. Kilby Medal. He and his co-awardees Emmanuel Candes of Stanford University and Terrance Tao of UCLA will receive the highest honor in the field of signal processing for "groundbreaking contributions to compressed sensing."

Justin joins Rice DSP alum Jim McClellan (PhD, 1973), John and Marilu McCarty Chair of Electrical Engineering at Georgia Tech, and Rice DSP emeritus faculty member C. Sidney Burrus  as recipients of this honor.

 

Open educational resources publisher OpenStax has received $12.5 million in funding to develop dozens of new free and open-licensed textbook titles as part of a program that will double its current catalog of 42 textbooks.

“Nine years ago, we dreamed about solving the textbook affordability and access crisis for students,” said Richard Baraniuk, the Victor E. Cameron Professor of Electrical and Computer Engineering at Rice and founder and director of OpenStax. “Now, with this tremendous investment in open education, we will be able to not only accelerate educational access for tens of millions of students but also drive innovation in high-quality digital learning, which has become commonplace due to Covid-19.”

Read more in the press release and Inside Higher Education.