An article on OpenStax by reporter Rebecca Koenig appears in the Oct 24, 2019 edition of EdSurge.
The Implicit Regularization of Ordinary Least Squares Ensembles
D. LeJeune, H. Javadi, R. G. Baraniuk, "The Implicit Regularization of Ordinary Least Squares Ensembles," arxiv.org/abs/1910.04743, 10 October 2019.
Ensemble methods that average over a collection of independent predictors that are each limited to a subsampling of both the examples and features of the training data command a significant presence in machine learning, such as the ever-popular random forest, yet the
nature of the subsampling effect, particularly of the features, is not well understood. We study the case of an ensemble of linear predictors, where each individual predictor is fit using ordinary least squares on a random submatrix of the data matrix. We show that, under standard Gaussianity assumptions, when the number of features selected for each predictor is optimally tuned, the asymptotic risk of a large ensemble is equal to the asymptotic ridge regression risk, which is known to be optimal among linear predictors in this setting. In addition to eliciting this implicit regularization that results from subsampling, we also connect this ensemble to the dropout technique used in training deep (neural) networks, another strategy that has been shown to have a ridge-like regularizing effect.
Above: Example (rows) and feature (columns) subsampling of the training data X used in the ordinary least squares fit for one member of the ensemble. The i-th member of the ensemble is only allowed to predict using its subset of the features (green). It must learn its parameters by performing ordinary least squares using the subsampled examples of (red) and the subsampled examples (rows) and features (columns) of X (blue, crosshatched).
More than Half of All US Colleges using OpenStax Textbooks
From an article in Campus Technology: This year, 56% of all colleges and universities in the United States are using free textbooks from OpenStax in at least one course. That equates to 5,900-plus institutions and nearly 3 million students.
OpenStax provides textbooks for 36 college and Advanced Placement courses. Students can access the materials for free digitally (via browser, downloadable PDF or recently introduced OpenStax + SE mobile app), or pay for a low-cost print version. Overall, students are saving more than $200 million on their textbooks in 2019, and have saved a total of $830 million since OpenStax launched in 2012.
Future plans for the publisher include the rollout of Rover by OpenStax, an online math homework tool designed to give students step-by-step feedback on their work. OpenStax also plans to continue its research initiatives on digital learning, using cognitive science-based approaches and the power of machine learning to improve how students learn.
OpenStax Cutting College Textbook Costs
Writes Chris Taylor from Reuters in Moneysaving 101: Four Ways to Cut College Textbook Costs, "While sky-high U.S. college tuition might be the headline number, here is a sneaky little figure that might surprise you: the cost of textbooks." See what OpenStax is doing about the crisis here.
Wall Street Journal Discusses the Disruptive Impact of OpenStax Texts
An article in the 28 July 2019 Wall Street Journal, "A Key Reason the Fed Struggles to Hit 2% Inflation: Uncooperative Prices" discusses the disruptive impact on the college textbook market of the free and open-source textbooks provided by OpenStax . Read online at Morningstar.com.
Spline Theory of Deep Networks Talk at Simons Institute
“Mad Max: Affine Spline Insights into Deep Learning"
Frontiers of Deep Learning Workshop, Simons Institute
16 July 2019
References:
- “A Spline Theory of Deep Networks,” ICML 2018
- “Mad Max: Affine Spline Insights into Deep Learning,” arxiv.org/abs/1805.06576, 2018
- “From Hard to Soft: Understanding Deep Network Nonlinearities…,” ICLR 2019
- “A Max-Affine Spline Perspective of RNNs,” ICLR 2019
- “A Hessian Based Complexity Measure for Deep Networks,” arxiv.org/abs/1905.11639, 2019
Co-authors: Randall Balestriero, Jack Wang, Hamid Javadi
An alternative presentation at the Alan Turing Institute, May 2019 (Get your SPARFA merchandise here!)
Academic Family Tree
Thanks to Shashank Sonkar, CJ Barberan, and Pavan Kota of the DSP group for producing the RichB Academic Family Tree ca. 2019. The code is available here.
Four Papers at ICLR 2019
DSP group members will be traveling en masse to New Orleans in May 2019 to present four regular papers at the International Conference on Learning Representations
- R. Balestriero and R. G. Baraniuk, “Hard to Soft: Understanding Deep Network Nonlinearities via Vector Quantization and Statistical Inference”
- J. Wang, R. Balestriero, and R. G. Baraniuk, “A Max-Affine Perspective of Recurrent Neural Networks”
- A. Mousavi, G. Dasarathy, and R. G. Baraniuk, “A Data-Driven and Distributed Approach to Sparse Signal Representation and Recovery”
- J. J. Michalenko, A. Shah, A. Verma, R. G. Baraniuk, S. Chaudhuri, and A. B. Patel, “Representing Formal Languages: A Comparison between Finite Automata and Recurrent Neural Networks”
Two Workshops at NIPS 2018
Two workshops have been accepted for NIPS in December 2018; more details soon on how to contribute:
- Integration of Deep Learning Theories (R. G. Baraniuk, S. Mallat, A. Anandkumar, A. Patel, and N. Ho)
- Machine Learning for Geophysical & Geochemical Signals (L. Pyrak-Nolte, J. Morris, J. Rustad, R. G. Baraniuk)
48% of US Colleges, 2.2 Million Students using Free OpenStax Textbooks This Year
This year, over 2.2 million students are saving an estimated $177 million by using free textbooks from OpenStax, the Rice University-based publisher of open educational resource materials. Since 2012, OpenStax's 29 free, peer-reviewed, openly licensed textbooks for the highest-enrolled high school and college courses have been used by more than 6 million students. This year, OpenStax added several new books to its library, including Biology for AP Courses, Introductory Business Statistics and second editions of its economics titles.
OpenStax books are having a tangible, marketwide impact, according to a 2017 Babson Survey that found that “the rate of adoption of OpenStax textbooks among faculty teaching large-enrollment courses is now at 16.5%, a rate which rivals that of most commercial textbooks." "We're excited about the rapidly growing number of instructors making the leap to open textbooks," said OpenStax founder Richard Baraniuk, the Victor E. Cameron Professor of Electrical and Computer Engineering at Rice. "Our community is creating a movement that will make a big impact on college affordability. The success of open textbooks like OpenStax have ignited competition in the textbook market, and textbook prices are actually falling for the first time in 50 years."
Read the full press release