Convex set, Operations research, Mathematics, Optimization, Convex function, Calculus

Going beyond least-squares – I : self-concordant analysis of Newton method

On Feb 1, 2021
@lawrennd shared
RT @BachFrancis: Least squares are great! They are even greater owing to self-concordance. See why in this month blog post: https://t.co/xcmZMVs0rB https://t.co/JYhlsIU6wp
Open

Classical examples are all linear and quadratic functions, the negative logarithm, the negative log-determinant, or the negative logarithm of quadratic functions. Classical analysis of Newton method Given a function , Newton method is an iterative optimization algorithm consisting in ...

francisbach.com
On Feb 1, 2021
@lawrennd shared
RT @BachFrancis: Least squares are great! They are even greater owing to self-concordance. See why in this month blog post: https://t.co/xcmZMVs0rB https://t.co/JYhlsIU6wp
Open

Going beyond least-squares – I : self-concordant analysis of Newton method

Going beyond least-squares – I : self-concordant analysis of Newton method

Classical examples are all linear and quadratic functions, the negative logarithm, the negative log-determinant, or the negative logarithm of quadratic functions. Classical analysis of ...

Effortless optimization through gradient flows

Effortless optimization through gradient flows

From gradient descent to gradient flows Gradient descent is the most classical iterative algorithm to minimize differentiable functions. I chose two starting points famous to cyclists, Col ...

Demystifying the Math of Support Vector Machines (SVM)

Demystifying the Math of Support Vector Machines (SVM)

This article was written by Krishna Kumar Mahto.  So, three days into SVM, I was 40% frustrated, 30% restless, 20% irritated and 100% inefficient in terms of g…

Pushing the boundaries of convex optimization

Pushing the boundaries of convex optimization

Convex optimization has many applications ranging from operations research and machine learning to quantum information theory.

Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments

Incremental and Parallel Machine Learning Algorithms With Automated Learning Rate Adjustments

The existing machine learning algorithms for minimizing the convex function over a closed convex set suffer from slow convergence because their learning rates must be determined before ...

Machine Learning Glossary

Machine Learning Glossary

Compilation of key machine-learning and TensorFlow terms, with beginner-friendly definitions.

Adaptive Algorithms: L* bounds and AdaGrad

Adaptive Algorithms: L* bounds and AdaGrad

In this lecture, we will explore a bit more under which conditions we can get better regret upper bounds than $latex {O(D L \sqrt{T})}&fg=000000$. Also, we will obtain this improved ...

Nevergrad: An open source tool for derivative-free optimization

Nevergrad: An open source tool for derivative-free optimization

We are open-sourcing Nevergrad, a Python3 library that makes it easier to perform gradient-free optimizations used in many machine learning tasks.

Essential Math for Data Science: ‘Why’ and ‘How’

Essential Math for Data Science: ‘Why’ and ‘How’

Data summaries and descriptive statistics, central tendency, variance, covariance, correlation, Basic probability: basic idea, expectation, probability calculus, Bayes theorem, conditional ...

Essential Math for Data Science

Essential Math for Data Science

Almost all the techniques of modern data science, including machine learning, have a deep mathematical underpinning. A solid understanding of a few key topics will give you an edge in the ...

Convex geometry of quantum resource quantification

Convex geometry of quantum resource quantification

We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, ...