Here’s a little problem I’ve been wondering about for a while. Suppose you’re trying to find the solution to the saddle-point optimization
where and are vectors, functions and map from their respective vector spaces to scalar outputs, and is a matrix. Assume that is convex and is concave. Let’s call the objective value of this problem .
Suppose that some oracle gives you a functional
i.e., the solution to the inner maximization of the original saddle-point problem . We can then consider the optimization
There’s a surprising (to me) result that the gradient of the function is
This gradient somehow ignores the gradient of , which is clearly a function that depends on . This gradient also happens to be the partial gradient with respect to . Why does the dependence on disappear? Let’s try to see why this is true.
Before we do that, let me give an example of where this form of optimization arises. The most prominent example is for Lagrangian relaxation. When you’re trying to minimize some function with a constraint $Ax = b$, you can form the Lagrangian problem
which takes the general form we started with if , , and . This general form also arises in structured prediction, for example when the inner maximization is the separation oracle of a structured support vector machine or the variational form of inference in a Markov random field.
Back to the general form, let’s try taking the gradient the traditional way, starting with
The second term can be expanded with some chain rule action:
. (I’m probably botching the transposes here.)
The third term can be expanded with product rule:
We also know something about . Since it comes from maximizing , we know that its gradient wrt is zero, i.e.,
which means , and .
The second term then can be replaced with
This replacement directly cancels out a term in the product rule (third term). Leaving us with
I suspect there’s an even more generalized form of this property, perhaps generalizing the bilinearity of the problem to some other kind of convex-concave relationship between and . Let me know if you know of anything along those lines.