Abstract
We consider an online optimization problem on a compact subset (not necessarily convex), in which a decision maker chooses, at each iteration t, a probability distribution over S, and Pseeks T to minimize a cumulative expected loss, , where is a Lipschitz loss function revealed at the end of iteration t. Building on previous work, we propose a generalized Hedge algorithm and show a bound on the regret when the losses are uniformly Lipschitz and S is uniformly fat (a weaker condition than convexity). Finally, we propose a generalization to the dual averaging method on the set of Lebesgue-continuous distributions over S.