## Speaker:

## Time:

## Venue:

- A-212 (STCS Seminar Room)

(i) We propose and analyze a new scheme for stabilizing the stochastic approximation iterates, viz., an adaptation of step sizes that controls the growth of the iterates without affecting their asymptotic behavior. This amounts to scaling the step sizes appropriately when the iterates are sufficiently far away from the origin. Only a finite random number of steps differ from the original scheme, and unlike the projection method, this scheme does not introduce spurious equilibria. Further, instead of requiring the o.d.e. to descend the Lyapunov function everywhere where the function isn't at its minimum, we only require it to do so outside a sphere of arbitrarily large radius. [http://arxiv.org/abs/1007.4689]

(ii) Sample complexity refers to the number of samples needed to be within a specific distance from the attracting set of the o.d.e. with a given probability provided that the iterates enter its domain of attraction with the step size sufficiently small at the time of entry. In previous work by Borkar, sample complexity estimates were obtained under the assumption that the suitably re-scaled martingale difference noise remains bounded. We recover these results under a much weaker condition that only requires the re-scaled martingale differences to have an exponentially decaying conditional tail probability. [http://arxiv.org/abs/1007.4684]