This presentation is not revolutionary, but evolutionary. Our aim is provide very real, concrete measures attendees can take home and apply in their invitations to online surveys. Utilizing a monthly tracking study for a client, Corona has conducted A/B testing for the past year to determine what helps, and hurts, response rates. Items tested included: the name in the "from" field, subject lines, the location of the web link; the format of the link, overall formatting, and more. The differences can often be minor (a couple of %), but on a large study it can still be significant. Also tested were the impact of incentives (e.g., none, contest drawing, gift cards, and varying levels), both on the actual response rates as well as the impact it has on the actual results (e.g., by reducing non-response bias, how does that change the results?). While the tracking research was conducted with a B2B audience, we will convey findings from additional ad hoc studies with other audiences.