So when being bored at work I've been reading around about mouse optimization and stumbled across the "raw" mouse input available in games (in_mouse 2 in QL, m_raw in Wsw and that new .exe for q3). This lead me to try it out and I like the results. I also read quite a bit of about stuff like interpolation and how altering windows sensitivity can give wierd effects (like skipping pixels). I also messed around a bit with my DPI values for my Ikari laser and found this on their website:

"To avoid any type of interpolation, remember to set your Windows sensitivity to default, remove acceleration and set your ingame sensitivity to one" ( http://www.steelseries.com/int/products/mice/...nformation )

This lead me to going from 400DPI @ sens 5 to 2000 DPI @ sens 1. Atm I'm using 1500 DPI with sensitivity one but I started thinking about DPI values and how a greater value (greater "resolution"?) should give more precision. So then it would be better to use 3000 DPI and a sensitivity of 0.5. But then Steelseries claims that you get software interpolation. Is this true?

I also read in some thread that you can use too high DPI in relation to your desired cm/360 and in-game resolution? Is this true or is higher DPI always better? If it is, what is more worth it: Higher DPI with some software interpolation or Lower DPI with sensitivity one?