Like many serious gamers, I use a logitech G pro mouse (wired). I originally bought it based on the reviews of the hardware that claimed it was capable of perfect 1-to-1 input. When I first got it, this did indeed seem to be the case. However, some time in the last 6 months, that seems to have changed. Lately, when I perform a flick shot, I will frequently undershoot it, or be off in either direction by a small amount. At first I thought it was my imagination or I was simply in some kind of slump.

Still, I wasn't 100% satisfied with that, because I have been using the same settings for close to 8 years now, and I can literally hit flick shots on stationary targets with my eyes closed. Coincidentally, this has proven to be one of the best methods to test with since it prevents me from compensating or correcting mid-aim, but we'll get to that later. At some point I begun to dig around in the driver configuration files to see if I could find anything to help explain this.

The settings for the LGS driver are located under C:\users\(your user name)\AppData\Local\Logitech\Logitech Gaming Software\

The drive letter is based on whatever your default OS drive is.

The settings for LG hub are in the same local appdata folder, but since I couldn't find a way to tweak those, I'll skip that for now.

Inside there is a folder called profiles, which contains your default profile plus any extras you may have created. The profiles are stored in XML form, which is editable by any decent text editor. I personally use Notepad++ since I find this to be the most helpful with formatting.

If you scroll down near to the end you will find a section that lists your pointers. If you have a G-Pro you will find a section that should be something like this

<pointer devicemodel="Logitech.Gaming.Mouse.Pro.C08C">
<mode shiftstate="1">
<reportrate rate="1000"/>
<powermode value="2"/>
<dpitable shiftindex="1" defaultindex="1" syncxy="1">
<dpi x="400" y="400" enabled="1"/>
<dpi x="800" y="800" enabled="1"/>
<dpi x="1600" y="1600" enabled="1"/>
<dpi x="3200" y="3200" enabled="1"/>
<movement speed="-1" acceleration="0"/>

Your settings may be different. Don't worry about that for now. Most of that is not important, the thing you want to focus your attention on is this line

<movement speed="-1" acceleration="0"/>

Curious, isn't it? Why would speed be set to -1 by default? In the process of trying to figure out what exactly this meant, I started to experiment with different values, and discovered quickly that this drastically changes the way my mouse moves.

Ok, good start, we're on to something here. However, setting it to 0 or 1, the obvious alternatives, did not seem to work how I wanted. So I began to do some digging to figure out if I could find any official documentation by logitech.

Sadly, there was none. But, in the process I may have discovered something very important. You see, it seems as though the logitech firmware uses an almost identical input curve to a linux library called libinput.

You can read more about it here -

How do I know that the logitech drivers are using the same framework?

Take a look at this graph.

Do you see how it defines linear acceleration as a product of speed values ranging from -1 to 1? That's the first clue. Now since I am quite sure my mouse does not ACTUALLY have acceleration, something else must be going on.

What I decided was that for some reason the mouse defaults to an almost identical acceleration curve but then if acceleration is disabled, it tries to convert those values back to flat 1-to-1 inputs.

Since the default speed of -1 produces, if this graph is equivalent, a value of 0.5 inputs at all speeds, this means that it would multiply your input by 2. Which means that you are effectively getting filtered inputs at this speed!

Here is the second clue, and it doesn't look like the logitech setup is exactly the same as wayland/libinput - the mouse seems to have some form of constant deceleration.

To figure this out I had to go searching through more advanced acceleration setups, namely xinput/xorg. I'll be honest, at this point you may be skeptical. I understand that completely.

These are all linux based drivers. What on earth does that have to do with logitech firmware/drivers in a windows environment?

I have been wracking my brain trying to come up with a way to PROVE any of this, but since it seems to happen at the firmware level, I can't figure out how to demonstrate it empirically. Just bear with me for now and try the settings out yourself to see if it works for you.

This blog in particular proved to be very helpful in my attempts to deduce what exactly was taking place at the driver level.

Anyway, the Xinput/Xserver convention is to apply constantdeceleration to the end of the input, with deceleration of 1 being the default (non-decelerated). Which means that if a device were, hypothetically, using the same convention, but used a value of 0 if the setting was absent, it would be forcing deceleration.

Deceleration is one of those gimmicky things that makes a mouse FEEL more precise, without actually increasing precision. In reality, it considerably reduces precision when you have memorized your chosen sensitivity.

It seemed perfectly reasonable to me that logitech engineers would add it on at some point as a marketing gimmick, so that people who were not serious gamers and used the mouse for the first time would believe the mouse "felt very precise" without doing any actual testing to discover if this was true (ie reviewers).

Still it was kind of a wild guess at this point, but I decided to try adding deceleration of 0 to my driver profile.

So my profile .xml read as

<movement speed="-1" acceleration="0" deceleration="0"/>

Almost immediately I could tell a difference. This was a very important clue. If there was no deceleration and the mouse did not recognize the command, why did it change the way it moved? However, setting it to 0 did not seem to produce the desired 1-to-1 I was hoping for.

So I tried setting it to 1, following the Xserver convention. This turned out to be the correct setting. I could tell a very immediate difference and did not feel as though flickshots were falling short of where I intended.

An unexpected consequence was that TRACKING was considerably improved, which makes sense, since you need a smooth constant stream of inputs that is NOT being mathematically altered to properly track very small targets, like heads.

my profile .xml now read as

<movement speed="-1" acceleration="0" deceleration="1"/>

This was already a big improvement. But I wanted to go back and figure out what the correct setting for the speed setting was. To do this I went to and input the values from the libinput graph into a slope/line equation.

Graphing the known values gave me a slope of 0.66666666667, and the value at which X=1 (basically your post-accel input) was also 0.66666667. Thinking I had solved a great puzzle, I tried that instead of speed="-1".

<movement speed="-0.666" acceleration="0" deceleration="1"/>

This turned out to be a slight improvement, but something still felt off. To be honest, I was kind of stumped here. Everything up to this point had suggested that the logitech drivers use the same conventions and values as the linux ones.

That's when I went back to libinput and noticed this page

I, like many other people, play on 800 DPI. According to this, DPI values below 1000 are normalized to an equivalent by post-accel multiplication. So in the case of 800dpi, all inputs would be multiplied to 1.25.

Again, I want to stress that the problem is not that my inputs are ACTUALLY 1.25x as fast as they should be. My suspicion is that the driver first multiplies the values as though acceleration is enabled, and then flattens them out to remove acceleration.

In the process, you lose your floats and you end up with pixel-skipping or filtered input.

So going off this idea, I went back to my profile xml and divided speed by 1.25, which gave me -0.533.

<movement speed="-0.533" acceleration="0" deceleration="1"/>

This was much better! In fact, it was getting very close to feeling like it was perfect 1-to-1 input again! Something was still slightly off, and I re-worked the curve a bunch of times to try and come up with an explanation.

My theory eventually became that the curve logitech uses is NOT identical to libinput (although it shares the same shape, the scale is different), and that the -0.666 for 1 was actually -0.333 for 1. So, dividing that by 1.25 again, this gave me

<movement speed="-0.266" acceleration="0" deceleration="1"/>

And finally, I had perfect 1-to-1 mouse inputs again! It was very dramatic and extremely noticeable once I had narrowed down the settings correctly: I could once again hit flickshots with my eyes closed and I could hit very long distance flickshots using my shoulder to aim.

So if you have been reading up to this point I'll just jump into the solution and what I recommend you try.

If, for any reason, you feel as though your logitech mouse does not seem to be moving exactly 1-to-1 like you expect, I strongly suggest you at least give this a try. It does not change anything permanently, is easy to undo, and may make an immediate and considerable difference for you (I know it did for me).

Keep in mind that logitech drivers are extremely obnoxious about resetting your configs to whatever it thinks the default should be. It will remove deceleration and set the speed back to either 0 or -1 (seemingly at random).

So to use this tweak correctly you will need to set your profile's xml file to read-only once you have correctly configured it.

I'll summarize like this

if you use 1000dpi or above, you should set it to be

<movement speed="-0.333" acceleration="0" deceleration="1"/>

if you use 800dpi you should set it as

<movement speed="-0.266" acceleration="0" deceleration="1"/>

and finally if you use 400dpi you should set it as

<movement speed="-0.133" acceleration="0" deceleration="1"/>

I want to close by listing a couple of other un-documented settings that I have found within the linux libraries and tried, all of which seem to produce some result.

The problem is that while each seems like it changes SOMETHING, it is subtle enough that I can't rule out placebo. And since there appears to be an inconsistent copying of different concepts from the linux libraries, I can't claim that the drivers will necessarily recognize them.


This comes from the Xorg convention that lists different acceleration profiles by device setting

0 is default, 1 is device dependent (which means simple accel, in this case), values above 1 change the curve, and -1 disables it

I have had a hard time telling if there is any actual difference between accel=-1 and accel=0, but since so much of the settings seem to have been "borrowed" from linux libraries, I have started using it as -1.


This also comes from an Xorg convention. From the documentation "Softening Tweaks motion deltas from device before applying acceleration to smooth out rather constant moves.
Input in an axis would change from 1 2 3 2 3 1 to 1 1.5 2.5 2.5 2.5 1"

Sounds nice, but this definitely messes with precision if you are trying to move exactly 3 pixels, for example.

Adding this to my logitech profile does seem to do SOMETHING, but I cannot really verify that it is either beneficial or detrimental.


This is more of a guess on my part. Smoothing should be obvious what it does (similar to softening it will filter the last 2 inputs to make start/stop motions feel more fluid).
Adding this and setting it to 0 seems to change the way the mouse feels slightly, but I cannot verify that it actually does what I think it does.

I have tested this with both my G-Pro (wired) and my G403, and it has an identical effect on both of them. My strong suspicion is that whatever foolishness is being run on the raw hardware inputs by the firmware or drivers is done universally on any mouse using a modern sensor (PMW3366 or hero etc.) and that for some reason this has become the foundation of logitech's driver design.

I find it extremely distressing to consider that there are a number of "undocumented" settings that alter the way mouse input works, taking what is in reality FLAWLESS HARDWARE and turning it into a filtered unreliable mess.

I also find it highly amusing to consider that logitech engineers very likely borrowed significant portions of open-source linux input libraries in the design of their driver software. It would be even more amusing if the messy filtering is actually accidental and the result of the engineers not using the "borrowed" libraries correctly.

Something that also occured to me while testing is that this may have something to do with why most professional gamers use either 400 or 800 dpi.

If you read the documentation about normalization of motion based on DPI, values above 1000 are "normalized" to accelerate the same, which suggests that certain values may introduce unpredictable filtering/pixel skipping (values not divisible by 1000, for example), whereas 400/800 produce occasional filtering, but of a far more predictable variety.

I know a lot of this probably looks sketchy and borderline illogical. I really wish I had a way to offer proof that could be quantified and measured. This is the result of many, many hours of me personally doing trial-and-error adjustments based on theoretical acceleration curves, and I have not figured out a way to produce any hard data that will verify my claims.

Right now the best I can do is suggest you give it a try, and see how it feels for you. I would caution that if you have played with your logitech mouse for a while using the default "imprecise" settings, any of these changes is going to feel "strange" at first. That was definitely the case for me. However, once I had nailed down the correct adjustments, it didn't take more than a couple minutes before that "strange" felt "100% natural" and removed a lot of the "thinking" out of aiming to make it feel more like well-honed reflex again.

Finally, let me say that I sincerely hope that this helps you game at your best again. It is a terrible shame that a mouse designed with flawless optical hardware is subject to a series of messy mathematical "adjustments" that more or less ruin the precision of the device.

If this tweak helps you, I would love to hear about it!
If you discover any other hidden settings or anything else that should be added to this, please let me know!

If you can think of a way to actually test and prove any of this empirically so it isn't just proof-by-tweaking, I'd love to hear that too!

Thanks for reading. Hope it helps.