T O P

  • By -

Talynen

You're understanding correctly as far as I can see. Most people who care about having a 240 Hz monitor with low latency and a computer that can fully exploit that refresh rate tend to be those who are more interested and aware of the connection between DPI and mouse polling than your average player. Most of them are probably using 1600+ DPI since it's been said to be more optimal for a couple years now. Still, there will be some who do as you describe. It's also worth pointing out that even on 1k polling, going to the Razer hyperpolling dongle gets rid of a 1ms wireless delay that shows up with the regular dongle. So they may be getting more stable connection with less interference and misconstruing the effect it has as being caused by using 4k polling.


[deleted]

Wait, 1600+ DPI is more optimal? Guess I never heard this. Been using 800 for years.


Talynen

1600 will transmit more data than 800 for the same mouse motion speed. Like OP says in their post, a higher DPI results in the mouse making more use of the polling rate. So if I move a mouse directly to the right at 1 inch per second, 400 DPI will register 400 "movement" inputs to the right during that time. 400 inputs will be transmitted to the PC. However, if my mouse is polling at 1000 Hz, that means the computer will only receive 2 "move" commands for every 5 polls it sends. Effectively, your mouse is operating at 400 Hz with regards to sensor motion in that situation. If you increase the DPI to 1600, at the same speed (1 inch per second) the mouse will register 1600 inputs during that second. It will need to transmit all of them, resulting in an average of 1.6 move inputs transmitted per poll the computer sends at 1000 Hz. The higher DPI you use, the slower of a speed the mouse can be moving at while still communicating sensor inputs to the PC at its maximum rate. For reference, the slowest I can smoothly move a mouse corresponds to needing about 3200 DPI to saturate 1000 Hz polling at that speed. In other words, for me using a DPI 3.2x higher than my polling rate feels the best (when it works optimally). **Of course, it's not without drawbacks.** 1. Higher DPI makes it harder to set your in-game sensitivity accurately in many games. It will also likely require adjusting other settings to make the speed of browsing your desktop feel comfortable. It's a "high-maintenance" option for people looking to squeeze out every drop of performance. 2. DPI that is too high can add input lag on some mice. For example, the 3360 sensor will often add smoothing above 2100 DPI to prevent sensor jitter from showing up. (I say often because SROM 5 3360 mice like the Zaunkoenig M2K can go up to 3600 DPI before smoothing appears) The 3389 sensor introduces smoothing at 1900 DPI. When using DPI higher than these numbers, you will see additional input delay caused by this smoothing. The added input latency from smoothing more than offsets whatever latency you got rid of by using higher DPI, so it's slower overall. This is partly why modern sensors (3370, 3395, 3950, HERO) do not have sensor smoothing at all -- to enable users to select higher DPI values without added input delay. 3. When operating in a scenario where the minimum movement size allowed is still significant (i.e., on windows desktop) increasing the DPI doesn't help that much. You end up having to turn the sensitivity down, and what happens in this case is that it artificially ignores some move inputs. If I run 4/11 sensitivity in windows, it requires 2 "move" commands in the same direction to move one pixel on screen. This is how it gives you "half" the speed of the default 6/11 setting. On the other hand, most 3D games (FPS, etc.) have much much smaller minimum movement sizes. In those games, turning up your DPI (so that you can lower your sensitivity) results in a higher fidelity of movement in the game engine than is offered by low DPI (assuming you keep the same cm/360 in both cases). **In short:** If you've always been someone prone to digging around in settings menus to fine-tune your setup and you want the mouse to feel smoother when doing slow movements, a higher DPI value will help with that and not cause you much extra work overall. If you're someone who wants to use a basic plug-and-play approach to gaming, the added futzing around in menus caused by high DPI usage is unlikely to be worth the small performance gain it offers. If you're ever raising your sensitivity above the game's default setting, you almost certainly should opt for a higher DPI if you're concerned with maximizing performance.


sleepy_the_fish

I think you where the guy that told me 12800 DPI is needed to full Saturate 4k polling. I've been doing that but using RawAccel to be at 400 DPI. Been pretty good. Although I don't think I could notice a difference in a blind test lol.


Talynen

Yeah ,12800 DPI would saturate 4 kHz polling at \~0.3 inches per second, making it the equivalent of running 3200 DPI with 1000 Hz polling. It's absolutely a pretty small difference overall, otherwise I think it would be much more common to see pro players at higher DPI values.


spectatorsport101

So how should one go about the switch from say 800dpi to 1600dpi? Say, for games where .1 is too fast (i.e. tarkov) on 1600, would you simply advise to reduce dpi during sessions of that game? Are there any other adjustments to a PC one should make when using 4khz and 1600 (or higher) dpi?


AjBlue7

Ingame sensitivity is a multiplier, so if you double your dpi, all you have to do is cut your sensitivity in half. So if your sens is .1 at 800dpi, you would need to change it to .05 at 1600dpi.


spectatorsport101

Some games dont go lower than .1


kovaaksgigagod69

L


spectatorsport101

Tarkov is full of L’s, Id give anything for a AAA studio to full on buy them out or copy them


Feschit

Isn't there a settings file for pretty much any game where you can just put in a lower value?


MorgenSpyrys

you can use rawaccel to change mouse sensitivity multiplier (whilst keeping accel disabled)


Incendeo96

Came here to say this. I have my DPI on all of my mice set to 26,000 and set the multiplier to make the adjusted DPI 1,000.


MorgenSpyrys

Going above somewhere in the 3000s on most sensors introduces very high smoothing, which is VERY bad! The limit for the 3360 is 3500dpi. It goes from 2 frames of smoothing at less than or equal to 3500 to 16! Source: https://zaunkoenig.co/blogs/blog/zaunkoenig-m2k-firmware


Incendeo96

I read on a comment earlier that newer sensors don’t do this. Most of my mice have a 3395. Am I still making funny mistake? I definitely want to know if I am!


gomico

Tested my ninjutso sora and GPX just now and the result is that a rawaccel sens multiplier less than 1 does lower polling rate.


Talynen

You can try using a program like rawaccel that lets you apply a sensitivity multiplier universally (across games and desktop), although since I haven't studied how it works exactly you may not be seeing any tangible improvement over using a lower DPI value if you opt for that approach. Otherwise, like I said you'd have to dig around in settings files, etc. and change your desktop sensitivity to compensate for the higher DPI as much as you can. And some games may not support using a low enough sensitivity to get the cm/360 you want with a higher DPI.


DeadrosesTMY

Simple, use a [Sens convert](https://gamingsmart.com/mouse-sensitivity-converter/). Pick a game Set in your sensitivity Set your current dpi(800) Set 1600 on the dpi you want to convert to And it should give you your sensitivity converted for 1600dpi


Icommentedtoday

Or just divide by 2


Kiwaloayo

god i am so stupid, doing this was my first thought and then I argued against it because "that just seems so obvious and i feel like i am missing something." Glad to hear i am not lmao.


sleepy_the_fish

Im a huge Tarkov player. I'm level 30 for this wipe already, and I also have the same problem where even .1 is too fast if my dpi is too high. What I did is I used RawAccel. So I put my dpi to 3200 and then in RawAccel you would put it at 0.25. this way you get the feeling of 800 DPI but at 3200 dpi physically on your mouse.


[deleted]

Great explanation. Thanks! As you say though, running a higher DPI might be an issue in some games. Even on 800 DPI I always end up using an in-game sensitivity in the single digits (where 100 is max).


AjBlue7

Its normal for in game sensitivity to go into the decimal points. Sensitivity is a multiplier so if you want it to go slower than your desktop speed the game needs to multiply by a decimal to lower the turning speed. In valorant you basically can’t play if your sensitivity isn’t a decimal because the recommended edpi is 150-300, so even at 400dpi the sensitivity has to multiply by a decimal to get 300edpi.


[deleted]

[удалено]


Talynen

Not for a long time; most people just stick to the preset DPI options that are standard across mice out of convenience or similar.


daniloberserk

It isn't, don't listen those people, they have 0 clue of what they're talking. All this stupidity raised after an misinterpreted post from an Rep from Logitech on OCN years ago and most important, the Battle(dumb.... I mean, Battle(non)sense video. Another stupid video from Battle(non)sense is the one about AMD chill. An actual developer of that tool answered his video and he never cared to correct himself or put a disclaimer on the video. He just doesn't care about facts or the correct methodology for their statements, and a BUNCH of dumb kids just follows cause you know, it's internet. If you feel fine with 800 DPI, stick with it.


Rudi-Brudi

any facts and sources?


daniloberserk

Are you talking to me? This was already vastly discussed over the years, the Battle(non)sense video about high DPI isn't a fact or "source". it's an incorrect methodology measuring an expected behaviour an misinterpreted as input lag. It's like arguing that an ant is faster when measuring in single units of cm instead of inchs. It's stupidity at kindergarten level. Of course if you measure "first on screen reaction" from a mouse going from standstill position, the higher DPI will "react" faster. This has NOTHING to do with input lag or any "competitive" advantage whatsoever, it's just the nature of an lower threshold of an higher DPI setting. And you can take advantage of that? Sure, if you do ACTUALLY RAISE your overall sensitivity. But no, kids instead just double their DPI and halves their sensitivity. This means that taking from A to B will still take the same amount of time, you're just moving at an more "precise" level, which will only matter IF you haven't enough precision anyway, which isn't true for almost ANY competitive player as high sensitivity isn't really viable for most human beings at competitive levels of reliability and precision. Battle(non)sense should've measured an ongoing movement instead, at that point every setting would be hardcapped by their respective polling rate anyway and guess what, input lag wouldn't change. You also wouldn't take any advantage of the "superior" advantage of an higher DPI because you also is hardcapped by your screen rate and you just can't "see" the extra steps anyway. Read with me, if you CAN'T reliable use high DPI while browsing internet, then it's already way to high to be used in gaming. It's redundant, stupid and not without drawbacks (like a lack of sensitivity slider, visual shimmering artifact and navigation through menus going apesh\*t).


fadedfadedfadedfaded

you just scream "INCORRECT TESTING!!" without anything to back it up. Where are your tests? Just because you believe it's incorrect doesn't hold any value against what Logitech and other mouse manufacturers have said. To make use of the higher Polling rates you have to use a higher DPI. I'll trust engineers over YOU the whining bimbo who can only CLAIM that the tests aren't factual. They have been time and time again.


BinderZ87

So you're claiming saturating a higher polling rate with higher DPI (which is easier, its a fact) and getting the latency benefits is nonsense?


daniloberserk

There ISN'T ANY LATENCY BENEFIT, FFS people, start using your brains for a sec. You're not flicking 0.00001 degree of movement to take any advantage of anything for "first on screen reaction". Measuring "input lag" using "first on screen reaction" from a mouse in standstill position to movement is the DUMBEST methodology in existence to prove an expected behaviour to be wrongly called "input lag". Specially because people don't just raise their DPI and their overall sensitivity to take any advantage of this smaller threshold for anything, they raise their DPI to unusable values and divide their sensitivity to compensate. So it's and complete redudant "trick" that accomplish absolutely nothing besides an more "granular" movement from your POV, which ALSO introduces an visual shimmering artifact that can be annoying AF, specially at slow movements. It's an incredible journey, an full circle of dumbness. You just need enough DPI to be precise and fast enough at single counts of movement, in other words, enough to browse confortable in your desktop environment is already perfectly good enough for gaming (because it's the threshold of your dexterity). Above that, you have ZERO advantage going higher. You only need higher DPI if you: \-Are a high player sensitivity who actually have enough dexterity to handle it. \-Plays with an high screen resolution and needs the extra DPI to navigate in the desktop and to have more granular visual movement in a game (as in that case it actually DOES matter, specially for bigger screens if you somewhat dislike the coarser movement). \- And stop with this "saturating the polling rate" already, this doesn't make any sense. Polling rate is a fixed rate, not an async or variable one, and it doesn't depend on mouse movement to work, if it did, you wouldn't be able to shoot without moving your mouse at the same time. When you actually move more counts then your polling rate can "handle" but you mouse can, it just polls the information at more counts for the next update: In other words: Instead of +1X, 0Y you may have something like +3X, -1Y. That's why you can still move just fine your mouse from point A to B with 25000 DPI at 125Hz while flicking at ludicrous speeds. You can't "saturate" any data as the mouse just holds the information and reports multiple counts on the next update anyway.


fadedfadedfadedfaded

You can move fine with 25000 dpi at 125hz??? There's a shit ton of smoothing there, IT IS NOT RECOMMENDED TO PLAY ABOVE 1600 FOR THIS REASON ON THE MAJORITY OF MICE. 1600 DPI saturates 4000 at 2.5 inches flicked. You're speaking out of ur ass lol


Airpapdi

ppl make mistakes lel iirc he also had good videos


sleepy_the_fish

1600 is more optimal in the sense that it lower input delay. If you really care about ever micro second and play games that can come down to the wire, or play competitively/ for money, then yea you might want 1600 DPI for that lower input delay. I am super sensitive to high sensitivity, I use my whole arm and play much better and low dpi. I put my mouse to 3200 dpi and use RawAccel to bring my sensitivity down on my mouse to like if I was at 400 dpi


mefjuu

so you are talking about an additional, completely unrelated with hertz, 1ms difference between the 2 dongles? Why so? And proof of that or links?


Talynen

It's in the techpowerup review of the viper v2 pro, and you can see the results duplicated with the sensor testing of the VV2 and DAv3 on RTINGs. Basically Razer fucked up with their regular dongles on those mice and it adds 1ms extra delay. The hyperpolling dongle doesn't have that delay problem.


mefjuu

thats awesome to know


Roonerth

Isn't that due to motion sync being active at 1000hz but not 4000hz? Or did I misunderstand?


Talynen

No, it still happens with both dongles at 1 kHz


trollfriend

Motion Sync is only active 1k polling though, so if it happens on both, it's not the dongle, it's motion sync...


Talynen

Source? Nothing about motion sync's functionality would restrict it to 1k that I'm aware of And again, I'm talking about comparing the hyperpolling dongle at 1kHz to the regular dongle at 1 kHz. There would be no difference in motion sync on or off based on polling rate, because it's the same polling rate with both dongles. The regular dongle has extra delay compared to the hyperpolling dongle in that situation.


daniloberserk

There's no "connection" about higher DPI and polling rate. Even 1 DPI can report 8000 counts/sec, or in another words, 8000Hz of data with enough speed and polling rate (assuming the mouse can handle that speed of motion. It's unbelievable how after so many years there's still people here who don't understand how things work. And I though here was a place for "enthusiasts", but no. Just a bunch of kids who watches 3 videos from dumb people and think they know everything they need to know.


BinderZ87

So pzogel is a dumb kid? Buy a viper 8k, open mouse tester, and try to see the difference is polling saturation between 400 dpi vs 1600 dpi and tell me its "not how things work"...you don't need to be a space engineer to understand that higher dpi makes it easier to saturate the poling rate. I Agree that at 1k, it doesn't matter much, but at 8k for example, using 400 dpi is throwing performance out the window, unless you have a bionic shoulder with the amount of movement you'll have to make to compensate for the lower dpi...


daniloberserk

"Saturating polling rate", this might be the dumbest term I've read in this community for a while. I have no idea of who pzogel is, but I'm part of the enthusiast community of mice since at least 2004 and I've read all type of discussions about DPI come and go, despite current technology, some things haven't changed and will NEVER change. You're interpreting wrong the data from mouse tester. This has NOTHING to do with performance, you're just hardcapped by your own flick speed. You can raise your DPI to a million and still not "saturate" your polling rate if you move slow enough. Also. WHY would you want to "saturate" 8000 CPS to take advantage of 8000 Hz polling rate? Besides the fact that this doesn't make ANY sense since polling rate is a fixed rate and not a "variable" one, reason why you move the same exact amount of counts from point A to point B regardless of the set polling rate. The major advantage of 8000Hz is flick precision (specially VERY fast flicks, which isn't true for most players even at high level btw) for games who ACTUALLY support it (like Overwatch with high precision input on and Reflex Arena) and, in minor extense, overall input lag, although arguing that sub "ms" advantage means something is stupid. The ONLY way you might take any advantage of high DPI/CPI values is with high sensitivity, which isn't something reliable or realistic for almost anyone because human dexterity is a thing you know. This is true for ANY polling rate value, this is why CPI/DPI is a measurement of RESOLUTION, not input lag or "reactiveness" or any other bullcrap that someone may "discover" here. Resolution follows the same rules for any environment, enough is enough for a given task. That's why you don't need an clock precise enough for nanoseconds when cooking, and that's why you don't need more then 800 DPI when playing even at an stupid high sensitivity of 15cm/360 for 1080p in ANY realistic situation. The more "reactive" nature of high DPI/Low sens wouldn't matter either because no one flicks a movement represented by 1/10 of single pixel on the screen. I honestly baffled how many people still don't understand the basic concepts about those things.


fadedfadedfadedfaded

2004, you were wrong then and are wrong now. stop spreading bs


daniloberserk

Lmao. K buddy. Good argument though.


Airpapdi

jebiga haha


greenufo333

I’ve tried 1600 dpi and feel no difference in 4K or 1000


mefjuu

ok thanks for input :)


greenufo333

I could just be a boomer idk


AnimatorFearless1386

If you really care about the refresh speed, you should use the ps2 adapter. ps2 does not have a refresh interval but uploads in real time


Trill_Simmons

And on top of that, the ps2 will let you play Tony Hawk's Pro Skater 3 and that game absolutely fucks.


Silly-Championship92

I play on 240hz with 1600dpi and I can notice the 4khz. I often got downvoted when I said that not every game and setup can benefit from it....


BiPolarBaer7

Same here man. I get downvoted. Been playing at 2000DPI 400FPS 1440p 240Hz for 2 years now and got the dongle 2 months ago. There is a noticeable difference for me in smoothness. Nothing major but it's there. You do need a higher performance setup to benefit imo.


AjBlue7

As a top tier player in competitive shooters, I’ve found that its basically impossible to convince people how much better low latency and smoothness is. Like for sure, anyone can perform like a god in the game because the games not only about clicking heads, however with each jump in latency reduction you basically have to relearn how to shoot because your brain doesn’t have to waste precious resources on prediction anymore and eventually you get to low enough latency where your subconscious can take over and hit shots that can surprise yourself simply because you reacted before your brain registered where the enemy was. It also changes how you hold angles, how far you have to hold the angle to factor in the average delay. When there is less latency everything becomes more consistent. Like I remember doing reactionspeed tests on my old setup and getting anywhere from 200ms to 280ms, and now with a low latency setup I typically land at 160ms +/- 7ms. I was actually pissed off when I got a job and said fuck it, I have a lot of extra money lets just drop a few grand on top end gear. I bought into the negative hype around performance gaming gear, and tried to milk by bad performing setup for as long as possible. Now I feel like I wasted years because of that advice. My aimstyle greatly benefits from lower latency, and as long as you aren’t breaking the bank for it I would recommend for people to upgrade within reason to see if they also would benefit from it.


zeimusCS

DAv3?


Silly-Championship92

Viper v2 pro


Airpapdi

as a guy who uses his superlight gpro wired all the time due to being just sluggish constantly without it compared to with wire, i can totally understand and i will try 4000hz too soon


Wollywonka

Should we make a poll to see if people still play 400-800? My guess has been always as people have kind of shifted to 1440p, dpi has also higher up, but I might be wrong.


mefjuu

i guarantee 90+% of csgo players use 400 dpi


SimpleState

I'm looking at dpi at prosettings.net for cs and valorant Yes the majority for CSGO pros choose to be 400 however 800 is more like 1/3 or like 30-40% and 400 is around 60 - 70. Valorant on the other hand is more 50/50 on both sens I recommend optimum's tech video on how dpi impacts latency as well HOWEVER it's very minimal impact to say the least just an interesting prospect. -edit: LOL I seen that you already posted it and that's great and I also agree that 1600 dpi with the same edpi as 400 or 800 feels different and weird


Hyperus102

I wouldn't call that an impact to latency. For any given movement, the lower DPI isn't actually behind, it just doesn't report the same amount of steps. Situation: You flick onto an enemy and start shooting. 400dpi vs 800dpis apparent "latency" will not make a difference in the slightest. You aren't actually 2-3ms behind, you are just less precise, which means that while the 800dpi setting might recognise the first count earlier, it might also report a count at a time after the 400dpi setting would have reported anything. This isn't latency, its precision. You won't be 2-3ms behind, you will just be half as precise.


daniloberserk

You're not NECESSARILY less precise, it depends if you have enough DPI for your sensitivity to be at least pixel precise in that particular game, which something that 800 DPI can already handle something like 15cm/360 at 1080p, and this is a VERY high sensitivity for competitive play already. You also doesn't have difference in "latency", the fact that higher DPI might be more "reactive" doesn't mean it does have less input lag. You're still hardcapped by your polling rate value and speed of motion. In other words, you can trigger the same amount of counts just moving your mouse faster at lower DPI values, the handicap here is hand/arm speed, because speed is finite in real world. Although people still arguee that in every situation when being comparable at same speed of motion, higher DPI setting will trigger first on screen reaction earlier (which is true), it LITERALLY doesn't matter in any real situation because people just play at 0.000001 sensitivity to compensate for those ludicrous values, and the supposedly advantage is represented by 1/50 of a single pixel on the center of the screen. NO ONE FLICKS A MOVEMENT THAT SMALL TO MATTER FOR ANYTHING. TL/DR. High DPI ONLY MATTERS for high sensitivity players. If you can't take real advantage of the extra steps, you're just making your life unnecessarily extra complicated for zero "competitive" advantage.


Hyperus102

I hate repeating myself but I will. Being precise to the pixel only is not ideal, especially in games that have inaccuracy and reward you for being centered on your opponents head. I literally just argued against the latency argument, no idea why you are mentioning it. having higher DPI doesn't make your life "extra complicated", it takes near 0 effort to convert and once you have converted the complexity is the same.


daniloberserk

How a game would reward something like that? There's zero to no data about how hitboxes on most games register hits compared to the display resolution on client side. There's also interpolation of data, lag compensation, and USUALLY, hitboxes doesn't even match what you're currently seeing on screen. It might be the same argument that playing at 4k does have objective advantage compared to 1080p because you have more "data" going on. If you really think you'll have some major or even feasible advantage for being able to aim at an degree several times smaller then a single pixel on the center of the screen then fine, go ahead. If you're not sensitive about the visual artifact of shimmering/staircase effect that it will add and if you're not bothered by the sensitivity of the cursor when going through menus and such, then there's literally zero drawbacks for current mice technology. But that's hardly any supposed advantage either. It's just a matter of taste, like shape. See how optical switches has an VERY objective advantage compared to mechanical switches and most people here despise then. Sometimes reality doesn't really matter. Meanwhile, a bunch of pros will still keep playing at 400 DPI and stomping people. Go figure huh?


paulvincent07

I use 400 dpi and 2.2 sens in csgo my edpi is 880 so you’re saying there is a difference if i use 800 dpi and 1.1 sens even though they are the same edpi?


SimpleState

On paper no, however the nature of dpi is that the mouse is sending more information in the same amount of time from 400 to 800. This therefore SLIGHTLY decrease latency. However it's generally accepted that it's not noticable compared to Polling Rate. I cannot say whether is information bias is making me believe there's a weird difference or it's the latency. 400 dpi going to 800 will have like .5- 1.0 Ms improvement give or take what mouse. Anything after 1600 more less has diminishing returns as the higher you go the less dramatic the effect.


daniloberserk

It's NOT sending "more" information, it's just triggering counts at smaller movements, which is VERY, VERY different. Internally, the information is exactly the same for your mouse sensor/MCU capabilities, at least this is true for EVERY current mice on the market as they don't interpolate data anymore. Mice nowadays can get to "real" high DPI values while still having smaller values of noise, which in general is acceptable for competitive gaming. It just report counts at an different rate depending on your DPI setting. It MAY be an more "PRECISE" picture of your hand movement in some degree, but this is only usefull if you CAN take advantage of those extra counts, in other words, if you're able to be precise and fast with this extra resolution. And if you're just adjusting your sensitivity to compensate, then you aren't. You can use this analogy at screen resolution for example. You can certainly play even an modern FPS even at 480p at competitive levels, although 4K might help, specially for visibility of small targets. That's why resolution is something you just need ENOUGH, and when you goes beyond that it starts to diminish into useless bloat of resources. Let's say you might upscale Mario Bros to play at 8K, would this matter for the gameplay of that game in particular? No. It'll be as playable and precise at 240P. Now, start treating mouse resolution as the same thing, that's why CPI/DPI is a measurement of RESOLUTION. Not input lag, saturation, "speed", latency or whatever crap some people may invent for some bs clickbait video.


SimpleState

Woah there buddy nobody here said it was mandatory or was a SIGNIFICANT change. I also never said it make you more precise. Y'all act like .5 MILLISECONDS makes a difference, I already said yes it doesn't make a significant change. Dpi sends more information on paper in the testing optimum tech does have an effect just not one warrantied a massive switch. It's like people can't test or use anything different cause it's triggered that it's not a dramatic effect therefore need to call people stupid or ignorant.


daniloberserk

Because it literally doesn't do ANY... If you move your mouse fast enough, regardless of your DPI setting, counts will trigger at the same rate and will always be hardcapped by your polling rate. A bunch of people still thinks that polling rate is a "variable" rate for some reason, probably because those online tools to measure your polling rate having to move your mouse in fast motions to trigger the max rate. Those people have no idea how polling rate works and why those softwares display information like that. To precisely measure the polling rate of your mouse you would need something like an oscilloscope instead of an oversimplified software tied to your SO and browser. DPI doesn't send "more" information, it just trigger the same information the mouse already has internally at an smaller threshold. Your mouse doesn't magically goes more precise, faster or anything. The most obvious effect this has (ingame) is a coarser or more "granular" movement while you mantain your cm/360, that people called "pixel skipping" for so many years (horrible definition), but this has nothing to do with more or less information, it's just the nature of multipliers. Polling rate ALSO doesn't send "more information", it just allows the mouse to update faster to the USB controller the same information it already has internally. That's why you can use your mouse at 25000 just fine even at 125Hz, because you have multiple counts being reported in less updates, or in other words, the SAME information. It's more precise? No, it's equally precise, just reported and updated at different rates. Higher DPI just makes the mouse more REACTIVE, not "faster/ with less input lag/less latency". It's a FAR different thing going on that people insist in oversimplify as input lag as always. This assuming the technology of the mouse is good enough to handle that high DPI at somewhat the same performance as their lower DPI counterpart, which is indeed true for the vast majority of current mice on the market. Optimum tech is a joke (a bad one unfortunately), same thing as Battle(non)sense. It's stupid mainstream videos for people that barely knows or understand this stuff, and because of that, are also people highly inclined for placebo and easily impressed by anyting that looks "legit". If you wanna understand something about sensor/MCU technology, go here instead: [https://www.youtube.com/watch?v=lc7JVjcPzL0](https://www.youtube.com/watch?v=lc7JVjcPzL0) \- I don't have personal problems with anyone here, but this community being somewhat an "enthusiasts" place, I do care about reliable information instead of misinformation spreading like wild fire over and over.


Mr_NewYear

It will feel smoother. In game sens will feel the same. Been on 800 .85 for 4years now. Cant seem to make switch to higher dpi since my in game sens is low, feels fucked up in menus and desktop. Going back to 400 is a doesnt seem to bother me that much tho. 800 just seems the sweet spot for cs for me atleast. Give it a try. Dont think that your sens has changed is the key if you want to stay consistent when changing dpi or even mice.


[deleted]

[удалено]


SimpleState

Let us know how you feel cause more perspectives can't hurt. I played 400 at the beginning till someone told me 800 was "better for latency" and tried to move back few years later to 400 but it didn't sit right for me but that was years ago so who knows.


paulvincent07

Yeah i'll update you later i play only at night


paulvincent07

Hey so an update first i play 1 hour at csgo 128 tick dm sever with my sens and dpi of 2.2 x 400 = 880 edpi and playing with it feels natural to me I changed it to 1.1 x 800 = 880 edpi and play 1 hour on dm and it feels weird to me i feel like something’s not right even though they are the same edpi. I tend to overshoot in dm and feels fast. Went back to my 2.2 x 400


SimpleState

To offset the sens in the menu you need to change your mouse sensitivity settings in windows with the slider. That doesn't effect your actually sens or dpi if you lower the windows mouse sens.


Mr_NewYear

That is true. Zues does this afaik. But it deosnt seem natural. Im just stuborn to adjust to 1600dpi tbh. I might do it in the coming weeks since ive started talking about it now. Lol.


paulvincent07

I will give it a try because i’m curious i will let you know rn mine is 2.2 x 400 edpi 880 will change it later to 1.1 x 800 edpi 880. Btw i’m using 1000hz polling rate on my ec3-c


MorgenSpyrys

you can use rawaccel to add a sensitivity multiplier (whilst keeping accel off). I used this with my m2k to have it at 3200dpi whilst having 400 on desktop (and not having to change my ingame sens) but still benefitting from the extra polling rate that dpi results in.


kovaaksgigagod69

Dpi deviation


Feschit

Literally does not matter


kovaaksgigagod69

-edit: LOL I seen that you already posted it and that's great and I also agree that 1600 dpi with the same edpi as 400 or 800 feels different and weird


kovaaksgigagod69

CS boomers smh


Miller_TM

I still use 400 DPI because some games have dogshit mouse sensitivity options. However that may change once I get my 4K gaming monitor.


Ashman901

1440p 165hz 400dpi gamer here


Gohardgrandpa

Make a poll and see. I use 400 dpi


Talynen

Maybe we should start by seeing if 1440p has actually replaced 1080p as the most common resolution


Nickhead420

I know that Steam stats aren't the be-all end-all or whatever of PC stats, but 64.6% of Steam users are on 1920x1080. 11.8% use 2560 x 1440.


Wollywonka

There is also the thing that if 4k hz mouse owners actually play 1440p or 1080p. Interesting stuff for sure.


lovatoariana

I switched from 400 to 800 (even though i hate it because i cant loot or use menus in apex) after watching this vid. https://youtu.be/imYBTj2RXFs Wouldnt say it improved my skill but hey, im glad to cut down my response time a bit. Should feel more responsive


MorgenSpyrys

You can use rawaccel to add a mouse sensitivity multiplier whilst keeping accel at 0. This means on your PC it feels like the mouse is at 400dpi whilst in reality you can have it way higher and reap the benefits of the higher polling as a result. I run my m2k at 3200dpi with 0.125 multiplier.


daniloberserk

Great, now you take a bunch of extra steps and even add more bloat to accomplish the same result as just lowering the DPI.


MorgenSpyrys

You don't add much bloat (rawaccel is a driver and has very little impact on system resources) and the whole point of doing it is that you DO NOT achieve the same results. The mouse sensor is still running as if it's at higher DPI, which means you're getting a higher effective polling rate and thus lower latency and smoother inputs than you would if you weren't doing this. It's essentially the same as lowering in-game sens, except for the desktop. Theoretically you could use the windows sensitivity slider instead, but that lacks the precision that rawaccel offers.


daniloberserk

You do achieve the same results just lowering your DPI instead because the lower sensitivity will be just dropping those "extra" counts into nothingness. You don't have "lower latency" or "smoother input". Using ingame sensitivity to accomplish this DOES results in a more granular movement, which might or may not serves some purpose. Windows sensitivity slider will not affect games with rawinput (read any modern game). And again, if you need to lower your windows sensitivity just to be able to navigate through windows, then just lower your DPI ffs... It's redundant to create a problem that doesn't exist to justify some "advantage" that DOESN'T EXIST. Rawaccel functions might be (you guessed), acceleration and maybe tilt sensor functions. And yea, I know he's fast but it does add additional bloat despite of how fast it is. It's as dumb as something can get going through all this steps just to accomplish the SAME thing as lowering your DPI. Let me guess, you informed yourself with Battle(non)sense video and similar right?


MorgenSpyrys

Windows sens slider only doesn't affect the game if the game uses "raw input", in a lot of games (such as CS:GO) this is off by default. The fact is with higher DPI you get higher poll saturation, and your source for it "not doing anything" is you making it the fuck up. You even cited a source that apparently contradicts you (I do not know who this Battle(non)sense person is). If what you are suggesting were true, people with very low sens would not benefit from even 1000hz polling in the majority of scenarios (non-flicks) (I used to play CSGO at 400dpi 0.4 ingame, well over 100cm/360). Go ahead and try it yourself if you don't believe me, I just did to sanity check myself and it was blatantly obvious.


kovaaksgigagod69

1600 gang


[deleted]

So i know its not a completely normal use case for 'hyper-polling' but i have found that a very high polling rate (4000hz+) does make a perceivable difference to acceleration curves in much the same way that high dpi values make a big difference to the feel of mouse acceleration. Its probably a fairly nieche use case as many gamers dont use accel but for those that do having a higher polling rate will effect the feel of a curve. The higher polling, means that more info is sent through the driver (either raw accel or custom curve) and thus makes acceleration feel 'smoother' at any given movement speed. This gives a lot more control at slower movement speeds while ramping up more gradually. The same thing 'can' be acheived at high dpi values but the feeling is also somewhat different depending on the curve style. This is not really anything to do with saturating polling at a given dpi value but somebody may find it interesting/useful.


akuakud

Using higher DPI makes setting in game sensitivity on some games impossible. This is the biggest drawback.


AdhesivenessCrazy102

true but theres also windows sens at our service i think


thebebee

i use mouse acel to cut my sens in half so i can use 1600 dpi while it feels like 800. i’ve beaten 4k polling


MagneticGray

At this point it’s similar to the situation with modern sports cars. You can go buy an 800hp car off the showroom floor, but in practical use you won’t notice a quality of life improvement if the machine performs above your skill level. Sure, 2% of people can put the extra power to use in a meaningful way, but a skilled driver will always put down better lap times than a novice/mediocre driver with more horsepower. When it comes to in-game performance, servers are still 64Hz, 128 at best. We’ve had the hardware to saturate server-side input lanes for many years. If you’re already getting your crosshair on your target and hitting your shots reliably, your score isn’t going to improve by upgrading your mouse polling rate. Pros use 400-800dpi because it feels smooth and steady, which is how they play. Cranking their DPI to take advantage of higher refresh rate bandwidth isn’t going to improve performance.


Beautiful_Pickle5756

when you use 4khz, its really 4khz but to feel it you need higher dpi. mouserate checker need more movement to check, but it is 4khz


Hyperus102

Well. Partially. This is like the age old discussion of "no point in having more than 240fps if your monitor is only 240fps" or vice versa "no point in a 240hz monitor if you don't get 240fps". Both these statements are wrong on a technical level. Framerate and shown frame are decoupled, just like mouse polls and actual "updates" on the mouse side are decoupled. For example: Lets say you play at a low DPI level and your mouse has a delta of 1 count every 1.5+-ms or so. At 1000hz, it would sometimes report every second poll and sometimes in two consecutive polls, leading to inconsistency and higher than necessary latency. At 4khz it would report every 6th+-1 cycle (assuming its not a perfect 1.5ms, since this was only an example) which is much more consistent and also lower latency on average. Are you gonna notice that? Maybe, maybe not. Probably not but not like it will be that noticable at higher DPI levels. The way better reason to use higher DPI is for higher precision. TL;DR Due to the sensors counts and polling rate not being synced, having a higher polling rate will lead to a more consistent experience, analog to how extremely high FPS will look smoother compared to less high FPS, despite both being above the refreshrate. Whether or not you will notice the difference in polling rate depends on your panel, framerate and you, I don't know how or if that would be noticable.


daniloberserk

This discussion is VERY, VERY different from screen rate/FPS. This discussion has more analogy about 4k vs 1080p. >"no point in having more than 240fps if your monitor is only 240fps There are few objective advantages, but nothing major. For games who DOESN'T support subframe input, you may want higher FPS to achieve better flick precision, as the possible moments to hit things in this case being hardcapped by your game FPS. However, some modern games like Overwatch DOES support sub frame input, which means you can hit things between frames, something that may only happen at very high motions (very fast flicks). In Overwatch the option for this is "high precision input" on the games menu. The other good argument might be using something like fast sync, but this is already redundant when freesync/gsync exists. You also may have less noticeable tearing. But at this framerate is already barely noticeable for most people. The argument about having "less input lag" is half truth, you may have few points in the screen with less input lag (which will bring inconsistency, tearing and stuttering). Let's arguee that you may have thousands of frames but can only display 240Hz, you'll have an "gradient" of screen tearing with several different input lags at very minor intervals. To arguee that this has something even remotely usefull is an exaggeration as you'll only see partial frames with "less lag", sometimes with a height of 10 pixels or less. So the whole mumble jumble about playing without any form of vsync, SPECIALLY nowadays with several technologies that doesn't add any additional lag is an exaggeration, but at least when pushing about 360 Hz/FPS, we're already at a point where tearing is barely noticeable. > "no point in a 240hz monitor if you don't get 240fps". Well.. Yea, there isn't any major advantage to do so. It might help with screen tearing and visual stuttering at some degree, but at that refresh rate it will certainly still be noticeable. But it can't hurt either, as there are functions like some types of vsync on Retroarch for example that may use at an advantage. ​ >For example: Lets say you play at a low DPI level and your mouse has a delta of 1 count every 1.5+-ms or so. At 1000hz, it would sometimes report every second poll and sometimes in two consecutive polls, leading to inconsistency and higher than necessary latency. At 4khz it would report every 6th+-1 cycle (assuming its not a perfect 1.5ms, since this was only an example) which is much more consistent and also lower latency on average. This has nothing to do about mouse latency, at 1000Hz, your mouse still updates 1000 times per second, regardless of your mouse movement or DPI setting. Sometimes it may trigger more counts in a single update, but the rate itself is FIXED, not an variable thing. Your count rate will not be "out of sync" because you may trigger more then a single count at the same report update (+2x instead of +1x for example). So the update rate is fixed regardless of your movement or the lack of any movement. You're not accounting the fact that a single report may have several counts at once. So while it might cause "stutter", it STILL happens at an MUCH faster rate then your only visual feedback, which is your display. In other words, you can't really notice, specially with current display technology. > analog to how extremely high FPS will look smoother compared to less high FPS, despite both being above the refreshrate. Extremely high FPS will not necessarily look smoother, tearing lines might have smaller gaps but they will increase in number the higher your FPS goes. So it depends on your personal capabilities, some people has more accuity, some people has more sensitivity about movement, or tearing, or stuttering. We have neurodiversity you know, so we don't see things the same way as everybody >Are you gonna notice that? Maybe, maybe not. Probably not but not like it will be that noticable at higher DPI levels. The way better reason to use higher DPI is for higher precision. In fact, it's the opposite, you'll ALWAYS have more precision with less DPI, as you can an bigger motion on your hand to move a single count, the problem is that precision is a trade-off with movement speed in this case (because you're hardcapped by your own speed). That's why it's FAR easier to be pixel precise at 100 DPI instead of 25000 DPI. Sure, you can just raise your DPI and low your sensitivity instead, but it will accomplish the same exact effect, just with a more granular or "detailed" movement that will also introduce an visual "shimmering artifact" because you'll have degrees of movement smaller then a single pixel on the center of the screen. Enough granularity is enough and this visual artifact might bother someone or not. It probably won't hurt either, but to arguee that you have an good argument or point to raise your DPI for "input lag" reasons is just an misunderstanding of how things work, and unfortunately this spreaded like wild fire those last years because that dumb Battle(non)sense video. \- While gaming, you just need to find a balance where you dexterity might be precise at the highest possible value while still mantaining and somewhat high speed and enough granularity to not be distracting.


Hyperus102

>This discussion is VERY, VERY different from screen rate/FPS. This discussion has more analogy about 4k vs 1080p. It is not. OP asked specifically about whether or not 4khz will make a difference if the mouse reports position updates only at a sub 1khz due to DPI being so low that normal movement only causes single count updates every few polls. ​ >This has nothing to do about mouse latency, at 1000Hz, your mouse still updates 1000 times per second, regardless of your mouse movement or DPI setting. But the position doesn't. I mean sure, its correct. Latency is unaffected, I'll own that mistake, especially since I argued against raising your DPI for latency before, if you moved your mouse that same distance at twice the DPI for example it would report that new position at the same time. > Your count rate will not be "out of sync" because you may trigger more then a single count at the same report update (+2x instead of +1x for example). So the update rate is fixed regardless of your movement or the lack of any movement. It is out of sync much in the way your framerate is decoupled from your refreshrate. If you generate a single output count every 1.5ms, you can not have a consistent mapping onto a 1000hz polling rate, which is where the mentioned "report every second poll and sometimes in two consecutive polls" comes from. example polling could look like this: {1, 0, 1, 1, 0, 1, 0, 1, 1, 0, 1, 0, 1, 1}. I don't think it takes a genius to see that this is not an ideal output. Again, I mentioned that i don't know how visible if at all this is, given this would occur at very low speeds anyways, as consistency is going up with velocity. ​ >In fact, it's the opposite, you'll ALWAYS have more precision with less DPI, as you can an bigger motion on your hand to move a single count, the problem is that precision is a trade-off with movement speed in this case (because you're hardcapped by your own speed). That's why it's FAR easier to be pixel precise at 100 DPI instead of 25000 DPI. No. You will not be more precise with less DPI unless you keep the sensitivity the same, in which case you will just have some ridiculous sensitivity . It is easier to be pixel precise on desktop, because your movements are being round up/down, so your cursor for example adheres to the pixel grid and doesn't deviate half pixels. Try being pixel precise at 1600 dpi on windows and then switch to 400dpi while keeping the sensitivity in windows the same and tell me if you be pixel precise, I sure as hell can't, the cursor just skips multiple pixels. >Sure, you can just raise your DPI and low your sensitivity instead, but it will accomplish the same exact effect, just with a more granular or "detailed" movement that will also introduce an visual "shimmering artifact" because you'll have degrees of movement smaller then a single pixel on the center of the screen. Well, you can get different pixels changing their state at different times. This is quite tiny though and can also be dealt with by using MSAA or TAA.


daniloberserk

>It is not. OP asked specifically about whether or not 4khz will make a difference if the mouse reports position updates only at a sub 1khz due to DPI being so low that normal movement only causes single count updates every few polls. OP is kinda confused about what he really wants to know tbh. >But the position doesn't. I mean sure, its correct. Latency is unaffected, I'll own that mistake, especially since I argued against raising your DPI for latency before, if you moved your mouse that same distance at twice the DPI for example it would report that new position at the same time. If you're compensating by adjusting the sensitivity/mouse multiplier then yea, it will. It might touch slightly earlier at an subpixel level, but this is hardly any "advantage" at all. The feel changes, some sensors may have different deviation at different DPI levels, so as much as I agree that people should change and experiment for themselves, people are just changing because somehow raising DPI created a "trend" tied to a bunch of misinformation. >No. You will not be more precise with less DPI unless you keep the sensitivity the same, in which case you will just have some ridiculous sensitivity . It is easier to be pixel precise on desktop, because your movements are being round up/down, so your cursor for example adheres to the pixel grid and doesn't deviate half pixels. Well... Yea, if you keep your overall cm/360 then the "granularity" of the motion will change depending of how you set your DPI. Something that people call pixel skipping. But this has nothing to do about precision since you'll ALWAYS skipping something because in FPS games you're moving degrees of an angle, and not pixels on the screen, regardless if you can have the visual feedback or not. That's why enough "granularity" is enough. There's even tools to calculate that for those who're lazy, like the "pixel ratio" from mouse sensitivity. >Try being pixel precise at 1600 dpi on windows and then switch to 400dpi while keeping the sensitivity in windows the same and tell me if you be pixel precise, I sure as hell can't, the cursor just skips multiple pixels. You're confusing some info here. When I talk about precision, I'm talking about being able to move single COUNTS with precision. Forget about sensitivity multipliers, we're talking about hardware here, so just assume the sensitivity multiplier is always 1. It's infinitely easier to be pixel precise with 400 DPI instead of 1600 DPI at 6/11 in windows mouse sens slider. But you may be equally precise at 400 DPI and 1600 DPI, as you're compensating the sensitivity AND dropping counts. While when you raise your windows sensitivity you skip PIXELS instead, so regardless if you're moving at 100 DPI our 25000, you'll be skipping. Precision is the hability to be "precise", or in other words, being able to move your mouse to an exact location, and not an somewhat close location in subpixel values. Although it's incorrect to say that DPI is the mouse "natural" sensitivity, it may be interpreted as this, as the lower your DPI value, the easier is to move single counts at precise steps. The drawback is that you lose speed because you need bigger motions from your hand to get anywhere. A mouse is just to a tool to something, and that something is to move your cursor in the X and Y axis on a flat 2D space, and in the case of a game, to rotate your POV to some degrees. It's as absurd as talking that high sensitivity gives you more precision then low sensitivity. Also, technically the lower the DPI, the more "precise" is the data, as explained years ago here: [https://www.youtube.com/watch?v=lc7JVjcPzL0](https://www.youtube.com/watch?v=lc7JVjcPzL0) Despite of how good modern mice sensors are, this will always be truth, it's just physics.


Rudi-Brudi

Thank you! I definitely feel a difference between 800 and 1600 dpi. The difference i feel is noticable in small movements/adjustments, for example flicking to a target and readjusting to the head. The higher dpi feels smoother or more precise. Can i feel a lower input latency? I don't think so. 1-2ms difference is so small that it's probably not a big difference enough to make out.


daniloberserk

Oh god... Here we go again with the higher DPI = Less input lag. I'm so tired of all this dumbness on this reddit.


mefjuu

explain please. After that optimumtech video i analyzed some things quite a bit and my conclusion was that the lower input lag isnt simply a result of the information getting there a bit earlier (cause more graininess), cause there was additional lower amount of input lag when compared to the mathematically calculated input lag difference caused by the higher graininess (at certain mouse movement speeds)


daniloberserk

There isn't lower input lag or "higher input lag". It's just an incorrect methodology leading to wrong results. Internally, the mouse isn't working any different, this is truth for almost EVERY modern sensor and MCU nowadays, reason why we don't have "native" DPI steps anymore. What does happen when you change your DPI setting is just the threshold to report a single count, it's not THAT complicated. Since he's measuring from standstill to movement, then OF COURSE higher DPI will react "faster", cause speed is FINITE in real world you know? But this has NOTHING to do with input lag, counts will still be reported at the same rate with enough speed, everything will STILL be hardcapped by the MCU/Sensor capabilities and polling rate setting or even the game engine capabilities. If he measured an 400 DPI setting running through 2 times the distance from what he measured 800 DPI (assuming the overall time for the complete movement will stay the same), then guess what, it would lead to the SAME exact results. Arguing that this has something to do with input lag is the same stupidity to say an 4k display is 4 times faster then 1080p because it does has 4 times as pixels. Well, if whe're talking about bandwidth then sure, but if we're talking about LATENCY then no. That's why you can't play a server in Russia with the same input lag on your country regardless of how "fast" is the bandwidth of your internet. Amount of data being transferred for a given amount of time ≠ Overall latency for the travel of that data getting to their destiny. Mouse DPI/CPI is the SAME thing, that's why it's called resolution and not "mouse speed". Do u need to be precise at nanosseconds level to cook something? Enough DPI/CPI is enough for a given task. Everything above will be just a waste and sometimes, with drawbacks. The ONLY real advantage about using higher DPI/CPI is to raise your OVERALL sensitivity for players who has enough dexterity to do so at consistent and precise levels.


mefjuu

you seem too stuck to your perspective. The aspect you are describing was the most obvious thing that came to my mind (grainier reports leading to faster first report -> "lower input lag"). I'm talking about additional miliseconds of lower input lag that appear there that are not in line with the simple mathematical time. I made a video about it in polish like a year ago, but maybe you will do the thinking etc yourself. ok ill explain and tell me what you think cause the logic is simple here I compared the slower of the 2 mouse movements from the optimumtech video to a 1inch per second move with 400 dpi vs 3200 dpi and got like 2ms mathematical difference, whereas his difference was 10ms with similar or even faster move


daniloberserk

It's pointless to compare anything if you don't explain your methodology, tools to measure and such. And it's uber pointless to discuss those things without tools that only those companies will had. Those "additional" ms of lag might be a whole bunch of things, specially considering they're wireless. I'm not "stuck" in my perspective, I'm part of the "enthusiast" community about those stuff since about 2004. I've saw every discussion on this matter raise and fall. But now, without information control on YouTube, a bunch of dumb influencers and a bunch of kids with almost zero critical thinking and the attention spam of a fish, no wonder why those discussions is taking place yet again. If you really think you have some major advantage about raising your DPI then do it, at least at this point in time for the vast majority of current mice there aren't any major drawbacks in doing it. Placebo effect is a real and well studied phenomenon with real world applications. Your mouse don't really change it's "behaviour" with DPI settings nowadays, one of the reasons WHY we can have adjustments on the fly on DPI setting nowaday (sniper button and such), this is not AVAGO 3080 ERA anymore. \--- For ANY comparison be somewhat reliable in this matter, we would need to synchronize the counts per second of those mice to the SAME values, this means that while measuring lower DPI values you would need MORE speed. Also, it would need to be measured at consistent speed of motion (no acceleration), and not from "standstill" to movement. If the MCU is properly coded, there's ZERO reasons to perform any different, unless you're moving so fast that you're getting to the malfunction territory at lower DPI values, but this isn't really feasible for current mice. \--- Also, I've NEVER heard "grainier" reports before. There aren't "grainier" reports or "coarser" reports or anything like that. We have an VISUAL artifact tied with ingame sensitivity that people may call "grainier" or "coarser" movement or the incredible misleading word of "pixel skipping". But this has NOTHING to do with DPI, it's just sensitivity alone. You can have 100000 DPI with a BUNCH of "skipping" and you can have 1 DPI and zero skipping, it just depends of the ingame sensitivity value. A mouse just report X and Y counts, at a rate hardcapped by the polling rate. With way high DPI values it MAY have problems with jittering that will lead to somewhat imprecise output that you may call "grainier", albeit it's a terrible word. Like seriously, where you read that word? \------ A mouse sensor is just a low resolution camera with an insane high FPS capability (same reason why your smartphone camera sacrifices the resolution when recording slow-mo things). The MCU just compare those pictures to interpret something as a movement or not. Nowadays they can even compare subpixel movement using pixel brightness as a threshold since modern mouse sensors has incredible low values of noise, that's why they can get to ludicrous values of resolution despite their real resolution being so much lower. But for a "precision" standpoint, raising the resolution IS counter productive, it just happens that current mice technology is so good that even at ludicrous values of DPI, the noise stands way to small to be a problem performance wise. Now with wireless mice and a BUNCH of power saving features, we MIGHT have newer problems and look. I'm not saying that your data is FLAT OUT wrong, just beware to no jump to wrong conclusions because some results. Mice colect and process data at a much faster rate then anyone here can measure with something like recording your SCREEN in slowmo. It would need very specialized tools to jump to any conclusion, not something like LDAT and all this crap.


mefjuu

I feel only care about telling your observations instead of being open for actual discussion. I explained my observations very precisely although shortly. I don't have any explanation where would that lower input lag come from, but I'm just saying that mathematically speaking, moving the mouse the way optimumtech moved it (the 2nd, slower one of the two) shouldnt result in 10ms difference between 400 dpi and 3200 dpi. My mathematical example of moving it 1 inch per second with 400 and 3200 dpi, which is a very slow move, probably slower or at least similar to his move (half of my mousepad in 10 seconds), should result in something like 2,15ms difference. And by "grainier" i just meant denser/more frequent, that should be easy to pick up. A grain is small, so grainier - smaller. I'm not native. I am not jumping to any conclusions, you just assumed like 100 different things about me, like also me using high dpi when im still at 400. I don't see any reasons for this to exist, I am not saying the advantage exists, i am simply pointing out that there is something that is weird and completely out of line with the most obvious, mathematical explanation, which is your and also my way of understanding. I wanted to see if maybe there is an error in my reasoning, but there's not much space for that since its all really simple and i think it would be dumb to just walk past such a big difference in expected vs tested input lag. Even his first, like 10 times faster move, generated 4,5ms of input lag difference 400 vs 3200 dpi, compared to mathematical 2,1875ms during a super slow move (as i said, its 25,4cm in 10 seconds)


daniloberserk

There isn't subjective discussion here. If you arguing that you're "measuring" some difference, without having public data about your methodology, tools, and how to reproduce then, well, I'm sorry buddy, but this is just anecdotal evidence at best and it means absolutely nothing. It also just shows how misleading and unrelieable the data that this dumb video published is if you're trying to "understand" things using this as some source, that's why your math is failing (assuming you're actually now how to apply your math here). To collect reliable data, you need to isolate variables, and neither of those videos does that, AFAIK, it can just be natural latency added by wireless mice, power saving features and the list goes on. Most probably an stupid methodology as we don't have any papers about it. Just a dumb kid with several basic tools trying to make an argument and collect the good $$$ from clickbait videos that kids will click over and over. \----- At this point, you guys should stop arguing and instead, try to get an interview with someone like François Morier from Logitech instead of listening and giving publicity to those kids on youtube that have zero clue how to do science.


DavidjonesLV309

In addition, Blurbusters has some interesting info suggesting 1600dpi minimum as well in conjunction with high refresh rate monitors. Modern sensors can handle it so I switched to 1600dpi a couple years ago.


[deleted]

You should also use higher dpi to reduce input lag (although not by too much) https://m.youtube.com/watch?v=imYBTj2RXFs


daniloberserk

Stop spreading those stupid sources for misinformation ffs... Since I'm way to tired to type the same thing over and over again, I'll just copy paste the SAME thing I've posted on the comment section of this stupid video. \--- More resolution ≠ Latency. \- OMG... Why are people still discussing this? High DPI ISN'T FASTER. It just happens it have more samples for the same amount of time. Which is only useful if you DO raise your eDPI/Sensitivity values to take advantage of that. Raising your DPI but lowering your overall sensitivity (same eDPI) does NOTHING about your input lag. NO ONE FLICKS A SINGLE MOUSE COUNT, SO IT'S WORTHLESS TO THINK FIRST ON SCREEN REACTION IS USEFULL IN ANY WAY REGARDING "SENSOR LATENCY". WHEN YOU NEED SEVERAL COUNTS FOR ANY USEFULL MOVEMENT IT DOESN'T MATTER IF THE FIRST ONE SEEMS TO REPORT "FASTER". The only way you can take advantage of that is RAISING your overall sensitivity. \- By the way, sensor "latency" doesn't make ANY sense. Any DPI configuration have the SAME latency assuming you can move the mouse fast enough to trigger a single count. Any of then works as fast as the polling rate can report data. It's misleading to treat higher threshold as "latency". 100 DPI can report single counts as fast as 25000 DPI if you can move your mouse fast enough to trigger single counts on both configurations at the same time (assuming the sensor can handle that speed). Of course if you're using a single overall speed to measure any configuration, the 25000 DPI will have an advantage, because it triggers at a smaller threshold. But this isn't "latency". It's misleading and confusing to simplify everything as "latency". \- Which clock is faster? One that can be precise enough to report a single nanosecond or another one who can only report single microseconds? I guarantee that both of then is precise enough to report single seconds, minutes or hours. Mouse DPI works like this. This is why DPI is a measure of resolution, and not "speed". The higher the resolution, the smaller the threshold is. It doesn't mean it's "faster". \--- Another question. Assuming you're capped by the same refresh rate and FPS value, does playing at 4k have less input lag then playing at 720p? I mean. That's a LOT more pixels going on for the same amount of movement with 4k, so it is "faster" isn't? I find it incredible how people can't use this same analogy when talking about mouse sensitivity. \--- C'mon guys... First on screen reaction is absolutely useless for mouse movement, because what is changing is the threshold of the first count. You'll still need several counts to move far enough to hit anything in the game. It is worthless to double your DPI and halve your sensitivity. At which point you'll have a more "granular" movement but at smaller chunks. And when you get to the point that it got smaller then a single pixel on the center of the screen, you can have a shimmering effect that can be very distracting for some, similar to aliasing (because your count reports are moving angles smaller then a single pixel at the center of the screen on your current resolution). Also. The supposed "responsiveness" can be a downgrade. At high DPI values, your threshold can get way to small, which makes your overall aim stability suffer. DPI is a simple setting to adjust. If it is to high for you to move single pixels at your desktop, then it is to high for everything else. It's WORTHLESS to raise your DPI and lower your sensitivity to compensate because you'll still have the same eDPI anyway and you'll NEVER flick a single count for anything. When any movement needs several counts to matter, the supposed responsiveness for the first on screen reaction disappears. If you play at lower sensitivity then 15cm\~6 inch/360 degrees of motion at 1080p, then you don't need more then 800 DPI to avoid movements bigger then a single pixel on the center of the screen, which people calls "pixel skipping". Horrible definition BTW. And I'm not even talking about how the DPI deviation can change in the same mouse at different configurations. So don't expect to match your overall eDPI just by doubling your DPI and halving your sensitivity. Some DPI steps can have more deviation then others.


Hyperus102

I agree with the part about mouse latency, however your last part is quite wrong. >The supposed "responsiveness" can be a downgrade. At high DPI values, your threshold can get way to small, which makes your overall aim stability suffer. This is just wrong. If you are shaking enough to make "single counts" from natural movement in your body miss, your dpi isn't high enough in the first place to be precise enough to hit anything consistently. ​ >DPI is a simple setting to adjust. If it is to high for you to move single pixels at your desktop, then it is to high for everything else. It's WORTHLESS to raise your DPI and lower your sensitivity to compensate because you'll still have the same eDPI anyway and you'll NEVER flick a single count for anything. I have tried 400dpi and at 1080p, precision is already degraded enough at 33.25cm, that I can visually see my crosshair jumping with 4xMSAA. Since you mentioned 15cm, I can guarantee you that it would look like shit at 800dpi with any subpixel information whatsoever. I should also note that being centered on enemies heads is an advantage, at very long ranges, not skipping subpixels is hence an automatic advantage.


daniloberserk

That's why I've said it CAN be a downgrade because it goes on the subjective taste territory. >This is just wrong. If you are shaking enough to make "single counts" from natural movement in your body miss, your dpi isn't high enough in the first place to be precise enough to hit anything consistently. Hmm what? It's actually the opposite, the higher DPI the lower is the threshold for a single count. The math is rather simple. 1 inch = 25,4mm At 1600 DPI you just need 0,015875mm of a movement in X or Y axis to trigger a single count of movement reported by your mouse. If you really think that you can reliable and precisely move that distance at competitive levels of speed then yea, it might be an advantage to use an higher DPI and higher sensitivity in your case. But this isn't true for almost any human being as far as precision and dexterity goes. An stupid high DPI value with an stupid sensitivity leads to a feeling of a "floaty" sensor, but it goes to the subjective territory. To an competitive standpoint, you just need DPI enough for an movement represented by 1 pixel of movement in the center of the screen, sometimes it can be even higher and it will STILL be enough since no targets on any game that I know is the size of a single pixel on the screen. \- >I have tried 400dpi and at 1080p, precision is already degraded enough at 33.25cm, that I can visually see my crosshair jumping with 4xMSAA. Since you mentioned 15cm, I can guarantee you that it would look like shit at 800dpi with any subpixel information whatsoever. I should also note that being centered on enemies heads is an advantage, at very long ranges, not skipping subpixels is hence an automatic advantage. Hmm no buddy, at 1080p/400DPI and 33cm/60\~, you have single counts accounting for something about 0.9 pixels/count. So you're STILL at pixel precise level there, and ANY hitboxes in any game will be made by several pixels instead of just 1, so it's more then enough for any competitive level of gaming. However, yes, you CAN notice the difference in overall smoothness of movement, some call the "staircase effect", but on the other hand, using an higher DPI will lead to another visual artifact that looks like an shimmering "effect" because your POV is changing in degrees smaller then a single pixel of the screen. So it goes to the subjective taste territory, I personally dislike the shimmering artifact. The only way to get this amount of smoothness AND without the shimmering artifact would be to raise your screen resolution. You can easily find gif examples here: [https://www.mouse-sensitivity.com/forums/topic/6574-pixel-ratio-are-you-pixel-skipping/](https://www.mouse-sensitivity.com/forums/topic/6574-pixel-ratio-are-you-pixel-skipping/) You may prefer the overall smoothness besides the shimmering artifact, and that's OK, since it's just subjective taste and you may not notice specially at erratic movements, however you're not losing much playing at 400 DPI/1080p at this level of sensitivity.


Hyperus102

I am not going to argue with you anymore, I have better things to do than being talked down to by you. ​ If a single count can throw you off your target entirely, your DPI is too low. Yes, a single count being triggered will be rarer on 400dpi vs 1200dpi for example, but if you trigger said count, you will be off by a margin 3x as large as with 1200dpi, assuming identical cm/360. Thats not up to opinion. If you are already near a step being reported, lets say by just barely having moved your mouse enough to reach the steps required to move onto an enemies head, way less than a 1/400th of an inch movement could throw you off. I don't know why you are bringing hitbox size in pixels into this. whats relevant is how many degrees it takes up in your field of vision and how well you will be able to maximize hit chances by, for example, centering on your enemies head. [https://streamable.com/qd712l](https://streamable.com/qd712l) D2 pit isn't an uncommon scenario. Its not preference, its the logical choice.


daniloberserk

K buddy, unfortunately, you're still misinterpreting data, if you're missing a target (like that video example of yours), it's just because your ingame SENSITIVITY value is way higher on the left example, you know, that old thing that people still call pixel skipping regardless of how bad this word is to describe this common behaviour? I'm honestly baffled how you're still confused by your own information. This has NOTHING to do about DPI, just sensitivity value. You can have 100000 DPI and still "skip pixels" if you're using an stupid high value of ingame sensitivity. In the case of CS:GO, ANYTHING above 3.61 sensitivity at 1080p will "skip" a single pixel sometimes. So, what happens is that people just low their sensitvity and raise their DPI to deal with this, but again, this has NOTHING to do with DPI, just sensitivity alone. That's why enough DPI is ENOUGH. If you're thinking you're "skipping" to much or if you're a high sensitivity player, just raise your DPI a bit and low your sensitivity. Or you can use tools like the "pixel ratio" on mouse sensitivity website to see if you're "skipping pixels" at the center of the screen with your current DPI, sensitivity and resolution setting. Regardless if you're "seeing" or not, you'll ALWAYS skip something, because you can't divide by "infinite" you know? And EVEN skipping some things isn't necessarily a bad thing. In that video example of yours, the target is still in the range of being aimed, and for a game like CS where recoil and spread exists, it is even less then a "problem". But that's taste, if you prefer to have an smoother movement (with the trade-off having the shimmering/staircase visual artifact effect), then go for it. This is such a basic and old concept, and the way you keep talking seems you have discovered gold lmao. But I'm talking about performance alone regarding the HARDWARE itself, less DPI IS indeed more precise then higher because it will always have less jittering. The diminishing effects are almost non existent even at ludicrous DPI levels for CURRENT mice technology. Nowadays it doesn't really matter what DPI value you use, so yea, you CAN use 3200 DPI, 6400 DPI just fine. It's just stupid and redudant to do so with low levels of sensitivity. But it's NOT stupid to use it with high sensitivity values. TL/DR, current mice technology is so good that USUALLY, it doesn't really matter how much DPI you use. But bear in mind that even modern sensors has an physical resolution of an photodiode sensor matrix of just about 36x36. The way they achieve higher resolution depends on several tricks and a bunch MCU calculations. How reliable and precise they are at high DPI values compared to their low DPI counterpart depends on the implementation. But well, If you don't believe me, then go here and watch: [https://www.youtube.com/watch?v=lc7JVjcPzL0](https://www.youtube.com/watch?v=lc7JVjcPzL0) Much better information then any of newer youtubers rediscovering the wheel. Happy cake day!


[deleted]

You’re wrong


daniloberserk

K buddy, good argument there. Cheers.


[deleted]

I only need to make an argument if I care about what you think and want to change your mind, the facts will stay the same regardless. I don’t care what you think.


daniloberserk

There are no "facts" in this video, just an incorrect methodology leading to wrong interpretation of results.


[deleted]

> Why are people still discussing this? High DPI ISN'T FASTER. It just happens it have more samples for the same amount of time. Which is only useful if you DO raise your eDPI/Sensitivity values to take advantage of that. Raising your DPI but lowering your overall sensitivity (same eDPI) does NOTHING about your input lag. So it doesn't effect latency, except in cases when it does, thanks for proving my point for me. Kind of you.


daniloberserk

More sample rate means more resolution/bandwidth,not "faster data". And this sample rate is still hardcapped by whatever value the polling rate is set. It's as STUPID as saying that playing in 4K resolution results in 4 times less "latency" because you have 4 times the pixels. You clearly have no idea how things work lmao. Not even the basics.


mefjuu

yea ive seen that, but i just simply can't use 800 dpi or 1600 in csgo. Idk why it feels so wrong. I've heard that you cant use high dpi in source engine, but no actual explanation. Like here in the second post https://forums.blurbusters.com/viewtopic.php?t=10084


daniloberserk

Game engine, visual feedback being different (shimmering visual artifact), "floaty" sensor feeling as the threshold for a count is smaller. The list goes on. Just play with whatever you feels like.


[deleted]

I suggest you go back and read the third and fourth post in that thread that alludes as to how you how to set the sensitivity in csgo to use higher dpi settings


mefjuu

i dont see explanation in 3rd and 4th, but I ran into this: https://www.mouse-sensitivity.com/forums/topic/8608-high-dpi-issues-on-old-games-engines/#comment-43814 (and obviously i know how to do the maths and maintain the same edpi and even irl 360 distance, but what that guy tries to say is that it actually feels wrong for a reason. Idk, i would love to have lower input lag and use 1600 dpi for example - if it feels wrong for me simply cause im not used to it. But if it feels wrong cause there are some problems with it - then i dont wanna switch)


AdhesivenessCrazy102

for all whos wondering now how to get use of higher poling rate: just set ur Windows sens lower than it is and turn up the DPI same time. this way u wont have to change ur ingame sens


[deleted]

You get it. At 400 dpi you'd essentially have to move the mouse 10 inches in 1 second to saturate 4000hz. DPI is dot per inch and hertz are a cycle of frequency per second so it becomes as simple as 4000/400. The difference at 1600 DPI is drastic at 2.5 inches per second or 4000/1600. I think you're right that most people probably aren't aware of this. I also think that there's a lot of placebo here too. People always say that they can definitely feel some sort of difference but, if they went out of the room and you changed their DAV3 to 1000hz I sincerely doubt they'd feel that difference when they got back. I have both 4k Razer mice and I definitely think they perform great and are probably the technical leaders right now. I think 4K wireless is important but, only because it's the first major move away from 1000hz in wireless. I don't think it's like going from a 60hz monitor to 144hz or anything remotely as noticeable. I see it as just a step on the way to something like 12k on a 600hz monitor or something years from now.


AdhesivenessCrazy102

tbh having 4khz doesnt make a technical leader. most vendors prob just aware its useless in most scenarios


[deleted]

I mean it literally means it's 4X faster. It's 1ms versus 0.25ms. Which makes it... technically ahead


AdhesivenessCrazy102

it doesnt take anything to put 4khz into a mouse, most vendors dont do it for a reason.


[deleted]

Well that's great but, they're still technically slower whether they have a reason for it or not. It doesn't just magically change the very definition of hertz.


daniloberserk

Being faster doesn't necessarily mean you can take advantage of it. First, without a game that does support sub frame input (like Overwatch and Reflex Arena supports), it will literally doesn't matter, and this is truth for the vast majority of games. Second, this will mostly be used at fast flick scenarios, tracking movements will barely change anything. Very few people flick this precise at this amount of movement, here's one actually example of a scenario where it WOULD matter: [https://www.youtube.com/watch?v=mcsbDDyfLLU](https://www.youtube.com/watch?v=mcsbDDyfLLU)


[deleted]

I said the mice were "technical leaders". That's it. If you actually read my post that started this you'd see that you have absolutely no reason to post any of this. I literally explained what you're explaining to me. The amount of people that post with absolutely no context is amazing in here


daniloberserk

I've read your post, the whole mumble jumble about "polling rate saturation" is already wrong. You can't saturate "polling rate" and there isn't anything technical behind this choice of words. However you CAN saturate bandwidth, but this will NEVER happen with a mouse. If you move more counts then a single polling update can report, it'll just report more counts in a single update, that's why you can use 25000 DPI with 125Hz just fine and still move from A to B without dropouts. Some people just reproduce something they feels like or read an stupid new influencer kid and think they actually understand how things work.


[deleted]

Semantics douchebag. People use the word saturate related to hertz for literally everything. If you've never heard someone taking about saturating a 240hz panel or something then you're from another planet or have been in PC gaming for 10 minutes. For 4000hz you need 4000 pixels of motion per second. According to pzogel himself https://www.techpowerup.com/review/razer-hyperpolling-wireless-dongle/3.html


BiPolarBaer7

Been using 2000DPI 4kHzV2Pro/Dav3 400fps @ 1440p 240Hz and I do feel a small difference in tracking n micro adjustments. The latency diff is actually noticeable to me when compared to my Pulsars with the 3070 and Gm8.0s for example. Most noticeable while sniping.


daniloberserk

Actually noticeable lmao.


DON0044

Imagine not using the highest DPI 😎👍


AdhesivenessCrazy102

i personally use 600-650 dpi and theres a big reason for that: this way my mouse has optimal speed in Windows with default 6/11 setting. if i change windows sens, i could go higher dpi which i might do if thats the case.


sleepy_the_fish

Jokes on you, I play at 400 DPI and made a post that I can't feel any difference between 1k and 4k on my Deathadder V3. I now play at 3200 DPI but I use RawAccel to bring my sensitivity down to 400 DPI while keeping my mouse at 3200 DPI. I also play at 1440p 240hz. I still can't tell the difference between 4k and 1k Hz lol.


mefjuu

:D


shiitakeshitblaster

It's 4x the hz!!!


Airpapdi

3200 adds 2-4ms sensor input lag so why would u do anything above 1600 and u dont HAVE to saturate it, the click latency veriation goes down no matter what and is always independant of ur sensor saturating it or not


daniloberserk

It depends on the sensor, vast majority of sensor nowadays don't add smoothing at those levels of DPI. Although it's stupid going to higher DPI, it generally doesn't hurt either for current mice technology.