PDA

View Full Version : Why mouse 5000 DPI better than 1000 DPI?


nerio.ru
03-10-2011, 02:55 PM
Why? What for do I need high DPI ? fnx

Sagenth
03-10-2011, 03:08 PM
A high dpi means that the hardware is more accurate. It can tell the difference between point A and point B much much better.

Having the hardware determine what is going on is better than pretending with software.

with 5000 dpi or something ridiculously high like that you can set software lower and your cursor will follow the path your mouse takes more accurately than if you had 40 dpi

Anything above 400 dpi is acceptable though.
But for instance my mouse has two settings if I wanted aiming to be the same SPEED this is what the settings would have to look like(approximately)

DPI 3200 ; sensitivity 0.465
DPI 400 ; sensitivity 3.72
The first one will be more true. The different you feel will not be very noticeable though, not for most people at least.

I've played with this high of a DPI for long enough I would notice the difference going back, but not going forward. I didn't even notice the difference between 2000 and 3200 other than the fact I had to slow everything way down because it was too fast for me.

nerio.ru
03-10-2011, 03:45 PM
much much better - what dose it mean ? Is it some technical expression?

DPI 3200 ; sensitivity 0.465 this is fir zoom , how I understand.
DPI 400 ; sensitivity 3.72 what for is this ? Why cant You play with 3200 all the time ?

nocebo
03-10-2011, 04:04 PM
if you have high resolution on your monitor you will need a mouse with high dpi. if you have low dpi on a high resolution screen it will feel like you "skip" pixels.

on a 640x480 resolution 400 dpi is ok.

there is a formula you can use to see how much dpi you need. its something like this


R = ( pi * W ) / ( I * tan[ F / 2 ] )

where
W = screen resolution width
I = real sensitivity (distance per 360 turn)
F = horizontal fov

R = mouse resolution required

if you dont want to calculate you can always just try to lower your dpi and see if the pointer jumps over pixels.

hers is another good explanation

Resolution

The resolution of your screen tell you the number of pixels of a 2D plane. The fov tells you what portion of the view sphere is projected to that plane. Due to this projection, the view is compressed in the center and dilated at the boundary (which means that pixels in the center express wider angles than those at the boundary. If you think it should be the inverse, make a drawing, or think twice).

To put some numbers down, let's make an example only on one dimension: If you have an horizontal fov of 90 and play at 1680, a pixel will span (on average) ~0.05 degrees.

Mouse DPI

The mouse DPI determines how many ticks the mouse gets by moving it by one inch. By fixing a certain sensitivity (x inches per 360 degrees), you can compute the smallest angle you can rotate. I made a graph to show this:
http://n.ethz.ch/student/gnoris/download/mouse_precision.png

With a 400DPI mouse, most people will have a precision below 0.2 degrees.

High res, low DPI, is this a problem?

No. With low DPI and high res, there is no influence of the resolution over the aiming capabilities.

Consider you moving your mouse. Your mouse input gets converted to the game internal representation of where your character is aiming, and than, based on it, a representation of the world is rendered. How accurate this representation is depends on the resolution, but such resolution will not influence the internal representation (you will always hit that spot, no matter how many pixels are used to represent that spot).

So the real question is if 400DPI is enough, or not, and I think that with all IE3.0 used among pros, this question should not be a concern.

What *might* be a problem is the inverse, high DPI and low resolution. In this scenario your mouse is capable of expressing rotations that on your screen look all the same, as it is simply too low res for representing them. This however is science fiction for quake, as nobody really cares about high DPI mice, and nobody plays on 200x150 screens.

Sagenth
03-10-2011, 04:37 PM
much much better - what dose it mean ? Is it some technical expression?

DPI 3200 ; sensitivity 0.465 this is fir zoom , how I understand.
DPI 400 ; sensitivity 3.72 what for is this ? Why cant You play with 3200 all the time ?

Lol you need a translator.

It means it can tell the difference between point A and point B to a far greater degree. Instead tracking movement per centimetre for instances.. it might track per millimetre or nanometre instead. This is what I meant by "much much better" sorry for not accounting for your poor english; my apologies.

I can play at whatever DPI I want from 1 to 3200, and I can play it all the time if I want. Or I can switch between two dpi settings.

DPI 3200 ; sensitivity 0.465 = DPI 400 ; sensitivity 3.72
These are the same speed. I must not have made that clear enough.
The difference between them is how accurate or precise they are. I am pretty sure that the 3200 dpi is just more precise than 400, not necessarily more accurate.

This doesn't belong in the CSS Beta forum btw, this needs to be moved to general discussion or something like that. As this has nothing to do with css specifically, it affects all games.

Bob910
03-10-2011, 04:44 PM
A high dpi means that the hardware is more accurate. It can tell the difference between point A and point B much much better.

Having the hardware determine what is going on is better than pretending with software.



Sadly DPI is really just a marketing gimmick with no practical functionality at all. Like putting cool looking stripes on your car or spoilers

Theres no such thing as a "more accurate" mouse due to DPI. All you need is the right minimum DPI for your monitor resolution and sensitivity. Beyond that you get no benefit and actually can create negative acceleration or multiply any other accel that might exist (even with it off you can have some). Mice tend to have sweet spots were they work best, and its usually 400 DPI but depends on the mouse. The lower the DPI the faster you can move the mouse before it starts getting less accurate (or skip), but it all depends on the player for whether you would hit its max speed or not. Theres a formula to work out how much DPI you need, its like (resolution width x 4) / (360 degree rotation in inches) = usable DPI so you don't skip pixels.

Some more info:
http://www.esreality.com/?a=longpost&id=1265679&page=2
DPI checker:
http://www.mikofoto.net/ae/calculator2.php

Realistically tho all you really need is a mouse with a shape that feels good and a sensor that is accurate (I much prefer old optical mice like mx518 than laser). Then pick DPI based on comfort and maybe check you don't skip pixels with a dpi checker

Sagenth
03-10-2011, 05:27 PM
Ya I think I corrected myself on the accuracy thing in my last post, changing it to precise. I always get the two mixed up, I should probably get out my google and define:

Cool to know that it doesn't provide any benefit beyond a certain point. I still like the higher DPI, I just lower my windows cursor speed and do the same for my games. It is probably like you say though, just a perceived difference.

At least I am ready for an upgrade from my two 1440*900 monitors.

Corrupt^
03-10-2011, 09:04 PM
Stop posting those silly calculations, they're pointless. In the end there's only 1 reason why more dpi could be usefull:

"Decreasing your ingame sens"

At sensitivity 1.0, for each count the PC receives from the mouse, the game will move 1 pixel to the corresponding direction. At sensitivity 2.0 it will skip a pixel and move 2 pixels in the corresponding direction. At 3.0 it will move 3, etc. Numbers in between like 2.75 are interpolated.

So in order to get maximum precision, one would need a sensitivity of 1.0, going lower is pointless as a pixel is the smallest increment visible on a monitor.

Where does dpi come in?

DPI simply get's more counts per inch, thus making the mouse more sensitive. Doubling your dpi simply means your mouse will move twice as fast.

So ideally (with raw input as otherwise you'll get negative acceleration), you'd have to adjust your sensitivity through the DPI settings of your mouse and get as close as possible to 1.0 ingame.

2 examples:


Ingame sens = 2.0 / DPI = 800 --> change sens to 1.0 and dpi to 1600
sens = 3.4 ingame / DPI = 800 --> (if possible, otherwise get as close as possible to 1.0) change dpi to 2760 and set ingame sens to 1.0


If you're using a sens as in the second example you should be shot. No point in going that high.

So more dpi is usefull, but we hardly need 5000 dpi. The extra precision is gained not by having more dpi, but by decreasing your ingame sens.

boin
03-10-2011, 09:59 PM
Except the mouse input should not have anything t do with PIXELs ... And in fact it does not any more since CS:S has mouse raw-input.

DPI is the precision of the measure the mouse is capable to return to the driver. Unfortunately Windows input driver API seems to be unable to provide the good information. If it were working properly, whatever the mouse you use, when you move say 1 inch a mouse, the mouse should send a X-DPI move message, and the driver should know this is a X-DPI mouse hence knowing you moved 1 inch. That way you can change mouse DPI ... in fact you can even change the mouse itself, it should always move the same (except for precision).

And your statement about floating point is just false. The game engine convert mouse input to angle, and obviously the value is floating point.
Example (with totally imaginary numbers):
current-angle = 56.021
mouse-input = 32 (this is integer)
mouse-sens = 1.26
inc-angle = 32*1.26 = 40.32
new-angle = 56.021+40.32 = 96.341
Off course It DOES work with non-integer value !

So the higher the sensitivity (mouse-sens) the less precision you have as the smallest amount to increase the angle is 1*mouse-sens. This is what limits your precision.

I agree than insanity high DPI are not needed, probably no human being is able to measure 1/5000th of inch anyway. It may even measure involuntary movement, but my guess is that mouse hardware/software have filters to avoid that. So I would say it does not harm your ingame sens, but it won't help it either.

Sagenth
03-10-2011, 10:11 PM
So more dpi is usefull, but we hardly need 5000 dpi. [B]The extra precision is gained not by having more dpi, but by decreasing your ingame sens.

Right you just need a minimum dpi as has been explained by bob(? can't remember) Game wise the precision is gained by doing both. The hardware grants the precision but it is squandered if you keep your sensitivity high. This is because it will be all blocky and skip all over the place with a dpi of 1 and just about any sensitivity. Theoretically, I am interested if the theory holds true though. I am far far far away from my gaming pc so somebody else should give it a try.

Btw the game can't really move the camera per pixel when the mouse moves, rather it Rotates with radians or degrees; dear god I hope they use radians or something more efficient which I know not of. (radians: 2pi = 360 degrees) As a result new pixels are generated, which is where all that filtering(bilinear, antistropic) stuff comes into play, as well as anti aliasing.

Not that it matters.

edit:
lol I was trying to say pretty much the same thing as boin albeit nowhere near as articulately

y3sak
03-11-2011, 06:26 AM
I have the G500, i have different DPI settings in the software, but I don't use anything higher than 400dpi. Just cause when I started PC games I had a 400dpi mouse. I think its mostly preference. Played 3.7 sensitivity in all FPS' mouse accell disabled for years.

LinkinMcOwnage
03-11-2011, 06:55 AM
High DPI a good gaming mouse does not make.

What matters is the ability to set the polling rate to 1000hz, and to have no acceleration issues, tracking issues, or most importantly, sensor issues (looks at the philips twin eye, the sensor that makes the cursor drop to the bottom right if you lift up your mouse)

nerio.ru
03-12-2011, 12:45 PM
Lol you need a translator.

It means it can tell the difference between point A and point B to a far greater degree. Instead tracking movement per centimetre for instances.. it might track per millimetre or nanometre instead. This is what I meant by "much much better" sorry for not accounting for your poor english; my apologies.
...

Your definitions : great , good , worth , much better , come from morality and nothing to do with technical aspects. When you try to explain something about equipment you cant use words like "good" and "bad"
May be you dont know what is definition I sent you link . Read it : http://en.wikipedia.org/wiki/Definition


DPI 3200 ; sensitivity 0.465 = DPI 400 ; sensitivity 3.72
These are the same speed. I must not have made that clear enough.
The difference between them is how accurate or precise they are. I am pretty sure that the 3200 dpi is just more precise than 400, not necessarily more accurate.

I have to ask you one more time : Why or when do you need to be not precise (DPI 400) ?

Sagenth
03-12-2011, 09:43 PM
Did you realize that languages have evolved from simple grunts at first to do all sorts of things. English, like most languages I am sure, changes and has changed over time.

I am sure the interpretation of all sorts of crap has changed drastically from back in the old times. I suppose I could grab a dictionary to make sure I use a word that translates well.. but I would rather just learn the language, and that is hardly realistic just to communicate with one person; more efficiently.

So basically I can use good and bad to communicate however I want. I am not drafting you a technical specification. You asked a question, and that was my answer. Part of the answer anyways.

Btw, while good and bad have to do with morality. Great, worth, much better are not "strictly" morality related. They are quantitative words, they just have no absolute value. Thus why in programming when I say things like if i > 10 it will be read if i is greater than 10. Let's just agree to disagree or agree that I use words that fit their intended purpose. Whether they accomplish the purpose depends on the reader's interpretation, and in far less cases it will be comprehension.

None of that really matters generally. I should maybe take the time to know the exact definitions of conversational english.. but what they mean in conversation and in certain contexts will not necessarily match. Anyways I think I've made my point which is that it is not my fault if you mis interpreted my post, or failed to comprehend. I was giving a relative explanation not a definitive one. Next time just ask for numbers but you had asked why 5000 was better than 1000.

I am sorry if I didn't explain it in a way that suited you; I will try harder next time.

Corrupt^
03-14-2011, 10:45 PM
Except the mouse input should not have anything t do with PIXELs ... And in fact it does not any more since CS:S has mouse raw-input.

In theory and in code, no, I agree with you. But in reality, when you're actually playing, that's what it comes down to, at least if the scaling is implemented correctly (Quake, 1.6, CSS, ...). Unreal engine games usually have a scaling that makes no sense at all.

When using a non decimal number, like 2.0, it does move 2 pixels for every count it receives from the mouse.

So this brings me back to the point that ideally, you should go for 1.0, since an increment smaller then a pixel (even if read by the code), would be rendered pointless since your monitor consists out of pixels.

You can easily test it if you have a LCD monitor with a high enough dot pitch. Stare really close to your monitor, eventually you'll start seeing the pixels. Have fun trying to move pixel per pixel at sens 2.0...

With raw input, your sensitivity is the same as before, provided you used a windows sensitivity of 6/11. Though it might feel slightly faster since you're no longer having negative acceleration and it feels more responsive in general.

Sagenth
03-15-2011, 05:43 AM
It can't move a simple two pixels. It moves and renders the new pixels.. it is just luck that it counts as 2 new pixels. There could be a rather large number of pixels in between depending on your resolution.

888
03-15-2011, 09:25 AM
There is an accuracy difference in CSS between 400 and 800 dpi i have tested both extensively and hitting that tiny pixel headshot is easier at 800dpi at long to mid range combat. I didnt however notice any difference accuracy wise going above 800 dpi.

Corrupt^
03-15-2011, 10:15 AM
It can't move a simple two pixels. It moves and renders the new pixels.. it is just luck that it counts as 2 new pixels. There could be a rather large number of pixels in between depending on your resolution.

Perhaps you're right when we'd increase the resolution above 1680x1050, but as far as I've been testing, at 2.0, it moves in steps of 2 pixels on my monitor and 3 pixels for 3.0. But like I said, it's not that I think it's coded this way, the precision just ends up to be this way, as presented on my monitor.

Either way, games allow a hell of alot more accurate aiming at 1.0.

Vedi
03-15-2011, 11:52 AM
In theory and in code, no, I agree with you. But in reality, when you're actually playing, that's what it comes down to, at least if the scaling is implemented correctly (Quake, 1.6, CSS, ...). Unreal engine games usually have a scaling that makes no sense at all.

When using a non decimal number, like 2.0, it does move 2 pixels for every count it receives from the mouse.


Your mouse reports its movement in terms of counts. Like "5 to the left, 2 up". It never says "4.5 to the left, 2.5 up". Counts are whole numbers. If you have raw input enabled, like you should, the result of "5 to the left" will be turning your screen 5*m_yaw*sens degrees. Moving one count, with the default m_yaw of 0.022, and your suggested sens 1, will turn you 0.022 degrees, not one pixel on your screen. How to translate this angle to pixels on screen needs a short calculation. Just look at this page http://www.phoon.us/mouse/

h0lm
03-15-2011, 01:03 PM
*forget this post*
thought there was just one page, so hadnt read page 2!

Corrupt^
03-15-2011, 07:53 PM
The mouse resolution determines the smallest angle you can rotate your view by in game, for a given sensitivity. If you want this smallest angle to be small enough so that you can turn your view by 1 pixel (to the pixel next to where your crosshair is), you need to know what angle that distance of 1 pixel represents on your screen. The projection of the 3D world onto the 2D plane of your screen means the pixels located near the crosshair represent much larger angles than those pixels located at the edges of your screen. If the mouse resolution calculated above is bigger than your current dpi, then your smallest rotation will be larger than 1-pixel's worth of rotation.

Moving one count, with the default m_yaw of 0.022, and your suggested sens 1, will turn you 0.022 degrees, not one pixel on your screen.

What I'm trying to point out Vedi, is that imo, on an average monitor, at sens 1.0, turning 0.022 degrees corresponds with moving 1 pixel (near the crosshair). If that's really the case, it then seems quite logical that turning 0.044 degrees (sens 2.0) corresponds with moving 2 pixels.

Either it's sheer luck or developers intended it to be this way, but that is how it visually shows on my monitor. Though my theory of "sens 1.0" only works in games that happen to have the typical 0.022 yaw/pitch values. Quake, CSS, 1.6, Half-Life, ... tons of games seem to be using 0.022.

boin
03-15-2011, 08:33 PM
What I'm trying to point out Vedi, is that imo, on an average monitor, at sens 1.0, turning 0.022 degrees corresponds with moving 1 pixel (near the crosshair). If that's really the case, it then seems quite logical that turning 0.044 degrees (sens 2.0) corresponds with moving 2 pixels.

Is that the case ? May be I'm wrong somewhere but as the FOV is 90 there should be 90/Screen-width degrees per pixel hence 90/0.022= 4090.9 pixels width on your average monitor ? Or may be it is not linear ?
But it does not matter really.
It seems pretty obvious to me that the limiting factor is one unit (whatever it is) of mouse input as it is integer. This unit represents your smallest move possible. As it is smaller on higher DPI mouses, they are more precise. But there is a limit (that I don't know of) where a human being won't be able to tell the difference.

Corrupt^
03-15-2011, 08:57 PM
If I were to take the calculations from that website, I'd only need 443 dpi to move my crosshair 1 pixel further. I would need a sensitivity of 2.844 (get the same speed as I would on sens 1.0 with my current dpi).

But it sure as hell doesn't work that way because it's impossible for me to move my mouse so that I would only jump by 1 pixel worth of movement in game, it'll always jump and do at least 2.

Sagenth
03-15-2011, 10:40 PM
STOP THIS PIXEL CRAP!

You can't move any number of "pixels" inside a 3d game. It does not work like that, the pixels are RENDERED AFTER everything is said and done. You give the input it rotates then it determines what pixels need to be changed. Perhaps there is a delta or something of the sort, but literally speaking it is impossible to move your screen x pixels. You cannot define a game as Resolution Width * 4 and call that 360 degrees! or 2pi if you want to get into the way most games deal with this stuff.

As for that m_yaw, 0.022 degrees != 1 pixel


If you are going to talk about Pixels stick to OS cursor movements... Your CURSOR does move by pixels.. your in game camera does frikking not!

EUAHG I am going bald because of you.
Seriously you can count as many pixels as you perceive different.. you can keep on doing it, but it doesn't make it any more true of a statement. It is true for you, okay perhaps.. it is not a universal truth however.

I hope I've said this enough times now; your in game camera does not move by pixels!

It reads how many pixels your cursor has moved, and translates that using m_yaw and sensitivity, as well as the m_customaccel stuff. It takes the resulting number and rotates your camera using (I hope) a quaternion which i think use radians; not degrees I know that is for sure. So if 0.022 is in degrees then it must get converted to radians first or whatever quaternions DO use. Somebody should look it up; not it.

Once they have their quaternion or rotation matrix if they do it that way instead, they apply that to the camera.. the camera then rotates. Then game engine then says oh hey it is time for a new frame, lets get to the rendering woot woot! It then renders the new frame thus creating all the pixels you see. Since the game is 3 dimensional it is impossible to rotate by pixels..

Crystal Clear or clear as mud?

Corrupt^
03-16-2011, 08:29 AM
It has always been clear, I never talked about how the camera changes in a 3D environment, but about the END RESULT on my monitor, which still presents the end result in a flat surface which isn't 3D and consists out of 1764000 pixels (in my case) and not degrees.

How would you explain losing precision if going beyond 1.0? (Not that it's that important, 2.0, 2.3, 2.8, 3.3, ... are still playable, but out of sheer interest I would like to know).

You give the input it rotates then it determines what pixels need to be changed. Perhaps there is a delta or something of the sort, but literally speaking it is impossible to move your screen x pixels. You cannot define a game as Resolution Width * 4 and call that 360 degrees! or 2pi if you want to get into the way most games deal with this stuff.

So to put clear from my standpoint again:


On my desktop, if the Windows slider is on 6/11 or lower, I can move in a perfect square of 4 pixels, not having a single pixel in between
Ingame, even though it does not move in pixels, I "reproduce" this fine square on my monitor


Now my game obviously moved in degrees, but on my monitor I have moved the game's view by 1 pixel worth of movement to either direction.

At sens 1.0 or lower I can make this very fine movement, yet as soon as I increase my ingame sens beyond 1.0, I can't make this fine square (At values like 1.25 it occasionally works, but it's inconsistent). Below 1.0 I no longer seem to gain any VISIBLE precision.

It would make sense if the game was still using WM_MOUSEMOVE, but it even happens when using raw input.

Explain this to me and I will stop trying to make you bald :p

boin
03-16-2011, 12:10 PM
Nonetheless for historical reason the default m_yaw value could have been defined so that N units of mouse (at the time it was pixel) would make a U-turn rotation (w/e rad or degree). Don't get me wrong I don't say it has been defined like that. Just it could have been.

How would you explain losing precision if going beyond 1.0? (Not that it's that important, 2.0, 2.3, 2.8, 3.3, ... are still playable, but out of sheer interest I would like to know).

Do you have some kind of robot arm to measure that ?

Corrupt^
03-16-2011, 02:57 PM
Do you have some kind of robot arm to measure that ?

Ehm nope, just make very very small movements. It's easier to test if you set your dpi really low.

Sagenth
03-16-2011, 03:05 PM
I won't answer your questions directly because they seem to be sprouting from not knowing exactly what precision means. Correct me if I am wrong, that is just how I interpreted the post.

http://en.wikipedia.org/wiki/Precision_(arithmetic)

Precision is gained by dealing with very small numbers, at least in this situation. If you have 5 million DPI and a really slow cursor then the tracking and your control over the cursor will be very precise.

That is to say having to move your mouse 5 inches to move your Cursor one pixel is very precise. There is obviously a problem with that however you wouldn't be able to physically move the mouse fast enough to get anything done.

As for sensitivity, in game stuff that is, it would work in the same fashion. The higher the number the less precise it becomes.

That isn't to say you yourself can't be precise with your control of it all if you don't use small numbers, far from that. You can use whatever values for your DPI, Speed, or Sensitivity as you want. Up until precision has been meant arithmetically however right now that isn't the case.


Google.ca: Define:Precision

"preciseness: the quality of being reproducible in amount or performance" - First Entry

"The precision of a value describes the number of digits that are used to express that value. In a scientific setting this would be the total number of digits (sometimes called the significant figures or significant digits) or, less commonly, the number of fractional digits or decimal places (the ..." - Fourth Entry
#######
#######
So in order to achieve that second definition of precision you need a high DPI a slow windows cursor and a low sensitivity. To be accurate you need to speed things up.

On the other hand to achieve the first definition of precision you just need to take the time and find out what works best for you. The second definition doesn't mean crap unless you can achieve the first definition.

Corrupt^
03-16-2011, 03:29 PM
As for sensitivity, in game stuff that is, it would work in the same fashion. The higher the number the less precise it becomes.

Obviously, but I'm still wondering why moving (ingame) with such precision only seems to work with sensitivity 1.0 or lower.

Perhaps it's just coincidence. Also, I think the whole idea got a bit misunderstood since I never meant my theory to be mathematically right, just purely emperical, something I can see and feel when moving my mouse and looking at my monitor.

Sagenth
03-16-2011, 03:47 PM
I will think about it for a little bit and see if I can figure out the answer to that question.

"I never meant my theory to be mathematically right, just purely emperical, something I can see and feel when moving my mouse and looking at my monitor."

That is fine and dandy, but what is correct for you won't be correct for everyone. When the game renders, it takes into account the screen size which is not always going to be the same; period. You CAN talk about this in terms of delta rendered pixels like you were anyways, but it is inaccurate information and won't hold true over all hardware configurations.

Edit:
DPI + OS Cursor Speed + Game Settings = Minimum Rotation(delta pixel, after render)

Lower your DPI or OS Cursor Speed and your game settings should be able to go about 1.0 without moving "more than one pixel"

Just balance that 3 variable equation to your own needs and desires. I myself like using 3200 DPI (highest my mouse goes; Ikari Laser) I like having my Windows Cursor on 3-6 I think I usually stick to 5. I never go above 6 though because I don't know if they still have that acceleration problem where the cursor skips pixels. Which means I need to keep my sensitivity under 0.7

Corrupt^
03-16-2011, 05:25 PM
Well it's something that alot of people around me have tried and seems to work, though we're all having an average monitor size (19" to 22").

Sagenth
03-16-2011, 06:10 PM
Try it with an old crt and run it on 800x600. Then reversely try it on outrageously high definition TVs with a crazy high resolution.. Then tell me that it is always the same. If you successfully prove that it is consistent on both ends of the spectrum with all the same mouse settings.. Then I will concede.

If it could hold true then clearly my explanations have been flawed; however we haven't tested yet. Theory dictates the two will have different numbers, even if it is a difference of one single pixel there should be a difference. If there isn't a difference I will be racking my brain for weeks to figure out why; surely.

nerio.ru
03-16-2011, 06:43 PM
DPI is dots per inch, it also relates directly to the mouse READING, along with the movement on screen. So that said, if you use a 1024x786 screen rez, and you having a 1000DPI mouse, it would 'technicly' take 1 inch to move from one side to the other.
But it also relates to how many 'screen shots' the mouse takes per second. So higher dpi = more screen shots per 'movement cycle' which translates into a smoother movement of your mouse, and yields a higher speed.
You can also reduce your movement speed (most commonly done in the game you are playing) to adjust the higher speed.
So if you go from a 400 dpi mouse with 4 sensitivity, a 1600 DPI mouse and a 1 sensitivity would give you the same speed, but higher precision.
You have to watch out for which mouse you buy and do research, SOME MICE labled as high gaming mice, react strange to high dpi / low sensitivity and develope the situation of 'negitive accel' So please do research into the mice you are buying.
The mouse DPI means that, the number of points in 1 cm squre which are been reading by mouse sensor. the more DPI, the better mouse reaction.
I found this in Internet.

Corrupt^
03-16-2011, 07:40 PM
I might have another idea as to why sensitivity 1.0 ingame gives you more precision and why below that sensitivity you don't seem to gain any (at least visible) precision.

This was posted earlier

current-angle = 56.021
mouse-input = 32 (this is integer)
mouse-sens = 1.26
inc-angle = 32*1.26 = 40.32
new-angle = 56.021+40.32 = 96.341

What if, due to multiplying the increase in angle with a number greater then 1, makes you lose a portion of degrees in precision.

So basically in this case, by multiplying 32 degrees with 1.26, we lose 40.32-32 = 8.32 degrees?

We don't lose them in terms of travelled distance, but we lose the 1-to-1 ratio with our mouse, causing our view to basically skip over those 8.32 degrees, similar to what would happen if we would multiply the windows sensitivity with 2.0 (it would probably travel in steps of 2 pixels instead of 1).

In Windows on the desktop, we wouldn't lose the travelled distance either, but we would lose half of our precision (ability to point at a single pixel on the desktop).

So until, we get monitors that have enough pixels to visually show an increment smaller then 0.022 (css yaw/pitch value) degrees, a sensitivity below 1.0 is pointless.

This doesn't mean the game can't move (let's say) 0.011 degrees (sens 0.5), but the increment is simply to small to be visually shown on a current monitor.

Since alot of games seem to be using the same mouse values of the old Quake games, perhaps some genius at ID Software thought up this value and figured it would be enough for years to come on monitors.

boin
03-16-2011, 11:11 PM
I might have another idea as to why sensitivity 1.0 ingame gives you more precision and why below that sensitivity you don't seem to gain any (at least visible) precision.

This was posted earlier



What if, due to multiplying the increase in angle with a number greater then 1, makes you lose a portion of degrees in precision.

So basically in this case, by multiplying 32 degrees with 1.26, we lose 40.32-32 = 8.32 degrees?

We don't lose them in terms of travelled distance, but we lose the 1-to-1 ratio with our mouse, causing our view to basically skip over those 8.32 degrees, similar to what would happen if we would multiply the windows sensitivity with 2.0 (it would probably travel in steps of 2 pixels instead of 1).

In Windows on the desktop, we wouldn't lose the travelled distance either, but we would lose half of our precision (ability to point at a single pixel on the desktop).

So until, we get monitors that have enough pixels to visually show an increment smaller then 0.022 (css yaw/pitch value) degrees, a sensitivity below 1.0 is pointless.

This doesn't mean the game can't move (let's say) 0.011 degrees (sens 0.5), but the increment is simply to small to be visually shown on a current monitor.

Since alot of games seem to be using the same mouse values of the old Quake games, perhaps some genius at ID Software thought up this value and figured it would be enough for years to come on monitors.

NO !!!! You don't get it. I don't know how to explain it differently that I already did. Everything in this system is floating-point precision except for the mouse input. Therefore this is the weakest link because there is nothing between a 0-move and a 1-move (in mouse unit which has *NOTHING* to do with pixel !!!). With higher DPI mouse this 1-unit-move consist of a smaller move of the mouse (in the order of 1/Mouse-DPI), so this is more precise (as the numerical value is lower). For a 400 dpi mouse won't detect a mouse of 1/2000th of inch that a 2000 dpi or more mouse will. Is that clear ?
You can plug a mouse on a screenless computer, this is a input device. It measures input movements and reports it to the OS. The OS will convert input to whatever he was programmed for. It can convert 1 mouse input to 1 pixel or using any kind of transformation including transformation that rely on screen resolution and/or DPI (X server can do that).

Screen is only what you see and is limited in precision by pixels, but even that is not relevant as a modern 3D rendered will use sub-pixel, anti-aliasing and other stuff that make the output slightly different even for a few portion of degree move. But even if you don't see your screen moving it does not mean that the internal variables (e.g current yaw angle) does not change. At some point you are limited by what you see, but the program is not. It is limited by the precision of the input, which is the precision of your mouse in the first place.

I hope this is clear.

Corrupt^
03-17-2011, 12:05 AM
Well, then explain me why sens 2.0 with m_yaw/m_pitch 0.011 does not give me the same precision as 1.0 and m_yaw/m_pitch 0.022. Either both of our logic is incomplete, or Valve/HPE's implementation is flawed.

(This is with raw input btw)

boin
03-17-2011, 12:35 AM
Assuming you don't use mouse acceleration either (it would change everything) ... may be we don't have the exact formula, but it does not really matter. It does not change the whole logic I had explained (about the mouse precision being the ``limiting factor'' in this equation). I can't tell if the implementation is flawed but I really doubt it. I can tell the in game mouse acceleration was totally screwed up back in time (probably because it relies on fixed FPS). I don't know how it is since the raw-input update. I haven't tested it for years.

Edit: exemple of a possible formula that would explain the behavior you have reported:
angle-inc = (m_yaw+0.07)*sensitivity*mouse-input

Once again those are random numbers just to show the m_yaw factor may not be directly proportional for whatever reason.

Sagenth
03-18-2011, 05:36 PM
I actually use m_yaw 0.03 I find it just feels more natural.

I agree completely with boin, you don't seem to understand what we are trying to get across here. He brought up an excellent point which I feel illustrates the problem you are having perfectly.

The internal values change, whether or not you see the difference displayed on screen or not.. the fact of the matter is you just moved 0.00001 degree to the left, right or whatever.

As for your sensitivity problem, I think you are approaching the issue from the wrong angle all together. It is just another variable in the equation.

Thanks for that info boin I didn't know exactly how a mouse reported to the system before now. So if it is reporting 20, and the OS is set to a slower setting.. so 4 let's say. The notches are defined arbitrarily and no quick way to determine what each represents as a threshold.. so lets arbitrarily define 4 as a 5 step.

Input: 20
Step: 5
Delta Pixels = 20/5 = 4
The game (without raw) receives 4
(4*sensitivity)*m_yaw
Presumably that is how the equation would go

((Input / Step) * Sensitivity) * m_yaw = delta rotation
*Render* -> insert difference

In other words... Years ago when I had a crappy mouse and used a sensitivity of 20.. I did not have such problems as you are reporting. Change some other variables and see how it affects the end result; I don't think you'd make a good scientist, not that that is at all relevant to the discussion.

shamoke
03-18-2011, 08:26 PM
ok i need a summary here

What is the optimal dpi for a mouse assuming everything else is good?

Corrupt^
03-19-2011, 06:43 AM
ok i need a summary here

What is the optimal dpi for a mouse assuming everything else is good?

Following Sagenth's and Bion's logic, probably set your ingame sens as low as possible until you can't see (seeing as in counting all limitations such as your monitor, your eyes, etc) a difference in precision, then set your DPI.


Though my logic (which they don't seem to capture):

In the end, wouldn't it simply be because you multiply the amount of counts received from the mouse in the equation, thus "artificially" bumping up these counts.

1.000000... would be neutral, giving you a perfect 1 to 1 ratio with your mouse, giving you FULL control over how many times you perform the increase of the yaw/pitch values.

In other words, the feeling of losing precision above 1.0 is not really a loss of precision, but feeling a loss of control over what I'm seeing. Occasionally the view will move a bit more in a certain direction then I've told it to do (with my hand and mouse).

And then on top of this I think (this is not necessarily true) that an m_yaw/pitch value of 0.022 is still small enough for todays monitors combined with the average FOV people/games use.

So basically if this all is correct, going above 1.0 will cause you to lose your 1-to-1 ratio with your mouse and going below it is rather pointless.

K I did my best not to use the word PIXEL or PRECISION this time, so if you still don't get what I'm trying to point out I give up. It's explained using your own logic, just with something added to it.

boin
03-19-2011, 06:25 PM
My position is that higher DPI are more precise, probably to a point where human being can feel the difference (but I haven't tested >= 2000 dpi mouse so I can't really tell). I haven't stated than more precise is better ! All you need is precise enough, everything more is useless. Furthermore there are many others factors to make a good mouse. The quality of the CCD, how the mouse respond to acceleration, how you like it (size, weight, shape, buttons...), poll rate and CPU it uses, driver quality ... All theses factors (and possibly others) are as important than DPI IMO.

shamoke
03-19-2011, 07:35 PM
im using 1920x1200 24" monitor, what dpi should i be using here?

boin
03-19-2011, 11:29 PM
im using 1920x1200 24" monitor, what dpi should i be using here?

According to my logic you should use the higher DPI unless you have a good reason not to. More importantly you should use the one that suite you. As for the game it has nothing to do with your screen resolution whatsoever as soon as you have checked ``raw mouse input''.

Bob910
03-20-2011, 07:41 AM
Something people are overlooking is "perfect control". Basically you want the mouse to respond the same whether you move it fast or slowly. Some mice also get positive or negative acceleration if you use high DPI

Each DPI on a mouse has a speed you can move the mouse beyond where it will simply stop functioning, but before that speed it starts getting errors. So if you make a quick swipe to turn around or to make a shot, you will point somewhere differently depending on how fast you actually swipe. It depends how fast you move the mouse ofc. For me I want the best accuracy up to ~2-3 m/s . I was disappointed with my G9 because it was pretty much useless to me because of poor perfect control and malfunction speed, even if I put it at 400 DPI (this was still better than the other DPI's, but still not good enough) so I ended up getting an optical mouse and I use that at 400 DPI now.

You can do your own tests with this program, but focus on the first test rather than the "accuracy" one, I don't think thats very accurate as it doesn't take into account how fast you move the mouse..if you move it slowly you will get like 100% every time. If you can move the mouse just before the malfunction speed you might be able to test its accuracy. But look at the max speed and try different DPI's. I get ~3m/s with my mx518 @ 400 DPI (without mouse drivers, they cause problems for new mx518). Put it at 1800 and it comes down to 1.8 . With my g9 its like 1m/s with 400DPI, but the g9 also has tons of negative accel. The max speed doesn't necessarily show perfect control - you can have a high malfunction speed but get errors way before that limit:

http://enotus.at.tut.by/Articles/MouseTest/index.html

This is an old mouse comparison but it mentions the important things with mice:

http://www.esreality.com/?a=longpost&id=1265679&page=21

So 400-800 DPI usually offers the best "perfect control" and higher malfunction speed. It depends on the mouse ofc and how fast you aim/turn though for whether it will affect you or not. I find high DPI offers nothing in exchange for reduced performance personally. I don't really see any benefit, 400 offers pin point accuracy and I can move the mouse as fast as I want

hl3_exe
03-20-2011, 09:43 AM
Something people are overlooking is "perfect control". Basically you want the mouse to respond the same whether you move it fast or slowly. Some mice also get positive or negative acceleration if you use high DPI

Each DPI on a mouse has a speed you can move the mouse beyond where it will simply stop functioning, but before that speed it starts getting errors. So if you make a quick swipe to turn around or to make a shot, you will point somewhere differently depending on how fast you actually swipe. It depends how fast you move the mouse ofc. For me I want the best accuracy up to ~2-3 m/s . I was disappointed with my G9 because it was pretty much useless to me because of poor perfect control and malfunction speed, even if I put it at 400 DPI (this was still better than the other DPI's, but still not good enough) so I ended up getting an optical mouse and I use that at 400 DPI now.

You can do your own tests with this program, but focus on the first test rather than the "accuracy" one, I don't think thats very accurate as it doesn't take into account how fast you move the mouse..if you move it slowly you will get like 100% every time. If you can move the mouse just before the malfunction speed you might be able to test its accuracy. But look at the max speed and try different DPI's. I get ~3m/s with my mx518 @ 400 DPI (without mouse drivers, they cause problems for new mx518). Put it at 1800 and it comes down to 1.8 . With my g9 its like 1m/s with 400DPI, but the g9 also has tons of negative accel. The max speed doesn't necessarily show perfect control - you can have a high malfunction speed but get errors way before that limit:

http://enotus.at.tut.by/Articles/MouseTest/index.html

This is an old mouse comparison but it mentions the important things with mice:

http://www.esreality.com/?a=longpost&id=1265679&page=21

So 400-800 DPI usually offers the best "perfect control" and higher malfunction speed. It depends on the mouse ofc and how fast you aim/turn though for whether it will affect you or not. I find high DPI offers nothing in exchange for reduced performance personally. I don't really see any benefit, 400 offers pin point accuracy and I can move the mouse as fast as I want

We're talking about two different things. Malfunction speed is something low sensitivity players are interested in, whereas DPI is relevant for high sens.

Back to DPI talk, the simplest way to look at it is to consider a 800x600 monitor and 800DPI mouse so that each monitor pixel has a one to one relation with one mouse DPI.

Given "m_rawinput 1" and ingame "sensitivity 4.5454" (yaw, pitch 0.22 x 4.5454 = 1), a one inch mouse movement will move the screen one full screen length (90 degrees). Theoretically, a 4-inch movement will be a 360 turn with no pixel skipping at these settings.

The equation is then, if,

[(Mouse DPI)]/[(Horizontal Monitor Resolution)x(yaw, pitch)x(sensitivity)] >= 1

then, no pixel skipping will occur.

Corrupt^
03-20-2011, 10:33 AM
Yep but then other problems seem to occur. Like I said, I think the main reason why 1.0 feels so bloody accurate compared to anything above it, is because you get a 1-to-1 ratio with your mouse.

Boin mentioned that the system simply captures the input of the mouse and then it can turn it in whatever it wants. A similar thing happens in Windows, if the slider is on 6/11, this means the windows sensitivity is 1.0 and we get a 1-to-1 ratio with our mouse.

To clearify (before you start dissing me for pixels again):


Windows Desktop moves in pixels: Sens 1.0 = for each mouse count move 1 pixel
Ingame sensitivity, moving in degrees: Sens 1.0 = for each mouse count move 1 m_yaw/pitch value worth of degrees


Thus, using a sensitivity higher then 1.0 results in losing our minimum possible movement in a game (0.022 degrees in CSS).

On sensitivity 2.0 you won't be able to move 0.022 degrees, it'll always move at least 0.044, because we multiply the mouse input by 2.0, it will always receive AT LEAST 2 mouse counts (one of them created due to the multiplication).

So even though my theory on page 1 was completely flawed, I'm still sticking to the fact that if m_yaw/m_pitch are configured correctly and there's no negative acceleration (rawinput is being used), that the best possible dpi settings are the ones that'll give you a sensitivity of 1.0 or below it, if you do not want to lose control over some of your movement.

Bob910
03-20-2011, 11:20 AM
Not if by increasing your DPI reduces you reduce your perfect control. It depends on the mouse and the sensor it uses. Increasing DPI may mean you can cover more pixels (but again we aren't talking about thousands of DPI here), but by doing so you reduce the ability for the mouse to follow your movements at speed.

Corrupt^
03-20-2011, 11:22 AM
Yes eventually you need to take into account everything else as well, but if your mouse is capable of it, you might as well give it a try.

boin
03-20-2011, 03:01 PM
I give up :( People still talking about this pixel scrap, I really don't know how it just does not make any sens to me ...

Just two things before I really give up this thread:

-1- ... if the slider is on 6/11, this means the windows sensitivity is 1.0 and we get a 1-to-1 ratio with our mouse. I'm not so sure this is totally true. I'm no Windows expert I barely use it to play some games like CS:S, but according to all information I have read from the internet there is always a smoothing involved in Windows mouse pointer movement. To get rid of it you have to edit SmoothMouseXCurve and SmoothMouseYCurve in the registry. This may as well depend on Windows version. Notice that these registry entries are not editable directly with Windows GUI (control panel).

-2- Another source of error could be a badly programmed driver. Like I wrote in a previous post, some constructor drivers add supplemental sensibility controls. The one you can't control with Windows, you have to run constructor program to modify them. It may be a another source of error/approximation/smoothing.

Also I'm not expert in mouse sensor either, what tell Bob910 about it not reacting differently to speed/acceleration over DPI is probably true, as for polling rate. I have this part covered when I wrote there is many parameters to make a good mouse.

ESreality had such a great mouse test, so bad it has been discontinued :(

Sagenth
03-20-2011, 05:47 PM
This discussion really went way off topic.

All that really matters is FOUR things.
1. Precision is king.
2. Higher DPI means higher precision.
3. Lower cursor speed, and in game sensitivity means greater precision.
4. Choose what works for you.

It does not matter how many pixels change on screen, it does not matter what 5000 DPI actually means.. it does not even matter what the POSSIBLE equations are in any part of the process..

All that matters is that it is ideal to have a high enough(for you) level of precision. The rest of everything needs to be set so that you are still able to point the mouse at what you need to interact with.. That is to say you need to set the windows cursor speed so that you can still quickly click buttons, icons, and anything else you may need to click.

In game, you have to be able to aim accurately so you just move the sensitivity around until you can do so quickly and easily.


THAT IS ALL THAT IS IMPORTANT!!


PS Sure different mice will have differing levels of quality, but that doesn't make a lick of difference in the end. You still just need to set everything so that you have as much precision as you can get while still being able to do all the tasks that need to be done.

So this thread should have ended once somebody said 5000 dpi is one dot per 1/5000 inches.

Corrupt^
03-20-2011, 06:48 PM
@Boin, the Windows sensitivity and slider have been completely explained before by MarktheC.

I wasn't even talking about pixel crap anymore, I just wanted to know why sens 1.0 was some sort of treshold where everything felt really neat and precise and basically found it, a 1-to-1 relation with your mouse, similar to what happens when you put the windows slider on 6/11.

Anything below it could also be an increase in precision, but 1.0 is a nice start.

And Sagenth, some people just want to know why things are the way they are.

I was wrong about the pixels, but not about sens 1.0.

boin
03-21-2011, 01:19 AM
@Boin, the Windows sensitivity and slider have been completely explained before by MarktheC.
Is that the article I've read on cadred ? I don't remember it mentioning those registry entries. Futhermore I do feel a difference between windows sensitivity 6 and raw input mode. Placebo ? May be. Like I said I'm not sure.

I wasn't even talking about pixel crap anymore, I just wanted to know why sens 1.0 was some sort of treshold where everything felt really neat and precise and basically found it, a 1-to-1 relation with your mouse, similar to what happens when you put the windows slider on 6/11. Not talking about you anymore. I still don't understand your 1:1 ratio as soon as no mouse input is lost. int(1) * float(R) == int(4) * 0.25*float(R) but in case the first term you have reached the maximum precision as in case the second you could have made smaller moves:
int(3)*0.25*float(R)
int(2)*0.25*float(R)
int(1)*0.25*float(R)
whatever the unit, one thing for sure is that it is more precise.
What I dont't get is what is the second one in your 1:1 (the first is obviously one mouse input) but what is the second ? And in what it changes anything to have this thing I can't figure out to a 1:1 ratio along with input count ? Really a great mystery to me.

Assuming
- floating-points are precise enough (and I really think they do even with 32-bit floating point numbers)
- no input count is lost in the process (this assumption is more tricky as it mostly relies on constructor drivers and Windows kernel).

Damn I said it was my last post here. I'm so weak :o

Corrupt^
03-21-2011, 08:27 AM
Is that the article I've read on cadred ? I don't remember it mentioning those registry entries. Futhermore I do feel a difference between windows sensitivity 6 and raw input mode. Placebo ? May be. Like I said I'm not sure.

Windows 7 is slightly different then XP/Vista and MarktheC wrote the mousefix for windows 7.

http://www.esreality.com/index.php?a=post&id=1846538

What I mean with my 1-to-1 ratio is quite simple:

If I move my mouse 1 count to any direction with sens 1.0, it'll convert the count to whatever is necessary to move the camera and only do it once. This goes perfectly in line with what I did with my mouse, I moved it 1 count and the camera does it's thing only once as well.

Now with a higher sensitivity, let's say 2.3, every time I send 1 count to the system, the camera ingame will perform the action it's supposed to do for 1 count worth of movement, 2.3 times.

This, as you mentioned, is caused because a mouse works with integer values, it cannot send half a count.

Even if you don't lose on screen precision, you do lose a portion of control. Simpler said, (with an easier number) on sens 2.0, we lose control over half of the movement ingame. At 2.0 only half of the camera movement is done by US, the other half is caused by the multiplication.

That being said, I'm starting to wonder why they even implemented a sensitivity slider in the first place. Sure it's more convenient for the average person who doesn't know what he's doing, but after all this (this discussion), it would make more sense to me to have 2 sliders, for both yaw and pitch.

Then we control the precision through the m_yaw/pitch values and keep the sensitivity locked at 1.

This doesn't necessarily give us full precision, as a very big yaw/pitch value could clearly cause visible loss of precision on screen, but it does give us perfect control, a 1-to-1 ratio between our mouse and how many times we "perform" these values.

Though, if I remember correctly, either yaw or pitch (one of them), was blocked since it was used alot for no recoil scripts. Silly imo, they should do it as with cl_inter(_ratio), being able to edit them but not during a game.

Futhermore I do feel a difference between windows sensitivity 6 and raw input mode. Placebo ? May be. Like I said I'm not sure.

So do I, but it is the same sensitivity. Raw input feels a bit faster because it's not having the negative acceleration as with WM_MOUSEMOVE/GetCursorPos.

To give a simple answer to the thread creator then:

Set your ingame sensitivity to at least 1.0 or lower and then adjust your dpi accordingly.

boin
03-21-2011, 11:24 AM
Windows 7 is slightly different then XP/Vista and MarktheC wrote the mousefix for windows 7.
Ok so basically it is what I said, by default even with enhanced mouse mouvement off, windows mouse pointer is not linear. Anyway we don't need to consider this since raw input mode.

If I move my mouse 1 count to any direction with sens 1.0, it'll convert the count to whatever is necessary to move the camera and only do it once. This goes perfectly in line with what I did with my mouse, I moved it 1 count and the camera does it's thing only once as well. This does not make sens to me.

Now with a higher sensitivity, let's say 2.3, every time I send 1 count to the system, the camera ingame will perform the action it's supposed to do for 1 count worth of movement, 2.3 times.

This, as you mentioned, is caused because a mouse works with integer values, it cannot send half a count.

Even if you don't lose on screen precision, you do lose a portion of control. Simpler said, (with an easier number) on sens 2.0, we lose control over half of the movement ingame. At 2.0 only half of the camera movement is done by US, the other half is caused by the multiplication. Exactly and it serve my point. For -1- if you move 1/2000th of inch with 4000 DPI mouse input is a 2 steps move but you won't get anything with a 400dpi mouse. For -2- if you use higher DPI mouse you have to lower your sensibility, hence gain precision as you've just agreed, to rotate exactly the same amount of angle.

That being said, I'm starting to wonder why they even implemented a sensitivity slider in the first place. Sure it's more convenient for the average person who doesn't know what he's doing, but after all this (this discussion), it would make more sense to me to have 2 sliders, for both yaw and pitch.

Then we control the precision through the m_yaw/pitch values and keep the sensitivity locked at 1. Yes assuming the formula is sensitivity*yaw (resp. pitch) (which is my best guess). Like you wrote it is probably easier for average user to have only one parameter to control the speed of the mouse.

So do I, but it is the same sensitivity. Raw input feels a bit faster because it's not having the negative acceleration as with WM_MOUSEMOVE/GetCursorPos. It is very unlikely as I play with quiet low sens. It was more like a filter (smoothing) feeling. Emphasis on ``feeling''.

To give a simple answer to the thread creator then:

Set your ingame sensitivity to at least 1.0 or lower and then adjust your dpi accordingly.

TL;DR I don't agree with Corrupt.

Corrupt^
03-21-2011, 12:01 PM
Exactly and it serve my point. For -1- if you move 1/2000th of inch with 4000 DPI mouse input is a 2 steps move but you won't get anything with a 400dpi mouse. For -2- if you use higher DPI mouse you have to lower your sensibility, hence gain precision as you've just agreed, to rotate exactly the same amount of angle.

Yes but in order to follow me, you need to step away from whatever you're trying to convert mouse put into.

It doesn't matter if you convert a mouse count into a travelled pixel, dot, or angle rotation.

If you multiply the input from a mouse with a number greater then 1, it adds "unexisting" counts. Example:

If I move my 400 dpi mouse 1 inch, it moved 400 counts. Now if you multiply this with 1.5, it has STILL only moved 400 counts, yet it'll count as 600 counts.

It uses these 600 counts to then rotate the camera (600 * m_yaw value), but 200 of these counts were NEVER made by my hand and mouse, they're ADDED through the multiplication.

This would also explain why sens 2.0 with m_yaw 0.011 doesn't feel as precise as 1.0 and 0.022. Though precise is the wrong word here, the precision in both is equal, but we lost control over half of the counts, half of the counts the game uses to move the camera are added.

boin
03-21-2011, 04:31 PM
The whole point of this post is to compare mouse DPI !!! That's what I'm doing. All you are saying is that it is better to have a lower sensibility, which I agree. Also with higher DPI you'll have to set the sensibility lower to have the same inch to angle conversion (that's better according to your own logic!!!). This is the whole point, since that's how you calibrate a mouse. For example I use my mousepad length to do a U-turn, you can give me any mouse with whatever DPI I will ALWAYS use this setting so that the game feel almost the same. To achieve that you can tweak many parameters like in-game sensibility, driver sensibility, windows settings (if you don't use raw-input) ... Really that's it. It is so obvious I really don't know what to tell.

This would also explain why sens 2.0 with m_yaw 0.011 doesn't feel as precise as 1.0 and 0.022. Though precise is the wrong word here, the precision in both is equal, but we lost control over half of the counts, half of the counts the game uses to move the camera are added. I've already done that.

Sagenth
03-22-2011, 06:23 AM
@Boin, the Windows sensitivity and slider have been completely explained before by MarktheC.

I wasn't even talking about pixel crap anymore, I just wanted to know why sens 1.0 was some sort of treshold where everything felt really neat and precise and basically found it, a 1-to-1 relation with your mouse, similar to what happens when you put the windows slider on 6/11.

Anything below it could also be an increase in precision, but 1.0 is a nice start.

And Sagenth, some people just want to know why things are the way they are.

I was wrong about the pixels, but not about sens 1.0.

Arbitrary reasons, is why most things in life are the way they are; especially with computers.

Like I have already made perfectly clear.. it all depends on the other two variables whether or not it will feel precise at 1.0

If you change all your settings drastically you will find other numbers. For instance... use 200 dpi and 6 speed.. then try using 1.0 sensitivity and tell me it feels good. You will be able to find HIGHER values if you lower the other two variables... THAT is what I have said in nearly every post. Pay attention this time?


Btw, maybe things have changed.. but most servers don't allow you to lower m_yaw below 0.02 i think it was. I might be thinking of pitch, but that would mean there would be no sense in lowering yaw. Unless you are fighting in elevator shafts

edit 2:
Custom if you are going to state the theorized equation as fact... then you've already got your answer and should stf up

Vedi
03-23-2011, 02:46 PM
What I'm trying to point out Vedi, is that imo, on an average monitor, at sens 1.0, turning 0.022 degrees corresponds with moving 1 pixel (near the crosshair). If that's really the case, it then seems quite logical that turning 0.044 degrees (sens 2.0) corresponds with moving 2 pixels.


Can't be assed to read the whole thread so I don't know if you were straightened on this issue. There IS, by consensus of smart people, an "minimum useful dpi" which can be explained as follows.

Consider the effects of one mouse count. As explained, the screen turns count*sens*m_yaw degrees. sens and m_yaw are given in terms of floating point precision numbers, and the product is calculated in floating point precision, i.e. to an accuracy of 10 decimals or so. Now ask yourself "can I see the screen move when I move the mouse by one count?" There are several ways you can answer this question.

1) Usual explanation for minimum useful dpi Postulate that to see the screen move, the view angle has to change equivalent of one pixel at the center of the screen. One can calculate the value of "sensitivity" for which 1 count = 1 pixel @ screen. For example, for width 1680 in CSS the answer is around 3.7 sens. This is the value for which 1count=1pixel, not sens 1. Now, the idea is we want "sensitivity" to be less than this estimate, since if it were above, aim becomes less than pixel perfect - in theory, we can't aim at every pixel.

2) My explanation, which gives a factor 2 enhancement When I thought of this question myself, I also considered that one degree covers many more pixels at the side of the screen than at the center. This is because of simple geometry. Therefore, if instead of requirement 1), you require sensitivity to be below the value that makes 1pixel=1count at the EDGE of the screen, you get an enhancement of about a factor 2 - making maximum sens you want to use on 1680, as an example, around 2.2 (cant remember the numbers exactly). This translates into an increased minimum useful dpi.

I really can't explain this issue better, since you really have to draw a few pictures yourself to understand the geometry. But believe me, your ideas are not correct.

By the way - these calculations are further complicated by antialiasing - which basically increases your resolution before it is downsized to fit your screen again. So in the end, the only "fair statement" I think can be made is that for large resolutions, use at least 800dpi, and for smaller resolutions at least 400.

Sagenth
03-23-2011, 03:00 PM
Yes it was addressed rather early on if my memory serves. A point you do have, but precision is precision regardless of whether it is "useful" on screen. People tend to say that you should set the DPI to match your resolution, but I don't believe that crap. It doesn't Really make any difference whether or not you set it to correspond with a particular resolution. What matters is if you need the HARDWARE's precision and whether or not you can manage using your mouse/cursor after you set the DPI. I consider all else moot;myself.

I just set my mouse to whatever the highest setting is that way the mouse is stepping according to its smallest measurable unit.

Vedi
03-24-2011, 05:59 AM
Yes it was addressed rather early on if my memory serves. A point you do have, but precision is precision regardless of whether it is "useful" on screen. People tend to say that you should set the DPI to match your resolution, but I don't believe that crap. It doesn't Really make any difference whether or not you set it to correspond with a particular resolution. What matters is if you need the HARDWARE's precision and whether or not you can manage using your mouse/cursor after you set the DPI. I consider all else moot;myself.

I just set my mouse to whatever the highest setting is that way the mouse is stepping according to its smallest measurable unit.

Well I would tend to agree, that greater dpi is always better, but negative effects of higher dpi are still a logical possibility, that I can illuminate with the following. Imagine modeling the screen by discretizing the possible view directions, so that all angles are effectively not floating point precision but multiples of this "smallest viewable angle difference". In this model, if your dpi is just right, each count will move the screen exactly one unit of this minimum possible view change. If you increase dpi from this (keeping real sensitivity constant), obviously for some counts the view will not change although you moved the mouse, while for some counts it will change. This is obviously a bad effect and therefore the extra dpi makes it worse. If it exists in real life it should be noticeable when moving the mouse very slowly.

boin
03-24-2011, 06:41 AM
Well I would tend to agree, that greater dpi is always better, but negative effects of higher dpi are still a logical possibility, that I can illuminate with the following. Imagine modeling the screen by discretizing the possible view directions, so that all angles are effectively not floating point precision but multiples of this "smallest viewable angle difference". In this model, if your dpi is just right, each count will move the screen exactly one unit of this minimum possible view change. If you increase dpi from this (keeping real sensitivity constant), obviously for some counts the view will not change although you moved the mouse, while for some counts it will change. This is obviously a bad effect and therefore the extra dpi makes it worse. If it exists in real life it should be noticeable when moving the mouse very slowly.
Right, may be ... But at this point it would be for far beyond human capability to notice such difference.

[edit] In fact it can't really happen ... Roughly mouse mouvement are 31bit encoded, floating point should be (in the worst case) are IEEE-574 32bit (24bit mantissa), default DPI is let say 400 (9bit to encode). To this point a mouse would have to send value higher than 2^31 to overflow the floating points, and that would overflow the mouse protocol so basically no constructor would ever build such a mouse !

Vedi
03-24-2011, 08:59 AM
Right, may be ... But at this point it would be for far beyond human capability to notice such difference.

[edit] In fact it can't really happen ... Roughly mouse mouvement are 31bit encoded, floating point should be (in the worst case) are IEEE-574 32bit (24bit mantissa), default DPI is let say 400 (9bit to encode). To this point a mouse would have to send value higher than 2^31 to overflow the floating points, and that would overflow the mouse protocol so basically no constructor would ever build such a mouse !

Hmm thats not what I meant. I meant this: imagine turning 0.00001 degrees. I bet the screen looks EXACTLY the same. Not one pixel is different. Now change that number until the smallest number where you can see a difference on screen. Say its t degrees. Now tune sensitivity so that one count of your mouse turns you t degrees. This is the optimal dpi I was referring to, since if you have higher dpi (but same real sens), you won't see a difference for every count but only some counts.

Anyways, just saying that if smaller dpi feels good, from a theoretical perspective there might be real reasons for it, however in general high dpi should be good.

boin
03-24-2011, 09:59 AM
Hmm thats not what I meant. I meant this: imagine turning 0.00001 degrees. I bet the screen looks EXACTLY the same. Not one pixel is different. Now change that number until the smallest number where you can see a difference on screen. Say its t degrees. Now tune sensitivity so that one count of your mouse turns you t degrees. This is the optimal dpi I was referring to, since if you have higher dpi (but same real sens), you won't see a difference for every count but only some counts.

Anyways, just saying that if smaller dpi feels good, from a theoretical perspective there might be real reasons for it, however in general high dpi should be good.
This is exactly the same... It is numerical precision. And there are about the same. So for the mouse to be able to have such precise move would require it overflows the low level input system.
So at some point very high DPI would have bad effect. But that's obvious as any computer system has a finite precision. All is to determine which end of the system reach its limit first. Empirically I would say in this case they are about the same order. Furthermore I'm pretty sure than before it reachs such limit a human being would be unable to feel any difference anyway.

HayzeMT
03-24-2011, 11:39 AM
Stop posting those silly calculations, they're pointless. In the end there's only 1 reason why more dpi could be usefull:

"Decreasing your ingame sens"

At sensitivity 1.0, for each count the PC receives from the mouse, the game will move 1 pixel to the corresponding direction. At sensitivity 2.0 it will skip a pixel and move 2 pixels in the corresponding direction. At 3.0 it will move 3, etc. Numbers in between like 2.75 are interpolated.

So in order to get maximum precision, one would need a sensitivity of 1.0, going lower is pointless as a pixel is the smallest increment visible on a monitor.

Where does dpi come in?

DPI simply get's more counts per inch, thus making the mouse more sensitive. Doubling your dpi simply means your mouse will move twice as fast.

So ideally (with raw input as otherwise you'll get negative acceleration), you'd have to adjust your sensitivity through the DPI settings of your mouse and get as close as possible to 1.0 ingame.

2 examples:


Ingame sens = 2.0 / DPI = 800 --> change sens to 1.0 and dpi to 1600
sens = 3.4 ingame / DPI = 800 --> (if possible, otherwise get as close as possible to 1.0) change dpi to 2760 and set ingame sens to 1.0


If you're using a sens as in the second example you should be shot. No point in going that high.

So more dpi is usefull, but we hardly need 5000 dpi. The extra precision is gained not by having more dpi, but by decreasing your ingame sens.

I'm still deciding between 0.6 and 0.5 in game sens (1800dpi) and if I chose to stick with 0.5, you think I should use 900dpi and 1 sens in game?

boin
03-24-2011, 11:56 AM
I'm still deciding between 0.6 and 0.5 in game sens (1800dpi) and if I chose to stick with 0.5, you think I should use 900dpi and 1 sens in game?

Yes that what he thinks. And I don't. You should just try and see for yourself.

HayzeMT
03-24-2011, 12:33 PM
Yes that what he thinks. And I don't. You should just try and see for yourself.

It may be a placebo but I've never liked playing with a dpi of 900 and below, although this may be because I've never used such a low in game sens before.

Vedi
03-24-2011, 12:34 PM
This is exactly the same... It is numerical precision. And there are about the same. So for the mouse to be able to have such precise move would require it overflows the low level input system.
So at some point very high DPI would have bad effect. But that's obvious as any computer system has a finite precision. All is to determine which end of the system reach its limit first. Empirically I would say in this case they are about the same order. Furthermore I'm pretty sure than before it reachs such limit a human being would be unable to feel any difference anyway.

You are way off here. The angular resolution of the screen is much closer to the naive estimate, 1/res, than numerical accuracy. It is within reach of say 5000dpi mouse.

boin
03-24-2011, 03:54 PM
You are way off here. The angular resolution of the screen is much closer to the naive estimate, 1/res, than numerical accuracy. It is within reach of say 5000dpi mouse.

No it's not because of subpixel rendering and antialisiang and such, you'll probably have a pixel changing even for a *VERY* small angle step.

And anyway even if it were true, you wouldn't loose everything execpt your extra precision. So in worst case scenario it does not change anything compare to lower DPI.

Sagenth
03-24-2011, 04:34 PM
Right, may be ... But at this point it would be for far beyond human capability to notice such difference.

[edit] In fact it can't really happen ... Roughly mouse mouvement are 31bit encoded, floating point should be (in the worst case) are IEEE-574 32bit (24bit mantissa), default DPI is let say 400 (9bit to encode). To this point a mouse would have to send value higher than 2^31 to overflow the floating points, and that would overflow the mouse protocol so basically no constructor would ever build such a mouse !

31 bits.... are you saying that it just drops off a bit..? 3.875Bytes?if anything it would have an extra bit for parity checking, but I have never heard of dropping off a bit. Please cite.

boin
03-25-2011, 01:08 AM
31 bits.... are you saying that it just drops off a bit..? 3.875Bytes?if anything it would have an extra bit for parity checking, but I have never heard of dropping off a bit. Please cite.

Because I was comparing with floating point mantissa which is unsigned so I removed the bit sign. Anyway It could have been 32bit, as my calculation were pretty rough, it wouldn't change much. Notice that 2^32/5000 = 858993 inch is the width of the largest area a 5000DPI mouse could cover in absolute mode, 2^31 is the largest step it can make in relative mode (the input device reports step (need to be signed, as the step can be left or right) instead of absolute positions, giving a unlimited area of action, only limited by acceleration (speed), as it can't report a larger step between two consecutive poll).

All this gibberish is pretty much useless, as it is simple fact that a better (more precise) sensor is more precise. Whatever any of you says ... I have tried to explained it by many ways, really it is so obvious. Just think to a camera CCD, it is the same. You can always filter a picture if you think you have to much information/pixel, but it does not work in the other direction.

Sagenth
03-25-2011, 07:09 AM
Because I was comparing with floating point mantissa which is unsigned so I removed the bit sign. Anyway It could have been 32bit, as my calculation were pretty rough, it wouldn't change much. Notice that 2^32/5000 = 858993 inch is the width of the largest area a 5000DPI mouse could cover in absolute mode, 2^31 is the largest step it can make in relative mode (the input device reports step (need to be signed, as the step can be left or right) instead of absolute positions, giving a unlimited area of action, only limited by acceleration (speed), as it can't report a larger step between two consecutive poll).

All this gibberish is pretty much useless, as it is simple fact that a better (more precise) sensor is more precise. Whatever any of you says ... I have tried to explained it by many ways, really it is so obvious. Just think to a camera CCD, it is the same. You can always filter a picture if you think you have to much information/pixel, but it does not work in the other direction.

okay I think I have a good understanding of what you meant. That last bit, probably the first bit rather, would be the negative/positive bit.

So you were saying that to use more than 4Bytes for mouse movement information, per axis presumably, would create tracking errors such as the ones people have mentioned?

boin
03-25-2011, 08:52 AM
okay I think I have a good understanding of what you meant. That last bit, probably the first bit rather, would be the negative/positive bit. Yes that's it. Bit#31 (MSB) holds the value of the sign (the absolute value is a 2nd-complement -> X = F(-X) = NOT(X)+1 ).

So you were saying that to use more than 4Bytes for mouse movement information, per axis presumably, would create tracking errors such as the ones people have mentioned? No what I was saying is that the level of precision needed for the mouse to be more precise than the internal rotation angles (which is the in-game precision and probably viewport precision too as I have already explained) would require that the mouse input would be larger than 32bit, which is not possible since it is an industry standard (well it could be possible by using some tricks, but really that's another story). What I intended to show is that with current mouses we are very far from that. With 16bit a 5000DPI mouse can address 13.1 inch of absolute coordinates , about the size of a mouse pad.
Even if the mouse input could be huge enough, it would means that the mouse is insanely precise, hence the bits you would lost in the floating point conversion would have the value in the order of 2^24th of inch (about 6E-8 inch) !!! So really it's very unlikely than anyone would notice !

Vedi
03-25-2011, 04:06 PM
No it's not because of subpixel rendering and antialisiang and such, you'll probably have a pixel changing even for a *VERY* small angle step.

And anyway even if it were true, you wouldn't loose everything execpt your extra precision. So in worst case scenario it does not change anything compare to lower DPI.

Of course with arbitrarily high antialising you can actually fit 2^(color depth in bits)*width*height information on the screen. Although a wild overestimate it is still clear that with high antialiasing the screen can resolve the detailed movement of even the highest dpi mouse, and this is basically based on changing the colors of pixels slightly.

This is exactly the theoretical reasoning why I said earlier this discussion applies to antialiasing off only. In that case the color of an object never changes, objects can only move, making resolution a good estimate of the actually accuracy of the screen.

The negative issue WOULD BE that for a slow constant movement, the screen would turn the minimal step on uneven intervals in time. This could feel akward. If someone notices this when using a high dpi mouse, maybe they ought to lower dpi. The ballpark for "possibly too high dpi" depends on resolution and runs from 1000+dpi at low resolutions to 2000+dpi at high resolutions. So all I wanted to say was: if you have a high dpi, and feel a lower dpi is better when moving the mouse slowly, this might be the reason.

Sagenth
03-25-2011, 07:28 PM
I think much of this is beyond the realm of understanding for most readers, me included, without a great deal of carefully worded explanations. I am an IT guy and I am 80% lost and 20% pretending like I know where I am.

"Yes that's it. Bit#31 (MSB) holds the value of the sign (the absolute value is a 2nd-complement -> X = F(-X) = NOT(X)+1 )." You got me confused again you are back to the 3.875Bytes again.

Vedi
03-26-2011, 03:58 AM
I think much of this is beyond the realm of understanding for most readers, me included, without a great deal of carefully worded explanations. I am an IT guy and I am 80% lost and 20% pretending like I know where I am.

"Yes that's it. Bit#31 (MSB) holds the value of the sign (the absolute value is a 2nd-complement -> X = F(-X) = NOT(X)+1 )." You got me confused again you are back to the 3.875Bytes again.

The question here has been

"Can dpi be too high?"

Essential information
- mouse reports movements in counts. Dpi gives the number of counts per inch
- with m_raw 1, each count turns the screen sensitivity*0.022 degrees

I have argued as follows. Imagine you are looking at a corner, with one wall being fully black and the other fully white. Antialiasing is off. In this situation, the only information the screen gives you about your viewangle is the location of the vertex. Your brain interprets the picture by assuming the vertex is located between the pixels where white pixels turn into black pixels. Now imagine turning your screen slowly. This line (the line dividing white and black pixels) will move one pixel at a time. Thus I have stated that as an estimate, the width of the resolution (and also fov) gives how accurately the screen can display your viewangle. I.e. two viewangles very close to each other might produces EXACTLY the same visual screen. Then I argued that for high dpi, for example 5000, when turning slowly the screen might not move for every count but, for example, 2 times in 3 counts. This would mean that the slow turning does not happen at a smooth pace but in jumps. Say, if you were turning 1 count per 0.01 seconds, then the screen would move on frames 1 2 4 5 7 8 etc, maybe giving a bad feeling of the mouse. Antialiasing changes things because then there are three pixels containing information. The one in the middle is some shade between white and black. With 2x antialising it can only be either gray, white or black. With infinite antialiasing it could be any shade between white and black, thus giving more accurate information on the viewangle.

The viewangle is assumed to be in terms of floating point precision, and as such is represented with a very high accuracy. Boin is concerned that if you have a high dpi enough mouse, you have to lower sensitivity until a value so low that the internal representation of the angle becomes faulty. Something like the computer is writing the angle 0.000000000000 and then just chopping the end off, in essence making slow movement bad. I'm not even sure it would work like that since floating point remembers the significant digits. But that discussion is anyways completely irrelevant given the limitations of the screen to display small differences in angles.

boin
03-26-2011, 05:22 AM
Your brain interprets the picture by assuming the vertex is located between the pixels where white pixels turn into black pixels. [...] Now imagine turning your screen slowly. This line (the line dividing white and black pixels) will move one pixel at a time. Agreed, and I had already understood what you mean the first time. But I do not agree on this: what you see is the only thing that matter. For example we all move the mouse while blinded by a flash when we don't see anything but white pixels.

Then I argued that for high dpi, for example 5000, when turning slowly the screen might not move for every count but, for example, 2 times in 3 counts. [...]
Yes, but it seems you forgot that those count represent smaller move of your hand ! And that's all that matter. So for the same mouse mouse, you'll get 2 count for a 5000DPI mouse where you got only 1 for a 2550DPI mouse. So like I already stated in the worst case scenario you only loose what you have gain with extra DPI. And that would be only for the visual aspect, as for every other aspect of the game (like aiming, moving...) internal engine will use floating-point precision. so the difference will exist while you are moving, unless there is some kind of angle snapping.

Now here is another thing: imagine a screen very few pixels width. I'm sure you can figure quiet easily where your mouse is ``inside a pixel'' by moving it. Because there is another dimension: time.

Boin is concerned that if you have a high dpi enough mouse, you have to lower sensitivity until a value so low that the internal representation of the angle becomes faulty.
Exactly, that could hypothetically happen. But I'm not really concern as such mouse does not exist. And as I have explained there is quiet a margin before it could happen.
But that discussion is anyways completely irrelevant given the limitations of the screen to display small differences in angles. That is exactly the point we do not agree on, because even if I admit that the screen can't show the difference which I don't, the difference still here in the aim and in the angle you can reach theoretically (it changes where you are able to move when pressing WASD). Just to be clear I don't honestly really think anyone can notice that, but it is here ... So I guess in fact we mostly agree that unnecessary high DPI mouse are probably useless. But I'm not able to tell you what this limit is.

Orange Ninja
03-26-2011, 06:57 AM
To OP:

Not sure if this has been pointed out already in the previous 6 pages, but here (http://www.youtube.com/watch?v=VVjPsqAJf1U)'s a video demonstrating why higher DPI mice give smoother results.

Not sure if 5000 dpi is absolutely necessary though!

fr0stik
03-26-2011, 09:19 AM
FFS people this is a pitifully unimportant discussion. Most competitive players's use 400 or 800 dpi on mice that do 1000+, does it matter? No use whatever the hell dpi you want because in the end it honestly wont make a difference. If you want to use 400 dpi fine, if you want to use 5000 dpi fine, itll be a ♥♥♥♥♥ to adjust the sens and higher dpi = better is rubbish, but go ahead itll be no different.

Lots of people use the intellimouse which is an amazing mouse, its only 400 dpi. DPI doesnt matter, focus on your play instead, do some productive deathmatch and that is what will influence your aim, not some stupid measure called dpi.

Sagenth
03-26-2011, 09:50 AM
FFS people this is a pitifully unimportant discussion. Most competitive players's use 400 or 800 dpi on mice that do 1000+, does it matter? No use whatever the hell dpi you want because in the end it honestly wont make a difference. If you want to use 400 dpi fine, if you want to use 5000 dpi fine, itll be a ♥♥♥♥♥ to adjust the sens and higher dpi = better is rubbish, but go ahead itll be no different.

Lots of people use the intellimouse which is an amazing mouse, its only 400 dpi. DPI doesnt matter, focus on your play instead, do some productive deathmatch and that is what will influence your aim, not some stupid measure called dpi.

We are fully aware of that. We actually stated if you had of read early on that higher dpi doesn't mean it is better. If you care to look up precision or preciseness in the dictionary that is basically what the entire discussion has been about; good job.

Higher DPI means higher precision not that it is easier or better to use. Some pro gamers still use optical mice for instance, we are not pro gamers though. I like my LASER mouse cause it has lazor beamz! No but really I don't have any problem using 3200dpi I am not the best player but a lot of this relies on a player's skill(ie dexterity).

Boin you need to stay in school your english grammar/spelling is terrible.

Lose*
Quite*
Theoretically* (not hypothetically)

boin
03-26-2011, 10:25 AM
Boin you need to stay in school your english grammar/spelling is terrible.

Lose*
Quite*
Theoretically* (not hypothetically)
Thanks for the advice. I think I'll pass. It's way too late for that, I've quit for 15 years. For my defense it was not really simple things I'll try to explain. I hope people have understood than more DPI is just more precise, everything else was pointless discussion on how it could or not affect our game feeling.

FFS people this is a pitifully unimportant discussion. Most competitive players's use 400 or 800 dpi on mice that do 1000+, does it matter? No use whatever the hell dpi you want because in the end it honestly wont make a difference. If you want to use 400 dpi fine, if you want to use 5000 dpi fine, itll be a ♥♥♥♥♥ to adjust the sens and higher dpi = better is rubbish, but go ahead itll be no different.

Lots of people use the intellimouse which is an amazing mouse, its only 400 dpi. DPI doesnt matter, focus on your play instead, do some productive deathmatch and that is what will influence your aim, not some stupid measure called dpi. That's mostly what we said. Can you show us an active and important discussion here ?

Sagenth
03-26-2011, 10:37 AM
That's mostly what we said. Can you show us a active and important discussion here ?

LMAO so true!!

"Thanks for the advice. I think I'll pass. It's way too late for that, I quit 15 years ago. In my defense they were not very simple things I was trying to explain. I hope people have understood that more DPI is simply more precise, everything else was pointless discussion on how it could or could not affect our "game-feeling."" -Boin*

:D

Exenter
03-27-2011, 06:17 AM
When using a non decimal number, like 2.0, it does move 2 pixels for every count it receives from the mouse.

Have fun trying to move pixel per pixel at sens 2.0...


What are you on about? You CAN move the mouse cursor 1 pixel with ALL mouse sensitivity settings and DPI settings on the mouse.

Sagenth
03-27-2011, 08:41 AM
"What are you on about? You CAN move the mouse cursor 1 pixel with ALL mouse sensitivity settings and DPI settings on the mouse."

ya?

shamoke
04-01-2011, 08:42 AM
good god i need another summary

does setting sens closer to 1 = more precision? That's all I want to know.

Vedi
04-01-2011, 10:49 AM
good god i need another summary

does setting sens closer to 1 = more precision? That's all I want to know.

No. Lower sensitivity is more precision, but you need higher dpi to achieve the same "actual sensitivity" with it. Therefore we have reached the surprising conclusion:

"Higher dpi allows higher accuracy."

(Use m_rawinput 1)

Sagenth
04-01-2011, 01:49 PM
No. Lower sensitivity is more precision, but you need higher dpi to achieve the same "actual sensitivity" with it. Therefore we have reached the surprising conclusion:

"Higher dpi allows higher accuracy."

(Use m_rawinput 1)

Precision != Accuracy
There is a distinct difference, so no!!! Higher DPI doesn't allot higher ACCURACY it allows high PRECISION. It does not hinder it but it does not explicitly enable better accuracy.

Vedi
04-01-2011, 03:46 PM
Precision != Accuracy
There is a distinct difference, so no!!! Higher DPI doesn't allot higher ACCURACY it allows high PRECISION. It does not hinder it but it does not explicitly enable better accuracy.

You make no sense.

Accuracy (mathematics, def. 3): The degree of correctness of a quantity, expression, etc. Compare precision (def. 5).

Precision (mathematics, def. 5): The degree to which the correctness of a quantity is expressed. Compare accuracy (def. 3).

Almost feel like I'm being trolled.

Please, go compare 50dpi to 500dpi and then rethink your statement.

Sagenth
04-01-2011, 04:12 PM
Increasing the DPI means that it is measuring smaller distances. That is the living definition of precision.

preciseness: the quality of being reproducible in amount or performance; "he handled it with the preciseness of an automaton"; "note the meticulous precision of his measurements"

In computer science, precision of a numerical quantity is a measure of the detail in which the quantity is expressed. This is usually measured in bits, but sometimes in decimal digits. It is related to precision in mathematics, which describes the number of digits that are used to express a value.

Computer science or computing science (sometimes abbreviated CS) is the study of the theoretical foundations of information and computation, and of practical techniques for their implementation and application in computer systems. (This applies to mice)



the quality of being near to the true value; "he was beginning to doubt the accuracy of his compass"; "the lawyer questioned the truth of my account"
In the fields of engineering, industry and statistics, the accuracy of a measurement system is the degree of closeness of measurements of a quantity to its actual (true) value

That last bit means that the mouse needs a minimum DPI in order to be more accurate. Not that it needs to be higher, an indefinite term that could keep going til you are at 5 trillion DPI.


Higher DPI does not mean you are going to be able to shoot that guy with the awp in the head with your AK or usp.. or whatever.. It means that your mouse is measuring smaller distances than if the DPI was lower. THIS DOESN'T even equate to you being able to expect your accuracy to be consistent (ie precise).

I do make sense, you have poor reading comprehension; perhaps. If you mean != makes no sense it does.
"Several computer languages use "!" for various meanings, most importantly for logical negation; e.g. A != (http://en.wikipedia.org/wiki/%21%3D) B means "A is not equal (http://en.wikipedia.org/wiki/Inequation) to B", and !A means "the logical negation (http://en.wikipedia.org/wiki/Negation) of A" (also called "not A"). In this context, the exclamation is named the bang character; other programmers call it a shriek or screech. Invented in the US, it is claimed that bang is from Unix and shriek from Stanford or MIT; however, shriek is found in the Oxford English Dictionary dating from the 1860s." -Wikipedia under exclamation point.

Vedi
04-01-2011, 04:41 PM
Increasing the DPI means that it is measuring smaller distances. That is the living definition of precision.

preciseness: the quality of being reproducible in amount or performance; "he handled it with the preciseness of an automaton"; "note the meticulous precision of his measurements"

In computer science, precision of a numerical quantity is a measure of the detail in which the quantity is expressed. This is usually measured in bits, but sometimes in decimal digits. It is related to precision in mathematics, which describes the number of digits that are used to express a value.

Computer science or computing science (sometimes abbreviated CS) is the study of the theoretical foundations of information and computation, and of practical techniques for their implementation and application in computer systems. (This applies to mice)



the quality of being near to the true value; "he was beginning to doubt the accuracy of his compass"; "the lawyer questioned the truth of my account"
In the fields of engineering, industry and statistics, the accuracy of a measurement system is the degree of closeness of measurements of a quantity to its actual (true) value

That last bit means that the mouse needs a minimum DPI in order to be more accurate. Not that it needs to be higher, an indefinite term that could keep going til you are at 5 trillion DPI.


Higher DPI does not mean you are going to be able to shoot that guy with the awp in the head with your AK or usp.. or whatever.. It means that your mouse is measuring smaller distances than if the DPI was lower. THIS DOESN'T even equate to you being able to expect your accuracy to be consistent (ie precise).

I do make sense, you have poor reading comprehension; perhaps. If you mean != makes no sense it does.
"Several computer languages use "!" for various meanings, most importantly for logical negation; e.g. A != (http://en.wikipedia.org/wiki/%21%3D) B means "A is not equal (http://en.wikipedia.org/wiki/Inequation) to B", and !A means "the logical negation (http://en.wikipedia.org/wiki/Negation) of A" (also called "not A"). In this context, the exclamation is named the bang character; other programmers call it a shriek or screech. Invented in the US, it is claimed that bang is from Unix and shriek from Stanford or MIT; however, shriek is found in the Oxford English Dictionary dating from the 1860s." -Wikipedia under exclamation point.

Aha. Ok, so you're saying a high dpi mouse won't make you pro. You could've just said that.

Sagenth
04-01-2011, 04:57 PM
Aha. Ok, so you're saying a high dpi mouse won't make you pro. You could've just said that.
Yea well you said it did.. essentially.

LinkinMcOwnage
04-01-2011, 06:27 PM
Someone end this horrible thread.

flea8me
04-10-2011, 11:15 PM
Is it weird that I play with a 400 dpi mouse with mouse acceleration on, 9.1 sensitivity ingame, and totally feel that I have awesome pin point accuracy? When I turn off mouse acceleration and use raw input its really fast (not smooth) and inaccurate when pinpointing and really slow when doing a 180 turn. I'm using this mouse. http://www.newegg.com/Product/Product.aspx?Item=N82E16826105185

Sagenth
04-11-2011, 03:22 PM
Is it weird that I play with a 400 dpi mouse with mouse acceleration on, 9.1 sensitivity ingame, and totally feel that I have awesome pin point accuracy? When I turn off mouse acceleration and use raw input its really fast (not smooth) and inaccurate when pinpointing and really slow when doing a 180 turn. I'm using this mouse. http://www.newegg.com/Product/Product.aspx?Item=N82E16826105185


No. Not at all.

flea8me
04-11-2011, 11:01 PM
No. Not at all.

Oh... oh good.

Spylander
04-17-2011, 03:49 PM
I think he should just get a trackball.

sneax
05-29-2011, 06:15 PM
A high dpi mouse will prevent your mouse from skipping pixels on-screen when moving the mouse using very high sensitivity. In at least the past 5 years there is a move to low sensitivity and also mouse sensors have advanced very good - now any 15EUR mouse has a laser sensor - there is no issue anymore.

HayzeMT
05-30-2011, 02:53 AM
3 pages and not once was that covered from a topic half dead, thanks

Orange Ninja
06-03-2011, 12:54 PM
3 pages and not once was that covered from a topic half dead, thanks

Yes, it was. See post #77 (http://forums.steampowered.com/forums/showpost.php?p=21467112&postcount=77).