2 Oct 02:53 2011

### Re: Native PyGame method for automatically scaling inputs to a surface resolution?

Christopher Night wrote: > If I have a 100x100 pixel window, > and I want to put a dot at a position (x,x), it seems to me like the dot > should appear in the window if 0 <= x < 100. You're saying it should > appear in the window if -0.5 <= x < 99.5. You need to be more precise about what you mean by "a dot". On a display surface made of pixels, to make anything appear at all, you need to paint at least one pixel. So I'll take it that you want to paint a 1x1 rectangle. There are also a couple of other things we need to be clear about. One is the precise relationship between coordinates and pixels. Two obvious choices come to mind: we could take the coordinates as labelling the centres of pixels, or the boundaries between pixels. I prefer to take the boundary approach, because it avoids a lot of potential confusion. To draw a 1x1 rect centred at (x, y), we need to paint the area between (x-0.5, y-0.5) and (x+0.5, y+0.5). In order to cover exactly one pixel, the centre coordinates need to be an integer plus 0.5, so that the boundaries are integers. So if x = 0.5, the rect covers the range 0.0 to 1.0. Now, suppose our arithmetic is a little inaccurate, and we actually get x = 0.499. The boundaries then come out as -0.001 and 0.999. If we round these, we get 0.0 and 1.0 as before. But if we floor, we get -1.0 and -0.0, and the pixel goes off the screen.(Continue reading)