The CPU of typical contemporary x86 laptop can do around 17,253,000,000 square roots per second (with SQRTSD instruction). For comparison, the display has 4,096,000 pixels. Given that we can sqrt as much as we want - it isn't any more costly than other basic arithmetic operations.
Name:
Anonymous2019-08-20 8:44
>>4 If you're doing SQRTs at this rate something is wrong with your code.
Name:
Anonymous2019-08-20 8:47
>>5 Sqrts are used for light calculation and distance in general. Also sqrt can be used for sRGB gamma compression, if you don't have fast pow(x,1/2.2)
Name:
Anonymous2019-08-20 9:02
>>6 It sounds like you're deliberately avoiding using the GPU for some reason
>>9 Of course,if someone calculates 17 billion sqrts per seconds, a GPU will be faster, but in one indie game its unlikely that very much of it occurs(maybe a few thousands sqrts?), so CPU will be faster because of lower latency.
Name:
Anonymous2019-08-20 21:30
>>9 The trick is that you don't need sqrts in abstract space, you need sqrts in context of other computations.
Name:
Anonymous2019-08-21 3:33
>>11 8 to 16bit floats can have a sqrt lookup table that is faster.16bits x 32k entries=64kbytes.
>>18 That is what Norman J. Wildberger proposes in his rational trigonometry lectures, which is similar to Wolfram's ideas of physics. But you do need sqrt for non-theoretic work to normalize the vector.