I guess it's something like the moment when i is decremented ? C# is not assembler, and out of 99.9% of the times i++ or ++i are used in code, the things going on in the background. The way for loop is processed is as follows 1 first, initialization is performed (i=0) 2 the check is performed (i < n) 3 the code in the loop is executed
Jay Schilter (@jayschilter) • Threads, Say more
There's absolutely no reason not to, and if your software ever passes through a toolchain that doesn't optimize it out your software will be more efficient
Considering it is just as easy to type ++i as it is to type i++, there is.
Is this a general rule of thumb, or is it php specific. Is there a performance difference between i++ and ++i in c++ Is there a reason some programmers write ++i in a normal for loop instead of writing i++? In javascript i have seen i++ used in many cases, and i understand that it adds one to the preceding value:
I++ evaluates to the last value of i and increments i at any time between the last and the next sequence point, and the program is not allowed to try to observe when. I must say that for the really curious, this is good knowledge, but for the average c# application, the difference between the wording in the other answers and the actual stuff going on is so far below the abstraction level of the language that it really makes no difference