 WebGL 2.0 Shadow Maps
 How do I enter the Euro Symbol €
 What is the ethical justification for capital punishment in the United States?
 If an adult is raped by a minor, is it statutory rape?
 Polishlanguage pop artists?
 Delonghi Dedica rubber wand is propelled into milk pitcher
 how to implement autofocus for video projector
 Find spring constant of frame structure
 How much of previous economics (historically) is still relevant today?
 A firm face the following cost function, TC=5Q and quantity 0, 1, 2, 3, 4. What is the value of fixed cost?
 Emmet expands css abbreviations as html
 Automatically use the title of a linked Org file as the link text
 What skin colour would living permanently on the moon select for?
 Are the three laws of motion (which is attributted to Sir Isaac Newton) really present in the Vaisheshika Sutras of Kanada?
 Dispassion in Buddhism
 See the waterboys
 Who can solve this cipher?
 Pretrained VGG16 for grayscale documents?
 What kind of stones are these?
 I'm trying to establish full duplex parallel communications between two Arduino Mega boards
What is the big$\Omega$ complexity of Fermat's Little Theorem?
Fermat's Little Theorem states that if an integer $n$ is prime them
$$
a^n \equiv a \pmod n \hspace{10mm} (*)
$$
for any $a \in \mathbb{N}$
My question is, is it correct to say that testing $(*)$ for some given $a$ has complexity $\Omega (n)$, since we would have to calculate $a^n$ which would require $n$ (multiplication) operations?
Additionally, what is the bigO complexity of testing this congruence?
Modular exponents can be computed efficiently using repeated squaring. This is a recursive method (which can also be implemented iteratively) that relies on the following two identities:
$$
a^{2k} = (a^k)^2, \qquad a^{2k+1} = a(a^k)^2.
$$
This allows us to give a recurrence for $f(a,b,n) = a^b \bmod n$:
$$
f(a,b,n) =
\begin{cases}
1 & \text{if } b = 0, \\
f(a,b/2,n)^2 \bmod n & \text{if $b$ is even}, \\
af(a,(b1)/2,n)^2 \bmod n & \text{if $b$ is odd}.
\end{cases}
$$
This shows that $a^b \bmod n$ can be computed using $O(\log b)$ many multiplications. Since we kee

Modular exponents can be computed efficiently using repeated squaring. This is a recursive method (which can also be implemented iteratively) that relies on the following two identities:
$$
a^{2k} = (a^k)^2, \qquad a^{2k+1} = a(a^k)^2.
$$
This allows us to give a recurrence for $f(a,b,n) = a^b \bmod n$:
$$
f(a,b,n) =
\begin{cases}
1 & \text{if } b = 0, \\
f(a,b/2,n)^2 \bmod n & \text{if $b$ is even}, \\
af(a,(b1)/2,n)^2 \bmod n & \text{if $b$ is odd}.
\end{cases}
$$
This shows that $a^b \bmod n$ can be computed using $O(\log b)$ many multiplications. Since we keep reducing modulo $n$ in each step, each of these multiplications takes time polynomial in $O(\log n)$ (naively, $O(\log^2 n)$, but this can be significantly improved using fast integer multiplication methods). In total, we obtain an algorithm running in polynomial time.
20180313 17:42:30 
Who says calculating $a^n$ takes n multiplications? It is easily done in 2 log n multiplications or less. Each multiplication modulo n can be done in O (log n)^2 easily, and faster with some hard work.
20180313 18:04:20 
Here is a list on wikipedia on the computational complexity of different operations, one of which happens to be modular exponentiation which you are asking about.
https://en.wikipedia.org/wiki/Computational_complexity_of_mathematical_operations
The complexity with exponentation by squaring is
#n = number of digits of n (= O(log(n)))
M(#n) = complexity of multiplication of two #ndigit numbers
=> O(M(#n)#n)
And M(#n) ~ O(#n^2) (for more information see wikipedia)
So putting together we get
O(M(#n)#n) = O(#n^3) = O(log^3(n))
However this is only true using that specific algorithm "exponentation by squaring".
20180313 18:41:09 
[I]s it correct to say that testing $(*)$ for some given $a$ has complexity $\Omega (n)$, since we would have to calculate $a^n$ which would require $n$ (multiplication) operations?
No. An algorithm provides an upper bound on the complexity of a problem. By exhibiting an algorithm that runs in time $f$, you're saying that the complexity of the problem can't possibly be worse than the running time of $f$. But there might be a better algorithm, so the complexity might be better than the running time of $f$. In other words, by demonstrating an algorithm with running time $f$, you've proven that the complexity of the problem is $O(f)$. To prove that the complexity is $\Omega(f)$, you would need to prove that no possible algorithm has running time $o(f)$.
Always be wary of the argument "This obvious algorithm has running time $f$, so the complexity can't be any better than that, right?" The whole point of studying algorithms is to come up smart techniques to solve problems faster t
20180313 19:04:27