Latest update

# What is the big-$\Omega$ complexity of Fermat's Little Theorem?

2018-03-13 17:10:00

Fermat's Little Theorem states that if an integer $n$ is prime them

$$a^n \equiv a \pmod n \hspace{10mm} (*)$$

for any $a \in \mathbb{N}$

My question is, is it correct to say that testing $(*)$ for some given $a$ has complexity $\Omega (n)$, since we would have to calculate $a^n$ which would require $n$ (multiplication) operations?

Additionally, what is the big-O complexity of testing this congruence?

Modular exponents can be computed efficiently using repeated squaring. This is a recursive method (which can also be implemented iteratively) that relies on the following two identities:

$$a^{2k} = (a^k)^2, \qquad a^{2k+1} = a(a^k)^2.$$

This allows us to give a recurrence for $f(a,b,n) = a^b \bmod n$:

$$f(a,b,n) = \begin{cases} 1 & \text{if } b = 0, \\ f(a,b/2,n)^2 \bmod n & \text{if b is even}, \\ af(a,(b-1)/2,n)^2 \bmod n & \text{if b is odd}. \end{cases}$$

This shows that $a^b \bmod n$ can be computed using $O(\log b)$ many multiplications. Since we kee

• Modular exponents can be computed efficiently using repeated squaring. This is a recursive method (which can also be implemented iteratively) that relies on the following two identities:

$$a^{2k} = (a^k)^2, \qquad a^{2k+1} = a(a^k)^2.$$

This allows us to give a recurrence for $f(a,b,n) = a^b \bmod n$:

$$f(a,b,n) = \begin{cases} 1 & \text{if } b = 0, \\ f(a,b/2,n)^2 \bmod n & \text{if b is even}, \\ af(a,(b-1)/2,n)^2 \bmod n & \text{if b is odd}. \end{cases}$$

This shows that $a^b \bmod n$ can be computed using $O(\log b)$ many multiplications. Since we keep reducing modulo $n$ in each step, each of these multiplications takes time polynomial in $O(\log n)$ (naively, $O(\log^2 n)$, but this can be significantly improved using fast integer multiplication methods). In total, we obtain an algorithm running in polynomial time.

2018-03-13 17:42:30
• Who says calculating $a^n$ takes n multiplications? It is easily done in 2 log n multiplications or less. Each multiplication modulo n can be done in O (log n)^2 easily, and faster with some hard work.

2018-03-13 18:04:20
• Here is a list on wikipedia on the computational complexity of different operations, one of which happens to be modular exponentiation which you are asking about.

https://en.wikipedia.org/wiki/Computational_complexity_of_mathematical_operations

The complexity with exponentation by squaring is

#n = number of digits of n (= O(log(n)))

M(#n) = complexity of multiplication of two #n-digit numbers

=> O(M(#n)#n)

So putting together we get

O(M(#n)#n) = O(#n^3) = O(log^3(n))

However this is only true using that specific algorithm "exponentation by squaring".

2018-03-13 18:41:09
• [I]s it correct to say that testing $(*)$ for some given $a$ has complexity $\Omega (n)$, since we would have to calculate $a^n$ which would require $n$ (multiplication) operations?

No. An algorithm provides an upper bound on the complexity of a problem. By exhibiting an algorithm that runs in time $f$, you're saying that the complexity of the problem can't possibly be worse than the running time of $f$. But there might be a better algorithm, so the complexity might be better than the running time of $f$. In other words, by demonstrating an algorithm with running time $f$, you've proven that the complexity of the problem is $O(f)$. To prove that the complexity is $\Omega(f)$, you would need to prove that no possible algorithm has running time $o(f)$.

Always be wary of the argument "This obvious algorithm has running time $f$, so the complexity can't be any better than that, right?" The whole point of studying algorithms is to come up smart techniques to solve problems faster t

2018-03-13 19:04:27