I think it's partly a problem of definitions. So-called perfect numbers like six are defined as the sum of all their factors. The factors of six are 1, 2 and 3, which sum to 6. But what about 6? Isn't that a factor of 6? Why, when we exclude 6 as a factor, are we including 7 as a factor of 7 in defining it as prime?Alex wrote: Sat Mar 07, 2026 6:33 pm The main rule of prime numbers is that a prime can only be divided by and itself.
Therefore in a straight logical sense, 1 has to be the first prime.
And 1 was the first prime for the most part of human history until the mathematical community decided to change it to number 2 being the first prime for conveniance sake. Not for logical reasons but in order to make neat statements like this:
"A prime number is a natural number greater than 1 that has exactly two distinct positive divisors."
But there are published mathematical papers using 1 as the first prime until the year of 1956.
The "Unique Building Block" Rule
The main reason for excluding 1 was to protect a cornerstone of mathematics: The Fundamental Theorem of Arithmetic.[5][6][7][8] This theorem states that every integer greater than 1 is either a prime number itself or can be expressed as a unique product of prime numbers.[5][8]
Think of primes as the fundamental building blocks of all other numbers.
The unique prime factorization of 12 is 2 x 2 x 3.
Now, let's see what happens if we allow 1 to be a prime number:
12 could be 1 x 2 x 2 x 3
Or 1 x 1 x 2 x 2 x 3
Or 1 x 1 x 1 x 2 x 2 x 3
Suddenly, the factorization is no longer unique; you could include an infinite number of 1s. This breaks the elegance and power of the theorem. To avoid this, mathematicians decided it was much simpler to refine the definition of "prime" to specifically exclude 1.
Why is 1 the first triangular number? Shouldn't it be 3? Why does the OEIS define 0 as the first triangular number? https://oeis.org/A000217
Why is pi the ratio of the circumference to the diameter and not the radius (which is tau)? That change would make sense, because physical quantities such as Planck's constant, h/2pi, would simplify to h/t.
Numbers just are what they are and have the properties they have. The definitions are simply useful ways of describing their properties, but ultimately they aren't absolute.