- prime-numbers security-definition probability negligible
- Updated Sun, 31 Jul 2022 15:20:38 GMT

I have this textbook definition, I shall include below.

GenModulus denotes a ppt algorithm that, on input $1^n$, outputs $(N, p, q)$ where $N = p\,q$ and (except with negligible probability) $p$ and $q$ are $n$-bit primes.

A few pages back in the book, it did say $1^n$ was the security parameter, which I assume applies above.

My first question is, what exactly is an $n$-bit number? I assume, to use an example with decimals, that $3$-bit would be a number no greater than $111$ in binary which is $7$ in decimal.

My next question is, what do they mean by negligible probability within the context of this question? I assume it means the two prime numbers $p$ and $q$ are guessable by an adversary with negligible probability.

What exactly is an $n$bit number?

In the context, *number* designates an element $m$ of the set $\mathbb N$, the non-negative integers. Such $m$ is $n$bit if and only if $n\in\mathbb N$ and $\lfloor2^{n-1}\rfloor\le m<2^n$, or equivalently $n=\left\lceil\log_2(m+1)\right\rceil$. When $n\ne0$, that simplifies to $2^{n-1}\le m<2^n$, or equivalently $n=\left\lfloor\log_2(m)\right\rfloor+1$.

Any $n$bit $m$ in $\mathbb N$ has a unique $n$bit bitstring representation $x$ in binary per big-endian convention, and when $n>0$ the leading bit of $x$ is $1$. It holds $\|m\|=|x|=n$ per the notation in Katz&Lindell's *Introduction to Modern Cryptography*.

Any $n$bit bistring $x$ represents an at-most$n$bit $m$ in $\mathbb N$ per such convention. It holds $\|m\|\le|x|=n$ and $0\le m<2^n$. In some contexts, it's (slightly incorrectly) stated that such integer $m$ is $n$bit, when really $x$ is $n$bit and $m$ is *at most* $n$bit. The question is making just this when it states

$3$bit would be a number no greater than $111$ in binary which is $7$ in decimal.

What do they mean by negligible probability within the context of this question?

It means that for any polynomial $r$ (with positive leading coefficient), for $n$ sufficiently large, the probability is less than $1/r(n)$ that it does not hold that $p$ and $q$ output by the algorithm are $n$bit primes.

Equivalently: for all $k>0$ exists $n_k$ such that $n>n_k$ implies that said probability is less than $1/n^k$.

Still equivalently: it is allowed only with probability bound to vanish when $n$ grows (and faster than any polynomial in $n$) that $p$ or/and $q$ are not prime, or/and are more or less than $n$bit.

Said probability is computed over all the inputs of the deterministic version of the Probabilistic Polynomial Time algorithm considered, where the randomness is explicitly introduced as an extra parameter, rather than implicit.

I assume it means the two prime numbers $p$ and $q$ are guessable by an adversary with negligible probability.

No. Nothing is stated *in the definition quoted in the question* about the probability that $p$ or $q$ can be guessed. That comes later in the text.

The probability in the definition is there to formally accommodate a probabilistic primality test, which can let a composite $p$ or $q$ creep out on rare occasions. In practice, that probability is made so low that it's neglected.

Including textbook RSA when dealing with the message rather than key generation.

That's left implicit in Katz&Lindell's *Introduction to Modern Cryptography*, ed. 3, definition 3.4

It additionally allows $p$ or $q$ to occasionally not be $n$bit, as could be the case for $n<2$ (there's no prime less than $2$bit); or for primes generated as the smallest prime at least $m$ drawn uniformly at random in $[2^{n-1},2^n)$, which outputs a prime larger than $2^n$ with probability $O(n/2^n)$.

External links referenced by this document: