2
$\begingroup$

Consider a block cipher algorithm with the properties: - Input, output block length is 64 bits and key size is 56 bits. - Given a key K, the key scheduling requires 2 microseconds. - After the key scheduling produces all subkeys, the encryption or decryption of a single 64-bit block requires 0.5 microseconds.

  1. Compute the total time (in microseconds) required to encrypt 1 megabytes of data ?
  2. Given two known values C and M such that C = EK(M) under an unknown key K, how many years (at most) are required to crack the cipher on a single computer (runs at 3 GHz)? Notes: 1 microseconds is 10^-6 seconds. 1 megabytes is 2^20 bytes.

My answer: 64-bit requires 0.5 ms (2^-1) ==> then 1-bit requires 2^-1/2^6 = 2^-7 .

now we need 2^-7 for each bit, how many for 1 megabytes of data:

1 megabytes = 2^23 bit ==> so 2^23 * 2^-7 = 2^16 + 2 microseconds.

now for the second part, consider the unknown key, 2^56 / 2 = 2^55


I'm wondering if my answers are correct ? could anybody please advice ?

Thanks - Alaa

2 Answers 2