Due: Friday, Sep. 16.
For simplicity, I will use the approximation that log2(3) is about 1.59.
You are given a number in binary notation. The goal is to convert it to decimal notation. For example, if the input is "10100" then the output is "20".
Suppose that the input is n bits long. Devise an algorithm that solves this problem in time O(n1.59).
Note. It is possible to divide two n-bit integers, getting an integer quotient and remainder, in the same time that it takes to multiply two integers, to within a constant factor. So you can divide two integers in time O(n1.59).
The following is a quadratic time algorithm for the problem that writes the decimal representation of the number in reverse order (from low to high digits). I leave it to you to figure out how the order might be reversed. The input is a number I, represented in binary. I assume that we can do arithmetic on binary numbers.
if(I == 0) print(0); else { v = I; while(v > 0) { print(v mod 10); v = v / 10; //integer quotient } }