SC700 -- Internet Information Protocols
Assigned: Tuesday, Sepemteber 10
DUE: October 1 at the beginning of class.
1. [20 points] Finite field/number theory sanity check:
a. Compute [å(24)]-1 (mod 51).
b. Compute x-1 over F16 [irreducible: x4+x+1].
c. Compute .
d. Consider the function and if p is prime. What are the largest and smallest possible values for (n)?
2. [20 points] For each of the following pictures, use xv (available on acs.bu.edu) or gimp (available for most linux machines) to manipulate the following photos (available on the web) into the most compact form possible without significantly affecting their quality. Please provide: the final image size (in bytes), the manipulations performed with explanations, a link to your web site where your transformed images may be found.
2. [60 points] You are implementing three different encoding schemes for your internet startup according to the following design criteria:
I. Huffman encoding: You encode in blocks of length 3 according to the following a priori probabilities:
p(000)=0.5 p(001)=p(010)=p(100)=0.1 p(011)=p(110)=p(101)=0.01 p(111)=0.17
II. Lempel-Ziv 77: You encode data with a look-back buffer of 1024 characters and a look-ahead buffer of 8 characters. Each character is represented with 8 bits, giving a total of 10+3+8 = 21 bits to encode each phrase.
III. Lempel-Ziv 78: You allow a dictionary of size 1024 and representing each character with 8 bits. Thus, you use 18 bits to encode each phrase. Each time the dictionary gets to size 1024, you throw it away and start anew.
a. What is the best asymptotic compression ratio you can get for each of these schemes? (compression ratio = uncompressed length / compressed length)
b. What is the worst asymptotic compression ratio you can get for each of these schemes?
c. Give an example where Huffman encoding beats the two other schemes.
d. Generate a 100k file for which LZ77 attains the best compression ratio, and similarly generate a file for which LZ77 attains the worst compression ratio. Compress each of these files with gzip and record the final file size together with a link to your web page where these files may be found. How close was your ratio to that attained by gzip?