AWS Certified Big Data - Specialty (#27)

A large grocery distributor receives daily depletion reports from the field in the form of gzip archives OF CSV files uploaded to Amazon S3. The files range from 500MB to 5GB. These files are processed daily by an EMR job. Recently it has been observed that the file sizes vary, and the EMR jobs take too long. The distributor needs to tune and optimize the data processing workflow with this limited information to improve the performance of the EMR job. Which recommendation should an administrator provide?

Reduce the HDFS block size to increase the number of task processors.
Use bzip2 or Snappy rather than gzip for the archives.
Decompress the gzip archives and store the data as CSV files.
Use Avro rather than gzip for the archives.