Gz decompress from hdfs

I am want to read .gz file from hdfs and decompress it to hdfs location. How do I do it ?

Google

I am using ioutils.copybytes(inputstream, outputstream, 4096) to uncompress my gz file of 50gb on s3 but the glue worker nodes fails stating no space left on the device. The same thing works fine with small files 5gb compressed gz.

Import org.apache.Hadoop.io.ioutils

Any idea why I am running out of disk space ? Is there a way to bypass the full disk usage of the worker node ?