How to overwrite the existing files using hadoop fs -copyToLocal command

Is there any way we can overwrite existing files, while coping from HDFS using:

hadoop fs -copyToLocal <HDFS PATH> <local path>

fs -copyFromLocal -f $LOCAL_MOUNT_SRC_PATH/yourfilename.txt your_hdfs_file-path

So -f option does the trick for you.

It also works for -copyToLocal as well.


You can first delete, then write.

hadoop fs -rmr <path> removes everything under given path in hdfs including the path itself

rm -rf <path> removes in local file system.

Make sure that there is no other file in the directory.