How to create large sized files that have unique content in Windows?

I need to create a large sized file that doesn't have repeated stuff in it so that even on zipping, I don't get the size reduced.


Solution 1:

On Windows, you can create a file containing truly unique content with Powershell. No third-party software needed. No admin rights needed.

As opposed to fsutil, the output will not contain 0's or spaces. Each time you run the command, the output will be unique. On compression, the file will retain it's original size.

The output will look something like this:

䫲藌㦖鐆쩘ꌓ⮅픻씙방醒ম擾酰₵廙⊌弓똌硣浍鐟ઘ⃭佐怱쎜鼋ꄻ윤訟ᦟ맂㱆�㔭ර槗旌閣㏏⯉來殰鲑奣鑍翴㑈Ꮪ嚠昲︇퇇뻕耣珁犒晥䒋酉懒䊿䱵漝벻玓⃸啈�펕戶臓헬珎宇ꄌꖓ萿 etc....

The command

$out = new-object byte[] 100000; (new-object Random).NextBytes($out); [IO.File]::WriteAllBytes('C:\100K.txt', $out)

Replace 100000 with whatever size you want, in bytes. You can use an online converter if you need help defining the size you prefer.

Replace C:\100K.txt with the filepath you want. If you provide a name without a path, it will be saved in your Windows profile folder (or maybe C:, depending on your system configuration). The file extension has no impact on the process.

Breakdown

This is a series of three commands. The first command creates an empty array called $out, of the size you specify. The second command loads the array with random characters. The third command saves the array to a file.

Here's the Microsoft support file for new-object.

This appears to be supported at least as far back as Powershell 3.0. (Currently at version 7.x as of this writing).

Taken from: https://www.digitalcitizen.life/3-ways-create-random-dummy-files-windows-given-size

Limit

I can output files up to 2G. Anything larger hits a limit. For example, 3G gives:

"Cannot convert value "3000000000" to type "System.Int32". Error: "Value was either too large or too small for an Int32.""

This answer shows a way to write large files in chunks. Basically, wrap a do..while around the commands which load the array and write to file. The writes need to append to the file.

Regardless of limits on array-size, I think you'll want to consider your available RAM.

Solution 2:

Under Linux the easiest way to do this would be the dd command. There is a Windows version at http://www.chrysocome.net/dd

To create a random file with a size of 1GB you can run the command

dd if=/dev/random of=random.file bs=1M count=1000

This means: use a blocksize of 1 MB and read/write 1000 blocks.

btw. on Linux you should use /dev/urandom but for this Windows version it has to be /dev/random