numpy.fromfile seems to be unable to read large files
I wanted to write some very simple python helper tool for my project which is reading binary data from an ECG record. I have found somewhere that numpy.fromfile is the most appropriate tool to approach it, so I wrote:
#!/usr/bin/env python3
import sys
import numpy as np
arrayOfNums = np.fromfile(sys.argv[1], 'short')
print("Converting " + sys.argv[1] + "...")
conversionOutput = open("output", "x")
conversionOutput.write(np.array2string(arrayOfNums, separator=' '))
conversionOutput.close()
print("Conversion done.")
I did that to write the data which is 2 byte records unseparated. The input file is somewhat large for a simple text file (over 7MB), however not large enough I think to cause numpy troubles.
The output I got in the file: [-32243 -32141 -32666 ... -32580 -32635 -32690]
Why the dots between? It seems to convert it okay, but omits almost everything it is supposed to save. Any help would be appreciated.
Solution 1:
Numpy reads correctly your file. To avoid a long display, numpy uses the dots:
import numpy as np
a = np.random.random(10000)
Output:
>>> a
array([0.20902653, 0.80097215, 0.06909818, ..., 0.5963183 , 0.94024005,
0.31870234])
>>> a.shape
(10000,)
a
contains 10000 values and not only the 6 displayed values.
Update
To display the full output:
import sys
np.set_printoptions(threshold=sys.maxsize)
print(a)