How to build a fast local area network
I work in film production and need VERY fast access to very large raw footage files. I am presently using eSATA 6Gbps docks for internal hard drives, which go as fast as the drive will go.
If I were to use a server and connect to my workstations via network, what (relatively affordable) technology would allow me to get anywhere near to, or exceed the drive speed?
Here in early 2017, judging from information readily available online, it looks like the fastest SATA HDDs max out at around 220 MegaBytes/sec, which is 1.760 Gigabits/sec.
So if you're just trying to beat the speed of a single drive, and you're limited to HDDs for cost-per-terabyte considerations for huge video files then, 10 Gigabit Ethernet is plenty.
As an aside, note that Thunderbolt Networking is 10 Gigabit/sec as well, so if you already have Thunderbolt ports, you can experiment with that. It could conceivably beat your 6 Gigabit eSATA 3 ports, although I'm not sure about that because eSATA is very storage-specific, whereas doing storage over Ethernet has more overhead. Also note that Thunderbolt is a desktop bus; it only reaches a few meters, not the 100m that 10 Gigabit Ethernet can handle. So while Thunderbolt may be interesting for experimenting and prototyping while you weigh your options, it's probably not the right long-term solution for you unless you want to keep all your workstations and disks connected back-to-back around a large table.
So that was for single HDDs. But if you RAID those drives together so that every read or write gets spread out across multiple drives, you can get much better performance than the single drive performance. Also, depending on your budget, you could put PCIe/M.2 NVME SSDs into a PC to act as a server/NAS, and you can get blazing fast storage performance (around 3.4 GigaByte/sec == 27 gigabit/sec) per drive.
In that case you might want to look at something faster than 10 Gigabit Ethernet, but glancing around online, it looks like prices jump quite dramatically beyond 10 Gigabit Ethernet. So you might want to look at doing link aggregation across multiple 10Gigabit links. I've also seen some anecdotes online that used network equipment, such as used 40Gbps InfiniBand stuff, can be bought on eBay for dirt cheap, if you don't mind the hassles that come with buying used gear on eBay.
If you stick to SATA disks, implementing 10Gb ethernet and building a reasonably sized RAID10 on the server will net you a noticeable performance increase beyond that of a single SATA disk. This will be a worthy investment because you can share the server among as many workstations as you need, and add in the future by adding switches. You'll need to run Cat-6 ethernet cable, as Cat-5E won't cut it - don't forget to add this expense to your calculations. You could also add SSDs as cache to speed the system up even further; since you're working with video footage, I assume you need vast amounts of storage space, which would be extremely expensive to build purely with SSDs.
You could buy a pre-made rackmount server from Dell or HP and use the hardware RAID card, or if you're more of a hardware person, you could buy a cheaper chassis from Supermicro and build the storage machine yourself, using software RAID in either Windows or Linux. Hardware RAID is often faster when a RAID1 is involved, since software must write to each disk in turn and wait for the write to complete before moving onto the next operation; a RAID card can generally write to both disks in parallel, and cache the write operation, returning control to the OS immediately. Do note, however, that although a RAID0 would be even faster, you have no redundancy and a single drive failure will cause complete data loss; never use a RAID0 when you have data that you want to keep. I recommend contacting Dell or HP or another big name to see if they can help you spec out a system to meet your needs.
At the high end of the scale you have Storage Area Networks (SANs) but these are designed to allow many operations from vast numbers of separate clients in parallel; a benefit of this is that the throughput is very high for a small number of connected machines, but is likely overkill for your needs, and very expensive. At the low end, you have Network Attached Storage (NAS) devices that others have mentioned, but although much simpler than a full server, I don't recommend them since a NAS is frequently a black box; they're designed to be plug & play for most users, and as a result you have little control over the OS - I have just had to send back a small NAS I bought for a client's office as it became unstable after a day's use.
Another advantage to building a server is that you're concentrating all your footage in one place, which makes it practical and relatively easy to back it up regularly. Never neglect your backup strategy; one day you'll need to depend on it!
Nothing beats 10gbe's flexibility and ease of configuration, but SAS is surprisingly networkable all on its own:
For small numbers of workstations (n<8) where volumes do not need to be writable from multiple computers at once, SAS works great. With a Tyan JBOD ($1,500) and an LSI HBA ($400), we get 3,400 MB/s (27 Gbps) sustained transfers to SSD. The JBOD has an internal switch with 3 uplinks to HBA's, but SAS switches are available for higher port counts.
Here is a speed test of one of our volumes:
Internally, we use this solution with Clustered Windows Server running Storage Spaces that are distributed to clients with 10gbe.
https://www.newegg.com/Product/Product.aspx?Item=N82E16820147593&cm_re=samsung_m.2--20-147-593--Product
First up, M.2 SSD which given the correct motherboard can reach 2gb-4gb/s, and possibly higher. RAID a few of these together, and your speeds are even higher.
Teaming multiple 10gb NIC could get you close to native speeds.
Anything more than 10g is a lot more money.
40 gigabit NIC https://www.serversupply.com/products/part_search/pid_lookup.asp?pid=263133&gclid=Cj0KEQiA9P7FBRCtoO33_LGUtPQBEiQAU_tBgIU2jZrJKf0kXFy96roOllcRkp7j-VoubG_n7xb0_pgaAnaT8P8HAQ