Saving any file to in the database, just convert it to a byte array?

Solution 1:

Since it's not mentioned what database you mean I'm assuming SQL Server. Below solution works for both 2005 and 2008.

You have to create table with VARBINARY(MAX) as one of the columns. In my example I've created Table Raporty with column RaportPlik being VARBINARY(MAX) column.

Method to put file into database from drive:

public static void databaseFilePut(string varFilePath) {
    byte[] file;
    using (var stream = new FileStream(varFilePath, FileMode.Open, FileAccess.Read)) {
        using (var reader = new BinaryReader(stream)) {
            file = reader.ReadBytes((int) stream.Length);       
        }          
    }
    using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetails))
    using (var sqlWrite = new SqlCommand("INSERT INTO Raporty (RaportPlik) Values(@File)", varConnection)) {
        sqlWrite.Parameters.Add("@File", SqlDbType.VarBinary, file.Length).Value = file;
        sqlWrite.ExecuteNonQuery();
    }
}

This method is to get file from database and save it on drive:

public static void databaseFileRead(string varID, string varPathToNewLocation) {
    using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetails))
    using (var sqlQuery = new SqlCommand(@"SELECT [RaportPlik] FROM [dbo].[Raporty] WHERE [RaportID] = @varID", varConnection)) {
        sqlQuery.Parameters.AddWithValue("@varID", varID);
        using (var sqlQueryResult = sqlQuery.ExecuteReader())
            if (sqlQueryResult != null) {
                sqlQueryResult.Read();
                var blob = new Byte[(sqlQueryResult.GetBytes(0, 0, null, 0, int.MaxValue))];
                sqlQueryResult.GetBytes(0, 0, blob, 0, blob.Length);
                using (var fs = new FileStream(varPathToNewLocation, FileMode.Create, FileAccess.Write)) 
                    fs.Write(blob, 0, blob.Length);
            }
    }
}

This method is to get file from database and put it as MemoryStream:

public static MemoryStream databaseFileRead(string varID) {
    MemoryStream memoryStream = new MemoryStream();
    using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetails))
    using (var sqlQuery = new SqlCommand(@"SELECT [RaportPlik] FROM [dbo].[Raporty] WHERE [RaportID] = @varID", varConnection)) {
        sqlQuery.Parameters.AddWithValue("@varID", varID);
        using (var sqlQueryResult = sqlQuery.ExecuteReader())
            if (sqlQueryResult != null) {
                sqlQueryResult.Read();
                var blob = new Byte[(sqlQueryResult.GetBytes(0, 0, null, 0, int.MaxValue))];
                sqlQueryResult.GetBytes(0, 0, blob, 0, blob.Length);
                //using (var fs = new MemoryStream(memoryStream, FileMode.Create, FileAccess.Write)) {
                memoryStream.Write(blob, 0, blob.Length);
                //}
            }
    }
    return memoryStream;
}

This method is to put MemoryStream into database:

public static int databaseFilePut(MemoryStream fileToPut) {
        int varID = 0;
        byte[] file = fileToPut.ToArray();
        const string preparedCommand = @"
                    INSERT INTO [dbo].[Raporty]
                               ([RaportPlik])
                         VALUES
                               (@File)
                        SELECT [RaportID] FROM [dbo].[Raporty]
            WHERE [RaportID] = SCOPE_IDENTITY()
                    ";
        using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetails))
        using (var sqlWrite = new SqlCommand(preparedCommand, varConnection)) {
            sqlWrite.Parameters.Add("@File", SqlDbType.VarBinary, file.Length).Value = file;

            using (var sqlWriteQuery = sqlWrite.ExecuteReader())
                while (sqlWriteQuery != null && sqlWriteQuery.Read()) {
                    varID = sqlWriteQuery["RaportID"] is int ? (int) sqlWriteQuery["RaportID"] : 0;
                }
        }
        return varID;
    }

Solution 2:

While you can store files in this fashion, it has significant tradeoffs:

  • Most DBs are not optimized for giant quantities of binary data, and query performance often degrades dramatically as the table bloats, even with indexes. (SQL Server 2008, with the FILESTREAM column type, is the exception to the rule.)
  • DB backup/replication becomes extremely slow.
  • It's a lot easier to handle a corrupted drive with 2 million images -- just replace the disk on the RAID -- than a DB table that becomes corrupted.
  • If you accidentally delete a dozen images on a filesystem, your operations guys can replace them pretty easily from a backup, and since the table index is tiny by comparison, it can be restored quickly. If you accidentally delete a dozen images in a giant database table, you have a long and painful wait to restore the DB from backup, paralyzing your entire system in the meantime.

These are just some of the drawbacks I can come up with off the top of my head. For tiny projects it may be worth storing files in this fashion, but if you're designing enterprise-grade software I would strongly recommend against it.

Solution 3:

It really depends on the database server.

For example, SQL Server 2008 supports a FILESTREAM datatype for exactly this situation.

Other than that, if you use a MemoryStream, it has a ToArray() method that will convert to a byte[] - this can be used for populating a varbinary field..

Solution 4:

I'll describe the way I've stored files, in SQL Server and Oracle. It largely depends on how you are getting the file, in the first place, as to how you will get its contents, and it depends on which database you are using for the content in which you will store it for how you will store it. These are 2 separate database examples with 2 separate methods of getting the file that I used.

SQL Server

Short answer: I used a base64 byte string I converted to a byte[] and store in a varbinary(max) field.

Long answer:

Say you're uploading via a website, so you're using an <input id="myFileControl" type="file" /> control, or React DropZone. To get the file, you're doing something like var myFile = document.getElementById("myFileControl")[0]; or myFile = this.state.files[0];.

From there, I'd get the base64 string using code here: Convert input=file to byte array (use function UploadFile2).

Then I'd get that string, the file name (myFile.name) and type (myFile.type) into a JSON object:

var myJSONObj = {
    file: base64string,
    name: myFile.name,
    type: myFile.type,
}

and post the file to an MVC server backend using XMLHttpRequest, specifying a Content-Type of application/json: xhr.send(JSON.stringify(myJSONObj);. You have to build a ViewModel to bind it with:

public class MyModel
{
    public string file { get; set; }
    public string title { get; set; }
    public string type { get; set; }
}

and specify [FromBody]MyModel myModelObj as the passed in parameter:

[System.Web.Http.HttpPost]  // required to spell it out like this if using ApiController, or it will default to System.Mvc.Http.HttpPost
public virtual ActionResult Post([FromBody]MyModel myModelObj)

Then you can add this into that function and save it using Entity Framework:

MY_ATTACHMENT_TABLE_MODEL tblAtchm = new MY_ATTACHMENT_TABLE_MODEL();
tblAtchm.Name = myModelObj.name;
tblAtchm.Type = myModelObj.type;
tblAtchm.File = System.Convert.FromBase64String(myModelObj.file);
EntityFrameworkContextName ef = new EntityFrameworkContextName();
ef.MY_ATTACHMENT_TABLE_MODEL.Add(tblAtchm);
ef.SaveChanges();

tblAtchm.File = System.Convert.FromBase64String(myModelObj.file); being the operative line.

You would need a model to represent the database table:

public class MY_ATTACHMENT_TABLE_MODEL 
{
    [Key]
    public byte[] File { get; set; }  // notice this change
    public string Name { get; set; }
    public string Type { get; set; }
}

This will save the data into a varbinary(max) field as a byte[]. Name and Type were nvarchar(250) and nvarchar(10), respectively. You could include size by adding it to your table as an int column & MY_ATTACHMENT_TABLE_MODEL as public int Size { get; set;}, and add in the line tblAtchm.Size = System.Convert.FromBase64String(myModelObj.file).Length; above.

Oracle

Short answer: Convert it to a byte[], assign it to an OracleParameter, add it to your OracleCommand, and update your table's BLOB field using a reference to the parameter's ParameterName value: :BlobParameter

Long answer: When I did this for Oracle, I was using an OpenFileDialog and I retrieved and sent the bytes/file information this way:

byte[] array;
OracleParameter param = new OracleParameter();
Microsoft.Win32.OpenFileDialog dlg = new Microsoft.Win32.OpenFileDialog();
dlg.Filter = "Image Files (*.jpg, *.jpeg, *.jpe)|*.jpg;*.jpeg;*.jpe|Document Files (*.doc, *.docx, *.pdf)|*.doc;*.docx;*.pdf"
if (dlg.ShowDialog().Value == true)
{
    string fileName = dlg.FileName;
    using (FileStream fs = File.OpenRead(fileName)
    {
        array = new byte[fs.Length];
        using (BinaryReader binReader = new BinaryReader(fs))
        {
            array = binReader.ReadBytes((int)fs.Length);
        }

        // Create an OracleParameter to transmit the Blob
        param.OracleDbType = OracleDbType.Blob;
        param.ParameterName = "BlobParameter";
        param.Value = array;  // <-- file bytes are here
    }
    fileName = fileName.Split('\\')[fileName.Split('\\').Length-1]; // gets last segment of the whole path to just get the name

    string fileType = fileName.Split('.')[1];
    if (fileType == "doc" || fileType == "docx" || fileType == "pdf")
        fileType = "application\\" + fileType;
    else
        fileType = "image\\" + fileType;

    // SQL string containing reference to BlobParameter named above
    string sql = String.Format("INSERT INTO YOUR_TABLE (FILE_NAME, FILE_TYPE, FILE_SIZE, FILE_CONTENTS, LAST_MODIFIED) VALUES ('{0}','{1}',{2},:BlobParamerter, SYSDATE)", fileName, fileType, array.Length);

    // Do Oracle Update
    RunCommand(sql, param);
}

And inside the Oracle update, done with ADO:

public void RunCommand(string strSQL, OracleParameter param)
{
    OracleConnection oraConn = null;
    OracleCommand oraCmd = null;
    try
    {
        string connString = GetConnString();
        oraConn = OracleConnection(connString);
        using (oraConn)
        {
            if (OraConnection.State == ConnectionState.Open)
                OraConnection.Close();

            OraConnection.Open();

            oraCmd = new OracleCommand(strSQL, oraConnection);

            // Add your OracleParameter
            if (param != null)
                OraCommand.Parameters.Add(param);

            // Execute the command
            OraCommand.ExecuteNonQuery();
        }
    }
    catch (OracleException err)
    {
       // handle exception 
    }
    finally
    {
       OraConnction.Close();
    }
}

private string GetConnString()
{
    string host = System.Configuration.ConfigurationManager.AppSettings["host"].ToString();
    string port = System.Configuration.ConfigurationManager.AppSettings["port"].ToString();
    string serviceName = System.Configuration.ConfigurationManager.AppSettings["svcName"].ToString();
    string schemaName = System.Configuration.ConfigurationManager.AppSettings["schemaName"].ToString();
    string pword = System.Configuration.ConfigurationManager.AppSettings["pword"].ToString(); // hopefully encrypted

    if (String.IsNullOrEmpty(host) || String.IsNullOrEmpty(port) || String.IsNullOrEmpty(serviceName) || String.IsNullOrEmpty(schemaName) || String.IsNullOrEmpty(pword))
    {
        return "Missing Param";
    }
    else
    {
        pword = decodePassword(pword);  // decrypt here
        return String.Format(
           "Data Source=(DESCRIPTION =(ADDRESS = ( PROTOCOL = TCP)(HOST = {2})(PORT = {3}))(CONNECT_DATA =(SID = {4})));User Id={0};Password={1};",
           user,
           pword,
           host,
           port,
           serviceName
           );
    }
}

And the datatype for the FILE_CONTENTS column was BLOB, the FILE_SIZE was NUMBER(10,0), LAST_MODIFIED was DATE, and the rest were NVARCHAR2(250).

Solution 5:

Yes, generally the best way to store a file in a database is to save the byte array in a BLOB column. You will probably want a couple of columns to additionally store the file's metadata such as name, extension, and so on.

It is not always a good idea to store files in the database - for instance, the database size will grow fast if you store files in it. But that all depends on your usage scenario.