Errors when using OFF_HEAP Storage with Spark 1.4.0 and Tachyon 0.6.4

There seems to be a related bug report: https://issues.apache.org/jira/browse/SPARK-10314

Since there seems to be a pull request for this, there might be a chance to soon get a fix for this.

From this thread, https://groups.google.com/forum/#!topic/tachyon-users/xb8zwqIjIa4, it looks like Spark is using TRY_CACHE mode to write to Tachyon so the data seems to get lost when evicted from the cache.