Entity Framework: There is already an open DataReader associated with this Command
I am using Entity Framework and occasionally i will get this error.
EntityCommandExecutionException
{"There is already an open DataReader associated with this Command which must be closed first."}
at System.Data.EntityClient.EntityCommandDefinition.ExecuteStoreCommands...
Even though i am not doing any manual connection management.
this error happens intermittently.
code that triggers the error (shortened for ease of reading):
if (critera.FromDate > x) {
t= _tEntitites.T.Where(predicate).ToList();
}
else {
t= new List<T>(_tEntitites.TA.Where(historicPredicate).ToList());
}
using Dispose pattern in order to open new connection every time.
using (_tEntitites = new TEntities(GetEntityConnection())) {
if (critera.FromDate > x) {
t= _tEntitites.T.Where(predicate).ToList();
}
else {
t= new List<T>(_tEntitites.TA.Where(historicPredicate).ToList());
}
}
still problematic
why wouldn't EF reuse a connection if it is already open.
Solution 1:
It is not about closing connection. EF manages connection correctly. My understanding of this problem is that there are multiple data retrieval commands executed on single connection (or single command with multiple selects) while next DataReader is executed before first one has completed the reading. The only way to avoid the exception is to allow multiple nested DataReaders = turn on MultipleActiveResultSets. Another scenario when this always happens is when you iterate through result of the query (IQueryable) and you will trigger lazy loading for loaded entity inside the iteration.
Solution 2:
Alternatively to using MARS (MultipleActiveResultSets) you can write your code so you dont open multiple result sets.
What you can do is to retrieve the data to memory, that way you will not have the reader open. It is often caused by iterating through a resultset while trying to open another result set.
Sample Code:
public class MyContext : DbContext
{
public DbSet<Blog> Blogs { get; set; }
public DbSet<Post> Posts { get; set; }
}
public class Blog
{
public int BlogID { get; set; }
public virtual ICollection<Post> Posts { get; set; }
}
public class Post
{
public int PostID { get; set; }
public virtual Blog Blog { get; set; }
public string Text { get; set; }
}
Lets say you are doing a lookup in your database containing these:
var context = new MyContext();
//here we have one resultset
var largeBlogs = context.Blogs.Where(b => b.Posts.Count > 5);
foreach (var blog in largeBlogs) //we use the result set here
{
//here we try to get another result set while we are still reading the above set.
var postsWithImportantText = blog.Posts.Where(p=>p.Text.Contains("Important Text"));
}
We can do a simple solution to this by adding .ToList() like this:
var largeBlogs = context.Blogs.Where(b => b.Posts.Count > 5).ToList();
This forces entityframework to load the list into memory, thus when we iterate though it in the foreach loop it is no longer using the data reader to open the list, it is instead in memory.
I realize that this might not be desired if you want to lazyload some properties for example. This is mostly an example that hopefully explains how/why you might get this problem, so you can make decisions accordingly
Solution 3:
There's another way to overcome this problem. Whether it's a better way depends on your situation.
The problem results from lazy loading, so one way to avoid it is not to have lazy loading, through the use of Include:
var results = myContext.Customers
.Include(x => x.Orders)
.Include(x => x.Addresses)
.Include(x => x.PaymentMethods);
If you use the appropriate Include
s, you can avoid enabling MARS. But if you miss one, you'll get the error, so enabling MARS is probably the easiest way to fix it.
Solution 4:
You get this error, when the collection you are trying to iterate is kind of lazy loading (IQueriable).
foreach (var user in _dbContext.Users)
{
}
Converting the IQueriable collection into other enumerable collection will solve this problem. example
_dbContext.Users.ToList()
Note: .ToList() creates a new set every-time and it can cause the performance issue if you are dealing with large data.