Should I have a separate assembly for interfaces?
Solution 1:
The usual expected? practice is to place them in their own assembly, because then a given project consuming those interfaces doesn't require a hard reference to the implementation of those interfaces. In theory it means you can swap out the implementation with little or no pain.
That said, I can't remember when I last did this, to @David_001's point this isn't necessarily "usual". We tend to have our interfaces in-line with an implementation, our most common use for the interfaces being testing.
I think there are different stances to take depending on what you are producing. I tend to produce LOB applications, which need to interoperate internally with other applications and teams, so there are some stakeholders to the public API of any given app. However, this is not as extreme as producing a library or framework for many unknown clients, where the public API suddenly becomes more important.
In a deployment scenario, if you changed the implementation you could in theory just deploy that single DLL - thus leaving, say, the UI and interface DLLs alone. If you compiled your interfaces and implementation together, you might then need to redeploy the UI DLL...
Another benefit is a clean segregation of your code - having an interfaces (or shared library) DLL explicitly states to any on the development team where to place new types etc. I'm no longer counting this as a benefit as we haven't had any issues not doing it this way, the public contract is still easily found regardless of where the interfaces are placed.
I don't know if there are best practices for or against, the important thing arguably is that in code, you are always consuming the interfaces and never letting any code leak into using the implementation.
Solution 2:
The answers so far seem to say that putting the interfaces in their own assembly is the "usual" practice. I don't agree with putting unrelated interfaces into one "shared" common assembly, so this would imply I will need to have 1 interface assembly for each "implementation" assembly.
However, thinking about it further, I can't think of many realy world examples of this practice (eg. do log4net or NUnit provide public interface assemblies so that consumers can then decide on different implementations? If so, what other implementation of nunit can I use?). Spending ages looking through google, I've found a number of resources.
-
Does having separate assemblies imply loose coupling? The following suggests no:
http://www.theserverside.net/tt/articles/showarticle.tss?id=ControllingDependencies
http://codebetter.com/blogs/jeremy.miller/archive/2008/09/30/separate-assemblies-loose-coupling.aspx
-
The general consensus that I could find from googling was that fewer assemblies is better, unless there's a really good reason to add new assemblies. See also this:
http://www.cauldwell.net/patrick/blog/ThisIBelieveTheDeveloperEdition.aspx
As I am not producing public APIs, and I'm already putting interfaces into their own namespaces, it makes sense not to blindly create new assemblies. The benefits of this approach seem to outweigh the potential benefits of adding more assemblies (where I'm unlikely to ever actually reap the benefits).
Solution 3:
The pattern I follow for what I call shared types (and I too use DI) is to have a separate assembly which contains the following for application level concepts (rather than common concepts which go into common assemblies):
- Shared interfaces.
- DTOs.
- Exceptions.
In this way dependencies between clients and core application libraries can be managed, as clients can not take a dependency on a concrete implementation either directly or as an unintended consequence of adding a direct assembly reference and then accessing any old public type.
I then have a runtime type design where I set up my DI container at application start, or the start of a suite of unit tests. In this way there is a clear separation between implementations and how I can vary them via DI. My client modules never have a direct reference to the actual core libraries, only the "SharedTypes" library.
The key for my design is having a common runtime concept for clients (be it a WPF application or NUnit) that sets up the required dependencies i.e. concrete implementations or some sort of mocks\stubs.
If the above shared types are not factored out, but instead clients add a reference to the assembly with the concrete implementation, then it is very easy for clients to use the concrete implementations rather than the interfaces, in both obvious and non-obvious ways. It's very easy to gradually end up with over-coupling over time which is near impossible to sort out without a great deal of effort and more importantly time.
Update
To clarify with an example of how the dependencies end up in the target application.
In my situation I have a WPF client application. I use Prism and Unity (for DI) where importantly, Prism is used for application composition.
With Prism your application assembly is just a Shell, actual implementations of functionality reside in "Module" assemblies (you can have a separate assembly for each conceptual Module, but this is not a requirement, I have one Modules assembly ATM). It is the responsibility of the shell to load the Modules - the composition of these Modules is the application. The Modules use the SharedTypes assembly, but the shell references the concrete assemblies. The runtime type design I discussed is responsible for initializing dependencies, and this is done in the Shell.
In this way module assemblies which have all the functionality do not depend on concrete implementations. They are loaded by the shell which sorts the dependencies out. The shell references the concrete assemblies, and this is how they get in the bin directory.
Dependency Sketch:
Shell.dll <-- Application
--ModuleA.dll
--ModuleB.dll
--SharedTypes.dll
--Core.dll
--Common.dll + Unity.dll <-- RuntimeDI
ModuleA.dll
--SharedTypes.dll
--Common.dll + Unity.dll <-- RuntimeDI
ModuleB.dll
--SharedTypes.dll
--Common.dll + Unity.dll <-- RuntimeDI
SharedTypes.dll
--...
Solution 4:
I agree with the ticked answer. Good for you, David. In fact, I was relieved to see the answer, thought I was going mad.
I see this interesting "pens in a pen pot" pattern in enterprise C# freelance jobs all the time, where people follow the convention of the crowd and the team must conform, and not conforming is making trouble.
The other crazy is the one namespace per assembly nonsense. So you get a SomeBank.SomeApp.Interfaces
namespace and everything is in it.
For me, it means types are scattered across namespaces and assemblies containing a whole slew of stuff I don't care about has to be referenced all over the place.
As for interfaces, I don't even use interfaces in my private apps; DI works on types, concrete with virtuals, base classes or interfaces. I choose accordingly and place types in DLLs according to what they do.
I have never had a problem with DI or swapping logic later.
• .NET assemblies are a unit of security, API scope and deployment, and are independent of namespaces.
• If two assemblies depend on each other, then they cannot be deployed and versioned separately and should be merged.
• Having many DLLs often means making lots of stuff public such that it’s hard to tell the actual public API from the type members that had to be made public because they were arbitrarily put in their own assembly.
• Does code outside of my DLL ever need to use my type?
• Start conservative; I can usually easily move a type out a layer, it’s a bit harder the other way.
• Could I neatly package up my feature area or framework into a NuGet package such that it is completely optional and versionable, like any other package?
• Do my types align to the delivery of a feature and could they be placed in a feature namespace?
• Many real libraries and frameworks are branded, making them easy to discuss, and they don’t burn up namespace names that imply its use or are ambiguous, could I brandify the components of my app using 'code names' like Steelcore instead of generic clichéd and confusing terms, errm 'Services'?
Edit
This is one of the misunderstood things I see in development today. It's so bad.
You have an API, so put all its types within the single API project. Move them out only when you have a need to share/reuse them. When you move them out, move them straight to a NuGet package with a clear name that carries the intent and focus of the package. If you're struggling for a name, and considering "Common", its probably because you're creating a dumping ground.
You should factor your NuGet package into a family of related packages. Your "core" package should have minimal dependencies on other packages. The types inside are related by usage and depend on each other.
You then create a new package for the more specialised types and subtypes that require additional sets of dependencies; more clearly: you split a library by its external dependencies, not by the kind of type or whether its an interface or an exception.
So you might stick all your types in a single big library, but some more specialised types (coloured spots) depend on certain external libs so now your library needs to pull-in all these dependencies. That's unnecessary, you should instead break out those types into further specialised libraries that do take the dependencies needed.
Types in package A and B can belong to the same namespace. Referencing A brings in one set of types and then optionally referencing B supplements the namespace with a bunch more.
That's it.
Luke
Solution 5:
I'm looking at System.Data.dll (4.0) in Object Browser and i can see that it is autonomous in itself with not just interfaces but all instrumental classes like DataSet, DataTable, DataRow, DataColumn etc in it. Moreover, skimming over the list of namespaces it holds like System.Data, System.Data.Common, System.Configuration & System.Xml, it suggests first to have interfaces contained in their own assemblies with all relevant and required code held together and second and more importantly to re-use same namespaces across the overall application (or framework) to seggregate classes virtually as well.