Fernando Machado Píriz's Blog

Posts about digital transformation, enterprise architecture and related topics

Archive for April 2010

A Simple Introduction to the Managed Extensibility Framework

with 6 comments

It is very common for us developers to spend more time modifying existing applications than building new ones from the scratch. New business requirements usually need new features in software applications. Nowadays, the way those requirements are usually added ends when we generate, test, and deploy a new version of the application. Those activities very often affect the entire application, not only the new features added.

When requirements change, we developers should change the corresponding features in the application. After years and years of evolution, source code implementing those features typically has undesired dependencies, and modifications can turn the application unstable, or demand extensive regression testing to make sure new bugs has not been introduced. Also in this case the story ends when we generate and deploy the entire application again.

These days businesses change more, and more often, to survive in a globalized and competitive world. And the software applications that makes those businesses work should also change, and change faster.

Let’s suppose for a moment that every single functionality of an application can be decomposed into a “part”. What we need is the ability to develop weakly coupled parts; independent not only in its structure, but also in its testing, and deployment; and also that those parts can be easily composed into an application.

Software engineering principles required to solve this situation have been known for a long time, and many organizations apply them successfully. But those solutions are usually on case by case basis, and are not available to the general public like us.

Recently, some software development frameworks have appeared to develop applications in parts. MEF is one of them.

There are the following roles in MEF:

  • Exported parts. They declare that they can be used to compose an application and the contract they implement. They are independent development, compilation, and deployment units. They are weakly coupled, not only with other parts in the application they will compose, but also with the application itself, i.e., a part does not know other parts, and does not know necessarily in which application it will be used.
  • Import points. They are variables that contain parts or collections of imported parts that must implement a specific contract. Parts are automatically created from the information contained in parts catalogs.
  • Parts catalogs. They contain parts definitions: where they are and what contract they implement.
  • Parts containers. They contain parts instances and they perform composition of parts.

I will show you how to develop a MEF application step by step. To understand MEF the application must be simple, but must offer multiple features decomposable in parts.

The application in this example allows writing some text and then transforming it by applying different algorithms. Each algorithm offers different functionality and that is what I will transform into parts. The application looks like this:


The drop down list shows available transformations:


The goal is to develop the user interface and each transformation independent for each other. There we go.

We declare the contract that parts must implement with the IFilter interface as follows:

public interface IFilter
    string Filter(string input);

Then we create a part (the transformation ToUnicodeFilter for example) implementing the IFilter interface and we declare it as exportable with the MEF attribute Export:

public class ToUnicodeFilter : IFilter
    public string Filter(string input)
        StringBuilder output = new StringBuilder();
        foreach (char item in input)
            output.AppendFormat(” U+{0:x4}”, (int)item);
        return output.ToString();

The transformation in this case is to convert each character of the input text into its correspondent Unicode representation.

We can have as much parts as we want, but all of them must implement the IFilter interface and be decorated with the Export attribute.

Let’s start coding the composition of parts into the application. We need a catalog to describe our parts. In this example all assemblies containing parts will be in a folder called Extensions, so we use a DirectoryCatalog:

DirectoryCatalog catalog = new DirectoryCatalog(“Extensions”);

We also need a container for the part instances and to build the composition. The container receives the catalog as parameter, so it can know where to find the parts:

CompositionContainer container = new CompositionContainer(catalog);

We are almost there. What we need now is a point where to import the parts. In this case there can be many transformations, so we declare an IList<IFilter> that we decorate with the ImportMany attribute.

private IList&lt;IFilter&gt; filters = new List&lt;IFilter&gt;();

The last thing we need to do is to compose the parts, indicating to the container where the import points are defined:


Then we can iterate over the list to populate the filters in the drop down list:

private void Window_Loaded(object sender, RoutedEventArgs e)
    foreach (IFilter filter in filters)

When the user clicks the Apply button, the filter to execute is found in the list of filters by using the index of the element selected in the drop down list:

private void buttonApply_Click(object sender, RoutedEventArgs e)
    if (comboBoxFilters.SelectedIndex != -1)
        int index = comboBoxFilters.SelectedIndex;
        IFilter filter = filters.ElementAt(index);
        textBoxOutput.Text = filter.Filter(textBoxInput.Text);

Three assemblies are playing here. The first one is where the IFilter interface is declared. The second one is where the class ToUnicodeFilter is declared. The third one is the application itself. The interesting thing here is that in order to add another transformation or to modify an existing one, the only thing we need to do is to deploy the assembly to the Extensions folder. Other assemblies are not affected, including the application assembly.

How this magic works? When the ComposeParts method of the container is called, passing the application itself as parameter, the container looks for all variables decorated with the ImportMany (or with the Import) attribute. In each case the interface required by the import point is also found by the container.

The container knows the catalog, because we passed it as parameter to the constructor. Since the catalog has the information of all parts, including which parts implements which interface, it can find which part or parts implement the interface of each import point. As the catalog also knows where the assemblies implementing the parts are, it can create instances of the appropriate part or parts, and assign them to the variables corresponding to the import points.

Finally, the composition “is built alone”, the work is done by MEF, not by the developer. The developer only “decorates” the types with Export, variables with Import or ImportMany, creates one or more catalogs and the container, and finally composes the parts. It is simple, is not it?

Remember that the challenge was to develop weakly coupled parts, independent not only on its structure, but also in deployment; and that those parts could be easily composed into an application. Goal achieved? I think so.

You can download the code for this sample application from here. The sample contains even more parts than the ones I am showing here.

See also MEF Home y MEF Overview.

In an upcoming post I will write on how to associate metadata to the parts, for example, to show a friendly name instead of the type name in the drop down list. See you later.

Written by fmachadopiriz

April 19, 2010 at 1:55 am

A poster for your box’s wall: Visual Studio 2010 keybindings

leave a comment »

One of the things you find after using Visual Studio for a while is that the amount of features and commands is longer than the menu, i.e., there are some features that do not appear as commands in the menu.

Let me give you an example. When we write an identifier and the compiler cannot resolve it, Visual Studio underlines it with a red squiggly line:


When we move the mouse pointer over the underlined identifier, there appears a small blue rectangle, and when we move the pointer over the rectangle, a small button appears that, when clicked, shows us a series of option to help to fix the error:


You have already seen that, have not you?

Unfortunately this feature is not accessible from the menu. I do not hesitate to recognize that sometimes my lack of fine motor function impedes me to do this as fast as I wanted to. But, did you know there is a key combination to do this without using the mouse? You just need to press Ctrl+Point at the same time.

This not only solves my lack of fine motor function, but also prevents me to move my hands out of the keyboard. I always remark the increase in productivity resulting for saving a second or a click in those tasks we perform many times per day, because at the end of the day we save many seconds and many clicks.

Visual Studio has menu examples like the aforementioned, the problem is that since many of this useful keyboard combinations do not have a corresponding menu item, we never see them (this problem has a name, it is called discoverability). Microsoft has recently published a poster listing all these keyboard combinations:


You can download the file from here.

There is more than one poster available, one for each keyboard combination associated to a language (the one you choose the first time you use Visual Studio).

Download the file, print it, fix it to your box’s wall. Give it a chance and then you tell me.

PS: Did you already know the Ctrl+Point keybinding? Okay, I am pretty sure you will find many unknown key combinations in the poster 🙂

Written by fmachadopiriz

April 18, 2010 at 12:25 pm

Visual C# 2010 Express available for download

leave a comment »

Today Microsoft made available for download the Express version of Visual C# 2010. There are also Express versions available for download of Visual Studio 2010, Visual Basic 2010, Visual C++ 2010, Visual Web Developer 2010. In addition, there is also a CTP (Community Technology Preview) of Visual Studio Express for Windows Phone.

Written by fmachadopiriz

April 16, 2010 at 8:52 pm

CoolX: booster your Visual Studio 2010 productivity

leave a comment »

You can greatly improve your software development productivity by using tools like the Microsoft Extensibility Framework, or the Model View Controller framework for ASP .NET 4.0, just to name a few. Tools like these ones can save you hours of development and testing time, and can save thousands of dollars to your customers.

Are there are other tools, like the Windows 7’s Aero Peak feature, that just allows you to save a couple of clicks, or a couple of seconds, but that’s nothing, isn’t it? Well, it depends. If switching between windows is something you do hundreds of times per day, as you usually do for sure, saving a couple of seconds each time can result in savings of hours at the end of the month.

If you spend most of your day editing code with Visual Studio, there are many chances for you to waste your clicks and your seconds in tasks like opening Windows Explorer instances to work with project’s files, opening Command Prompts to execute commands on project’s folders, resolving missing references, etc.

Until now. There is a new tool (in fact is a new version of an old tool), so called CoolX, that’s really cool. It does a very good job saving your valuable seconds and clicks on everyday’s tasks, by adding many useful commands to context menus in Visual Studio:

image   image

These images show just some of the commands added. The full list includes (and most names are self-descriptive):

  • Copy Project
  • Paste Project
  • Explorer Context Menu
  • Collapse All Projects
  • Open Container Folder
  • Visual Studio Prompt Here
  • Demo Font
  • Locate in Solution Explorer
  • Copy Reference
  • Paste Reference
  • Resolve Project References
  • Reference Manager
  • Build (added to editor’s context menu)
  • Float In Other Screen

You can download the CoolX Visual Studio addin from this page in the Visual Studio Gallery.

Written by fmachadopiriz

April 16, 2010 at 8:21 pm

Covariance and contravariance made easy

leave a comment »

One of the most important new features in C# 4.0 and .NET Framework 4.0 is the introduction into the language of covariance and contravariance. Several times I have spoken about covariance and contravariance while presenting C# and .NET Framework 4.0 news, and also in the old blog, but I am not very sure that all attendees end by easily getting these concepts. Until now I have started from formal definitions of covariance and contravariance, and then show how to implement them in C#. Probably that one is not the best approach, so now I will try to explain those concepts in a different way. Here we go.

Take a look on the following classes Animal and Cat inheriting from Animal:

class Animal { }
class Cat : Animal { }

The following declarations are valid in any C# version:

Cat kitty = new Cat();
Animal animal = cat;

Every Cat is an Animal –that is what Cat inherits from Animal means-, so I can assign the variable kitty to the variable animal. There is nothing new until now.

Look now to what happens when I try to do something similar with enumerable of Animal and enumerable of Cat:

IEnumerable<Cat> cats = new List<Cat>();
IEnumerable<Animal> animals = cats;

Every Cat is an Animal, so intuitively any enumerable of Cat is an enumerable of Animal, right? Wrong, at least for all compilers before C# 4.0; they say they cannot convert IEnumerable<Cat> into IEnumerable<Animal> and ask me if I am missing a cast.


I can add the cast, but that will be an unsafe cast. I mean, the program will compile, but I will see an InvalidCastException at runtime when trying to do the assignment.

Okay, you and me will agree that even if the C# compiler does not accept that every enumerable of Cat is also an enumerable of Animal, intuitively we accept that sentence, in the same way we accept that every Cat is an Animal.

C# 4.0 solves the conflict. The code fragment above happily compiles and does not generate any exception at runtime, matching our intuition.

Why the same code that compile in C# 4.0 does not compile in previous versions?

Before C# 4.0 IEnumerable interface was declared as:

public interface IEnumerable<T> : IEnumerable

While in C# 4.0 it is declared as:

public interface IEnumerable<out T> : IEnumerable

Note the out keyword besides the T type parameter: it is used to indicate that IEnumerable is covariant in respect of T. Generally speaking, given S<T>, being S in respect of T implies that, if the assignment Y ← X is valid when X inherits from Y, then the assignment S<Y> ← S<X> is also valid.

Let’s take a look now to another code fragment involving actions –actions are delegates to functions with the form void Action<T>(T)– on Animal and Cat.

Action<Animal> doToAnimal = target => { Console.WriteLine(target.GetType()); };
Action<Cat> doToCat = doToAnimal;
doToCat(new Cat());

Once again, since every Cat is an Animal, an Action<Animal> I can do to an Animal should also be an Action<Cat> that I can do to a Cat. Observe that even while the sentence is reasonable to say, parameter types are the other way around than in the previous case: there I was assigning an enumerable defined in terms of Cat to an enumerable defined in terms of an Animal, while in this case I am assigning an action defined in terms of Animal to an action defined in terms of Cat.

Even if the sentence is reasonable, compilers before C# 4.0 do not like the assignment and fail with a similar message than in the previous case: cannot convert an Action<Animal> into an Action<Cat>; in this case no questions about cast.


Once again C# 4.0 solves the problem, and the code fragment above does compile. Let’s see how actions are declared:

Before C# 4.0 the Action delegate was declared as follows:

public delegate void Action<T>(T obj);

While in C# 4.0 is now declared as:

public delegate void Action<in T>(T obj);

Observe now the keyword in besides type parameter T: it is used to indicate that delegate Action is contravariant in respect of T. Generally speaking, given S<T>, being S in respect of T implies that, if the assignment Y ← X is valid when X inherits from Y, then the assignment S<X> ← S<Y> is also valid.

Summarizing, covariance in C# 4.0 allows a method having a result of a type derived from the type parameter defined in the interface. In this same way, it allows assignments of generic types that, being intuitive, were not allowed before, such as IEnumerable<Animal>IEnumerable<Cat>, when Cat inherits from Animal.

Note that IEnumerable can only return instances, it does not receive instances as parameters. That is why the keyword to declare IEnumerable covariant in respect of T is out.

Analogously, contravariance allows a method to have parameters of a type ancestor of the type specified as type parameter in the delegate. This way, it also allows assignments that even if being intuitive, were not possible before, like Action<Cat>Action<Animal>. Observe that in this case Action can only receive instances as parameters, it does not return instances. That is why the keyword to declare Action contravariant in respect of T is in.

In this post I am showing an example of covariance with a generic interface and an example of contravariance with a generic delegate; in an upcoming post I will show different examples and will list which interfaces and delegates in .NET Framework 4.0 are covariant and which ones are contravariant.

Sample code can be downloaded from here. To test it in compilers earlier that C# 4.0 and see the difference, change the target framework in project properties:


Hope you have enjoyed this post. See you.

Written by fmachadopiriz

April 16, 2010 at 3:31 am

Montevideo’s Run 2.0 live!

leave a comment »

RUN 2.0 in Montevideo, major Microsoft event of this year, is being broadcasted high definition, using Silverlight 4 Live Smooth Streaming: http://www.dominiodigitalhd.com.ar/dd/bbb/default.html

Written by fmachadopiriz

April 14, 2010 at 11:53 am

Posted in Announcements

Run 2.0

leave a comment »

Next week, on Wednesday April 14 starting from 9AM,  in the Antel’s Communications Tower, we will hold the Run 2.0. This event has become the most important of the year related to Microsoft technologies. The most important product launchs will take place in Run 2.0: Visual Studio 2010, .NET Framework 4.0, etc.

Run2.0 I will be participating during the keynote on the morning, showing how Visual Studio 2010 and the .NET Framework 4.0 help increase productivity.

In the afternoon together with Gaston Milano, Gustavo Quintana and David Gorena, at the plenary session for developers, we will be speaking on what is new in Visual Studio 2010 and .NET Framework 4.0. An hour and a half of few slides and many demos.

You can register here. Looking forward to meet you there!

Written by fmachadopiriz

April 11, 2010 at 2:57 pm

Posted in Announcements