Attributes – The New Macros ?

Do you remember the times when C++ was the mainstream programming language in enterprise level projects ?
A popular approach was to simplify the code by defining a lot of macros.
Even Microsoft did that very extensively in their MFC implementation for instance in their ubiquitous messagemaps.
The benefit was an easy progamming model – just add a few macros and be happy.
On the other hand it was very difficult to understand what really happened in the background.
This was especially true for user defined macros which could only be used in the context of a certain project.

Later languages like Java and C# took over and macros became history (C# has a preprocessor but no macros). The nightmare was over.
Todays software systems aren’t easier than in the past. Therefore the language engineers considered it to be a good idea to support some metainformation facility in form of attributed programming.

In .NET you have attributes.

public string HelloWorld()
return “Hello, .World!”;

Java introduces the Language metadata facility (JSR175) which is currently a community draft. This approach is based on javadoc comments.

* @common:operation
public String HelloWorld()
return “Hello, World!”;

Looks somehow similar, right ?

But it’s not.

In .NET the attributes are stored in the assemblies metadata and therefore can be accessed at runtime.
They can be determined by reflection as shown in the following example:

class Test
static void Main(string[] args)
Type t = typeof (Test);
MemberInfo[] members = t.GetMember(“DoIt”);
object[] attrs = members[0].GetCustomAttributes(false);

public static void DoIt()

public class CoolAttribute : Attribute

In Java they merely serve to generate code at compile time.
Java supports reflection as well. But due to the fact that the attributes are not stored in the metadata they can’t be determined by reflection at runtime.

What’s the conclusion ?

On one hand both approaches are very useful to simplify the programming model for the developer.

On the other hand they are like C++ macros. A developer can define his/her own attributes which nobody outside a project might know.
This leads to applications which are hard to maintain.
In order to create maintainable software systems I think developers should define their own attributes/annotations as little as possible.
This will ease the job especially for new team members which join a project at a later time.

Cripple Objects In The Web Services World

When you read different forums there is a lot of talk about serialisation of plattform dependent types.
Pepole are asking how to get a .NET Type (e.g. DataSet) from Java or vice versa (e.g. Resultset).
If you are developing end to end .NET or J2EE applications you don’t have to worry about that. The runtime does most of the work for you and if you ask for a DataSet you will get one.
But when it comes to cross plattform what will you get when you ask for a DataSet from an J2EE application ?
To answer this Question we have ask another one.

What makes up an object ? Answer: It’s state and behaviour.

When you request a DataSet you will only get the state and type information. But what makes a DataSet so powerful ? It’s the behaviour.
If you don’t have the concept of a DataSet on the J2EE platform you can get only the half of the object representation. It’s a cripple object.
The same it true when you request a Java Resultset from .NET.

In order to develop interoperable Web Services you have to rely on XML Schema types only. First define the schema and wsdl contract and then generate the service from it.
It’s the same as you might know from CORBA or COM Interface Definition Language (IDL).

Tool support is available:

.NET -> wsdl.exe /server …
JWSDP -> wscompile -gen:server …

Web Service Interop Impressions

As we know Web Services are about interoperability.
Let’s prove it I thought and tried to figure out how far we can get when we try to write interoperable Web Services.
On the .NET side I used WSE2.0. On the J2EE it’s a little bit more complicated.
You have to choose amongst Axis, IBM WSDK, Sun’s JWSDP,BEA WLS, Glue, etc. etc.
Although I know that it’s not the most powerful SDK, I chose Sun’s JWSDP1.2 because it is what we will see in all the other products earlier or later.

In general interop works if you follow some best practices.

Here are my first insights:

1. Document/Literal with SoapAction routing does not work for cross plattform interoperability.

In the first attempt we used VS.NET to generate the WSDL contract. It produced a document/literal style.
The document contained the soapaction attribute in the soapbinding:operation element. The JWSDP didn’t recognise this attribute to generate client proxies.
Therefore the service was called with the http soapaction-header of “”. This caused the service to raise an error.
When we used JWSDP to generate the WSDL it produced an RPC/encoded version. Using this as a basis for the service generation worked for both platforms.
But using RPC/encoded is not recommended due to the fact that document/literal will be the standard of the future because of it’s selfdecribing nature.

2. SOAP attachments

JWSDP dows not support DIME for SOAP attachments. The application/dime mime type is missing from the HTTP accept-header when sending a request.
On the other hand WSE does not support MIME attachments. Axis supports DIME. Therefore it’s not possible to write interoperable Web Services with DIME or MIME attachments. In order to be interoperable avoid attachments until a standard has been established.
DIME support in WSE is pretty good.
It’s not possible to write an J2EE service with attachments (MIME) with the help of JAX-RPC. One has to use a the lower level API SAAJ and create a servlet to process the message on the HTTP level.

3. JWSDP does not support WS-Security. Only HTTP based Authentication is possible.


WSE2.0 supports a lot of features and opportunities but for interoperability you need two parties. The most important aspect of Web Services is that you can connect different platforms in a secure and reliable way. My impression is to make this reality there is still a long way to go. Let’s see…

Where are we going ?

Today I would like to share some thoughts about the future of information technology.

If you look back in the history of information technology you will see one obvious trend.
What I mean is abstraction. The first important step was the object oriented paradigma which allows us to model the real world into code.
With this paradigma it was possible to implement persons instead of simple functions.

The next step was component technology which sits on top of oo. This model is more aligned to the real world, as it focuses on interfaces instead of internal implementations.
With that you just have to meet the person and know the language for communication. It’s not important whether the person has a pacemaker or not.

But we still had to develop on a very low technical level. This changed with the introduction of application platforms like J2EE and .NET. This brought abstract APIs for development.
With these application platforms we got rid of the error-prone system level coding. These new programming models are easier to handle than it predecessors like MFC, OWL, ect. (but they are very complex)
Because of this ease it’s possible to build more sophisticated distributed applications. But architecture and system design is much more important these days.
Nowadays you are no more limited to develop single persons, but worldwide distributed communities.

And the choice of the application platform is loosing importance (except for the vendor).
Most of the complex enterprise level applications have the same non functional requirements like availability, failover, security. Therefore providing powerful runtime environments, like EJB or Enterprise Services are a logical consequence.
And it finalises the good old separation of concerns paradigma.
That is the environment, e.g. air, accomodations, etc. where the community lives in.

That’s where we are today.

What comes next ?

If you open your eyes you can identify the upcoming levels of abstraction. One is the notion of shifting from coding to modeling. This trend has a popular three letter acronym.
It’s MDA or Model Driven Architecture which promises to generate most of the code from a model.
The other sign of abstraction is business orchestration. We see that in products like Microsofts Biztalk Server or BEAs Weblogic Integration.
The outcome could be that the applications which are generated from the model can interact with each other by connecting them with a drag and drop approach.
The foundation for this interoperability are the emerging Web Service standards.

How does that sound ? Do you like coding ? What will you do in the future ?

In my opinion coding will loose importance.
But without experienced people which are able to look behind the scenes and are able to cope with the complexity of these highly distributed systems it would be very hard to maintain them.
At the latest when a system does not work as expected. Moreover Enterprise Application Integration will raise to a new level.
That means there will be still much work to do for developers even if the way of application development will change. But that is business as usual, isn’t it ?


The time has come. I’ve set up a weblog as many others did before. You may ask for the reason (i hope you will).
The reason is to keep in touch with the people.
During my life as a professional coach and trainer I met many people in projects or at trainings and conferences.
Wouldn’t it be nice to share something with these people even after leaving the location ? I think yes.

Why English ?

Almost everyone in the IT industry speaks English, but only a few speak German if you work in an international environment (what a pity).
Though my English is far from being perfect, I think it’s good enough to share my experiences with others. I hope you will excuse the mistakes (especially joking is very hard in a foreign language).
Some issues might be interesting only for German readers. In this case I’m going to write in German.

What can you expect ?

As my focus is on enterprise integration, J2EE and .NET technology, you will find information about these issues in the blog.
Because I’m standing with one leg in the .NET and with the other in the J2EE camp the content will sometimes be more related to one or the other side (which one is the dark one ?).
I’ll try my best only to write about things which might be interesting to other people as well.
If you are concerned with cross platform and interoperability issues this blog is for you (what are the most all-purpose words in the it industry ? platform, component and object I guess).
Sometimes the content will be prose and sometimes in depth technical information.

In real situations it’s boring if one speaks and the others just listen. The same is true for a weblog (the difference is that I can’t see you yawning).
So feel free to participate in this blog by posting comments. I look forward to hearing from you.