Wednesday, September 19, 2007

Common Information Model, Bane for Service Re-engineering!

In this brilliant article titled "Classic SOA" Dan North discusses the technlogy agnostic way of designing services. He is spot on in his remark that the venodrs are making SOA look lot more complex than it actually is in order to sell their products and solutions.

He writes

"Naturally it is in the vendors’ interest to emphasize the complexity of SOA and then provide a timely and profitable solution. This leads many systems architects into a technology-centric view of SOA, when, in fact, the most important criteria for a service-oriented architect — before tackling the technology — should be a keen understanding of the business.

I am also quite impressed by his explantion on how a single domain model will make little sense for the consumer and the povider of the services and how it becomes diffucult to re-engineer becuase the tight domain model coupling.

He emphasises usage of "busniness concepts", an effectively higher-level, ubiquitous language that ties together all of the finer-grained domain models behind each service.

He goes on to add

"The service contract is then expressed in terms of enterprise-level business concepts, such as a vacation or a dispatch or a sales order, which again decouples the service consumer from the service provider and allows them to evolve independently, while still able to communicate in a common language. The mistake that enterprise information architects (or people with similarly named roles) make is trying to define what the business concept means to each of the people using it"

Thursday, September 06, 2007

Sun Java CAPS and OpenESB

As expected, future versions of Java CAPS will include "OpenESB framework and components". I am quite interested to see how the Java CAPS 5.2 version will look like? I have been a strong supporter of JBI and have been using Apache ServiceMix extensively. I have to admit that I have not used OpenESB to the same extent. It would be nice to see, how the OpenESB JBI container and other JBI components will be integrated with the existing components like eGate Integrator, eInsight etc (From SeeBeyond acquisition). This will also help the existing SeeBeyond customers who are quite anxious about the future roadmap of Java CAPS.

In this post Sun's Fred Aabedi writes about the value add this will provide to the new and existing customers.

In his words..

"The merger of the OpenESB framework and some of JBI components into CAPS 5.2 brings exciting new possibilities to both our existing and new customers in the integration, composite applications, and SOA domains. A consolidated runtime environment based on the world class Glassfish platform allows the interoperability of our classic Java EE based components and new JBI based components. This combination is quite powerful and provides a lot of new options to our customers to solve their integration problems and build killer composite applications. Customers realize this ability to leverage existing proven solutions along with leading edge technologies by taking advantage of the Bi-directional Re-use features (JBI Bridge) that allow interoperability between the Java EE and JBI based components. In addition, standardization on the NetBeans 6.0 platform for all of the Java CAPS tooling gives developers a proven and effective platform on which to develop enterprise solutions."

Thursday, August 16, 2007

What's new in SOA Testing?

SOA Testing.. How different is this from the Testing we have been doing all along?

SOA Testing is understood to be testing the building blocks of SOA , the services . This is evolving day by day as SOA being accepted as the mantra for business agility. There are many products in the market that help in automate testing of services. It would be fair to say that these products/tools are more or less webservice centric.

In artilce titled Adjusting Testing for SOA , David Linthicum talks about the the changes in testing approach, SOA brings to the table.

He concentrates on testing services ( Testing the Core) and makes a mention of the complexities related to service security and governance can bring to testing without elaborating on them.

In his words...

"Considering that, when testing services (for example, Web services, Java EE, etc.) you have to think about a few things, including autonomy, integration, granularity, stability and performance, in the particular order of your requirements"

Miko Matsumura in his article SOA Testing Hubub extends the concpet to SOA Governance.

He adds..

"As such, the testing group is a party concerned with the concept of quality. Therefore thier ability to create policy assertions that define their concerns and expectations around quality creates their participation in governance. Now the “enforcement point” for testing may be a quality system such as the Mindreefs, iTKO, Parasoft, PushToTest, or Solstice type system"

Wednesday, August 08, 2007

Mashups and EAI

In an interesting article Gregor Hohpe(Google Architect of EIP Fame) describes how EAI patterns and concepts can be used while building mashups.

This to some extent justifies, why major vendors in the EAI space (ex. TIBCO - GI Acquisition) are looking for offerings in this space.

It is not simple to differntiate Mashups from Composite Apps.Conceptually they are the same.Only difference is in their scope.Mashups are often built ad-hoc and then integrated using simple protocols like RSS and Atoms. Mashups are used in the context of Web 2.0. Mashups pull data from different sources, aggregate and transform the data to be used in different contexts.

Mashups: REST/XML,JSON, ad-hoc, bottom-up, easy to change, lowexpectations, built by user

Composite Apps: SOA/WS-*, planned, top-down, more static, (too) highexpectations, built by

If you want to know more about Mashups, have a look at this tutorial from the same author.

Heard of Yahoo Pipes.. . Experiment with it and have fun.

Thursday, May 03, 2007

Service Harvesting

SOA is expected to give the corporates, that have a love-hate relationship with their monolithic legacy applications, a new lease of life. Clearly management of these legacy applications mainly in mainframes have become the single largest outlay of IT funds, on the other hand, amoebic growth (in terms of functionality and business processes and data) of these applications over time, has made it nearly impossible to replace these applications. To complicate the matter organizations are also losing people, who are the so called "Knowledge Centers" for these applications. Organizations are also facing numerous challenges in terms of business agility which can be traced back to the inflexible nature these applications and their embedded business logic.

The best possible way out for these organizations lies in creating services out of these applications. These services can then be reused to develop new business processes or modify existing business processes to cater to changing business needs.

But, how to go about this? Is there any standard methodology or approach for this? I have discussed about the top-down and bottom-up approaches in one of my earlier posts. Obviously both of them have their limitations and may make the whole initiative bite dust. No effort to identify and create services would be successful without understanding the existing applications and their routines.

In his article Finding Services in the Mainframe, Mike Oara ,CTO, Relativity Technologies, discusses "meet-in-the-middle" approach for service identifications in mainframe applications. I am sure this can be applied to any kind of legacy applications (mainframe, non-mainframe). He also talks about identifying potential services and harvesting them.

According to him...

"In this approach the service modeling team and the mainframe application experts work together to identify potential services that are both useful and feasible, given the existing legacy constraints"

He defines potential services as ...

"Application artifacts which alone or combined have all the characteristics of services"

He also goes on to add a few ways to dig-up functionalities from these applications and delves into the debatable topics like "service composition" and "service granularity". Finally he talks about importance of interactions and negotiations obetween "mainframe" and distributed applications" communities.

IMHO, there are still quite a few gray areas around Service Design? There are two schools of thought advocating different approacheds for Service Design

- Business Transaction Approach
- Logical Data View Approach

In Logical data view approach, a service would be defined with more CRUD type operations. I am not convinced that this is the right approach for designing services?

My preference would be to design services based on business transaction rather than logical data view. But, within the constraints of legacy applications, this option may prove to be road-block.

Where as, when developing new applications it would be appropriate to go for the business transactions approach and keep the data for the services close to the services. Talking about "Service and Data" I am reminded of this blog post "SOA Question: should we carve service independence into the database?" by Nick Malik . He mentioned that...

"If the services are designed well, there should be no cause for a single transaction that adds data under two different services."

This is only possible if we provide for data redundancy across services and synchronize them.

Friday, April 27, 2007

Mashups and Unified Desktops

A few days back Todd Biske has written this nice blog post on Composite Applications, Mashups, Widgets and Gadgets. I think now the industry has somewhat in agreement that composite apps and mashups mean the same, but mashups are more used in the context of web based presentation of composite applications. Lot is being spoken about about Enterprise Mashups at present. I think the simplicity (its all about javascript,DHML and XML) of the technology would slowly eat away the Portal Technology market.

There seems to be a growing demand in organizations for creating "Unified Desktop" for the employees. The requirement is to create a single user interface to access basic features of all kinds of applications in the enterprise. Sometime back I was dealing with such an requirement from a leading online trading firm. The requirement is to build an "Unified Agent Desktop" for the customer care representatives, which would have case management workflows along with the access to multiple applications they use, to manage the cases.

Dashboard Gadgets and Widgets should hopefully make this simpler in the future. Application Vendors should provide these Application Widgets/Gadgets to their customers along with the applications or organizations can develop them based on their simplicity.

Wednesday, April 25, 2007

Mule jBPM Connector

Mule 1.4 comes with an BPM Connector. This connector can integrate with BPM engines that provide a Java API. If there is a requirement to integrate with a BPEL engine standard web services with the soap transport (axis/xfire) canbe used.

jBPM is the first BPM engine that comes out of the box with Mule 1.4.

It looks pretty cool. You can easily write your long running integration processes in JPDL.

I hope to see integration support for JBossRules (Drools) engine in the future.

Tuesday, April 24, 2007

Are you looking at implementing SOA as part of your project?

Recently while going through a set of questions asked by my fellow colleagues to a customer regarding an RFP, I saw this question. This was an RFP for a packaged software implementation and the question was asked in the context of integration requirements with the other applications in the enterprise.

I brought this, to point out how product vendors and system integrators confuse customers with questions like this. Understanding of the concept is of paramount importance for its adoption. Dave Linthicum has clearly emphasized this in his blog, where he talks about the need for the vendors (I would add the System Integrators) to got o SOA School.

There are some other industry leader who have rightly pointed out how SOA has become a "goal" rathor than a "mean" to achieve business agility. There are certain organizations who talk about ROI of SOA initiative etc... IMHO, rather than starting SOA initiatives, which has become a new name of implementing web services and implementing a service registry (or may be implementing an ESB), organizations should try and inculcate the style in the organization. Microsoft Enterprise Architect Nick Malik has written a very good post ( ,in which he points out, where the maximum benifits of SOA lie in an enterprise.

A few lines from his post..

"IT projects provide tools for business processes. They automate parts of a business process or collect information or reduce errors. The point is… which processes? In the past, traditional IT only succeeded with the processes that changed rarely"

"The problem is that there is a long list of business processes that occur frequently but that are more difficult to automate because they change frequently"

"That is the SOA sweet spot"

In most of the cases we do not try to understand the customer requires and what are his pain points before suggesting a solution. Therefore, "SOA" has become "magic wand" for the sales guys and consultants.

Tuesday, April 10, 2007

Business Activity Monitoring (BAM) .. Key components

With the amount of industry buzz around SOA and technologies like MDM, BPM and BAM, it would be difficult to find out an IT Leader who does not want the benefits of BAM in his organization. There are quite a few products in the market in this segment which will catch your eye. I would discuss the key components of BAM tool/framework in this post.

The key to any enterprise is to understand the Business Events (external and internal) and treat them accordingly. There are certain business events that would help you know the business heath. There are certain business events which together can tell you the performance, trends etc. This is nothing but business intelligence, in fact real-time business intelligence.

Ideally there would be three components of a BAM tool

1. Event Absorption Layer also known as the Data Collector Layer
2. Event Processing Layer
3. Delivery Layer

Event absorption layer is the one that collects business events from the enterprise. If you look at application landscape of any organization, you would find number of applications (legacy, custom developed, packaged apps etc). Business processes and business logic lie embedded in these applications in many cases. Data collection can be either using push/pull model. Application pushing data or BAM framework pulling the data from these sources. As it is difficult to collect/absorb data from these diverse sources, integration platforms and technologies play a vital role here. Due care should be taken so that this event collection/absorption is as non-invasive as possible so as not to affect the performance of these business applications.

Event Processing Layer has the job of analyzing and correlating these events based on some rules and assumptions. Key Performance Indicators (KPIs) are nothing but filtered and correlated event data based on some rules. This layer consists of a rules engine,an analytics engine and a predictive engine (fingerprinting engine) . Predictive engine uses the analyzed and correlated data from the business events to predict.

Finally, the Delivery layer to deliver the results to the end users. There should be a notification/alert engine to notify the users if required using different channels. There should be portal(preferably web based) for the users to have a look at the filtered, correlated and analyzed data.

Friday, April 06, 2007

Who acquires what and why?

With so many acquisitions and mergers happening off late, it has become very difficult to keep pace with them . The new one in the bloc is "Software AG to acquire webMethods". There is a clear overlap in the product offerings of these companies. For example, Software AG has a service registry/repository product CentraSite that clearly overlaps with wM registry and repository products X Registry and X Broker from their earlier Infravio acquisition. There is a similar overlap in the BPM space too. webMethods has BPM tools as part of its integration platform ( PRT, Modeller, Workflow etc) and Software AG has a similar product Crossvision. Which one of these parallel products (or both) would survive the merger and would be positioned to the customer, only the future will tell. But, this would definitely create confusion and apprehension among the existing customers of both these vendors.

Now the news... webMethods sold itself for less than half ($550 million compared to $1.3 billion for Active) compared to what it paid back in 1999 for Active Software, the EAI company that was supposed to be its future. Reasons... webMethods was sinking back into the red with CEO David Mitchell blaming it on poor sales execution and gaps in its SOA offerings.

"Miko Matsumura" the face of "SOA" for webMethods (remember he is originally from Infravio) gave a very calculated answer to the future of the Registry Segment. In his opinion, JAXR ( compliance of these products would help customers to save their investments irrespective of the future road map.

Software AG relies on a heavily-promoted strategic partnership with Fujitsu in the BPM segment for Crossvision that now faces competition from webMethods. But according to Software AG CEO Karl-Heinz "There is a significant sales pipeline with the Fujitsu product, and Fujitsu BPM will continue to be the company’s strategy in the short- and mid-term". But in the next breath he had also added, "[Our own] IP always has a preference wherever they fit into same [product] segment."

So, it may not be "end of the road" for existing webMethods products.....

Thursday, April 05, 2007

XPDL,BEPL,JPDL,BPMNS,BPDM et al.. Standards and More Standards

BPM (Business Process Management) is getting more and more visibility in the industry in spite of the fact there are still some of difficult questions on it's justification remain unanswered or at least have divided opinion in the industry. One of them is "Whether to separate business logic from the components/services or not"? This is definitely against basic OO concept of "Encapsulation" i.e. Data and Business Logic of an object should remain inside the object and shielded from other objects. Let me leave it here and get to the real theme of this post..

There are innumerable standards, often from non-profit consortiums or from a group of vendors that are always interested in getting their ideas standardized and widely accepted in the industry, sometimes driven by what is supported or planned to be supported by their products. You can clearly find out the divide existing between these vendors on some standards with competing/overlapping standards popping up now and then to confuse the users and keep the proponents of these standards pulling each others leg.

There are quite a few advantages of using standards, the primary ones being portability and interoperability. Standards will help your code/artifact/entity to be ported from one tool/application/platform to the other and also inter operate with each other without much difficulty.

Let us look at the standards in the BPM space. Any BPM tool/framework provides for

A. Process Definition/Modeling
There are two distinct areas as part of this
- Creating Process Diagrams (ex. How to represent activities,joins,forks etc ). There can be standard diagramming elements which tools can support.
- Storing/serializing of these process diagrams in a common format for other tools to understand.

B. Process Execution
- Some standard interchange format for Process Semantics, which can be understood by the execution engines and this format should be portable across all process engines.

Now comes the question, where does these standards like BPMN, XPDL and BPEL et al stand today.

BPMN is a modeling notation — more than just a diagram, since each element has defined process semantics, abstracted from implementation details — but BPMN has no official XML schema, i.e. no interchange format. The BPMN proponents suggest than you can use any interchange format like XMI for portability.

XPDL captures all the elements of BPMN for interchange, But from a diagram portability perspective, not process semantic portability.

BPEL captures the process semantics and not the diagrams, but assumes the processes as a sequence of calls to web services. Everything in a BPEL is a web service operation, not "an activity", i.e. a unit of work

JPDL also captures the process semantics but it is very specific to jBPM (now with Redhat) . It assumes that every process consists of "States" and "Actions", somewhat like state machine.

On portability I would completely agree with "Bruce Silver" on what he has to say in one his blog posts.

"The argument over whether BPEL or XPDL is more "portable" is based on different interpretations of what portable means. If you mean the same process semantics can be executed on two different engines, then BPEL is more portable. If you mean that the same diagram can be created in two different tools, then XPDL — especially if you allow the target tool to ignore the graphical details that don't carry over"

Now a days every product vendor claims to support either BPMN, XPDL and BPEL or all of them. But, what does this support mean???There are many of them who have their own proprietary Interchange formats and have capability export/import from/to standard formats.There are some tools which can store these interchanges any standard/non-standard format, but would add one layer of translation before the process engine creates an instance of the process and executes it.

Is this what we are looking for in BPMS tools ?? IMHO, we should look for moving the process semantics and the diagrams repository from one tool/engine to the other as it is.. Also, we should look at whether the process engines can be hosted in any container (ex. JEE Server,ESB etc).
To put it straight we are looking for process tools and engines that are BPEL or XPDL based and not provide support for them in way of export and import or internal translation.

The question is.. while evaluating tools/engines...which of these standards what we should look for? With the current state these standards, I would definitely say there is no concrete answer. The choice is yours.. But, the decision should not be based on what the product vendors support, but what is the pros and cons of these standards and how you foresee the evolution of these standards.

Wednesday, March 28, 2007

SOA Approach .. Top Down or Bottom Up..

In his blog, Grady Booch (of UML and Rational Fame) talks about 'how Bottom Up approach for SOA can be a disaster'.

"Going back to the A part of SOA, the issue then is one of abstraction, separation of concerns, and all the usual fundamentals of architecture. I've seen some folks suggest creating an SOA from the bottom up: look at a silo, identify the potential services, and publish them, then weave a system together from them. This is in essence technology first. In my experience, this is a recipe for disaster and/or serious over-engineering. You've got to start with the scenarios/business needs, play those out against the existing/new systems, zero in on the points of tangency, and there plan a flag for harvesting a meaningful service. These styles, and their resulting costs/benefits, are rarely discussed. "

Following things should be kept in mind while deciding on the approach for SOA

1. It is business first and technology afterwards.
2. Do not think SOA as the panacea of all Business and IT alignment issues.
3. Keep it simple... Do not over-engineer.
4. Understand your business, existing IT assets/systems before going for something like SOA

Wednesday, March 14, 2007

Taxonomy and Ontology..

Recently I attended a session on "Architectural Knowledge Management " in a TOGAF conference, where I learnt about the KM concepts like topic maps, concept maps, ontology et al. Of course I was aware of mind mapping as a technique...But, I was quite impressed by the amount of research that is going on in this space...

And here it is again.....while doing some research on Service Registries/Repositories I bumped into Ontology once more...

Do you want to store the relationship between your services and their meta data, store some of the service properties like certain service qualities, say, minimum and maximum service delivery times; may require certain payment obligations, say, advance credit card payment, rather than only classifying them only into categories and sub categories??

Of course, that would be of great help for the consumers in finding services based on what they exactly want. They would also know the service dependencies (how the services are related).

If the service registry is expected to be the management backbone of an Enterprise SOA, then the they should support these features.. Does any of the Service Registries in the market or the open source space have support for this..May be...?? I could only find one of them clearly talking about it..

Tuesday, January 09, 2007

SOA and EDA, Do they complement?

There have been plenty of debate on "SOA vs EDA". Vendors give their flavour of the difference which suits their product:)...Everyone with the intention of cashing on these industry jargons somehow...

Of course both of them are architectural styles and can live with/without the other... But what needs to be understood is the underlying concept of these styles.

In SOA, the core building blocks are services.. some provide services for others to consume..
Where as EDA the core building blocks are events/business events (internal/external) and how the business treats it...

So, if you have services that receive and treat business events... what do you have.. a SOA or an EDA or both??