A few days back Todd Biske has written this nice blog post on Composite Applications, Mashups, Widgets and Gadgets. I think now the industry has somewhat in agreement that composite apps and mashups mean the same, but mashups are more used in the context of web based presentation of composite applications. Lot is being spoken about about Enterprise Mashups at present. I think the simplicity (its all about javascript,DHML and XML) of the technology would slowly eat away the Portal Technology market.
There seems to be a growing demand in organizations for creating "Unified Desktop" for the employees. The requirement is to create a single user interface to access basic features of all kinds of applications in the enterprise. Sometime back I was dealing with such an requirement from a leading online trading firm. The requirement is to build an "Unified Agent Desktop" for the customer care representatives, which would have case management workflows along with the access to multiple applications they use, to manage the cases.
Dashboard Gadgets and Widgets should hopefully make this simpler in the future. Application Vendors should provide these Application Widgets/Gadgets to their customers along with the applications or organizations can develop them based on their simplicity.
My understanding, views and experiments on enabling technologies for distributed and connected Systems
Friday, April 27, 2007
Wednesday, April 25, 2007
Mule jBPM Connector
Mule 1.4 comes with an BPM Connector. This connector can integrate with BPM engines that provide a Java API. If there is a requirement to integrate with a BPEL engine standard web services with the soap transport (axis/xfire) canbe used.
jBPM is the first BPM engine that comes out of the box with Mule 1.4.
It looks pretty cool. You can easily write your long running integration processes in JPDL.
I hope to see integration support for JBossRules (Drools) engine in the future.
jBPM is the first BPM engine that comes out of the box with Mule 1.4.
It looks pretty cool. You can easily write your long running integration processes in JPDL.
I hope to see integration support for JBossRules (Drools) engine in the future.
Tuesday, April 24, 2007
Are you looking at implementing SOA as part of your project?
Recently while going through a set of questions asked by my fellow colleagues to a customer regarding an RFP, I saw this question. This was an RFP for a packaged software implementation and the question was asked in the context of integration requirements with the other applications in the enterprise.
I brought this, to point out how product vendors and system integrators confuse customers with questions like this. Understanding of the concept is of paramount importance for its adoption. Dave Linthicum has clearly emphasized this in his blog, where he talks about the need for the vendors (I would add the System Integrators) to got o SOA School.
http://weblog.infoworld.com/realworldsoa/archives/2007/03/soa_vendors_nee.html
There are some other industry leader who have rightly pointed out how SOA has become a "goal" rathor than a "mean" to achieve business agility. There are certain organizations who talk about ROI of SOA initiative etc... IMHO, rather than starting SOA initiatives, which has become a new name of implementing web services and implementing a service registry (or may be implementing an ESB), organizations should try and inculcate the style in the organization. Microsoft Enterprise Architect Nick Malik has written a very good post (http://blogs.msdn.com/nickmalik/archive/2007/01/16/your-soa-is-jabows-just-a-bunch-of-web-services-and-i-can-prove-it.aspx) ,in which he points out, where the maximum benifits of SOA lie in an enterprise.
A few lines from his post..
"IT projects provide tools for business processes. They automate parts of a business process or collect information or reduce errors. The point is… which processes? In the past, traditional IT only succeeded with the processes that changed rarely"
"The problem is that there is a long list of business processes that occur frequently but that are more difficult to automate because they change frequently"
"That is the SOA sweet spot"
In most of the cases we do not try to understand the customer requires and what are his pain points before suggesting a solution. Therefore, "SOA" has become "magic wand" for the sales guys and consultants.
I brought this, to point out how product vendors and system integrators confuse customers with questions like this. Understanding of the concept is of paramount importance for its adoption. Dave Linthicum has clearly emphasized this in his blog, where he talks about the need for the vendors (I would add the System Integrators) to got o SOA School.
http://weblog.infoworld.com/realworldsoa/archives/2007/03/soa_vendors_nee.html
There are some other industry leader who have rightly pointed out how SOA has become a "goal" rathor than a "mean" to achieve business agility. There are certain organizations who talk about ROI of SOA initiative etc... IMHO, rather than starting SOA initiatives, which has become a new name of implementing web services and implementing a service registry (or may be implementing an ESB), organizations should try and inculcate the style in the organization. Microsoft Enterprise Architect Nick Malik has written a very good post (http://blogs.msdn.com/nickmalik/archive/2007/01/16/your-soa-is-jabows-just-a-bunch-of-web-services-and-i-can-prove-it.aspx) ,in which he points out, where the maximum benifits of SOA lie in an enterprise.
A few lines from his post..
"IT projects provide tools for business processes. They automate parts of a business process or collect information or reduce errors. The point is… which processes? In the past, traditional IT only succeeded with the processes that changed rarely"
"The problem is that there is a long list of business processes that occur frequently but that are more difficult to automate because they change frequently"
"That is the SOA sweet spot"
In most of the cases we do not try to understand the customer requires and what are his pain points before suggesting a solution. Therefore, "SOA" has become "magic wand" for the sales guys and consultants.
Tuesday, April 10, 2007
Business Activity Monitoring (BAM) .. Key components
With the amount of industry buzz around SOA and technologies like MDM, BPM and BAM, it would be difficult to find out an IT Leader who does not want the benefits of BAM in his organization. There are quite a few products in the market in this segment which will catch your eye. I would discuss the key components of BAM tool/framework in this post.
The key to any enterprise is to understand the Business Events (external and internal) and treat them accordingly. There are certain business events that would help you know the business heath. There are certain business events which together can tell you the performance, trends etc. This is nothing but business intelligence, in fact real-time business intelligence.
Ideally there would be three components of a BAM tool
1. Event Absorption Layer also known as the Data Collector Layer
2. Event Processing Layer
3. Delivery Layer
Event absorption layer is the one that collects business events from the enterprise. If you look at application landscape of any organization, you would find number of applications (legacy, custom developed, packaged apps etc). Business processes and business logic lie embedded in these applications in many cases. Data collection can be either using push/pull model. Application pushing data or BAM framework pulling the data from these sources. As it is difficult to collect/absorb data from these diverse sources, integration platforms and technologies play a vital role here. Due care should be taken so that this event collection/absorption is as non-invasive as possible so as not to affect the performance of these business applications.
Event Processing Layer has the job of analyzing and correlating these events based on some rules and assumptions. Key Performance Indicators (KPIs) are nothing but filtered and correlated event data based on some rules. This layer consists of a rules engine,an analytics engine and a predictive engine (fingerprinting engine) . Predictive engine uses the analyzed and correlated data from the business events to predict.
Finally, the Delivery layer to deliver the results to the end users. There should be a notification/alert engine to notify the users if required using different channels. There should be portal(preferably web based) for the users to have a look at the filtered, correlated and analyzed data.
The key to any enterprise is to understand the Business Events (external and internal) and treat them accordingly. There are certain business events that would help you know the business heath. There are certain business events which together can tell you the performance, trends etc. This is nothing but business intelligence, in fact real-time business intelligence.
Ideally there would be three components of a BAM tool
1. Event Absorption Layer also known as the Data Collector Layer
2. Event Processing Layer
3. Delivery Layer
Event absorption layer is the one that collects business events from the enterprise. If you look at application landscape of any organization, you would find number of applications (legacy, custom developed, packaged apps etc). Business processes and business logic lie embedded in these applications in many cases. Data collection can be either using push/pull model. Application pushing data or BAM framework pulling the data from these sources. As it is difficult to collect/absorb data from these diverse sources, integration platforms and technologies play a vital role here. Due care should be taken so that this event collection/absorption is as non-invasive as possible so as not to affect the performance of these business applications.
Event Processing Layer has the job of analyzing and correlating these events based on some rules and assumptions. Key Performance Indicators (KPIs) are nothing but filtered and correlated event data based on some rules. This layer consists of a rules engine,an analytics engine and a predictive engine (fingerprinting engine) . Predictive engine uses the analyzed and correlated data from the business events to predict.
Finally, the Delivery layer to deliver the results to the end users. There should be a notification/alert engine to notify the users if required using different channels. There should be portal(preferably web based) for the users to have a look at the filtered, correlated and analyzed data.
Friday, April 06, 2007
Who acquires what and why?
With so many acquisitions and mergers happening off late, it has become very difficult to keep pace with them . The new one in the bloc is "Software AG to acquire webMethods". There is a clear overlap in the product offerings of these companies. For example, Software AG has a service registry/repository product CentraSite that clearly overlaps with wM registry and repository products X Registry and X Broker from their earlier Infravio acquisition. There is a similar overlap in the BPM space too. webMethods has BPM tools as part of its integration platform ( PRT, Modeller, Workflow etc) and Software AG has a similar product Crossvision. Which one of these parallel products (or both) would survive the merger and would be positioned to the customer, only the future will tell. But, this would definitely create confusion and apprehension among the existing customers of both these vendors.
Now the news... webMethods sold itself for less than half ($550 million compared to $1.3 billion for Active) compared to what it paid back in 1999 for Active Software, the EAI company that was supposed to be its future. Reasons... webMethods was sinking back into the red with CEO David Mitchell blaming it on poor sales execution and gaps in its SOA offerings.
"Miko Matsumura" the face of "SOA" for webMethods (remember he is originally from Infravio) gave a very calculated answer to the future of the Registry Segment. In his opinion, JAXR (http://java.sun.com/webservices/jaxr/index.jsp) compliance of these products would help customers to save their investments irrespective of the future road map.
Software AG relies on a heavily-promoted strategic partnership with Fujitsu in the BPM segment for Crossvision that now faces competition from webMethods. But according to Software AG CEO Karl-Heinz "There is a significant sales pipeline with the Fujitsu product, and Fujitsu BPM will continue to be the company’s strategy in the short- and mid-term". But in the next breath he had also added, "[Our own] IP always has a preference wherever they fit into same [product] segment."
So, it may not be "end of the road" for existing webMethods products.....
Now the news... webMethods sold itself for less than half ($550 million compared to $1.3 billion for Active) compared to what it paid back in 1999 for Active Software, the EAI company that was supposed to be its future. Reasons... webMethods was sinking back into the red with CEO David Mitchell blaming it on poor sales execution and gaps in its SOA offerings.
"Miko Matsumura" the face of "SOA" for webMethods (remember he is originally from Infravio) gave a very calculated answer to the future of the Registry Segment. In his opinion, JAXR (http://java.sun.com/webservices/jaxr/index.jsp) compliance of these products would help customers to save their investments irrespective of the future road map.
Software AG relies on a heavily-promoted strategic partnership with Fujitsu in the BPM segment for Crossvision that now faces competition from webMethods. But according to Software AG CEO Karl-Heinz "There is a significant sales pipeline with the Fujitsu product, and Fujitsu BPM will continue to be the company’s strategy in the short- and mid-term". But in the next breath he had also added, "[Our own] IP always has a preference wherever they fit into same [product] segment."
So, it may not be "end of the road" for existing webMethods products.....
Thursday, April 05, 2007
XPDL,BEPL,JPDL,BPMNS,BPDM et al.. Standards and More Standards
BPM (Business Process Management) is getting more and more visibility in the industry in spite of the fact there are still some of difficult questions on it's justification remain unanswered or at least have divided opinion in the industry. One of them is "Whether to separate business logic from the components/services or not"? This is definitely against basic OO concept of "Encapsulation" i.e. Data and Business Logic of an object should remain inside the object and shielded from other objects. Let me leave it here and get to the real theme of this post..
There are innumerable standards, often from non-profit consortiums or from a group of vendors that are always interested in getting their ideas standardized and widely accepted in the industry, sometimes driven by what is supported or planned to be supported by their products. You can clearly find out the divide existing between these vendors on some standards with competing/overlapping standards popping up now and then to confuse the users and keep the proponents of these standards pulling each others leg.
There are quite a few advantages of using standards, the primary ones being portability and interoperability. Standards will help your code/artifact/entity to be ported from one tool/application/platform to the other and also inter operate with each other without much difficulty.
Let us look at the standards in the BPM space. Any BPM tool/framework provides for
A. Process Definition/Modeling
There are two distinct areas as part of this
- Creating Process Diagrams (ex. How to represent activities,joins,forks etc ). There can be standard diagramming elements which tools can support.
- Storing/serializing of these process diagrams in a common format for other tools to understand.
B. Process Execution
- Some standard interchange format for Process Semantics, which can be understood by the execution engines and this format should be portable across all process engines.
Now comes the question, where does these standards like BPMN, XPDL and BPEL et al stand today.
BPMN is a modeling notation — more than just a diagram, since each element has defined process semantics, abstracted from implementation details — but BPMN has no official XML schema, i.e. no interchange format. The BPMN proponents suggest than you can use any interchange format like XMI for portability.
XPDL captures all the elements of BPMN for interchange, But from a diagram portability perspective, not process semantic portability.
BPEL captures the process semantics and not the diagrams, but assumes the processes as a sequence of calls to web services. Everything in a BPEL is a web service operation, not "an activity", i.e. a unit of work
JPDL also captures the process semantics but it is very specific to jBPM (now with Redhat) . It assumes that every process consists of "States" and "Actions", somewhat like state machine.
On portability I would completely agree with "Bruce Silver" on what he has to say in one his blog posts.
"The argument over whether BPEL or XPDL is more "portable" is based on different interpretations of what portable means. If you mean the same process semantics can be executed on two different engines, then BPEL is more portable. If you mean that the same diagram can be created in two different tools, then XPDL — especially if you allow the target tool to ignore the graphical details that don't carry over"
Now a days every product vendor claims to support either BPMN, XPDL and BPEL or all of them. But, what does this support mean???There are many of them who have their own proprietary Interchange formats and have capability export/import from/to standard formats.There are some tools which can store these interchanges any standard/non-standard format, but would add one layer of translation before the process engine creates an instance of the process and executes it.
Is this what we are looking for in BPMS tools ?? IMHO, we should look for moving the process semantics and the diagrams repository from one tool/engine to the other as it is.. Also, we should look at whether the process engines can be hosted in any container (ex. JEE Server,ESB etc).
To put it straight we are looking for process tools and engines that are BPEL or XPDL based and not provide support for them in way of export and import or internal translation.
The question is.. while evaluating tools/engines...which of these standards what we should look for? With the current state these standards, I would definitely say there is no concrete answer. The choice is yours.. But, the decision should not be based on what the product vendors support, but what is the pros and cons of these standards and how you foresee the evolution of these standards.
There are innumerable standards, often from non-profit consortiums or from a group of vendors that are always interested in getting their ideas standardized and widely accepted in the industry, sometimes driven by what is supported or planned to be supported by their products. You can clearly find out the divide existing between these vendors on some standards with competing/overlapping standards popping up now and then to confuse the users and keep the proponents of these standards pulling each others leg.
There are quite a few advantages of using standards, the primary ones being portability and interoperability. Standards will help your code/artifact/entity to be ported from one tool/application/platform to the other and also inter operate with each other without much difficulty.
Let us look at the standards in the BPM space. Any BPM tool/framework provides for
A. Process Definition/Modeling
There are two distinct areas as part of this
- Creating Process Diagrams (ex. How to represent activities,joins,forks etc ). There can be standard diagramming elements which tools can support.
- Storing/serializing of these process diagrams in a common format for other tools to understand.
B. Process Execution
- Some standard interchange format for Process Semantics, which can be understood by the execution engines and this format should be portable across all process engines.
Now comes the question, where does these standards like BPMN, XPDL and BPEL et al stand today.
BPMN is a modeling notation — more than just a diagram, since each element has defined process semantics, abstracted from implementation details — but BPMN has no official XML schema, i.e. no interchange format. The BPMN proponents suggest than you can use any interchange format like XMI for portability.
XPDL captures all the elements of BPMN for interchange, But from a diagram portability perspective, not process semantic portability.
BPEL captures the process semantics and not the diagrams, but assumes the processes as a sequence of calls to web services. Everything in a BPEL is a web service operation, not "an activity", i.e. a unit of work
JPDL also captures the process semantics but it is very specific to jBPM (now with Redhat) . It assumes that every process consists of "States" and "Actions", somewhat like state machine.
On portability I would completely agree with "Bruce Silver" on what he has to say in one his blog posts.
"The argument over whether BPEL or XPDL is more "portable" is based on different interpretations of what portable means. If you mean the same process semantics can be executed on two different engines, then BPEL is more portable. If you mean that the same diagram can be created in two different tools, then XPDL — especially if you allow the target tool to ignore the graphical details that don't carry over"
Now a days every product vendor claims to support either BPMN, XPDL and BPEL or all of them. But, what does this support mean???There are many of them who have their own proprietary Interchange formats and have capability export/import from/to standard formats.There are some tools which can store these interchanges any standard/non-standard format, but would add one layer of translation before the process engine creates an instance of the process and executes it.
Is this what we are looking for in BPMS tools ?? IMHO, we should look for moving the process semantics and the diagrams repository from one tool/engine to the other as it is.. Also, we should look at whether the process engines can be hosted in any container (ex. JEE Server,ESB etc).
To put it straight we are looking for process tools and engines that are BPEL or XPDL based and not provide support for them in way of export and import or internal translation.
The question is.. while evaluating tools/engines...which of these standards what we should look for? With the current state these standards, I would definitely say there is no concrete answer. The choice is yours.. But, the decision should not be based on what the product vendors support, but what is the pros and cons of these standards and how you foresee the evolution of these standards.
Subscribe to:
Posts (Atom)