Powered By Blogger

Tuesday, December 21, 2010

Processing XML input payload without namespace

Sometimes we come across situations like BPEL receives an XML without any target namespace defined it and BPEL has to transform it to different format. Assume that the XML it receives is generated by third party application which is not in our control.Given this scenario we have no option of changing source input XML but we have to change the BPEL code to work with XML payload without name space associated with it.By default BPEL always works with payloads which has name space associated with it.

Brief steps to achieve this
  • Remove target namespace attribute from xsd which is modelled to validate incoming xml
  • Keep default namespace attribute in xsd
  • Remove any namespace prefixes pointing to xsd default namespace from xsl file
Example:

Assume that BPEL receives below xml as input
<invoices>
<invoice>
<invoiceNumber>1234 </invoiceNumber>
<partNumber>AELWF</partNumber>
</invoice>
</invoices>

 
By default BPEL generates below xsd to represent above xml

<?xml version="1.0" encoding="windows-1252" ?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns="http://www.example.org"
targetNamespace="http://www.example.org"
elementFormDefault="qualified">

We need to remove targetNamespace from above xsd

<?xml version="1.0" encoding="windows-1252" ?>

<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"

xmlns="http://www.example.org"

elementFormDefault="qualified">

Remove any namespace prefixes in xsl files.

Now we can deploy BPEL process

There might be many other ways to achieve this, but I feel this approach is simple.

Sunday, December 5, 2010

ORABPEL-10101-Error while updating the task.ORA-12899: value too large for column "ORABPEL"."WFTASK"."PUSHBACKSEQUENCE" (actual: 201, maximum: 200)

This error has something to do with your human task updating by task service.Though this may occur in many ways, I will explain one use-case where I received this error.

pushback sequence: This is workflow internal mechanism to keep track number of times task has been updated. This is also used in pushing back assigned task to previous approvers or original assignees users from which it came.it is defined in

Schema-ORABPEL
Table-WFTASK
Column-PUSHBACKSEQUENCE

Our environment description

We have a polling bpel process which polls staging table and closes certain human tasks when specific criteria meets.

We have set 2 mins as polling frequency for this BPEL process.Due to this less polling frequency time, process used to gets executed very frequently and this led to a scenario where new instances getting created before completing old ones and all of them trying to do same task i.e closing human tasks if some criteria was met. At certain point in time there were few hundred processes running and all of them tried to access same task and were trying to close the task.Considering our environment, for us this is something timing issue.
Ultimately this led to data base level locks and violation of database level constraints like

Whenever task is modified then extra sequence is being appended to pushbacksequence.

Lets say initially when task is created we have sequence =>1-3
then when any operation is performed on it then it will become 1-3-5-7, like this for each operation it goes on adding sequence.

After few attempts it became length of 201(1-3-5-7........127) and database column is defined to hold only  length of 200 characters hence the error.

Exact error we have seen is

10/12/02 11:06:19 java.sql.SQLException: ORA-20002: Error while updating the task


ORA-06512: at "ORABPEL.WFTASKPKG", line 1034

ORA-12899: value too large for column "ORABPEL"."WFTASK"."PUSHBACKSEQUENCE" (actual: 201, maximum: 200)

ORA-06512: at line 1

Tuesday, August 24, 2010

Using Email to initiate a BPEL Process

The notification service in Oracle BPEL Process Manager allows you to send a notification by email (as well as voice message, fax, pager, or SMS) from a BPEL process.

However another requirement is to be able to use the receipt of an email to initiate a BPEL process. This is thesubject of this blog, many thanks to Muruga Chinnananchi on whose original example this is based.Essentially we want to create a simple process EMailActivation which recieves an email sent to a particularemail address. To achive this there are two basic steps:
Configure the BPEL Server to be able to connect to the mail account.
Define the BPEL Process to be initiated on receipt of an email.
Configure Email Account:
To configure the email account which the BPEL Server should connect to, we need to place a MailAccount xml configuration file (in our example BpelMailAccount.xml) into the following directory:

    SOA_DOMAIN\bpel\domains\default\metadata\MailService
Note: You will need to create the metadata and MailService directories.
The file itself can have any name (though must end with .xml) as you can define multiple accounts. However make a note of the name as you will need it to link your BPEL Process to the actual mail account.

Here’s one sample file:
<mailaccount xmlns="http://services.oracle.com/bpel/mail/account">
   <userInfo>
     <displayName> any a/c display name </displayName>
     <organization> My Org </organization>
     <replyTo> replyToAddress@localhost </replyTo>
 </userInfo>
<outgoingServer>
  <protocol> smtp </protocol>
  <host > localhost </host >
  <authenticationRequired > false </authenticationRequired >
  <outgoingserver>

<incomingServer>
  <protocol> pop3 </protocol>
  <host > localhost </host >
  <email> bpel </email>
  <password> CRYPT{xdfg+Gs=}</password>
</incomingServer>
</mailAccount>


The outgoing SMTP service doesn’t need to be configured (as we use the notification service to send outgoing emails). However the incoming account is defined by the following tags:
    <incomingServer>
        <protocol>[protocol pop3 or imap]</protocol>
        <host>[imap or pop3 server]</host>
        <email>[imap or pop3 account]</email>
        <password>[imap or pop3 password]</password>
        <folderName>[imap only, inbox folder ]</folderName>
    </incomingServer>
Note: When defining the account name, be careful to use the actual account name not the email address as they are not always the same.


Creating BPEL Process:-

The first step is to use JDeveloper to create an Asynchronous process initiated by a request message with a payload containing an element of type mailMessage (defined in Mail.xsd installed as part of BPEL PM).
To do this use the BPEL Project Creation wizard to create a BPEL Process in the normal way. After entering the process name and specifying the process template to be asynchronous, select "Next".
This will take you to the next step in the wizard where you specify the Input and Output Schema Elements, click on the flash light for the input schema and select Mail.xsd (located in <SOA_HOME>\bpel\system\xmllib)

Specifying input/output schema elements leads to opening the type chooser window to select the element to use from the imported schema. Select the mailMessage element

Once the process has been created you can remove the callBackClient activity as we won’t need this.
Import Common Schema:

If you now try and compile your process, you will find it fails with an error message. Thus is because the Mail.xsd itself imports a schema (common.xsd), so you need to import this schema as well.

To import the Mail Schema into your BPEL Process, ensure the diagram view for your BPEL process is open and selected in JDeveloper. Then within the BPEL Structure window, right click on the Project Schemas node and select "Import Schemas"

Note: Once imported, manually update the WSDL file to ensure the import statements for both the Mail.xsd and common.xsd are contained within the same element or it will still fail to compile. See previous blog - Using Nested Schemas with BPEL for details.

Define Mail Activation Agent

The process itself is now ready for deployment. However we need to complete one final activity, which is to tie the BPEL Process to a mail activation agent for the Email account that we defined earlier.

The Activation Agent will poll the defined mail box for emails and then for each email it receives invoke an instance of the process to handle it.

To do this you need to add the following definition to the bpel.xml file, after the <partnerlinkbindings>element:

<activationAgents>

  <activationAgent className=”com.collaxa.cube.activation.mail.MailActivationAgent”
  heartBeatInterval=”60”>

  <property name=”accountName">BpelMailAccount</property>

  </activationAgent>

</activationAgents>

 
Where heartBeatInterval is how often we want to poll the email account for new emails, and the accountName corresponds to the name of the account configuration file we defined earlier.
Finally deploy the process and send an email to the appropriate account.
Note: If you modify the BPEL process in JDeveloper, the bpel.xml file may lose its changes (i.e. the activationAgent definition), and as a result the process will never get initiated - so always check the bpel.xml file is correctly defined just before deploying the process.

Email Server:

Install and configure Email server according to product instructions

Passing BPEL Variable contents into XSLT as Parameters

XSLT is executed using the XPath Extension Function ora:processXSLT. The well known two arguments for this extension function are as follows.

1) The XSL File Name
2) The source variable to be transformed [bpws:getVariableData(...)]

But there is one more argument that this XPath function can accept - 'properties'.
Note the signature of this function specified in xpath-functions.xml
Signature: ora:processXSLT('xsl template','input variable','properties'?).
These properties translate to XSL Parameters that can be accessed within the XSL map using the construct

<xsl:param name="paramName">

You can retrieve the value of this parameter within your XSLT in a way similar to the way used to extract data from XSL variables.

For e.g. <xsl:value-of select="$paramName"/>

The "properties" argument of the XPath function is expected to be an XML Element that has the following structure.
Illustrated below is an example of such a properties XML.

<parameters xmlns:ns2="http://schemas.oracle.com/service/bpel/common" xmlns="http://schemas.oracle.com/service/bpel/common">
<ns2:item>
<ns2:name>userName</ns2:name>
<ns2:value>ramkmeno</ns2:value>
</ns2:item>
<ns2:item>
<ns2:name>location</ns2:name>
<ns2:value>CA</ns2:value>
</ns2:item>
</parameters>

XSLTParameters.xsd

<?xml version="1.0" encoding="windows-1252" ?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.oracle.com/service/bpel/common" targetNamespace="http://schemas.oracle.com/service/bpel/common" elementFormDefault="qualified">
<xsd:element name="parameters">
<xsd:annotation>
<xsd:documentation> A sample element </xsd:documentation> </xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="item" maxOccurs="unbounded"> <xsd:complexType>
<xsd:sequence>
<xsd:element name="name" type="xsd:string"/>
<xsd:element name="value" type="xsd:string"/>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:schema>
Within the XSLT, the parameters are accessible through their names. [in this case, the parameter names are "userName" and "location", and their values are "ramkmeno" and "CA" respectively.

Approach in nutshell

1) Add XSLTParameters.xsd to BPEL project
2) Declare a variable (xslt_Variable) is of the abovementioned data type- parameters [see XML above.]
3) Populate the variable with the contents of the BPEL variable you wish to pass into the XSLT
4) Invoke processXSLT() with the XSL File, source variable, and the parameters variable.
5) Access the parameter contents within the XSLT

Example BPEL Snippet

<!--Step 1: initialize the parameters variable from whatever BPEL variable whose information you need to access from within XSLT -->
<assign name="Assign_PassProcessIterator">
<copy>
<from expression='"userName"'/> <!-- Name of xslt parameter-->
<to variable="xslt_Variable"
query="/ns7:parameters/ns7:item/ns7:name"/>
</copy>
<copy>
<from variable="Praveen"/> <! Value of xslt parameter-->
<to variable="xslt_Variable"
query="/ns7:parameters/ns7:item/ns7:value"/>
</copy> </assign>
<!--Step 2: Invoke the XSLT with the parameters as the third argument -->
<assign name="executeXSLT">
<bpelx:annotation>
<bpelx:pattern>transformation</bpelx:pattern>
</bpelx:annotation>
<copy>
<from expression="ora:processXSLT('TestXSLParams.xsl',
bpws:getVariableData('inputVariable','payload'),
bpws:getVariableData('xslt_Variable'))"/>
<to variable="outputVariable" part="payload"/>
</copy>
</assign>

XSLT Snippet

<xsl:stylesheet version="1.0" ....>
<xsl:param name="userName"/>
<xsl:template match="/">
<ns1:TestXSLParamsProcessResponse>
<ns1:result>
<xsl:value-of select="concat('User : ', $userName, ' Location : ',$location)"/>
</ns1:result>
</ns1:TestXSLParamsProcessResponse>
</xsl:template>
</xsl:stylesheet>

Tuesday, August 3, 2010

Automatic recovery program for pending BPEL call back messages

BPEL engine maintains all async call back messages into database table called dlv_message. You can see such all messages in BPEL console call-back manual recovery area.The query being used by bpel console is joined on dlv_message and work_item tables.This query simply picks up all call back messages which are undelivered and have not been modified with in certain threshold time.

Call-back messages are processed in following steps
  • BPEL engine assigns the call-back message to delivery service
  • Delivery service saves the message into dlv_message table with state 'UNDELIVERED-0'
  • Delivery service schedules a dispatcher thread to process message asynchronously
  • Dispatcher thread enqueues message into JMS queue
  • Message is picked up by MDB
  • MDB delivers the message to actual BPEL process  waiting for call-back and changes state to 'HANDLED=2'
So given above steps, there is always possibility that message is available in dlv_message table but MDB is failed in delivering it to BPEL process which keeps message always in state= 0.

Following program can be tailored to suite one's own requirements to recover from such state-0 messages.

Note:- This program contains logic to recover from invocation and call-back messages. Please comment out appropriately.

package bpelrecovery;
import com.oracle.bpel.client.*;
import com.oracle.bpel.client.util.SQLDefs;
import com.oracle.bpel.client.util.WhereCondition;
import java.util.ArrayList;
import java.util.Hashtable;
import java.util.List;
import javax.naming.Context;

public class bpelrecovery {

public bpelrecovery() {

}

public static void main(String[] args) {

bpelrecovery recover = new bpelrecovery();
String rtc = "";
try{
rtc = recover.doRecover();
}
catch (Exception e){
e.printStackTrace();
}
}
rivate void recoverCallbackMessages(List messages)
throws Exception
{

String messageGuids[] = new String[messages.size()];
for(int i = 0; i < messages.size(); i++)
{
ICallbackMetaData callbackMetadata = (ICallbackMetaData)messages.get(i);
String messageGuid = callbackMetadata.getMessageGUID();

messageGuids[i] = messageGuid;
System.err.println((new StringBuilder()).append("recovering callback message =
").append(messageGuids[i]).append(" process
[").append(callbackMetadata.getProcessId()).append("(").append(callbackMetadata.getRevisionTag()).ap
pend(")] domain [").append(callbackMetadata.getDomainId()).append("]").toString());

}

Locator locator = getLocator();
IBPELDomainHandle domainHandle = locator.lookupDomain();
domainHandle.recoverCallbackMessages(messageGuids);

}

public String doRecover() throws Exception{
// Connect to domain "default"

try{

System.out.println("doRecover() instantiating locator...");

Locator locator = getLocator();

System.out.println("doRecover() instantiated locator for domain " +

locator.lookupDomain().getDomainId());

// look for Invoke messages in need of recovery

StringBuffer buf1 = new StringBuffer();

WhereCondition where = new WhereCondition(buf1.append(SQLDefs.IM_state).append( " = "

).append(IDeliveryConstants.STATE_UNRESOLVED ).toString() );

System.out.println("doRecover() instantiating IInvokeMetaData... with where = "+ where.getClause());

IInvokeMetaData imd1[] = locator.listInvokeMessages(where);

System.out.println("doRecover() instantiated IInvokeMetaData");

// iterate thru the list
List l1 = new ArrayList();

for (Object o:imd1){

l1.add(o);

}

// See how many INVOKES are in the recovery zone

System.out.println("doRecover() instantiated IInvokeMetaData size = " +l1.size());

// look for Callback messages in need of recovery

StringBuffer buf = new StringBuffer();

where = new WhereCondition(buf.append(SQLDefs.DM_state).append( " = "

).append(IDeliveryConstants.TYPE_callback_soap ).toString() );

System.out.println("doRecover() instantiating ICallbackMetaData... with where = "+

where.getClause());

ICallbackMetaData imd[] = locator.listCallbackMessages(where);

System.out.println("doRecover() instantiated ICallbackMetaData");
//

// recover

//

List l = new ArrayList();

for (Object o:imd){

l.add(o);

}

 recoverCallbackMessages(l);

}

catch (Exception e){

e.printStackTrace();

}
return "done";

}

public Locator getLocator(){

System.out.println("getLocator() start");

Locator locator = null;
// set JNDI properties for BPEL lookup

String jndiProviderUrl = "opmn:ormi://localhost:6003:oc4j_soa/orabpel";

String jndiFactory = "com.evermind.server.rmi.RMIInitialContextFactory";

String jndiUsername = "oc4jadmin";

String jndiPassword = "welcome1";
Hashtable jndi = new Hashtable();

jndi.put(Context.PROVIDER_URL, jndiProviderUrl);

jndi.put(Context.INITIAL_CONTEXT_FACTORY, jndiFactory);

jndi.put(Context.SECURITY_PRINCIPAL, jndiUsername);

jndi.put(Context.SECURITY_CREDENTIALS, jndiPassword);

jndi.put("dedicated.connection", "true");

try{

System.out.println("getLocator() instantiating locator...");

locator = new Locator("default", "welcome1", jndi);

System.out.println("getLocator() instantiated locator");

}

catch (Exception e){

System.out.println("getLocator() error");

e.printStackTrace();

}

return locator;

}
}

Saturday, July 24, 2010

BPEL dehydration data store tables and their significance.

Dehydration store is the database where BPEL engine stores all BPEL processes meta data and run time instance data.This data store is installed under db schema- ORABPEL
Meta data includes bpel process descriptor (bpel.xml), human task modelling data etc..
Run time instance data includes process instance records, process activities execution data, invoke and call back xml messages etc..

Table nameDescription
CUBE_INSTANCEContains one entry for each BPEL instance created. It stores instance meta data information like creation date,last modified date, current state, process id etc.
Following are processes state codes and their meaning
StateCode
Closed and Aborted8
Closed and Cancelled7
Closed and Completed 5
Closed and Faulted 6
Closed and (Pending or Cancel)4
Closed and Stale9
Initiated 0
Open and Faulted3
Open and Running1
Open and Suspended2
CUBE_SCOPEStores the scope data for an instance. It stores BPEL scope variable values
INVOKE_MESSAGEStores incoming (invocation) messages (messages that result in the creation of an instance). This table only stores the meta data for a message (for example, current state, process identifier, and receive date). Following are message states and their meanings
StateDescriptionCode
CANCELLEDMessage Processing Cancelled3
HANDLEDMessage is processed2
RESOLVEDMessage is given to BPEL PM but not yet processed1
UNRESOLVEDMessage is not yet given to BPEL PM0
DLV_MESSAGECall back messages are stored here
WORK_ITEMStores activities created by an instance. All activities in a BPEL flow have a work_item table. This table includes the meta data for the activity (current state, label, and expiration date (used by wait activities))
SCOPE_ACTIVATIONScopes that need to be routed/closed/compensated are inserted into this table. In case of system failure, we can pick up and re-perform any scopes that should have been done before the failure
DLV_SUBSCRIPTIONStores delivery subscriptions for an instance. Whenever an instance expects a message from a partner (for example, the receive or onMessage activity) a subscription is written out for that specific receive activity. Once a delivery message is received the delivery layer attempts to correlate the message with the intended subscription
AUDIT_TRAILStores record of actions taken on an instance. As an instance is processed, each activity writes events to the audit trail as XML
AUDIT_DETAILSStores details for audit trail events that are large in size. Audit details are separated from the audit_trail table due to their large size. The auditDetailThreshold property in Oracle BPEL Control under Manage BPEL Domain > Configuration is used by this table. If the size of a detail is larger than the value specified for this property, it is placed in this table. Otherwise, it is placed in the audit_trail table
XML_DOCUMENTStores process input and output xml documents. Separating the document storage from the meta data enables the meta data to change frequently without being impacted by the size of the documents
WI_EXCEPTIONStores exception messages generated by failed attempts to perform, manage or complete a work item. Each failed attempt is logged as an exception message
PROCESS_DESCRIPTORStores BPEL processes deployment descriptor(bpel.xml)
Record of events (informational, debug, error) encountered while interacting with a process.
INVOKE_MESSAGE_BINStores invoke payload of a process. This table has foreign key relationship with INVOKE_MESSAGE table
DLV_MESSAGE_BINStores received payload of a call-back process.This table has foreign key relationship with DLV_MESSAGE
WFTASKStores human workflow tasks run time meta data like taskid,title,state,user or group assigned, created and updated dates
WFTASKMETADATAStores task meta data. Content in this table comes from '.task' file of BPEL project
WFASSIGNEEStores task assignee information
WFMESSAGEATTRIBUTEStores task input payload parameters
WFATTACHMENTStores task attachments
WFCOMMENTSStores task comments

Tuesday, July 20, 2010

Configuring JDev and SOA server 10134 for MS SQL server database connections

Jdeveloper configuration:
By default Jdeveloper works with Oracle databases.This article explains configuring Jdeveloper to work with MS SQL server

  1. Download MS SQL Server 2005 JDBC driver or suitable JDBC driver for your MS SQL server
  2. Extract  this into your machine (assume program files). It should create folder  called  "Microsoft SQL Server 2005 JDBC Driver" or folder name be different based on your driver version.
  3. Search for sqljdbc.jar from above folder and  copy this file into JDeveloper\JDBC\lib folder
  4. Close your Jdeveloper if it is opened
  5. Open JDeveloper/jdev/bin/jdev.conf file add following entry.
    AddJavaLibPath C:/Program files/Microsoft SQL Server 2000 Driver for JDBC/lib.
  6. Here path points to sqljdbc.jar file. If sqljdbc.jar is residing in other folder than above mentioned, then copy the correct folder path
  7. Open command prompt, go to JDEV_HOME/JDev/bin and execute following command

    jdev -verbose (This will open JDeveloper)
  8. Now go to JDeveloper > Connections > Database Connections > New Database Connection
  9. Select Third Party JDBC
  10. Specify MS Sql Server User Name, password and optional  Role parameter
  11. In connection page specify following


    - Driver Class: com.microsoft.sqlserver.jdbc.SQLServerDriver

    - For class path browse to C:/Program files/Microsoft SQL Server 2000 Driver for JDBC/lib folder, select sqljdbc.jar add it as library.
  12. Specify URL as following.


    jdbc:sqlserver://DBHOSTNAME:;databaseName=MSSQLDBNAME
  13. Now test the connection
SOA server configuration:
We need to set server CLASSPATH to include database driver jar file. This can be achieved in many ways.
  • For standalone installation type, please make an entry into Oracle_Home/bpel/system/appserver/oc4j/j2ee/home/config/server.xml
  • For middle tier installation type,please make an entry into Oracle_Home/j2ee/OC4J_BPEL/config/server.xml
    Or
  • For standalone installation type, please drop jar file into Oracle_Home/bpel/system/appserver/oc4j/j2ee/home/applib
  • For middle tier installation type,please drop jar file into
    Oracle_Home/j2ee/OC4J_BPEL/applib
                                   

Sunday, July 18, 2010

Migrating in-flight BPEL instances from one environment to another

This article applies to Oracle BPEL Process Manager - Version: 10.1.3.3 to 10.1.3.4 and describes how to migrate BPEL processes including in-flight instances running on one environment to a completely new environment.

Lets take below use case to understand how migration can take place.

Use-case:
BPEL is running on a single instance mid-tier against a single instance Oracle database 10.2.0.3.
This will be moved to a new mid-tier server running against a new RAC Oracle database 10.2.0.3.

Please follow below steps to achieve the goal.

1. Install SOA schemas on RAC Database by running IRCA scripts.

2. Install mid-tier and patch it to 10.1.3.3 or above.

3. Shutdown the mid-tier and database.

4. Replace the new ORABPEL Schema (on RAC) with the original ORABPEL schema where BPEL Process instances are Dehydrated.

This can be done by database export and import.

5. Start the mid-tier and database.

You should now be able to see all BPEL process in new BPEL console and the temporary files for BPEL processes are regenerated.

6. All BPEL processes (including one with Human Tasks running with OracleAS Single Sign-On) should have been successfully migrated.

All opened instances should be able to finish and new instances could be started from the same BPEL processes.

- before step (5) you have to edit the BPEL table PROCESS_DESCRIPTOR, field DESCRIPTOR and change the property name wsdlLocation to the new server:port values.

- The whole procedure will work just fine for new instances of deployed BPEL Processes. However for old instances we had to open in the old BPEL Server theHTTP Port where WSDL were publish because migrated opened instances were always trying to reach that server:port when the main BPEL Process call other (inner)BPEL Processes defined on it.

eg: Main BPEL Process "A" is invoked in new server:port WSDL Location (this instance was migrated from the previous server; new instances work just fine).

When the process "A" is dehydrated and calls a second BPEL Process, say process "B", it does the call over the old server:port WSDL location.

We could not figure out how to resolve this issue so we started a listener process (iptables) on the old server that redirect all the trafic to the new server:port. So invocation of old WSDL locations are redirected to the new server:port values. This configuration will have to exist until old instances migrated from the previous server are finished. New instances of BPEL processes do not have this problem.

In step (6) "All opened instances should be able to finish and new instances could be started from the same BPEL processes." is not completely true. Is there any other way to get this working fine without using redirection from old server to new one.


Tuesday, June 29, 2010

How to change dehydration store DB for SOA Suite

This article explains about pointing SOA 10g suite dehydration data store to another database.
  • Install another database instance and ensure that the new target database is running.
  • Run IRCA script to create schemas for BPEL, ESB and OWSM in the new Database. example: irca all "localhost 1521 v102" welcome -overwrite
  • Backup the file- ${OC4J_HOME}\config\data-sources.xml
  • Change the following ESB and BPEL connection pools to point to new DB resource
  1. BPELPM_CONNECTION_POOL
  2. ESBAQJMSPool
  3. ESBPool
You can use either Enterprise Manager Console to perform these changes or edit the file: ${OC4J_HOME}\config\data-sources.xml


Below is a sample of configuration for BPELPM_CONNECTION_POOL in data-sources.xml<



<connection-pool name="BPELPM_CONNECTION_POOL">
<connection-factory factory-class="oracle.jdbc.pool.OracleDataSource"
user="orabpel_OracleAS_1" password="orabpel" url="jdbc:oracle:thin:@localhost:1521:xe"/>
</connection-pool>
  • If you are changing from Olite DB to Oracle EE, look in the project's build.properties files for DB_URL and change DB_VENDOR from "olite" to "oracle"
    DB_VENDOR=oracle
    otherwise no change is neccesary.
  • Update ESB Metadata - Connect to ESB schema (ORAESB) and change the value of DT_OC4J_HOST and DT_OC4J_HTTP_PORT to match your environment Hostname and Port and by running the following SQL script:
insert into esb_parameter values('ACT_ID_RANGE', '400');
insert into esb_parameter values('DT_OC4J_HOST', 'soainternal.mycompany.com');
insert into esb_parameter values('DT_OC4J_HTTP_PORT', '7777');
insert into esb_parameter values('PROP_NAME_INITIAL_CONTEXT_FACTORY', 'com.evermind.server.rmi.RMIInitialContextFactory');
insert into esb_parameter values('PROP_NAME_DEFERRED_TCF_JNDI', 'OracleASjms/MyTCF');
insert into esb_parameter values('PROP_NAME_DEFERRED_XATCF_JNDI', 'OracleASjms/MyXATCF');
insert into esb_parameter values('PROP_NAME_DEFERRED_TOPIC_JNDI','OracleASjms/ESBDeferredTopic');
insert into esb_parameter values('PROP_NAME_ERROR_TCF_JNDI', 'OracleASjms/MyTCF');
insert into esb_parameter values('PROP_NAME_ERROR_XATCF_JNDI', 'OracleASjms/MyXATCF');
insert into esb_parameter values('PROP_NAME_ERROR_TOPIC_JNDI', 'OracleASjms/ESBErrorTopic');
insert into esb_parameter values('PROP_NAME_ERROR_RETRY_JNDI', 'OracleASjms/ESBErrorRetryTopic');
insert into esb_parameter values('PROP_NAME_ERROR_RETRY_TCF_JNDI', 'OracleASjms/MyXATCF');
insert into esb_parameter values('PROP_NAME_MONITOR_TCF_JNDI', 'OracleASjms/MyTCF');
insert into esb_parameter values('PROP_NAME_MONITOR_TOPIC_JNDI', 'OracleASjms/ESBMonitorTopic');
insert into esb_parameter values('PROP_NAME_CONTROL_TCF_JNDI', 'OracleASjms/MyXATCF');
insert into esb_parameter values('PROP_NAME_CONTROL_TOPIC_JNDI', 'OracleASjms/ESBControlTopic');
commit;
  • Review the inserted values using- select * from esb_parameter
  • Restart the SOA container and verify the opmn and J2EE container logs for any errors.

Sunday, June 20, 2010

Characteristics and Challenges of SOA

Loosely fashioned interacting software agents

This has been buzz word in IT industry for years that many software technologies and architectures tried to address.Coupling refers to degree of direct knowledge that one software component has about other component to achieve required behavior. In simple words how dependent a component on other component (If you take Java language, here component can be anything starting from a java class to an EJB) to achieve its intended functionality. This is a common phenomenon in designing applications. You can not really design even a single application with out dependency. Only matters here is how you are designing/coding those dependencies, how they are going to impact your application in case of change in behavior of a component and how soon you can incorporate the changes if it gets effected. Given all these you can not avoid coupling completely but we have to minimize it. SOA simply minimizes this dependency by defining services to the extent of granularity in a such a way that service is independent and self contained.Achieving this is really difficult,challenging and comes only with experience.

Easy integration among disparate software components

The effect of 1st discussed activity results in achieving this goal. It all depends on how complete your service defining exercise is. If a service is defined with sufficient granularity and self existence then integration is nothing but pick all what services you want and invoke them.

Reusing software components

This has been there for decades and most of traditional technologies achieved sufficient results. Here what I wanted to say is SOA is also not ignoring this important software designing principle.

Technology neutral

SOA design principles says application components should talk to each other without keeping the technologies in mind on which they have implemented. So technology is no longer barrier here. If you take current generation of SOA technologies/applications, they are heavily leveraging web services to realize this. Code your service in a technology of your choice and warp it up under web service (essentially providing WSDL interface to it).Any other component can call this service with out knowing the technology. Nowadays many tools exists in market to expose legacy applications as WSDL interfaced services.

Rapid application development(RAD)

Re usability always promotes rapid application development.RAD refers to ability how quickly you can build new applications.This can be achieved because we start with defining services and over the time we will have bunch of well defined in-house services where each service is intended to solve a particular business problem. When business is looking for a solution to solve a problem, probably what all you need to do is pick up from on- self services which you have been exercising for the years and compose a new application.

Now lets look at challenges we are facing in defining and realizing SOA benefits.

Identification and granularity of service

There are two approaches available to perform this activity. exercising this activity is very important and challenging too. Probably this might be first activity you start doing once you have made your mind to go with SOA. Business might be running bunch of applications where few of them are legacy and remaining are running on latest technologies. This does not mean that your are going to define and expose all of them as services.

  1. Top-down realization
  2. Bottom-up realization

Top-down realization:

This involves identifying all eligible business functionality which we are pulling into SOA layer. We take existing application and decompose into more smaller manageable units called services.

Bottom-up realization:

This involves defining all services what you are planning to implement in the first phase. Next step would be composing these smaller units into bigger one. Continue this exercise until you get a feel of sufficient level of service granularity and exposure is achieved.

Performance overhead

This is the cost we will pay. Anyway we are paying this cost because of current state technologies and hardware devices speed. In any typical SOA environment we are utilizing web services to make technology neutral and expose a service over Internet. So web services are being utilized as underline service platform. Calling a web service is always expensive because invoking web service involves wrapping the original message in SOAP envelope which adds complexity and overhead to the underlining middleware infrastructure. Considering lot of improvements in middleware servers and hardware this will be no longer bottle neck.

Service meta data maintenance

This is one of additional responsibilities that comes with SOA. Over the time we will be ending up with maintaining lot of services, many versions of same service and their meta data. Of course this responsibility goes to middleware server.

Governing services

This involves administrative part of your SOA environment. This is more political and taking business decisions rather technology part. This talks about coming up with policies for service life cycle(identifying,defining,deploying,revisioning and retiring services). It also covers who should do what aspects and also covers persons involved in SOA and their privileges. Lot of middleware servers provides applications to cover this part.

Monday, June 7, 2010

SOA all about

There has always been much hype and misconceptions about service oriented architecture(SOA).To be frank there is no unanimous agreement among SOA experts and authors on what and how SOA needs to be defined and boundaries of it.
However let me define it in my own way-SOA is an architecting style to design computer applications to achieve certain design goals.
Now let us see what the goals are.

Design goals
  • Loosely fashioned interacting software agents
  • Easy integration among disparate software components
  • Reusing software components
  • Composability
  • Technology neutral
  • Rapid application development
The basic building block of SOA is service. In simple words service can be any business function designed to do unit of work. For example service can be a pure C- language funtion, C++ method, Pl/SQL procedure or java application with certain characteristics.
A service is a function that is well-defined, self-contained, and does not depend on the context or state of other services.

Let us talk about a scenario where business has multiple applications running implemented in various technologies. If he wanted to integrate all of them, then the option he has in his hand is recruit multiple consultants who understands all these technologies which forces business to shell out lot of money.
In contrast, if all these applications have been designed following SOA then all he need is one consultant who knows SOA.
Many more can be added but I want to keep the list short, precise and meaningful.I am sure all the traditional application development approaches like OOPS, distributed components like EJB, DCOM and CORBA must have struggled to achieve same goals but they were successful only in achieving few of goals but not all.That means SOA as an architectural evolution rather than as a revolution. It captures many of the best practices of previous software architectures.
In nut shell-It aims at coming up with a software design in such a way that all business functionality can be implemented in set of reusable technology neutral software units with de-couple fashioned communication among them.

How SOA changes next generation of application development

It impacts many phases of a typical SDLC. We need to revise the way we used to follow to design pre-SOA applications.Some enterprise architects believe that applications developed using SOA gives lot of business agility which helps in reacting to business changes as quickly as possible and incorporate those changes in services and be at the edge of nowadays competition.If a business wants to develop a new application, what all they need to do is choose already built in-house/third-party software services, compose and orchestrate them. This paradigm greatly reduces time to 'develop to deploy' cycle.
SOA Business also has an option of choosing their own technology to implement the services. This allows coding the logic in a technology that best fits.
When started adopting SOA it is going to take more time to realize SOA benefits because initially you need spend time to identify what functionality needs to be defined as a separate service called granularity level.Once business is done with this activity, then next step would be pick those services and build new ad-hoc applications.


How web services and SOA related
There is a myth that both web services and SOA are one and same, in fact both are different but work closely together.Web service talks about exposing a legacy application functionality over the network so that clients can call them to achieve certain business work or unit of work.Web services never talks about software architecting styles. In contrast SOA talks about designing an application with certain characteristics mentioned above. Neither of them replaces another. They simply supplements each other.Web services paves the way to realize and implement SOA and in tern SOA gives best practices and design principles to implement a web service.

SOA Challenges
  • Identification and granularity of service
  • Performance overhead
  • Service meta data maintenance
  • Governing services
Given many more advancements in XML standards, process and data standardizations and processing speeds above challenges are no longer be in the list.
In next post we will talk about SOA characteristics and challenges in detail.