HCL Workload Automation, Version 9.4

Salesforce jobs

HCL Workload Automation integrates with Salesforce to provide the capability of automating, monitoring, and controlling workflows containing batch APEX jobs.

Prerequisites

You gain greater control of your Salesforce jobs with both calendar-based and event-based workload automation, as well as providing a single point of control to handle exceptions and automate recovery processes.
Before you start to create a Salesforce job definition with HCL Workload Automation, consider the following limitations:
  • The batch Apex classes (and related methods) that you want to run with the Salesforce plug-in, must be defined with global access level, in order to make them accessible by all Apex everywhere (the public access level is not enough).
  • At job definition time, only Salesforce batch Apex classes can be run. If you select a non-batch Apex class, the job fails.
To create a Salesforce job definition, you must complete the prerequisite steps that are listed in the following procedure.
  1. Register on Salesforce Server and ask for user ID and password.
  2. Log in to Salesforce Server.
  3. Create the following Apex classes that are needed for the communication between HCL Workload Automation and Salesforce Server. The HCL Workload Automation Apex classes must be defined outside any package.
    Class TWSListApexClass
    @RestResource(urlMapping='/TWSListApexClass/*')
    global with sharing class TWSListApexClass{
    //This Apex class exposes the TWSListApexClass REST service 
    //which returns a list of all known Batchable Apex classes.    
        @HttpGet
        global static List<ApexClass> doGet() {
            RestRequest req = RestContext.request;
            RestResponse res = RestContext.response;
            String fullName='';
            List<ApexClass> tempList = 
            [SELECT NamespacePrefix,Name FROM ApexClass ORDER BY Name];        
            List<ApexClass> result = new List<ApexClass>();
            for (ApexClass a: tempList){
                if (a.NamespacePrefix==null || a.NamespacePrefix.equals('')){
                   fullName=a.Name;
                } else {
                   fullName=a.NamespacePrefix+'.'+a.Name;
                }
                System.debug(LoggingLevel.Info, 'ApexClass: '+fullName);            
                result.add(a);
            }   
            return result;
        }
    }
    Class TWSSubmitApexJob
    @RestResource(urlMapping='/TWSSubmitApexJob/*')
    global with sharing class TWSSubmitApexJob{
    //This Apex class exposes the TWSSubmitApexJob REST service 
    //which submits an Apex class to the Salesforce server.
        @HttpGet
        global static ID doGet() {
            RestRequest req = RestContext.request;
            RestResponse res = RestContext.response;
            String apexClass = req.params.get('className');
            System.debug(LoggingLevel.Info, 'Execute Batch:'+apexClass);
            Type t = Type.forName(apexClass);
            if (t == null){
               throw new TWSException (apexClass + ' not found');
            }         
            Object s = t.newInstance();
            ID batchprocessid = 
            Database.executeBatch((Database.Batchable<sObject>)s);
            System.debug(LoggingLevel.Info, 'Job ID: '+batchprocessid);
            return batchprocessid;
        }    
    global class TWSException extends Exception{}
    } 
    Class TWSMonitorApexJob
    @RestResource(urlMapping='/TWSMonitorApexJob/*')
    global with sharing class TWSMonitorApexJob{
    //This Apex class exposes the TWSMonitorApexJob REST service 
    //which will monitor the progress of the backend Apex job.
        @HttpGet
        global static AsyncApexJob doGet() {
            RestRequest req = RestContext.request;
            RestResponse res = RestContext.response;
            ID i = (ID) req.params.get('jobID');
            AsyncApexJob a = [SELECT Id, Status, ExtendedStatus, NumberOfErrors, 
            JobItemsProcessed, TotalJobItems FROM AsyncApexJob WHERE Id = :i];
            return a;
        }
    }
    Class TWSAbortApexJob
    @RestResource(urlMapping='/TWSAbortApexJob/*')
    global with sharing class TWSAbortApexJob{
    //This Apex class exposes the TWSAbortApexJob REST service 
    //which will abort the Apex job on the Salesforce server.    
        @HttpGet
        global static void doGet() {
            RestRequest req = RestContext.request;
            RestResponse res = RestContext.response;
            String jobID = req.params.get('jobID');
            System.abortJob(jobID);
        }
    }
  4. Verify the content of the Salesforce plug-in properties file:

    <TWA_HOME>\TWS\javaExt\cfg\<plug-in_name>.properties

    This file contains the plug-in properties that were set at installation time and that you can choose to override later. The plug-in properties are the following:
    ProxyServer
    ProxyServerPort
    pollingPeriod
    pollingTimeout
    where
    ProxyServer
    The IP address or the server name for the proxy server. Specify this property if you connect to the Salesforce server through a proxy server.
    ProxyServerPort
    The listening port of the proxy server.
    pollingPeriod
    The monitoring frequency. It determines how often the job is monitored during its execution. It is expressed in seconds.
    pollingTimeout
    The monitoring time. It determines for how long the job is monitored during its execution. At the end of the timeout interval, the job fails. It is expressed in seconds.
    The values that you specify in the properties file are the values that are used as default at job definition time.

Business Scenario

WWMail4U.Inc offers mail and e-commerce market products and services worldwide. As an organization, WWMail4U.Inc manages large amounts of complex data traffic.

WWMail4U.Inc operates in a very competitive market, and to maintain a leading role, it recently implemented cloud solutions to provide business applications as a service to its customers. WWMail4U.Inc's top priority is to have its SAP source servers aligned with the SalesForce Server within the cloud environment. The company's SAP workload is already controlled by HCL Workload Automation and the plan is to extend this control to all their (batch) business processes.

Thanks to the integration between HCL Workload Automation and Salesforce, WWMail4U.Inc has its entire business process chain in a single job stream controlled by HCL Workload Automation.

Salesforce job definition

A description of the job properties and valid values are detailed in the context-sensitive help in the Dynamic Workload Console by clicking the question mark (?) icon in the top-right corner of the properties pane.

For more information about creating jobs using the various supported product interfaces, see Defining a job.

The following table lists the required and optional attributes for Salesforce jobs:
Table 1. Required and optional attributes for the definition of a Salesforce job
Attribute Description and value Required
Server The Salesforce server that Salesforce provides you, after your registration.
User name The name of the user authorized to access the Salesforce server.
Password The password that is associated with the user that is authorized to access the Salesforce server.
APEX Class The APEX batch class that is supported for HCL Workload Automation. You can execute only Salesforce Apex batch classes. If you specify a non-batch class, the job fails.

Scheduling and stopping the job in HCL Workload Automation

You schedule HCL Workload Automation Salesforce jobs by defining them in job streams. Add the job to a job stream with all the necessary scheduling arguments and submit the job stream.

You can submit jobs by using the Dynamic Workload Console, Application Lab or the conman command line. See Scheduling and submitting jobs and job streams for information about how to schedule and submit jobs and job streams using the various interfaces.

After submission, when the job is running and is reported in EXEC status in HCL Workload Automation, you can stop it if necessary, by using the kill command from the Dynamic Workload Console. However, this action is effective only for the Request/Response scenario, therefore the HCL Workload Automation processes do not wait to receive a response from the Salesforce job.

Monitoring the job

If the HCL Workload Automation agent stops when you submit the HCL Workload Automation Salesforce job or while the job is running, as soon as the agent restarts in the Request/Response scenario, HCL Workload Automation begins monitoring the job from where it stopped and waits for the Response phase.

For information about how to monitor jobs using the different product interfaces available, see Monitoring HCL Workload Automation jobs.

Job properties

While the job is running, you can track the status of the job and analyze the properties of the job. In particular, in the Extra Information section, if the job contains variables, you can verify the value passed to the variable from the remote system. Some job streams use the variable passing feature, for example, the value of a variable specified in job 1, contained in job stream A, is required by job 2 in order to run in the same job stream.

For more information about passing variables between jobs, see Passing job properties from one job to another in the same job stream instance.

For information about how to display the job properties from the various supported interfaces, see Analyzing the job log.

For example, from the conman command line, you can see the job properties by running:
conman sj <Salesforce_job_name>;props
where <Salesforce_job_name> is the Salesforce job name.
For a Salesforce job in the Extra Information section of the output command, you see the following properties:
Extra Information
  Apex batch class = TWSBatchTest
  Apex job ID = 7072000000eLnYOAA0
  Job item processed = 1
  Number of errors= 0
  Server address = regionA.salesforce.com
  Batch status = Completed
  Total Job items = 1
  User name = userabc@xyz.com
where
Apex batch class
The APEX batch class that is supported for HCL Workload Automation.
Apex job ID
The Salesforce job ID.
Job item processed
The number of the processed job items.
Number of errors
The number of the errors.
Server address
The Salesforce server that you specify in the Server field.
Batch status
The status of batch job
Total Job items
The total number of processed job items.
User name
The name of the user authorized to access the Salesforce server that you specify in the User name field.

You can export the Salesforce job properties that you can see in the Extra Information section, to a successive job in the same job stream instance. For more information about the list of job properties that you can export, see Properties for Salesforce jobs .

The properties file is automatically generated either when you perform a "Test Connection" from the Dynamic Workload Console in the job definition panels, or when you submit the job to run the first time. Once the file has been created, you can customize it. This is especially useful when you need to schedule several jobs of the same type. You can specify the values in the properties file and avoid having to provide information such as credentials and other information, for each job. You can override the values in the properties files by defining different values at job definition time.

The following example shows the job definition for a Salesforce job :
NEWYORK-01#JOB-SF-0000
TASK
<?xml version="1.0" encoding="UTF-8"?>
<jsdl:jobDefinition xmlns:jsdl=
"http://www.abc.com/xmlns/prod/scheduling/1.0/jsd
l" xmlns:jsdlsalesforce=
"http://www.abc.com/xmlns/prod/scheduling/1.0/jsdlsalesforce" 
name="SALESFORCE">
<jsdl:application name="salesforce">
<jsdlsalesforce:salesforce>
<jsdlsalesforce:SalesforceParameters>
<jsdlsalesforce:SalesforceParms>
<jsdlsalesforce:ServerConnection>
<jsdlsalesforce:Server>regionA.salesforce.com
sforce.com</jsdlsalesforce:Server>
<jsdlsalesforce:UserID>userabc@xyz.com
</jsdlsalesforce:UserID>
<jsdlsalesforce:password>{aes}+D
2UAAxhxtYf8ENfb7LNr0DLRt0hwKPHlDiA2/PO1e4=
</jsdlsalesforce:password>
</jsdlsalesforce:ServerConnection>
<jsdlsalesforce:APEXJobDetails>
   <jsdlsalesforce:APEXClass>TWSBatchTest
    </jsdlsalesforce:APEXClass>
</jsdlsalesforce:APEXJobDetails>
</jsdlsalesforce:SalesforceParms>
</jsdlsalesforce:SalesforceParameters>
</jsdlsalesforce:salesforce>  
</jsdl:application>
</jsdl:jobDefinition>
RECOVERY STOP 

Job log content

For information about how to display the job log from the various supported interfaces, see Analyzing the job log.

For example, you can see the job log content by running the command conman sj <Salesforce_job_name>;stdlist, where <Salesforce_job_name> is the Salesforce job name.

For a Salesforce job log, you see the following information:
===============================================================
= JOB : NY0000000000#JOBS[(0000 05/08/14),(JOBS)].SF_MAR0318376017
= TASK : <?xml version="1.0" encoding="UTF-8"?>
<jsdl:jobDefinition xmlns:jsdl=
"http://www.abc.com/xmlns/prod/scheduling/1.0/jsdl" xmlns:jsdlsalesforce=
"http://www.abc.com/xmlns/prod/scheduling/1.0/jsdlsalesforce" name="SALESFORCE">
<jsdl:variables>
<jsdl:stringVariable name=
        "tws.jobstream.name">JOBS</jsdl:stringVariable>
<jsdl:stringVariable name=
        "tws.jobstream.id">JOBS</jsdl:stringVariable>
<jsdl:stringVariable name="tws.job.name">
           SF_MAR0318376017</jsdl:stringVariable>
<jsdl:stringVariable name=
        "tws.job.workstation">NY0000000000</jsdl:stringVariable>
<jsdl:stringVariable name=
         "tws.job.iawstz">201405080000</jsdl:stringVariable>
<jsdl:stringVariable name=
........."tws.job.promoted">NO</jsdl:stringVariable>
<jsdl:stringVariable name=
.........."tws.job.resourcesForPromoted">10</jsdl:stringVariable>
<jsdl:stringVariable name=
.........."tws.job.num">607245960</jsdl:stringVariable>
</jsdl:variables>
<jsdl:application name="salesforce">
<jsdlsalesforce:salesforce>
<jsdlsalesforce:SalesforceParameters>
<jsdlsalesforce:SalesforceParms>
<jsdlsalesforce:ServerConnection>
<jsdlsalesforce:Server>
        regionA.salesforce.com</jsdlsalesforce:Server>
<jsdlsalesforce:UserID>userabc@xyz.com</jsdlsalesforce:UserID>
<jsdlsalesforce:password>
{aes}+D2UAAxhxtYf8ENfb7LNr0DLRt0hwKPHlDiA2/PO1e4=
</jsdlsalesforce:password>
</jsdlsalesforce:ServerConnection>
<jsdlsalesforce:APEXJobDetails>
<jsdlsalesforce:APEXClass>TWSBatchTest</jsdlsalesforce:APEXClass>
</jsdlsalesforce:APEXJobDetails>
</jsdlsalesforce:SalesforceParms>
</jsdlsalesforce:SalesforceParameters>
</jsdlsalesforce:salesforce>
</jsdl:application>
<jsdl:resources>
<jsdl:orderedCandidatedWorkstations>
<jsdl:workstation>
690830601B8D4681AF38D3529BC5199E</jsdl:workstation>
</jsdl:orderedCandidatedWorkstations>
</jsdl:resources>
</jsdl:jobDefinition>
= TWSRCMAP : 
= AGENT : NC125181_1
= Job Number: 607245960
= Thu May 22 17:18:49 CEST 2014
===============================================================
Apex Batch job ID: 7072000000eLnYOAA0
Apex job completed with success
Apex Job ID: 7072000000eLnYOAA0
Status: Completed
Total Batches: 1
Batches Processed: 1
Failures: 0
===============================================================
= Exit Status : 0
= Elapsed Time (Minutes) : 1
= Thu May 22 17:18:53 CEST 2014
=============================================================== 

See also

From the Dynamic Workload Console you can perform the same task as described in

Creating job definitions.

For more information about how to create and edit scheduling objects, see

Designing your Workload.