Tuesday, April 30, 2019

How to expose a salesforce apex class as Rest API

salesforce lightning,
1. Create connected app and generate user secret and key
2. Give Full access for this project under connected app permission
3. Create your apex class and test in  workbench://use this url to call from workbench /services/apexrest/GetCase2/5009000001NxD87
Apex code

global with sharing class RestCaseCreatorurl {

    global static Case getCaseById() {
        RestRequest request = RestContext.request;
        // grab the caseId from the end of the URL
        String caseId = request.requestURI.substring(request.requestURI.lastIndexOf('/')+1);
       Case result =  [SELECT CaseNumber,Subject,Status,Origin,Priority FROM Case WHERE Id = :caseId];
        return result;
    global static  String createcase1(String status, String origin){
        Case cs = new Case();
        cs.Status = status;
        cs.Origin = origin;
        insert cs;
        return cs.id;

4. JAva code
       /*Java code to access
       package sfdc_rest;
import java.io.IOException;
import java.io.InputStream;
import java.net.URISyntaxException;

import org.apache.http.client.methods.*;
import org.apache.http.impl.client.DefaultHttpClient;
import org.apache.http.HttpResponse;
import org.apache.http.HttpStatus;
import org.apache.http.ParseException;
import org.apache.http.util.EntityUtils;
import org.apache.http.client.utils.URIBuilder;
import org.apache.http.client.ClientProtocolException;
import org.json.JSONObject;
import org.json.JSONTokener;
import org.json.JSONException;
public class Main {
static final String USERNAME = "narenrr@yahoo.com";
static final String PASSWORD = "pwdWHVAUUcVWbxXMmKFQDEl1sgK0";
static final String LOGINURL = "https://login.salesforce.com";
static final String GRANTSERVICE =
static final String CLIENTID = "3MVG9Y6d_Btp4xp7zpH04V6xgnCGIjiRvaGyUe3CIaIwVd5KmnEfStMNlYeql3EN3UugSvGlQ2pUh0QfkK4lj";
static final String CLIENTSECRET = "2948595789758136914";
public static void main(String[] args) {
DefaultHttpClient httpclient = new DefaultHttpClient();
// Assemble the login request URL
String loginURL = LOGINURL +
"&client_id=" + CLIENTID +
"&client_secret=" + CLIENTSECRET +
"&username=" + USERNAME +
"&password=" + PASSWORD;
// Login requests must be POSTs
HttpPost httpPost = new HttpPost(loginURL);
HttpResponse response = null;
try {
// Execute the login POST request
response = httpclient.execute(httpPost);
} catch (ClientProtocolException cpException) {
// Handle protocol exception
} catch (IOException ioException) {
// Handle system IO exception
// verify response is HTTP OK
final int statusCode =response.getStatusLine().getStatusCode();
if (statusCode != HttpStatus.SC_OK) {
System.out.println("Error authenticating to Force.com:"+statusCode);
// Error is in EntityUtils.toString(response.getEntity())
String getResult = null;
try {
getResult = EntityUtils.toString(response.getEntity());
} catch (IOException ioException) {
// Handle system IO exception
JSONObject jsonObject = null;
String loginAccessToken = null;
String loginInstanceUrl = null;
try {
jsonObject = (JSONObject) new JSONTokener(getResult).nextValue();
loginAccessToken = jsonObject.getString("access_token");
loginInstanceUrl = jsonObject.getString("instance_url");
} catch (JSONException jsonException) {
// Handle JSON exception
System.out.println("Successful login");
System.out.println(" instance URL: "+loginInstanceUrl);
System.out.println(" access token/session ID:"+loginAccessToken);
// release connection

HttpGet httpget = new HttpGet();//loginInstanceUrl +  "/services/data/v30.0/sobjects/GetCase/5009000001NxD81");
httpget.addHeader("Authorization", "OAuth " + loginAccessToken);
HttpResponse response1 = null;
URIBuilder builder;
try {
    builder = new URIBuilder(loginInstanceUrl + "/services/apexrest/devlight1973/GetCase/5009000001NxD81");

        } catch (URISyntaxException e1) {
            // TODO Auto-generated catch block

String getResult1 = null;
    response1 =httpclient.execute(httpget);
    }catch(Exception e){
System.out.println(response1.getStatusLine().getStatusCode() );
    if (response1.getStatusLine().getStatusCode() == HttpStatus.SC_OK) {
                        try {
                            // Do the needful with entity.
                            try {
                                getResult1 = EntityUtils.toString(response1.getEntity());
                            } catch (ParseException e) {
                                // TODO Auto-generated catch block
                            } catch (IOException e) {
                                // TODO Auto-generated catch bloc=
                            jsonObject = (JSONObject) new JSONTokener(getResult1).nextValue();
                            System.out.println("Query response: " + jsonObject);
                        }catch(JSONException e){


How to schedule batch class in salesforce

Benefits of batch apex
The execution logic of the batch class is called once for each batch of records. The default batch size is 200 records. You can also specify
a custom batch size. Furthermore, each batch execution is considered a discrete transaction. With each new batch of records, a new set
of governor limits is in effect. In this way, it’s easier to ensure that your code stays within the governor execution limits. Another benefit
of discrete batch transactions is to allow for partial processing of a batch of records in case one batch fails to process successfully, all
other batch transactions aren’t affected and aren’t rolled back if they were processed successfully.

This is a use case that we all come across very often, Schedule a batch class every hour to clean up data or to send out batch emails to case team members (Which I’ll blog about later).
There are three main steps involved in this 
1.         Write a Batch class with the required logic
2.         Write a Scheduled Apex which calls the above Batch Class
3.         Schedule the class from the developer console by executing anonymous apex
Step 1: Write the batch class
Batch apex gives us the advantage to run jobs that might require a lot more that the usual governor limit contexts, example Batch job are made to perform common UPSERT operation on a scheduled basis. The Batch apex, can be used to conveniently perform time to time task and some real complex job ranging from data cleansing, archiving the data to the other quality improvements

Learn more about Batch Apex here.
Example of a batch class
global class ExampleBatchClass implements Database.Batchable<sObject>
    global ExampleBatchClass(){
              // Batch Constructor
     // Start Method
     global Database.QueryLocator start(Database.BatchableContext BC){
        return Database.getQueryLocator(query);
     // Execute Logic
     global void execute(Database.BatchableContext BC, List<sObject> scope){
            // Logic to be Executed batch wise     
     global void finish(Database.BatchableContext BC){

Step 2: Write the Schedulable class

Now, We have the batch class ready and it has to be in a schedulable context in-order to schedule the batch. 
You can learn more about schedulable apex here

Example of a Scheduled Apex
  global class scheduledBatchable implements Schedulable{
     global void execute(SchedulableContext sc) {
        // Implement any logic to be scheduled

           // We now call the batch class to be scheduled
        ExampleBatchClass b = new ExampleBatchClass();
        //Parameters of ExecuteBatch(context,BatchSize)

Step 3: Schedule the class by executing anonymous

Finally now we can schedule the batch class, there are two ways by which we can schedule the batch class 

1.         From Setup—> Apex Classes –> Schedule Apex : but, here the minimum is one day/ 24 hours
2.         By executing anonymous code from either developer console or apex, here the minimum is 1 hour

Code to be executed
// Cron EXP for hourly schedule
   String CRON_EXP = '0 0 * * * ?';
   SheduledBatchable sch = new scheduledBatchable();
   system.schedule('Hourly Example Batch Schedule job', CRON_EXP, sch);

If you want to run it as frequent as 15,30 or N mins .....
System.schedule('Job1', '0 * * * * ?', new SchJob());
System.schedule('Job2', '0 15 * * * ?', new SchJob());
System.schedule('Job3', '0 30 * * * ?', new SchJob());
System.schedule('Job4', '0 45 * * * ?', new SchJob());

Custom settings to enable disable salesforce validation rules

salesforce lightning,

How to disable/enable all validation rules for data loading

While working on a recruiting application, I found a solution for being able to load data into a SalesForce application without being blocked by validation rules.
Validation rules are usually intended to be applied only when a user creates a record and not when data is being imported from an external database. In this recruiting application, candidate records go several stages in a sequence (1-lead, 2-phone, 3-applicant, 4-interview, 5-contract negotiation, etc.) and this validation rule prevented the import process from loading candidate records in a stage higher than lead.

So the solution was to create a Custom Setting of type Hierarchy with a flag/checkbox in it that disables validation rules for a given user or profile. That is, all the validation rules will include this flag and only apply when the value of this flag is enabled.
To implement it:
1) click Setup, then on the left side, click App Setup/Develop/Custom Settings.
2) click New and create your settings as hierarchy/public

3) now create a custom field of type Checkbox:  click New, select Checkbox, click Next, type the name of the field as “Disable Validation Rules”, default to Unchecked, click Next, then click Save.
4) in the Custom Setting Definition/Your App Settings screen, click Manage to start configuring the setting you just created.
5) click the New button above “Default Organization Level Value”, leave the checkbox “Disable Validation Rules” unchecked and then click Save.
6) click “Back to List”, click the New button at the bottom, select Profile or User then choose the user id or profile that will perform the data loading, then click “Disable Validation Rules” and click Save.
7) now edit the validation rule appending the expression (highlighted below)
&& NOT( $Setup.Your_App_Settings__c.Disable_Validation_Rules__c )
  8 ) click Save
9) now the validation rule will only apply if the setting checkbox
Disable Validation Rules is unchecked for your profile/user
10) you can now load data freely and then, later, re-enable all
validation rules for your profile/user by changing the custom
11) you can use this way of implementing Custom Settings on
triggers too, just use the syntax below:
 Your_App_Settings__c s = 
 Your_App_Settings__c.getInstance( UserInfo.GetUserID() ); //or Profile
 if( s.Disable_Validation_Rules__c ) return; // skip trigger...
Pertinent Articles:

What is implicit sharing in salesforce

Implicit Sharing
The sharing capabilities of the Force.com platform include a wide variety of features that administrators can use to explicitly grant access to data for individuals and groups. In addition to these more familiar functions, there are a number of sharing behaviors that are built into Salesforce applications. This kind of sharing is called implicit because it is not configured by administrators; it is defined and maintained by the system to support collaboration among members of sales teams, customer service representatives, and clients or customers.
This table describes the different kinds of implicit sharing built into Salesforce applications and the record access that each kind provides.
Type of Sharing
Read-only access to the parent account for a user with access to a child record
·       Not used when sharing on the child is controlled by its parent
·       Expensive to maintain with many account children
·       When a user loses access to a child,Salesforce needs to check all other children to see if it can delete the implicit parent.
Access to child records for the owner of the parent account
·       Not used when sharing on the child is controlled by its parent
·       Controlled by child access settings for the account owner’s role
·       Supports account sharing rules that grant child record access
·       Supports account team access based on team settings
·       When a user loses access to the parent, Salesforce needs to remove all the implicit children for that user.
Access to records owned by or shared to portal users for internal users
·       Shared to the role of the account owner
·       Also supports inheritance within portal roles
Access to portal account and all associated contacts for all portal users under that account
Shared to the lowest role under the portal account
Access to data owned by Community users under a portal for internal users who are members of the portal share group
All members of the share group gain access to ever record owned by every Community portal user.
Community Parent
Access to the parent accounts of child records shared through the Community portal share group for internal users who are members
Maintains the ability to see the parent account when internal users are given access to account children owned by Community portal users
1To allow portal users to scale into the millions, Community users have a streamlined sharing model that does not rely on roles or groups, and functions similarly to calendar events and activities. Community users are provisioned with the Service Cloud Portal or Authenticated Website licenses.

How to call future(asynchronous class in apex) methods from salesforce apex triggers and not run into governor limits

 Use @future Appropriately
     As articulated throughout this article, it is critical to write your Apex code to efficiently handle bulk or many records at a time. This is also true for asynchronous Apex methods (those annotated with the @future keyword).
          Even though Apex written within an asynchronous method gets its own independent set of higher governor limits, it still has governor limits. Additionally, no more than ten @future methods can be invoked within a single Apex transaction.

·    Here is a list of governor limits specific to the @future annotation:
·    No more than 10 method calls per Apex invocation
·    No more than 200 method calls per Salesforce license per 24 hours.
·    The parameters specified must be primitive dataypes, arrays of primitive  datatypes, or collections of primitive datatypes.
·    Methods with the future annotation cannot take sObjects or objects as arguments.
·    Methods with the future annotation cannot be used in Visualforce controllers in either getMethodName or setMethodName methods, nor in the constructor.
 the Apex trigger inefficiently invokes an asynchronous method for each Account record it wants to process:
EX :
trigger accountAsyncTrigger on Account (after insert, after update) {
  for(Account a: Trigger.new){
    // Invoke the @future method for each Account
    // This is inefficient and will easily exceed the governor limit of
    // at most 10 @future invocation per Apex transaction
Here is the Apex class that defines the @future method:
EX :
global class asyncApex {

  public static void processAccount(Id accountId) {
       List<Contact> contacts = [select id, salutation, firstname, lastname, email
                from Contact where accountId =&nbsp;:accountId];
         for(Contact c: contacts){
              System.debug('Contact Id[' + c.Id + '], FirstName[' + c.firstname + '], LastName[' + c.lastname +']');
                                                    c.Description=c.salutation + ' ' + c.firstName + ' ' + c.lastname;
        update contacts;       

Since the @future method is invoked within the for loop, it will be called N-times (depending on the number of accounts being processed). So if there are more than ten accounts, this code will throw an exception for exceeding a governor limit of only ten @future invocations per Apex transaction.

Instead, the @future method should be invoked with a batch of records so that it is only invoked once for all records it needs to process:
trigger accountAsyncTrigger on Account (after insert, after update) {
    //By passing the @future method a set of Ids, it only needs to be
    //invoked once to handle all of the data.

And now the @future method is designed to receive a set of records:
EX :
global class asyncApex {
  public static void processAccount(Set<Id> accountIds) {
       List<Contact> contacts = [select id, salutation, firstname, lastname, email from Contact where accountId IN&nbsp;:accountIds];
       for(Contact c: contacts){
           System.debug('Contact Id[' + c.Id + '], FirstName[' + c.firstname + '], LastName[' + c.lastname +']');
                                  c.Description=c.salutation + ' ' + c.firstName + ' ' + c.lastname;
        update contacts;
Writing Test Methods to Verify Large Datasets :
             Here is the poorly written contact trigger. For each contact, the trigger performs a SOQL query to retrieve the related account. The invalid part of this trigger is that the SOQL query is within the for loop and therefore will throw a governor limit exception if more than 100 contacts are inserted/updated.
EX :
trigger contactTest on Contact (before insert, before update) {
             for(Contact ct: Trigger.new){
              Account acct = [select id, name from Account where Id=:ct.AccountId];
                    System.debug('found a contact related to an account in california...');
                    ct.email = 'test_email@testing.com';
                    //Apply more logic here....
Here is the test method that tests if this trigger properly handles volume datasets:
EX :
public class sampleTestMethodCls {
              static testMethod void testAccountTrigger(){
                  //First, prepare 200 contacts for the test data
                  Account acct = new Account(name='test account');
                  insert acct;          
                  Contact[] contactsToCreate = new Contact[]{};
                  for(Integer x=0; x<200;x++){
                      Contact ct = new Contact(AccountId=acct.Id,lastname='test');
                  //Now insert data causing an contact trigger to fire.
                  insert contactsToCreate;
Note the use of Test.startTest and Test.stopTest. When executing tests, code called before Test.startTest and after Test.stopTest receive a separate set of governor limits than the code called between Test.startTest and Test.stopTest. This allows for any data that needs to be setup to do so without affecting the governor limits available to the actual code being tested.

Now let's correct the trigger to properly handle bulk operations. The key to fixing this trigger is to get the SOQL query outside the for loop and only do one SOQL Query:
EX :
trigger contactTest on Contact (before insert, before update) {              
             Set<Id> accountIds = new Set<Id>();
             for(Contact ct: Trigger.new)
             //Do SOQL Query   
             Map<Id, Account> accounts = new Map<Id, Account>(
                  [select id, name, billingState from Account where id in :accountIds]);                
             for(Contact ct: Trigger.new){
                     System.debug('found a contact related to an account in california...');
                     ct.email = 'test_email@testing.com';
                     //Apply more logic here....
Note how the SOQL query retrieving the accounts is now done once only. If you re-run the test method shown above, it will now execute successfully with no errors and 100% code coverage.
Avoiding Hardcoding :
                    Here is a sample that hardcodes the record type IDs that are used in an conditional statement. This will work fine in the specific environment in which the code was developed, but if this code were to be installed in a separate org (ie. as part of an AppExchange package), there is no guarantee that the record type identifiers will be the same.
EX :
for(Account a: Trigger.new){
             //Error - hardcoded the record type id
                //do some logic here.....
             }else if(a.RecordTypeId=='0123000000095Km'){
                //do some logic here for a different record type...
           Now, to properly handle the dynamic nature of the record type IDs, the following example queries for the record types in the code, stores the dataset in a map collection for easy retrieval, and ultimately avoids any hardcoding.
EX :
//Query for the Account record types
               List<RecordType> rtypes = [Select Name, Id From RecordType
                            where sObjectType='Account' and isActive=true];
               //Create a map between the Record Type Name and Id for easy retrieval
               Map<String,String> accountRecordTypes = new Map<String,String>{};
               for(RecordType rt: rtypes)
                for(Account a: Trigger.new){
                    //Use the Map collection to dynamically retrieve the Record Type Id
                    //Avoid hardcoding Ids in the Apex code
                       //do some logic here.....
                    }else if(a.RecordTypeId==accountRecordTypes.get('High Tech')){
                       //do some logic here for a different record type...