05 January 2015

Service Cloud Certification Tips

I have taken the Salesforce Service Cloud certification exam last October and luckily passed. It was a difficult one though even though I have experience using the module so I want to share tips on the topics that I remember part of the exam I took which I think everyone should review when taking the exam.


  • Call Center KPI reports. There were scenario based questions which would require to determine which metrics/reports best to use. I have used this link as starting point to be familiar with the common reports.
  • Visual Workflow. There were questions on the use cases of using Visual Workflow. Interestingly, I didn't encounter any questions regarding its component types which was suggested by all of the blogs I read regarding the exam.
  • Data Migration of Knowledge Articles. There are questions that specifically asked for the steps and fields to take care when migrating on-premise Knowledge Base to Salesforce Knowledge Base.
  • Data Categories. There are questions for the use-cases of using data categories and how best to assign articles access to users using data category using role or profile.
  • Implementation improvements of contact center. There are scenario based questions on how to lessen cost of maintenance of contact center, how to can enforce adherence to company goal like allowing customers to get support via all channel supported by Salesforce using any service cloud functionality.
  • Entitlement Management. There are questions related to entitlement management and the model it supports. (e.g. Entitlement Only, Service Contract with Entitlements, Service Contracts with line items and entitlement). 
  • Salesforce Console for Service. There are tricky questions about the old agent console and new Salesforce Console.
  • Implementation scenario for call using CTI versus open CTI.
  • Benefits of using KCS around case management module.
  • Email-to-case versus On-demand Email-to-case.
In my opinion, the best way to pass the exam is to read thru the topics listed on the exam study guide. Working on sample configuration from the service cloud workbook would greatly also help since you will get familiar on the process and configuration needed to enable and use the service cloud features. 

19 December 2014

Your runAllTests request is using too many DB resources

A couple of months ago, I have encounter a very strange error when deploying components in Production where Salesforce is throwing an error message saying "Your runAllTests request is using too many DB resources". Digging deeper on the error message, I came upon this knowledge article that mentions of a limit 350,000 of DML rows on all test classes. What's fascinating is that the limit is not in the documentation. From what I understood, the limit is associated with the oracle database rollback used in the backend.

This is really a big issue specially if there are a lot of applications on your instance and you plan to add one. Also, regardless whether you optimise your code to limit data created on test classes, it would be a matter of time before you reach the limit. If your environment is not yet enabled for FAST deploy as discuss in my previous blog post, you will not be able to do regular deployment.

As of right now there is no fool-proof solution for this. However, there are a couple of things I would recommend you check in case you are getting the error message.


  1. Check if your environment is loading bulk data from static resource. I'm not a big fan of the Salesforce documentation because some of the information are vague (at least for me) and having to clarify to support how salesforce computes, let's say a limit is so slow and takes weeks to confirm based on my experience. Basically loading data from static resource using Test.loadData are computed as part of this overall DML row limit.
  2. Are you using Custom Settings as holder of your environment variables? One of the things I learned from previous Dreamforce sessions is that you can use custom settings as holder of environment variables so that you don't need to maintain static values on a class since its hard to redeploy components in Production environment and using custom labels to do this may not be the best solution (and using custom labels for storing such values are not good since it should be used for multi-lingual content). This is a similar pattern used for Java and even .NET that is handy. However you may want to review this pattern because it will be a liability at the end because of the limit.
  3. Consider Multi-Org Setup. I'll be honest, as of right now I haven't implemented this on a client but having said that, having multiple instances definitely means you have separate limits for each of your unrelated applications. If for somehow some of your applications are connected then maybe you can take a look at Salesforce to Salesforce feature. 
  4. Use SeeAllData=true. I don't really recommend this since it slows down processing of test classes and it is not a best practice but it can be a temporary workaround specially if you are loading a lot of test records from your test classes regardless if its coming for custom settings or coded in the class.
  5. Check with if Managed Package will help. Now this is something I'm not sure will work and you may want to check with Salesforce but from what I understand, managed package has its own limits with unmanaged package. (I would appreciate if someone can confirm if it would work or not.)

I hope you find the list above helpful. Let me know if you encounter the same issue and if you have any thoughts on how to fix the issue. In addition, please vote my idea on removing DML row limit for loading data using static resource.

16 November 2014

Install Android Lollipop OTA manually on a MAC

After days of frustrations for getting the OTA update on Android lollipop, I decided to manually install the firmware on my Nexus 5. Searching the internet I stumbled upon this Youtube video by an awesome guy on how to install the update which I try to list down the steps for newbies like me to follow.

This will NOT delete any files from your device! 
Follow the steps at your own risk!

PRE REQUISITE
STEPS TO FOLLOW

1. Extract the Android SDK on any location you want on your MAC. (I put it on my Documents folder).

2. Copy the OTA file under sdk/platform-tools folder of the extracted Android SDK location. Optionally rename the zip file to whatever you want (I renamed it as lollipop.zip as per the youtube below above).



3. Plug your Nexus 5 to your MAC.

4. Open Terminal and go to the location of the platform-tools folder by executing the below without quotes (Please note location change depending on where you extracted the SDK):

"cd Documents/android/sdk/platform-tools" 

5. Execute "./adb devices" to verify if your device connected.


NOTE

  • If there is no device, make sure that your Nexus 5 is on debugging mode by going to go to Settings --> {} Developer options. If you cannot see this option, check this site
  • If you need instruction on how to enable debugging mode see link

6. Go to recovery mode by pressing the power button and volume down or executing the below without quotes in Terminal:

"./adb reboot-bootloader"




7. Select "apply updates from ADB" option.



8. Execute the following in Terminal without quotes (replace the file name with the actual file name from step 2):

"./adb sideload lollipop.zip"

NOTE

  • Do not disconnect your phone from your laptop.
9. Wait for till the phone completes the installation.



Congratulations you are now in Lollipop!


03 August 2014

REVIEW: FAST Deploy using Force.com Ant Migration Toolkit

One of the pain points when deploying components in Salesforce regardless if you use ChangeSet, Force.com IDE or ANT is the fact that it runs all test classes in your Production environment if one of the components being deployed are one of the following metadata types (as per the metadata api document developer guide v31).

  • ApexClass
  • ApexComponent
  • ApexPage
  • ApexTrigger
  • ArticleType
  • BaseSharingRule
  • CriteriaBasedSharingRule
  • CustomDataType
  • CustomField
  • CustomObject
  • DataCategoryGroup
  • Flow
  • InstalledPackage
  • NamedFilter
  • OwnerSharingRule
  • PermissionSet
  • Profile
  • Queue
  • RecordType
  • RemoteSiteSetting
  • Role
  • SharingReason
  • Territory
  • Validation Rule
  • Workflow
Now imagine if your environment contains a huge amount of customization. This would mean you might have hundreds to thousands of apex test classes that are being run every time you deploy. The time it takes to run all of these test classes depends on how optimized the codes and test classes were written and I have experienced waiting for 4 hours for a couple hundred of apex codes to be deployed to Production.

Introducing FAST Deploy (Pilot)


FAST Deploy allows admins to specify specific test classes to run during deployment and the current instance I'm supporting was fortunate enough to be a part of the program and I could say that it really provides good value specially to administrators. The duration of our deployments are now cut from hours to minutes. 

How to use FAST Deploy (Pilot)


The following are the prerequisites:
  1. The instance should be a part of the pilot program (contact your account manager).
  2. Currently this is only possible when using the ANT Migration Toolkit.
If you are familiar with ANT Migration scripting then it is fairly easy, otherwise you can learn more in this link. You just need to add a new <runTest> node on your <sf:deploy> from the build.xml of your ANT script. See example below:


<target name="deployCode">
<sf:deploy username="${sf.username}"
password="${sf.password}" serverurl="${sf.serverurl}"
maxPoll="${sf.maxPoll}" deployRoot="codepkg">
<runTest>TestClassName1</runTest>
<runTest>NameSpace.TestClassName2</runTest>
</sf:deploy>
</target>

Here's a screen shot of a sample deployment from Setup -> Deploy -> Deployment Status section. Most of the components here are apex codes and custom fields but the deployment only ran 12 test methods and took only 5 minutes.



There are a few caveats though on using the feature:

  • The test classes specified to run must cover 75% code coverage for each apex triggers/classes in the package.xml.
  • Aggregate code coverage from different test classes are not counted. For example, if 1 test class covers 50% of an apex trigger and another test class covers 25% for the same trigger, if both test classes are run, this is not considered as 75%.
  • As mentioned above, all these are only possible using Force.com ANT migration toolkit.
As this only runs a subset of the test classes from the environment, I recommend administrators to ensure that all test classes are run from time to time to get the overall code coverage from the environment.



19 June 2014

Salesforce Login Types

For some reason I could not find any public salesforce documentation or KB articles that discusses what each of the login types means in the Login History page. So I asked support to understand each and they provided me with the table below which I'm sharing to help anyone who may be needing it.

Login Type Description
Application UI login
Chatter Communities External User Community User Login
Other Apex API Other Apex API
Partner Product Partner Integration Login
Remote Access 2.0 OAuth 2.0 login ( for example, connected app login)
SAML Chatter Communities 
External User SSO
Community User Login from Single-Sign-On
SAML Idp Initiated SSO SAML Single-Sign-On with login initiated 
from Idp(Identity Provider) side
SAML Sfdc Initiated SSO SAML Single-Sign-On with login initiated 
from Salesforce side, redirect to Idp then 
login after got Idp's SAML response
SYNC Login from SYNC client

16 June 2014

Unlocking the power of custom settings for data loading

Common use case when doing data load is to not to execute anything from the environment such as workflows, validation rules or apex triggers. This can only be achieve if (1) you have migrated only object you are targeting to do data load while workflow and triggers are not yet deployed or (2), workflows and triggers are inactivated which is somehow a hassle to admin or developers as they have to manually deactivate the workflow and worst deploy the inactivated triggers or (3), you explicitly includes user to skip on the workflow criteria or apex codes on triggers. Custom Settings can be used to hold configurations specific to user for skipping such automations.

Below is a sample custom settings I created with 3 fields.


I also created a small apex class that will interface with this custom settings and will be used on apex triggers.

public class AutoSkipSettings {
    
    public static boolean skipTrigger(){
        boolean skipTrigger = false; 
     
        Automation_Skip_Settings__c userSkipSetting = 
             Automation_Skip_Settings__c.getValues(UserInfo.getUserId());
        
        if(userSkipSetting <> null && userSkipSetting.Skip_Trigger__c){
            skipTrigger = true;
        }

        return skipTrigger;
    }
    
}

The apex triggers basically just extracts the custom setting value for the user and return the value of the checkbox. In this case, I'm returning the value of the Skip_Trigger__c field. Now we are ready to use the custom settings on validation and workflow rule criteria and apex trigger.

On the validation or workflow rule, the custom setting fields will be available as a global variable. Note that for workflow rule, you need to select 'formula evaluates to true' on the rule criteria section.


On any apex trigger, use the apex class created above.

trigger TestAccountTrigger on Account (before update) {
    
    if(!AutoSkipSettings.skipTrigger()){
    // do anything here
    }
    
}

Additional Notes:
  • The reason why we used hierarchy type for custom setting is so that we can have a default organization level value. 
  • The custom setting is much better than using custom object so that we save 1 SOQL query on the code and reference the field as global variable on formula





17 March 2013

Testing SFDC Web Service using SoapUI

Its not often to that you need to test your custom SFDC web service codes using XML request/response. Usually when you create your own custom web service you just generate the WSDL file, give it to your integrator and his middleware tool will do all of the conversions.

Recently I got challenged by one of my colleague on how I test my web services (just because they could not figure out how they can consume the WSDL file). I told him that I use Eclipse or the Force.com IDE to test if the service are running correctly by executing an anonymous call. This is just ok but you are not really calling your web service using the standard XML request/response. 

So how can you test your web service?

First, you need SoapUI. Its a free tool you can use in PC and Mac.

Second, you need to download your Salesforce WSDL file. You can use either the Enterprise or the Partner WSDL from Setup -> Develop -> API


Third, you create your custom web service logic. This is optional as you can use the standard web service method from the API. In the example below I just created a simple service that returns a string value 'Hello World!' and downloaded its WSDL file.


/* 
    Description: This class is called from SoapUI
*/
global class SimpleService {
     webservice static string sampleService() {
        return 'Hello World!';
    }
}

Fourth, you create a new project in SoapUI and use the WSDL generated from the second step. For this example I used the Enterprise WSDL. 



After the project is created, import the custom service WSDL file by right clicking on the project and selecting 'Add WSDL' context menu. Select the custom web service WSDL.


You should now have two nodes on the project, one for the Enterprise and the other for the custom web service you have coded as below.



Fifth step is to create a login request. Delete all of the values from the <soapenv:Header> section and then add your actual username and password to the appropriate tags like the screenshot below and click the submit icon. It should return with a soap response envelop.



Sixth, use the session Id from the soap response from step 5 and use it on your custom service call.



Click the submit icon and it should update back with the response envelop and thats it, Congratulations!


Please note that I have removed all of the header information except for the session Id which is important.

Additional notes:
- If you have a custom web service with parameters and you need to pass null value on it, Salesforce requires you to remove the parameter node on the request instead of passing a node without a value. This behavior is the same on an outbound message in which Salesforce removes the field on the message if the field is null.