Einstein Analytics and Discovery Consultant

I am excited to announce that I am now Certified Einstein Analytics and Discovery consultant. It took me a while since the exam was released last year. Failed on my first two attempts.

Here are the things I did wrong before.

I now recall one of introduction webinars of the certification where the presenter mentioned clearly, you will not pass if you don’t finish the learning path and both super badges where in the path. Follow the links posted in the analytics chatter group and if you have access to partner community follow the partner chatter group. Lastly follow the Einstein Analytics & Discovery Cert FP partner chatter group.

From my experience of failing twice, most of the questions change every release. But if you understand the concepts you will breeze through it. My final attempt felt like it. Here are some contents that helped me through the finish line.

Some Blogs that helped me in the journey,

As a path forward I am excited on the future of analytics with Tablaeu. Going to do some more learning. New Tableau for Partners chatter group sounds interesting. Will keep posting on analytics in future.

Tip: Manage multiple JVMs in Mac with Homebrew

I started my journey to learn Mulesoft Development Fundamentals using Mule 4, which currently supports only OpenJDK 1.8. Current version of java is 12. Thought it might be useful to share how to manage multiple jdk versions in Mac. You don’t want to downgrade your default jdk to 8 as it might impact other applications.

Simplest way to manage is to use a package manager and my choice in Mac is Homebrew. Quoting Homebrew: “Homebrew installs the stuff you need that Apple (or your Linux system) didn’t.” Run this from the terminal to install Homebrew,

/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"

Next we have to install cask, which is basically list of repository for different software, called formula. Once you install the case you will be ready to install any software it supports. If you already have homebrew installed, but did not use it for sometime, it is always better to update it. Run this command from terminal to install cask and update brew if needed,

brew update && brew upgrade && brew tap homebrew/cask-versions && brew cleanup && brew cask cleanup

To install Open Jdk 8 run this command from the terminal,

brew cask install adoptopenjdk/openjdk/adoptopenjdk8

Look for all installed Java versions

ls /Library/Java/JavaVirtualMachines/

In my case this is the output,

adoptopenjdk-8.jdk	jdk-12.0.1.jdk		jdk-9.0.4.jdk		jdk1.8.0_151.jdk	jdk1.8.0_181.jdk

Install Jenv to manage java environments. Run this from the terminal,

brew install jenv

Add JVMs to jenv,

$ jenv add /Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home/

$ jenv add /Library/Java/JavaVirtualMachines/jdk-12.0.1.jdk/Contents/Home/

$ jenv add /Library/Java/JavaVirtualMachines/jdk-9.0.4.jdk/Contents/Home/

$ jenv add /Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/

Initially, Jenv will be using the system installed version of Java. Lets set the one we need. Before we proceed you may check all the installed version in Jenv using this command,

jenv versions

It should show all the installed version and the current global version highligted using a *

* system (set by /Users/username/.jenv/version)
  1.8
  1.8.0.181
  1.8.0.242
  12.0
  12.0.1
  9.0
  9.0.4
  openjdk64-1.8.0.242
  oracle64-1.8.0.181
  oracle64-12.0.1
  oracle64-9.0.4

Run this from the terminal to use Java 8 for all users,

jenv global openjdk64-1.8.0.242

Run this to set Java 8 for only current user,

jenv global openjdk64-1.8.0.242

Now jenv versions should return this,

  system
  1.8
  1.8.0.181
  1.8.0.242
  12.0
  12.0.1
  9.0
  9.0.4
* openjdk64-1.8.0.242 (set by /Users/username/.jenv/version)
  oracle64-1.8.0.181
  oracle64-12.0.1
  oracle64-9.0.4

Check Java version

java -version

It should return this,

openjdk version "1.8.0_242"
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_242-b08)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.242-b08, mixed mode)

Hope this tip helps for anyone who might need to manage multiple JVM versions.

Platform Developer II

This week I became Salesforce Certified Platform Developer II, 14th active Salesforce Certificate in my list. This is an item that I always wanted since it was called Advanced Developer. It took me a while to finally nail it. There are plenty of study guides available for this exam, so, I will skip to my exam experience and tips.

The format of the exam has changed since the old days of the advanced developer. You no longer need to clear the multiple-choice exam and wait for the assignment and post assignment essay. You have to earn the Platform Developer I certificate, which is a multiple-choice exam and four super badges and finally clear the Platform Developer II multiple-choice exam.

In my opinion, finishing the super badges, which has a lot of prerequisite badges/trails, then appearing for Platform Developer I and II back to back is the fast and most efficient approach. Not for me though, I cleared the Platform Developer I a few years back and lost the edge with time and had to start over. I worked in multiple projects involving complex customization and configuration, that helped I guess.

If you want you may proceed for the Platform Developer II multiple choice and then attempt the super badges. Multiple choice exam is way simpler compared to the super badges and as I said if you finish the super badges and pay attention to all the prerequisite materials you will breeze through though the multiple-choice exam. Mostly the exam is with circumstantial examples where you have to chose right from wrong or predict the output. Mostly it is not like that you have to memorize any specific coding syntax, it is rather you have to use what you would easily remember if you apply the customization tools available in Salesforce platform on your day to day work for a while.

The Platform Developer II multiple-choice exam focuses on the following,

  • Apex
  • Visualforce
  • Data Modeling
  • Performance Tuning
  • Integration
  • Testing
  • A tiny bit of Aura
  • Fraction of a tiny bit of LWC

Overall the exam objectives are perfectly balanced with what a Salesforce Advanced Developer would use in his/her day to day life. I hope this post helps to those who are preparing for this exam and the best of luck for your exam.

Some links:

Einstein Analytics Single Page Overview

The objective of this post is to put together the overall architecture and basic building blocks of Einstein Analytics. This post excludes Einstein Discovery Insights.

Einstein Analytics The Big Picture

Einstein Analytics - Architecture big Picture (3)

Basic Building Blocks

  1. Data Sources: Source Systems of your data that we will be analyzing using EA
    • Local Salesforce Environment:
    • External Salesforce Environment
    • External System(s)
    • CSV extracts
  2. Connection: Connection becomes available once Data Sync is enabled. Without Connection, only the Local Salesforce Environment is available as a data source. Each connection represents one source system. Even if the local Salesforce environment is the only data source, enabling Data Sync decouples data extraction and transformation and improves performance.
  3. Data
    • Connected Data: Each of the data tables/objects is represented as one connected data object. Connected data objects are not available directly in a lense or dashboard, it must be transformed using a data flow or a recipe first.
    • Datasets: Datasets are the end result of transformation using a data flow or a recipe.
    • Salesforce direct: Objects from the local Salesforce org are available for use using Salesforce direct in a dashboard.
  4. Transformation
    • Dataflow: A dataflow can consume one or more Connected data and datasets, perform various transformation operation and generate one or more datasets. There are limits on the total number of data flows and execution time of long-running data flows, so it is recommended to optimize the transformations and minimize the total number of data flows.
      Einstein Analytics - Dataflow vs Recipe (1)
    • Recipe: A recipe can consume one or more Connected data and datasets, perform various transformation operation and generate one dataset. Similar to the dataflows optimization of the transformation and minimization of the total number of long-running recipes is the key. One key benefit of the recipe over data flow is the easy visual while the transformation being performed as the UI is more visual with what is happening with the data and there are few transformations that are only available in Recipe.
      Einstein Analytics - Dataflow vs Recipe (2)
  5. Analyze
    • Lense: explore one dataset to gain insight.
    • Dashboard: Perform advanced exploration of one or more datasets or the salesforce objects from the local salesforce org directly.
    • Application: Combine Dashboards, Lenses, and Datasets in an application and share with specific users, groups, roles, and roles & subordinates.

JSON Web Token

What is JSON Web Token?

JSON Web Token, referred as JWT henceforth, is an open standard method for secure information exchange between two parties. This information may be anything and in this article we are going to explore it for Authorization use case. Actual implementation details of the JWT and working example of an authorization flow is coming soon.

When to use it?

  • Authorization: For secure authorization of a client application, from an identity provider, where the authorization server trusts the client application.
  • Information exchange: Securely transmit information between two parties.

How it look like?

It consists of three parts separated by dots (.)

  • Header
  • Payload
  • Signature

It looks like this,

xxx.yyyyy.zzzzz

Header

It consists of two attributes, type and algorithm. The attribute names may be different in different implementations, but these values are most widely used. It looks like this,

{
    typ: "JWT",
    alg: "RS256"
}

This JSON is base64 url encoded to form the first part of the token. Will explain the algorithm and encoding with example in my next post.

Payload

This will contain the actual information being exchanged. For authorization, this will contain the information about who the client application is. This again depends on the implementation, but for our example it looks like this,

{
    iss: "details about the issuer",
    sub: "subject of the request",
    aud: "audience (url of the identity provider)",
    exp: expiry date & time of the request,
    iat: date & time of the request
}

In an authorization from Salesforce as authorization server these are sample values,

iss: issuer is always the client id generated by creating a connected app

sub: username of an active user in the identity provider org, who have access to the connected app and api

aud: login url of Salesforce org.

exp: this depends on implementation, for Salesforce org, this has to be with 5 minutes of the request

iat: request date and time. If this is outside of few minutes from server time, the request will get rejected. Will update actual number soon.

This JSON is base64 url encoded to form the second part of the token.

Note: base64 url encode, makes it url safe, and the third part (signature) makes it protected against tampering, however this may be read by anyone if intercepted. Please be sure to communicate through encrypted channel, if there is confidential information. For authorization with Salesforce, this is ensured through TLS 1.1 and above.

Signature

This is the last part of the token and this ensures the token from tampering by any interceptor. The based 64 encoded first two parts are concatenated using a dot (.) separator and encoded using the algorithm defined in header (RS256 for this example) and the private key of the client. It looks like this,

RS256 ( 
     base64URLEncode (header) + '.'
     base64URLEncode (payload)
)

How it really looks

Here is a sample JWT with base 64 encoded header, base 64 encoded payload and RS256 encoded signature.

eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiJ9.eyJpc3MiOiIzTVZHOUNFbl9PM2p2djB3Q08yYkJBcVp3czZJa1lBR3IuYThKX2xHNzVZeEpkdXAuVms3NGhtNF9XSlZNTXFuUTJhREIxbDRTdkFCVGppbi5QRDcyIiwic3ViIjoic2FyZmFyYWpleUBwbGF5ZnVsLW1vb3NlLTM5MTUwNC5jb20iLCJhdWQiOiJodHRwczovL2xvZ2luLnNhbGVzZm9yY2UuY29tIiwiZXhwIjoxNTMzODI2Mzk3LCJpYXQiOjE1MzM4MjYyMTd9.NGSPTJqW5qMZVnuhGHLkOTAlefn0N12fxhm9PZnlh-kBW-ZTRyo40RxMFnUMzCi4aqcYjV986rboGPv7k0u1-ZzeoCubK0MfqPcu7Qi5OsYQwKmVvXTXxe8_VhzcLqNKB1VGkhl7EjCDK2TIqWI2CxOAqWbKtAtNpBjP2w_-viA0T88Q9VhR2D7lsDb_dEXuVpFBbJMGackDB2lNHh5UzJF-t-V2Gv-3lF6ywhAxlT6xmKwqltQdF9i2j5a_mGQIkPTOhnDF67P-OsPjUZrOFt0PDEeTQDNwcJACHsAjsO9IeqG6MMmdTqEDmv-9Ei5j56wp6tHMCpwyPR3xIoRSCQ

Please note that, this is not encrypted and hence you may decode it. Here is a nice debugger, that gives nice readable output,

Screen Shot 2018-08-09 at 11.00.20 AM.png

Please note that the signature cannot be tampered by an interceptor, if they don’t have access to the private key of the requestor. This signature may be decoded by the authorization server (or any recipient) using the public key of the requestor, and the payload may be verified for sanity.

Will discuss this topic in detail with working code sample in my next post soon. Thanks for reading.

Easy Deployment Using Unmanaged Package & SFDX

I found Salesforce packages to be very useful in resolving dependencies while metadata migration and this comes handy when we are deploying between orgs where we cannot use changeset or due to complexity in project we have to migrate metadata manually.

Please note that this is not the use case for continuous integration, where our package.xml file is maintained in a better manner.

The following approach will generate the package.xml, all metadata files with all dependencies in minutes and you will be all set for deployment to the target org. For obvious reasons, if there are conflicts with the metadata and the target org, that has to be resolved manually, on case to case basis.

Step 1: Create an unmanaged package with all the components in your target org.

Go to Setup > Package Manager > New

Give a name to your package (testpackage in my case) and save

Screen Shot 2018-08-08 at 11.22.35 PM.png

Click Add to add components

Please note that you need to add only key components, and all the related components will be added to the package automatically.

For example, if you add an App, all Objects included in that app, with related custom fields, apex triggers, Apex classes, visualforce pages (if used as button overrides/in inline page in page layout), page layouts, custom settings etc. will be automatically be added.

Screen Shot 2018-08-08 at 11.24.00 PM.png

For the purpose of this activity you don’t need to upload the package.

Step 2: Install SFDX in your computer. Here are the instructions

Step 3: Open Terminal if you are using Mac or Command Window if in Windows. Please be sure to add location SFDX to your path. Go to your workspace directory and make a directory named deployment and cd to that directory

Screen Shot 2018-08-08 at 11.31.01 PM

Step 4: Connect your Source Org. Change login.salesforce to test.salesforce if you are using a Sandbox. Change to custom domain if you have if enabled and login is restricted outside of domain.

sfdx force:auth:web:login -r https://login.salesforce.com -a sourceOrg1

This will open a window in your default browser with login page. Login, authorize and close your browser window once successful. This is what you will see in your terminal,

screen-shot-2018-08-08-at-11-36-59-pm.png

Step 5: Similarly Connect your target org,

sfdx force:auth:web:login -r https://login.salesforce.com -a targetOrg1

screen-shot-2018-08-08-at-11-42-54-pm.png

Step 6: Extract metadata from Source

sfdx force:mdapi:retrieve -s -r ./ -u sourceOrg1 -p testpackage

screen-shot-2018-08-08-at-11-47-16-pm.png

This will generate package.xml with details of all fields and also download all metadata. Please note that this will generate a zip file. Unzip and modify if you need to,

Screen Shot 2018-08-08 at 11.49.30 PM.png

Step 7: Deploy metadata to target,

Command for deployment if you have unzipped it and the target is a directory,

sfdx force:mdapi:deploy -d ./unpackaged -u targetOrg1 -w 10

Screen Shot 2018-08-09 at 12.05.53 AM.png

Command for deployment if you are deploying the zip file,

 

sfdx force:mdapi:deploy -f ./unpackaged.zip -u targetOrg1 -w 10

You may use -c option to to validate only instead of actual deployment.
Here is the deployment status in target org,

Screen Shot 2018-08-09 at 12.08.55 AM.png

 

 

Provide FLS permission to Profile

Here is a simple tip to check all checkboxes in a page. I found this particularly useful in granting FLS permission to a profile.

Go to your profile > Object Settings > The object you want to add FLS to & click on edit button,

Screen Shot 2018-08-08 at 10.58.10 PM

Open Developer Console. If you are using Google Chrome, it is available in Menu > More tools > Developer Console

Screen Shot 2018-08-08 at 11.01.34 PM.png

Enter the following code snippet in the Console and hit enter,

var all = document.getElementsByTagName("input")
for(var a in all) if(all[a].type == 'checkbox') all[a].checked = "checked";

This will check all checkboxes in the page,

Screen Shot 2018-08-08 at 11.07.32 PM.png

Save your page.

Lock out users during maintenance / deployment

Locking out users out of Salesforce platform for deployment, I used to think not needed at all. Recently I realized, sometimes you have to. There are some straightforward way to do this, like deactivating/freezing users using data loader, lock out all login hours in profile, freeze all users using data loader (UserLogin entity). I like a different approach using login flow. Login flow is a powerful tool introduced in Winter’15. I found the following properties that makes it a great fit for locking out users,

  • This gets invoked the moment someone tries to login and there is no way to bypass the flow. The user has to click on the finish button before proceeding to the application.
  • Usually our end user don’t have api access, hence we don’t need to prevent the api login (which login flow does not cover). If this is not true for you, please go with deactivating/freezing users.
  • Your users will be able to login and will see an user friendly message explaining why they are not able to use the application now and since when they will be able to do so. This avoids multiple calls from end users and explain them why they are not able to login.
  • Also you as an admin have more control. Like in this example, I have added a passcode, which I can share with a limited set of end users, if I need them to login.

Two step to set it up,

  • Create a flow with the message that you want to show to your users.
  • Assign this flow as login flow to all user profiles.

Create a flow:

  • Go to Setup > Flow and click on new flow button. This will open Flow designer page
  • Drag and drop a Screen element. This will open a new Screen wizard. Enter a name for your screen.

  • Go to Add a Field tab and add a display text element

  • Click on the added display text element to update its properties. Add the message you want your users to see.

  • Go back to add a field tab and add a password field

  • Click on the password field to update its properties – Give a name, make it required

  • Check the checkbox “Validate” under input validation section to add password validation logic. I love simplistic approach, hence I will just compare with one complex passcode. CAUTION: Please write down this passcode in a safe place. If possible email it to other admins in your org. If you loose it, and you assign this flow to System Administrator profile, you will be locked out.

  • Click ok to return back to flow designer. You will see one element added in the canvas. Mouse hover on the screen element and use the green icon top right to make it start element.

  • Save the flow, give it a name. I am calling it Scheduled Maintenance In Progress.
  • From the detail page of the flow activate the flow version

Assign the flow to each of the end user profiles:

  • Connect the flow with your profile. You have to repeat this step for each of the profile in your org.

Now you are ready to start working on your org. Users will not be able to login. Once you are done, delete the login flow records. Keep the flow for future use.

Let me know your thoughts. Also I would love to know other innovative ways you might be using for locking users out.

Bring your existing html project into to Salesforce with near zero effort

Sometimes we need to bring in our existing html static content into Salesforce. It may be a help document, some product catalogue or may be something else. As a matter of fact it may be a full-fledged html application with tons of Javascript, jQuery, Backbone, Marionette, AngularJS, Underscore or many other things. Converting this entire pack to Visualforce page is a pain. Here is a small lazy solution that might be of help. In brief, don’t convert anything just refer it. Here is how,

If your entire html source size is below 10MB here is one near zero effort solution for you 🙂
Here is the result,


This VF loads the HTML in full screen. In case you need this as a normal tab use this VF,

The small javascript trick we need here is the adjustment of the iframe height. Whenever the content gets changed this automatically adjusts the height. This is the result,

If your project size is more than 10 MB you have to split the folder to 2 or more so that each is within 10 MB size and you need to work on your project to handle this split (handle the references).

Salesforce Certified Sales Cloud Consultant

Continuing my quest for certifications I have become a Certified Sales Cloud Consultant last Monday. To be honest this is a bit difficult in terms of my experience with Salesforce Certification so far. There was a lot of use cases that I had to read and digest to be able to answer.
This is a 60 question exam of duration of 105 minutes and passing score is 68 percent. Compared to Developer or Administration certification exams, this exam tests your analytic and problem solving skills. You have to be prepared to provide solution to the scenarios. Also in some cases there were more than one possible solutions present for the given scenario. You have to chose the best solution comparing all the pros and cons of all applicable ones. 
I would recommend at least 1-2 years of hands on consulting experience in Salesforce platform to clear this. As the name suggest Sales cloud is the focus area here. Here are some key areas to focus more,

  • SDLC models and interrelation between various stages 
  • Campaing, Lead, Opportunity. You have to know every bit of Salesforce application in these areas
  • Territory management considerations
  • Customizable and Collaborative forecasts. You have to know different options and visibility available under forecast tab.
  • Person Account considerations
  • Community
  • Salesforce security model and sharing architecture. You should have in-depth knowledge in this area.
For preparation Study Guide was my bible. I have covered each items from section 5 of this guide. If you have premier success plan I would recommend completing all of the online courses outlined in the study guide section 4. Also you should read all the implementation guides mentioned under section 4. If you complete these and you have proper experience the exam will be an easy walk.