Integrate IBM Watson™ Tone Analyzer API with Salesforce Without Any Apex!

Welcome to the third installment in External Services series. This time we have a new API joining the party and that’s – IBM Watson™ Tone Analyzer. Now what does this service do ?!? Right from their docs –

The IBM Watson™ Tone Analyzer service uses linguistic analysis to detect emotional and language tones in written text. The service can analyze tone at both the document and sentence levels. You can use the service to understand how your written communications are perceived and then to improve the tone of your communications. Businesses can use the service to learn the tone of their customers’ communications and to respond to each customer appropriately, or to understand and improve their customer conversations.

So what’s different about this blog post ? We are integrating the popular AI Platform – IBM Watson with Salesforce without any Apex! I know that is a little hard to digest but read more and find it out yourself.

IBM Watson™ Tone Analyzer Service

Now how do I get started with IBM Watson ?!? Here is the link to get started – You need to first sign up for the IBM Cloud Account and then grab a Username and Password (aka the Service Credentials).

We will be using the GET verb of the Tone Analyzer API tone. Let’s look at how the Swagger Schema for this API looks like –


  "swagger" : "2.0",

  "info" : {

    "description" : "Tone Analyzer API",

    "version" : "1.0.0",

    "title" : "Tone Analyzer API"


  "host" : "",

  "schemes" : [ "https" ],

  "paths" : {

    "/tone-analyzer/api/v3/tone" : {

      "get" : {

        "summary" : "Analyse the tone of the text",

        "produces" : [ "text/plain" ],

        "parameters" : [ {

          "in" : "query",

          "name" : "text",

          "description" : "The text to be analysd in url encoded format",

          "required" : true,

          "type" : "string"

        }, {

          "in" : "query",

          "name" : "version",

          "description" : "The version of the API",

          "required" : true,

          "type" : "string"

        }, {

          "in" : "query",

          "name" : "sentences",

          "description" : "The Analysis of Sentence",

          "required" : true,

          "type" : "string"

        } ],

        "responses" : {

          "200" : {

            "description" : "Result of the Analyzis",

            "schema" : {

              "type" : "string"








Creating the Named Credential

Named credential is used to specify the authentication mechanism used by the API. Here the IBM Watson API requires username and password based authentication:


Registering the External Service

Now let’s see how we can register the External Service. In case you want to know how you can setup External Services, feel free to refer the old blog posts:


Here is what you would see after you successfully register the External Service:

External Serv

Creating a Flow

Now let’s see how we can build the Flow.

Step 1: Here is how your Flow would look like at the end.


Step 2: Drag in a Screen and add a Text Area element to the screen.



Step 3: Add the new Apex Element (that calls the External Service) to the screen and fill the fields like as shown below:



Step 4: Add the ShowEmoticons Lightning Component to the Flow. Wait ?!? What did you just say ? A Lightning Component ? Hang on. Keep reading to know the surprise!




Now that we have a Flow in place that invokes the Tone Analyzer API but we cannot really make use of the data that is returned since External Services cannot parse the JSON that it returns but instead returns to us a piece of text.


  "document_tone" : {

    "tones" : [ {

      "score" : 0.6165,

      "tone_id" : "sadness",

      "tone_name" : "Sadness"

    }, {

      "score" : 0.829888,

      "tone_id" : "analytical",

      "tone_name" : "Analytical"

    } ]



We could certainly add some logic into the Flow (using Decisions and Assignments) that would check for the various emotions(anger, fear, joy and sadness) that would be present and identify the tone but how about adding some Lightning Components to the party ? Now that Flow Screens have started to support Lightning Components, let’s get creative!

Adding More Fun – Using GIPHY

The whole idea is to add a Lightning Component that looks at the response from the IBM Watson – Tone Analyzer API and then invokes the GIPHY APIs to show a GIF image corresponding to the identified emotion.

Lightning Component – Markup

Here is the Component Markup:

<aura:component implements="lightning:availableForFlowScreens" access="global">
    <aura:attribute name="toneAnalyzeResponse" type="string" access="global"></aura:attribute>
    <aura:handler name="init" value="{!this}" action="{!c.showEmotion}"></aura:handler>
    <aura:attribute name="gifUrl" type="string" access="private"></aura:attribute>
    <aura:attribute name="detectedEmotion" type="string" access="private"></aura:attribute>
    Detected Image: {!v.detectedEmotion}
    <img src="{!v.gifUrl}" />

Here we have an attribute called toneAnalyzeResponse which receives the IBM Watson – Tone Analyzer API Response from the Flow(API invoked through External Service). The “init” handler triggers the whole process of invoking the GIPHY APIs to retrieve a GIF corresponding to the identified emotion.

The two other attributes: gifUrl and detectedEmotion are used internally within the component.

Lightning Component – Controller

    showEmotion : function(component, event, helper) {
        var toneAnalyzeResponse = component.get( "v.toneAnalyzeResponse" );
        var standardTones = [ "joy", "sadness", "anger", "fear" ];
        var emotion;
        for( var i = 0; i <= standardTones.length - 1; i++ ) {
            if( toneAnalyzeResponse.indexOf( standardTones[i] ) !== -1 ) {
                emotion = standardTones[i];
            function( gifUrl ) {
                component.set( "v.gifUrl", gifUrl );
                component.set( "v.detectedEmotion", emotion );

At first, read the value in the attribute – v.toneAnalyzeResponse via the value provider. The standard emotions identified by IBM Watson – Tone Analyzer API are – joy, sadness, anger, fear. Thus, we can consider the API Response as a simple block of text and then use the indexOf method to identify which emotion is present in the response.

After identifying the emotion, we will then make an API call to GIPHY to retrieve a GIF corresponding to the identified emotion.

Lightning Component – Helper

    getGiphyImage : function( emotion, callback ) {
        var xhr = new XMLHttpRequest(); "GET", "" + emotion + "&limit=1", false )
        xhr.onreadystatechange = function() {
            if( this.readyState === 4 ) {
                var jsonResp = JSON.parse( this.response );
                var gifUrl =[0].images.downsized.url;
                callback( gifUrl );

Making an API call to GIPHY is a child’s play! All you have to do is to make a GET call with the API Key and the identified emotion and you are good! Note that we are using plain simple XMLHttpRequest to make the API call from vanilla JavaScript.

Adding Component to the Flow



Don’t forget to pass the API Response back into the Flow!

Let’s see it in Action!


Guess what ?!? I presented this at Jaipur Dev Fest’18 too. Check out the slide deck and video to know more –

Please note that the version presented at #JDF’18 was a more simplified one without the Lightning Component.

2 thoughts on “Integrate IBM Watson™ Tone Analyzer API with Salesforce Without Any Apex!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s