on 01-05-2016 05:32 AM
Task Name | This is a collection of tasks for interacting with Cisco Spark. |
Description | These will enable UCS Director add people to Spark groups and also post messages into specific spark rooms. Updated: 3rd Feb 2015 based on feedback |
Prerequisites | Tasks have been updated and tested on version 5.4.0.1 |
Category | Spark (Tasks) |
Components |
Instructions for Regular Workflow Use:
NOTE: This post has been updated to reflect new tasks and updates to ensure they function with UCS Dirctor 5.4.0.1
I have been looking to show how UCSD can integrate with various tools and their API's, I had an idea for Cisco Spark (collaboration tool) a while back and have been waiting for the publication of the Rest API. Well in December the API was released.
I decided to create some workflows that would add people to Spark rooms and post messages (including pictures) to the room. If you look at the following screen shot you will see in the Spark chat client a user 'dsmith@miggins.com' has been added to the room and then a message (including an image) posted straight after.
I have also created a video showing some of the integration from an end-user perspective - or a recorded customer session.
If you want more information about the Spark API I would highly recommend the documentation at https://developer.ciscospark.com/index.html
In the process of ensuring these tasks work with 5.4 I have decided to rewrite them a bit and try and make the code a little bit more reusable (I plan to look to creating a script module for these). I have also created some additional functionality that I didn't originally have the 1st time around.
I have also added rollback functionality to the applicable tasks (typically as an option as some of the demos we have we don't want to delete posted messages when the workflow is rolled back).
The list of tasks are;
Type | Action | Description |
---|---|---|
Room | Create | Creates a room. The authenticated user is automatically added as a member of the room. |
Delete | Deletes a room, by ID. | |
Details | Shows details for a room, by ID. | |
Message | Create | Posts a plain text message, and optionally, a media content attachment, to a room. |
Delete | Deletes a message, by message ID. | |
Details | Shows details for a message, by message ID. | |
Membership | Create | Add someone to a room by Person ID or email address; optionally making them a moderator. |
Delete | Deletes a membership by ID. | |
Details | Get details for a membership by ID. | |
People | Details | Shows details for a person, by ID. |
To use the tasks you will need to provide a number of inputs that are consistant to all the tasks and some that a specific to the task being called. The comming inputs are;
Input Name | Description |
---|---|
Token | This is your (or another generic account like I have done) API token to authenticate access against spark. It looks some thing like (and no this isnt my real one, I made it up);
It can be obatined from https://developer.ciscospark.com/getting-started.html. The curl example is along the lines of (I have highlighted the to
|
Proxy Host | If your UCS Director instance is sat in a lab that has no direct access to the Internet (like me) you will need to enter your poxry server destination in this input. |
Proxy Port | As above you will also need to specific the porxy port (for example 80). |
The optional inputs often require you to know what the specifc ids of things are (roomId for example), this can be obtained from the useful, inline, help pages - https://developer.ciscospark.com/endpoint-rooms-get.html.
I do plan to look at how I can improve these tasks so that it makes obtaining this information much easier. I have a few ideas but just need to test them out so watch this space.
The CloupiaScripts are also stored on my GitHub repo (probably updated more often than will be here) - Cloupiascripts/Spark at master · clijockey/Cloupiascripts · GitHub
The next Spark project will be to create a 'chatbot' so that I can make use of the UCSD API as well and request items. Although this is a bit bigger and will require some thought - although we do have it working as a proof of concept .
The original version 5.3 example is as follow. You can make use of the below scripts and create your own custom tasks. The code was put together pretty quickly for a PoC so im sure lots of room for improvements.
The example workflow used two custom tasks;
The Spark_Membership_Create task will add a user to a specific spark group. The inputs required are;
Input | Description |
---|---|
token | Spark API Token for authentication reasons |
member email | The email address of the person you want to add to the Spark room |
spark_room_id | The ID of the Spark room. |
proxy server [optional] | The address of your proxy server if UCSD is sat behind a proxy |
proxy server port [optional] | The port of the proxy |
//=================================================================
// Title: Spark_Membership_Create
// Description: This will add a user to a specific spark group
//
// Author: Rob Edwards (@clijockey/robedwa@cisco.com)
// Date: 18/12/2015
// Version: 0.1
// Dependencies:
// Limitations/issues:
//=================================================================
importPackage(java.util);
importPackage(java.lang);
importPackage(java.io);
importPackage(com.cloupia.lib.util);
importPackage(com.cloupia.model.cIM);
importPackage(com.cloupia.service.cIM.inframgr);
importPackage(org.apache.commons.httpclient);
importPackage(org.apache.commons.httpclient.cookie);
importPackage(org.apache.commons.httpclient.methods);
importPackage(org.apache.commons.httpclient.auth);
// Inputs
var token = input.token;
var fqdn = "api.ciscospark.com";
var member = input.member;
var roomId = input.roomId;
var proxy_host = input.proxy;
var proxy_port = input.proxy_port;
// Build up the URI
var primaryTaskPort = "443";
var primaryTaskUri = "/v1/memberships";
// Request Parameters to be passed
var primaryTaskData = "{\"roomId\" : \""+roomId+"\",\
\"personEmail\" : \""+member+"\", \
\"isModerator\" : false}";
// Main code start
// Perform primary task
logger.addInfo("Request to https://"+fqdn+":"+primaryTaskPort+primaryTaskUri);
logger.addInfo("Sending payload: "+primaryTaskData);
var taskClient = new HttpClient();
if (proxy_host != null) {
taskClient.getHostConfiguration().setProxy(proxy_host, proxy_port);
}
taskClient.getHostConfiguration().setHost(fqdn, primaryTaskPort, "https");
taskClient.getParams().setCookiePolicy("default");
taskMethod = new PostMethod(primaryTaskUri);
taskMethod.setRequestEntity(new StringRequestEntity(primaryTaskData));
taskMethod.addRequestHeader("Content-Type", "application/json");
taskMethod.addRequestHeader("Accept", "application/json");
taskMethod.addRequestHeader("Authorization", token);
taskClient.executeMethod(taskMethod);
// Check status code once again and fail task if necessary.
statuscode = taskMethod.getStatusCode();
resp=taskMethod.getResponseBodyAsString();
logger.addInfo("Response received: "+resp);
if (statuscode == 400) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": The request was invalid or cannot be otherwise served. An accompanying error message will explain further.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 401) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": Authentication credentials were missing or incorrect.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 403) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": The request is understood, but it has been refused or access is not allowed.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 404) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": The URI requested is invalid or the resource requested, such as a user, does not exist. Also returned when the requested format is not supported by the requested method.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 409) {
logger.addWarn("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": The request could not be processed because it conflicts with some established rule of the system. For example, a person may not be added to a room more than once.");
logger.addInfo("Response received: "+resp);
} else if (statuscode == 500) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": Something went wrong on the server.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 501) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": Server is overloaded with requests. Try again later.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else {
logger.addInfo("All looks good. HTTP response code: "+statuscode);
//logger.addError("Return code "+statuscode+": Something unknown happend!!");
// Set this task as failed.
//ctxt.setFailed("Request failed.");
}
taskMethod.releaseConnection();
The Spark_Message_Post task will post a message to a specific spark group. The inputs required are;
Input | Description |
---|---|
token | Spark API Token for authentication reasons |
message | Message to post to Spark |
spark_room_id | The ID of the spark room you want to post into |
file | The file you want to post with the message |
proxy [optional] | As my UCSD instance sits behind a poxy I need to communicate via it, as a result need to specific in the CloupiaScript. |
proxy port [optional] | The port of the proxy |
//=================================================================
// Title: Spark_messages_post
// Description: This will post a message to a specific spark group
//
// Author: Rob Edwards (@clijockey/robedwa@cisco.com)
// Date: 18/12/2015
// Version: 0.1
// Dependencies:
// Limitations/issues:
//=================================================================
importPackage(java.util);
importPackage(java.lang);
importPackage(java.io);
importPackage(com.cloupia.lib.util);
importPackage(com.cloupia.model.cIM);
importPackage(com.cloupia.service.cIM.inframgr);
importPackage(org.apache.commons.httpclient);
importPackage(org.apache.commons.httpclient.cookie);
importPackage(org.apache.commons.httpclient.methods);
importPackage(org.apache.commons.httpclient.auth);
// Inputs
var token = input.token;
var fqdn = "api.ciscospark.com";
var message = input.message;
var file = input.file;
var roomId = input.roomId;
var proxy_host = input.proxy;
var proxy_port = input.proxy_port;
// Build up the URI
var primaryTaskPort = "443";
var primaryTaskUri = "/v1/messages";
// Data to be passed
var primaryTaskData = "{\"roomId\" : \""+roomId+"\",\
\"file\" : \""+file+"\", \
\"text\" : \""+message+"\"}";
// Main code start
// Perform primary task
logger.addInfo("Request to https://"+fqdn+":"+primaryTaskPort+primaryTaskUri);
logger.addInfo("Sending payload: "+primaryTaskData);
var taskClient = new HttpClient();
if (proxy_host != null) {
taskClient.getHostConfiguration().setProxy(proxy_host, proxy_port);
}
taskClient.getHostConfiguration().setHost(fqdn, primaryTaskPort, "https");
taskClient.getParams().setCookiePolicy("default");
taskMethod = new PostMethod(primaryTaskUri);
taskMethod.setRequestEntity(new StringRequestEntity(primaryTaskData));
taskMethod.addRequestHeader("Content-Type", "application/json");
taskMethod.addRequestHeader("Accept", "application/json");
taskMethod.addRequestHeader("Authorization", token);
taskClient.executeMethod(taskMethod);
// Check status code once again and fail task if necessary.
statuscode = taskMethod.getStatusCode();
resp=taskMethod.getResponseBodyAsString();
logger.addInfo("Response received: "+resp);
if (statuscode == 400) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": The request was invalid or cannot be otherwise served. An accompanying error message will explain further.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 401) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": Authentication credentials were missing or incorrect.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 403) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": The request is understood, but it has been refused or access is not allowed.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 404) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": The URI requested is invalid or the resource requested, such as a user, does not exist. Also returned when the requested format is not supported by the requested method.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 409) {
logger.addWarn("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": The request could not be processed because it conflicts with some established rule of the system. For example, a person may not be added to a room more than once.");
logger.addInfo("Response received: "+resp);
} else if (statuscode == 500) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": Something went wrong on the server.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else if (statuscode == 501) {
logger.addError("Failed to configure Spark. HTTP response code: "+statuscode);
logger.addInfo("Return code "+statuscode+": Server is overloaded with requests. Try again later.");
logger.addInfo("Response received: "+resp);
// Set this task as failed.
ctxt.setFailed("Request failed.");
} else {
logger.addInfo("All looks good. HTTP response code: "+statuscode);
//logger.addError("Return code "+statuscode+": Something unknown happend!!");
// Set this task as failed.
//ctxt.setFailed("Request failed.");
}
taskMethod.releaseConnection();
The example workflow will first add the user and then post a message;
An example input (you will need to obtain your own API key by following the instructions);
This is very cool. Have you tried this in UCSD 5.4 already?
Thanks Orf, I got a bit too excited about the Spark API
I am upgrading our labs to 5.4 this week and will be testing these tasks then. I will update with any tweaks when tested.
Find answers to your questions by entering keywords or phrases in the Search bar above. New here? Use these resources to familiarize yourself with the community: