« Back
in eclipse sbt Scala coldfusion read.

Running Scala Code In ColdFusion For Extensibility.

ColdFusion already has support for Java code and starting with CF10+, a dynamic class loader is included without the need to restart the CF Application Service. So what about another languages that runs on the JVM? We'll start to explore calling Scala code from ColdFusion. I could not find any articles related to this task on the Internet so I decided to do it myself. This won't be a post on why Scala is great as there are many posts on that out there but moreso for developers who are interested in bringing new functionality to an existing CF code base. The end goal is to use CF as the front end UI and to utilize Scala and Akka to create a more powerful backend.

If you haven't already looked at Akka, you're missing out. It provides an abstracted way to create concurrent code to mitigate multi threading errors, provides a great model for creating distributed systems and much more. I also found this great technical document on how these guys built a fault tolerant video streaming platform that can scale up and out in just 5000 lines of Scala: GearPump.

The Goal

The goal is to create a simple Scala class that can upload a file to Azure using SAS authentication.
We'll be calling that code through ColdFusion and deal with some problems such as:

  1. Setting up the work environment to work with Scala
  2. Instantiating the Class in CF
  3. Include Dependencies in the File
  4. Packaging the Scala solution and importing it into our CF Instance

Setting Up The Work Environment

Install Scala through the link provided. That should be pretty straightforward.

Install SBT which stands for Simple Build Tool and is used for managing dependencies in your project, compiling and running your code, and generate documentation. It looks through the Ivy and Maven repositories which holds libraries for Java. Coming from a .NET background this is the equivalent of Nuget. Ever since switching to stuff that runs on the JVM though, I'm a bit overwhelmed with the amazing amount of options available. We'll be using SBT though since that's the go to package manager for Scala.

Using Scala IDE and sbteclipse

If you like an editor with code completion or "intellisense", the only option I know of is Scala IDE which is running off of Eclipse. If you choose to go this route, you can install this nifty plugin for SBT that generates the eclipse project file for you and includes all the library references. If you don't like eclipse then skip this section.

Install sbt-eclipse From their github repo:

Add sbteclipse to your plugin definition file. You can use either: the global file (for version 0.13 and up) at ~/.sbt/0.13/plugins/plugins.sbt
the project-specific file at PROJECT_DIR/project/plugins.sbt
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0")
In sbt use the command eclipse to create Eclipse project files
eclipse In Eclipse use the Import Wizard to import Existing Projects into Workspace

If the plugin directory doesn't exist, just create it. We'll be needing this plugins.sbt for the next plugin anyway.

Packaging With sbt-assembly

sbt-assembly will unjar all of the dependencies of our program into their respective class files and then jar everything up into one big jar with our Scala code class files. This gives me a nice contained jar file that I can just drop into ColdFusion at a later date.

There are other alternatives out there that leave the jar files the way they are which I'll have to follow up on to see how they play with CF.

This plugin goes into the same directory as the eclipse plugin above. Make sure that you have a new line after each plugin otherwise sbt will throw a fit. So your plugin.sbt file should look like this:

addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0")

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")

Preparing the Directory Structure

Let's create a new folder for our Scala code project. According to the sbt documentation it should be like this:

src/  
  main/
    resources/
       <files to include in main jar here>
    scala/
       <main Scala sources>
    java/
       <main Java sources>
  test/
    resources
       <files to include in test jar here>
    scala/
       <test Scala sources>
    java/
       <test Java sources>

Sbt will know to automagically look into the src/main/scala folder to compile those classes. Additionally if we had any Java files it'll compile those too.

Add Project Dependencies

Since the Scala code will depend on Apache's HttpClient code I'll add that as a dependency. In my [ProjectFolder]/build.sbt file which is one level above this directory tree, I have this:

name := """ScalaBlockBlobUploader"""

version := "1.0"

scalaVersion := "2.11.1"

libraryDependencies += "org.apache.httpcomponents" % "httpclient" % "4.3.6"

libraryDependencies += "org.apache.httpcomponents" % "httpmime" % "4.3.6"

That sets the project name, its version and the scala version I'm using.

Adding dependencies to the project is as simple as the final two lines.

Preparing Our Code

Okay so now let's drop into our project folder and run the command:
sbt If you're running this for the first time, it'll download what it needs to run and install the plugins we specified earlier.

You should be greeted with a command prompt afterwards. To compile our project we can run:
compile To ready our eclipse project we just run:
eclipse which will call the sbt-eclipse plugin we installed earlier to generate a .project file for us.

Now we can use the Import feature in eclipse to import the project. You'll notice that all dependencies get pulled in and you don't have to fiddle around with the IDE to do that.

This is also great in that we don't have to worry about dependencies when another developer pulls the code from source control. They can pull it in automatically.

Scala Code To Upload to Azure Block Blob Using SAS Authentication

Now to actually build that Scala class!
Here's what I've got in BlockBlobUploader.scala:

import java.io.File  
import org.apache.http.HttpResponse  
import org.apache.http.client.HttpClient  
import org.apache.http.client.methods.HttpPost  
import org.apache.http.impl.client.DefaultHttpClient  
import org.apache.http.client.methods.HttpPut  
import org.apache.http.entity.mime.MultipartEntityBuilder  
/**
 * @author tony.truong
 * @constructor Creates a new portal uploader
 * @param blobEndPoint The endpoint given by the data connection as a verified partner.
 * @param signature The authentication signature of the SAS key
 * @param container The block blob container name
 * @param file The physical file path. This implies that this process runs on the same machine calling it or a valid UNC path
 * @param filename The final filename at the destination
 * 
 */
class BlockBlobUploader(blobEndPoint:String,signature:String,container:String,file:String,filename:String) extends App{

  /**
   * @return returns true on successful upload and false on failure.
   */
  def upload():Boolean =
  {
     var returnCode=false
     var f= new File(file)
     val mpEntity=MultipartEntityBuilder.create().addBinaryBody("file", f).build()
     val httpClient = new DefaultHttpClient()     
     val uri= blobEndPoint + container + "/" + filename + signature 

     var put = new HttpPut(uri)
     put.setHeader("x-ms-blob-type","BlockBlob")
     put.setEntity(mpEntity)

     try
     {
       val response = httpClient.execute(put)
       returnCode= if(response.getStatusLine().getStatusCode() == 201) true else false
     }
     catch
     {
       case e: Exception => returnCode=false
     }
     httpClient.close()
     return returnCode
  }
}

Pretty self explanatory code.
Note: If your file exceeds 64MB, you'll need to implement uploading through blocks. At which point you'll need to drop into java.io because doing so in Scala requires a lot of memory if your file is big. Better to use java.io to read 1MB-4MB blocks (which is the supported block size in the Azure documentation)

Compile And Package the code

Let's drop back into sbt in the project folder. We can test to see that all is well with the compile command again.

This time we're going to package the whole thing into a jar file for ColdFusion by using the assembly command from our sbt-assembly plugin. You'll notice that in the output it'll put the final file into:
[ProjectFolder]\target\scala-2.11\BlockBlobUploader-assembly-1.0.jar

This is the final jar that we'll drop into our ColdFusion Server. If you have the dynamic class loader you can load the classes that way. Otherwise we can drop it into the lib folder of the CF instance such as:
C:\ColdFusion10\cfusion\wwwroot\WEB-INF\lib Assuming your install is in C:\ColdFusion10. Without the dynamic class loader though you'll need to restart the CF Application Service.

Instantiating the Scala Class in ColdFusion

Since Scala code boils down to java classes and byte code that runs on the JVM, We can instantiate the object in ColdFusion using their CreateObject method as if it was a Java object:

<cfset vm=createObject("java", "BlockBlobUploader").init("https://myendpoint/",  
    "?sr=myKey",
    "myContainerName",
    "/path/filename.gz","filename.gz")/>

<cfset returnStatus=vm.upload()/>  
<cfoutput>  
    #returnStatus#
</cfoutput>  

Phew! That should work.

Although the post was a bit long, We managed to get our development environment setup and deal with dependencies if we're developing in a team.

This setup may not be for everyone but it works well enough for me coming from a .NET background faced with many choices and not losing any flexibility.

Another great option is to get the directory structure created for you with TypeSafe's Activator. A great place to start learning about the Actor model to create distributed programs.

It generates the directory structure required for sbt and provides tutorials on how to get started on many things including Akka.

Thanks for reading and have a great holiday!

comments powered by Disqus