How to deploy a spark Java web app?

1) Clone this repo: https://github.com/simplesteph/ec2-masterclass-sampleapp

2) Navigate to project pom.xml directory

3) mvn clean install

4) go to target folder

5) java -jar ec2-masterclass-sample-app-1.0-jar-with-dependencies.jar

6) In browser, navigate to http://localhost:4567


For the standalone scenario, you can just use Gradle (or Maven) to create fat (meaning has all dependencies including an embedded Jetty server), executable jar file. Here is a simple build.gradle file that does just that:

apply plugin: 'java'
apply plugin: 'application'

// TODO Change this to your class with your main method
mainClassName = "my.app.Main"

defaultTasks 'run'

repositories {
    mavenCentral()
}

dependencies {
    compile group: 'com.sparkjava', name: 'spark-core', version: '2.5.5'
    // TODO add more dependencies here...
}

// Create a fat executable jar
jar {
    manifest {
        attributes "Main-Class": "$mainClassName"
    }

    from {
        configurations.compile.collect { it.isDirectory() ? it : zipTree(it) }
    }

    archiveName "app.jar"
}

Build the you application on the command line via gradle build. This will create an app.jar file in your build/libs folder then just run:

java -jar build/libs/app.jar

If you want to be really up to date :) then you have to use Docker to package your JRE and application jar, thus you are not dependent on the software stack installed on the server. To do this we can use a Dockerfile:

FROM java:8

ADD build/libs/app.jar /

EXPOSE 4567
ENTRYPOINT ["java", "-jar", "app.jar"]

Build the docker image and run it, e.g.:

docker build -t myapp:v1 .
docker run --rm --name myapp -p 4567:4567 myapp:v1

Of course if you want to use the Docker image on a remote web server, you need to push it to Docker Hub or a private docker repository and use docker pull to pull it to your server, before running it.


Here you find information about deploy: http://sparkjava.com/documentation.html#embedded-web-server

Fist of all, set filter options for web.xml config:

<web-app>
  <!-- some options -->
  <filter>
    <filter-name>SparkFilter</filter-name>
    <filter-class>spark.servlet.SparkFilter</filter-class>
    <init-param>
      <param-name>applicationClass</param-name>
      <param-value>your.package.Application</param-value>
    </init-param>
  </filter>

  <filter-mapping>
    <filter-name>SparkFilter</filter-name>
    <url-pattern>/*</url-pattern>
  </filter-mapping>
</web-app>

Application class should implement of the interface spark.servlet.SparkApplication and have to initialize the routes in the init() method.

This one looks like as (in Java SE 8 you can use Lambda Expression for router.):

package your.package;

import static spark.Spark.*;

public class Application implements SparkApplication {
    @Override
    public void init() {
        get("/", (request, response) -> "Hello World");

        get("/hello/:name", (request, response) -> {
            return "Hello: " + request.params(":name");
        });
    }
}

App with this configuration works fine for tomcat and glassfish servers.


You will first need to create a regular Java project that can be built into a .war file (in Eclipse this would be a Dynamic Web Project)

The spark documentation at this link describes what needs to be added to your projects web.xml file. http://sparkjava.com/documentation.html#other-webserver

the param-value listed in the documentation within the filter needs to point to the class where you have defined your routes.

Additionally, all the code that was previously in main() needs to be moved to init().

@Override
public void init() {
    get(new Route("/test") {
        @Override
        public Object handle(Request request, Response response) {
            return "response goes here;
        }

    });

Also, in order for me to deploy it to JBoss, I had to only include the spark libraries and not the Jetty libraries. Once this was done, you should be able to build the war and deploy it to your server the same way you would any other Java project.