Using undertow as an embedded web server

October 2014 ยท 5 minute read

Since I was tinkering with node.js and found out how easy it was to create an embedded http server in it, I started to search for embedded http server solutions in java, And ended up at Jboss’s Undertow after looking at TechEmpower Benchmarks.

I want to create Restful service using undertow + resteasy. There is already a project Hammock running on same specifications.

Yes, code..

 
public class App
{
    private static Undertow server;
    public static void main( String[] args ) throws ServletException, IllegalArgumentException, IOException
    {
        //Gathering Resteasy classes
        Reflections reflections = new Reflections("");
        //Resources
        Set> resourceClassSet = reflections.getTypesAnnotatedWith(Path.class);
        Set> providerClassSet = reflections.getTypesAnnotatedWith(Provider.class);

        Reflections reflectionsForProviders = new Reflections(ClassLoader.getSystemClassLoader());
        Set> globalProviderClassSet = reflectionsForProviders.getTypesAnnotatedWith(Provider.class, false);

        /*
         * Create RestEasyDeployment
         */
        ResteasyDeployment restEasyDeployment = new ResteasyDeployment();

        /*
         * Adding all resource classes
         */
        restEasyDeployment.getActualResourceClasses().addAll(resourceClassSet);
        /*
         * Adding all provider classes
         */
        restEasyDeployment.getActualProviderClasses().addAll(providerClassSet);
        //restEasyDeployment.getActualProviderClasses().addAll(globalProviderClassSet);
        for(Class cls : globalProviderClassSet) {
            //Checking for constructors
            for(Constructor constructor : cls.getConstructors()) {
                java.lang.reflect.Type[] parameterTypes = constructor.getGenericParameterTypes();
                if(parameterTypes.length == 0 && constructor.getModifiers() == Constructor.PUBLIC) {
                    System.out.println("Adding : "+cls.getCanonicalName());
                    restEasyDeployment.getActualProviderClasses().add(cls);
                    break;
                }
            }

        }

        ResteasyHttpHandler handler = new ResteasyHttpHandler();
        handler.setDispatcher(restEasyDeployment.getDispatcher());

        /*
         * Getting rest easy servlet info (same as specifying servlet in web.config )
         *
         * HttpServletDispatcher ( super class of HttpServlet30Dispacher ) also works
         * I cannot find javadoc related it, but found @WebServlet(asyncSupported=true) in 30Dispatcher
         * means it supports async io, so using it
         *
         */
        ServletInfo servletInfo = Servlets.servlet("RestEasyServlet", HttpServlet30Dispatcher.class);
        servletInfo.setAsyncSupported(true)
        .setLoadOnStartup(1)
        .addMapping("/*");

        /*
         * Creating deployment info for context path as / (root)
         */
        DeploymentInfo deploymentInfo = new DeploymentInfo();
        deploymentInfo.setContextPath("/")
        .addServletContextAttribute(ResteasyDeployment.class.getName(), restEasyDeployment)
        .addServlet(servletInfo)
        .setDeploymentName("KiriyardPetika")
        .setClassLoader(ClassLoader.getSystemClassLoader());

        DeploymentManager deploymentManager = Servlets.defaultContainer().addDeployment(deploymentInfo);
        deploymentManager.deploy();

        /*
         * Starting Undertow ftw
         */

        server = Undertow.builder()
                .addHttpListener(1198, "localhost")
                .setHandler(deploymentManager.start())
                .build();


        System.out.println("Starting server ... ");

        server.start();

        System.out.println("Server started ... ");

    }
}

I have used reflections for now to avoid manual adding of providers and path classes.

Benchmarking against node.js

I am using my simillar implementation for node.js called Khokhu-Node

Code for plugin of Khokhu-Node


var Bench = function (context) {
    this.context = context;
};

Bench.prototype.processRequest = function (request, onComplete) { var resp = { responseCode: this.context.HTTP_OK, headers: {} }; onComplete(resp, JSON.stringify({ Hi: “There” })); };

module.exports = Bench;

Code for Java

@Path("/bench")
public class Bench {

    @GET
    @Produces("application/json")
    public HiThere hello() {
        return new HiThere();
    }


    public static class HiThere {
        private String hi;

        public HiThere() {
            hi = "There";
        }

        public String getHi() {
            return hi;
        }

        public void setHi(String hi) {
            this.hi = hi;
        }
    }
}

Results

Node.js
Server Software:        
Server Hostname:        127.0.0.1
Server Port:            8090

Document Path:          /bench
Document Length:        14 bytes

Concurrency Level:      10
Time taken for tests:   1.764 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      1210000 bytes
HTML transferred:       140000 bytes
Requests per second:    5669.14 [#/sec] (mean)
Time per request:       1.764 [ms] (mean)
Time per request:       0.176 [ms] (mean, across all concurrent requests)
Transfer rate:          669.89 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.2      0       1
Processing:     1    1   1.0      1      18
Waiting:        0    1   1.0      1      18
Total:          1    2   1.0      1      19

Percentage of the requests served within a certain time (ms)
  50%      1
  66%      2
  75%      2
  80%      2
  90%      3
  95%      3
  98%      4
  99%      5
 100%     19 (longest request)
Java

Using

MAVEN_OPTS=“-XX:+UseParNewGC -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC  -XX:NewRatio=3 -Xms1024m -Xmx10240m”
Server Software:
Server Hostname: 127.0.0.1 Server Port: 1198

Document Path: /bench Document Length: 14 bytes

Concurrency Level: 10 Time taken for tests: 1.311 seconds Complete requests: 10000 Failed requests: 0 Total transferred: 1210000 bytes HTML transferred: 140000 bytes Requests per second: 7625.05 #/sec Time per request: 1.311 ms Time per request: 0.131 ms Transfer rate: 901.01 [Kbytes/sec] received

Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.3 0 10 Processing: 0 1 0.6 1 10 Waiting: 0 1 0.6 1 10 Total: 0 1 0.6 1 11

Percentage of the requests served within a certain time (ms) 50% 1 66% 1 75% 1 80% 2 90% 2 95% 2 98% 2 99% 3 100% 11 (longest request)

Now un-realworld benchmarks

Node.js w/o Context overhead
Server Software:        
Server Hostname:        127.0.0.1
Server Port:            1199

Document Path:          /
Document Length:        14 bytes

Concurrency Level:      10
Time taken for tests:   0.753 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      1210000 bytes
HTML transferred:       140000 bytes
Requests per second:    13279.30 [#/sec] (mean)
Time per request:       0.753 [ms] (mean)
Time per request:       0.075 [ms] (mean, across all concurrent requests)
Transfer rate:          1569.14 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.1      0       1
Processing:     0    0   0.5      0      13
Waiting:        0    0   0.4      0      11
Total:          0    1   0.5      1      13

Percentage of the requests served within a certain time (ms)
  50%      1
  66%      1
  75%      1
  80%      1
  90%      1
  95%      1
  98%      1
  99%      2
 100%     13 (longest request)
Java w/o pojo serialization overhead
Server Software:        
Server Hostname:        127.0.0.1
Server Port:            1198

Document Path:          /bench
Document Length:        17 bytes

Concurrency Level:      10
Time taken for tests:   1.118 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      1440000 bytes
HTML transferred:       170000 bytes
Requests per second:    8944.17 [#/sec] (mean)
Time per request:       1.118 [ms] (mean)
Time per request:       0.112 [ms] (mean, across all concurrent requests)
Transfer rate:          1257.77 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.4      0      14
Processing:     0    1   0.7      1      15
Waiting:        0    1   0.6      1      15
Total:          0    1   0.8      1      15

Percentage of the requests served within a certain time (ms)
  50%      1
  66%      1
  75%      1
  80%      1
  90%      2
  95%      2
  98%      2
  99%      3
 100%     15 (longest request)

These benchmarks only shows raw capacity of delivering minimal amount of data at minimal amount of computation. Nothing near to real world.

Java code is part of an upcoming project simillar to khokhu-node mentioned earlier.


contact - social@apurv.me