SparkJava: Dependency injection in SparkApplication using Spring

We've been looking at Spark for a while now, we really like it's clean syntax and simplicity. Unforturnately, up until now, we have been unable to find a way to implement dependency injection with Spark. Our application is Spring based, so making Spark work with Spring was an absolute requirement for us.

We have now found a solution that works for us, and because we haven't found anyone else suggesting a solution online we thought we'd post about our solution.

Consider the following SparkApplication:

public class HelloWorldApp implements SparkApplication {

    private final ResourceToInject resourceToInject;

    public HelloWorldApp(ResourceToInject resourceToInject) {
        this.resourceToInject = resourceToInject;
    }

    @Override
    public void init() {
        get("/", (req, res) -> resouceToInject::sayHello);
    }
}

Now consider how we could inject our dependencies into HelloWorldApp.

The first thing that we would have to do is make SparkFilter use the HelloWorldApp defined in the applicationContext instead of just initiating an instance. We solved this by extending SparkFilter, giving us the ability to override the getApplication method to fetch the bean definition of HelloWorldApp.

public class SpringifiedSparkFilter extends SparkFilter {

    @Override
    protected SparkApplication getApplication(FilterConfig filterConfig) throws ServletException {

        ApplicationContext context = new ClassPathXmlApplicationContext("applicationContext.xml");
        String e = filterConfig.getInitParameter("applicationClass");
        return (SparkApplication) context.getBean(e);
    }
}

Next, we need to create the HelloWorldApp bean in our applicationContext.xml.

<bean id="helloWorldApp" class="com.deadcoderising.HelloWorldApp">  
    <constructor-arg ref="resourceToInject"/>
</bean>  

We can now use our SpringifiedSparkFilter instead of SparkFilter when creating the filter in web.xml.

<filter>  
    <filter-name>SparkFilter</filter-name>
    <filter-class>com.deadcoderising.SpringifiedSparkFilter</filter-class>
    <init-param>
        <param-name>applicationClass</param-name>
        <param-value>helloWorldApp</param-value>
    </init-param>
</filter>  

And there you go. A Spark application that is handled by Spring.

We're still learning, so if you have found other ways of solving this problem, don't hesitate to add a comment.

Co-author Adam Mscisz

Enjoyed the post?

If you don't want to miss future posts, make sure to subscribe