-
Notifications
You must be signed in to change notification settings - Fork 560
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark integration fails due to web sockets not being supported #40
Comments
API Gateway and Lambda do not support web sockets so you wouldn't be able to receive requests from clients. What are you trying to accomplish with web sockets? |
I'm not using web sockets. But the Spark service always calls |
Can you show me the code for your LambdaHandler class? |
The one catch with spark is that you need to define the resources after you have initialized the handler: https://github.com/awslabs/aws-serverless-java-container/blob/master/samples/spark/pet-store/src/main/java/com/amazonaws/serverless/sample/spark/LambdaHandler.java#L45 |
Sure. Thanks for the quick reply btw. public class LambdaHandler
implements RequestHandler<AwsProxyRequest, AwsProxyResponse> {
private SparkLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> handler;
@Override
public AwsProxyResponse handleRequest(AwsProxyRequest input, Context context) {
if (handler == null) {
try {
handler = SparkLambdaContainerHandler.getAwsProxyHandler();
HelloSparkServer.start();
} catch (ContainerInitializationException e) {
throw new RuntimeException("Failed to initialize server container", e);
}
}
return handler.proxy(input, context);
}
} public class HelloSparkServer {
private static final Logger LOG = LoggerFactory.getLogger(HelloSparkServer.class);
// not used on AWS Lambda
public static void main(String[] args) {
int port = getPort();
if (port >= 0) {
port(port);
}
start();
LOG.info(format("Listening on port %d", port()));
}
public static void start() {
LOG.info("Initializing routes");
path("/api", () -> {
path("/v1", () -> {
before("/*", (req, res) -> {
LOG.info(format("%s %s %s",
req.ip(),
req.requestMethod(),
req.url()));
});
get("/things/:id/prop",
(req, res) -> format("Hello thing %s prop", req.params(":id")));
post("/things/:id/prop/update",
(req, res) -> format("Hello thing %s prop update", req.params(":id")));
notFound((req, res) -> "Not found");
internalServerError((req, res) -> "Internal server error");
after("/*", (req, res) -> res.type("application/json"));
});
});
}
private static int getPort() {
String portFromEnv = System.getenv("FM_API_PORT");
return portFromEnv == null ? -1 : Integer.parseInt(portFromEnv);
}
} |
Thanks, I will test this. Which version of spark are you using? |
2.6.0 I'm looking right now to see if that's the issue. Their changelog is here: http://sparkjava.com/news#spark-kotlin-released |
Thanks. I'll test this and publish a fix in the next few days. Flying around right now. |
Meanwhile, could you try with spark 2.5.3 - the library is tested with this version. If you want to join the Gitter chat room I'll probably post updates there as I work on this. |
Getting a strange NPE with Spark 2.5.3:
It's hard to tell but it looks like maybe |
Yes, it looks like it's a bug with the latest release. I think FilterChainManager was introduced in 0.5. Try and use 0.4 of the library - I'll include this fix in the TT. Sorry about the churn on this. |
I've found the issue and have a fix - I will create some unit tests for this over the weekend and push it. I'll let you know when it's there and you can start using 0.6-SNAPSHOT. The fix is very simple. I needed to call the super.handleRequest(httpServletRequest, httpServletResponse, lambdaContext); in the spark handler. If you want, you can add it yourself here. |
I've pushed a fix for the bugs in the 2.5.3 support to the I noticed that spark made some breaking changes in 2.6.0 that require bigger changes to the library, particularly if we want to support both 2.5.x and 2.6.x versions. I have created a separate issue to track this #42 |
Thanks a bunch! Would it be possible to publish the 2.5.3-related bug fixes to the 0.5 version on maven central? I agree that supporting 2.6 would be appropriate for the 0.6 release. Perhaps it's a good idea to employ semver with patch updates? |
Yes, it may make sense to push out a fixed |
+1 for a 0.5.1 to get us running with Spark 2.5.3 |
Will do. Do you expect it to work against Spark 2.6 or just 2.5.3? |
@dmcg only 2.5.3 - I have support for 2.6 in a separate issue for the next major release |
0.6-SNAPSHOT from the servlet-improvements branch works for me, thanks |
Good to know, thanks. If all goes well I will push out 0.5.1 today. |
Resolving now that the fix is verified and about to go out |
I have a working hello-world-ish local Spark server, but when trying to run it on Lambda it throws the
UnsupportedOperationException
when the embedded server tries to configure web sockets.Here's the exact error I'm getting:
I can't see a current workaround other than some crazy reflection magic. I guess there are some potential fixes like:
LambdaEmbeddedServer.configureWebSockets()
to not throw an exceptioninitExceptionHandler
on theSpark.Service
instance to ignore the exceptionThoughts?
The text was updated successfully, but these errors were encountered: