Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No Enchanced Context in AwsProxyRequest (using Spark) #89

Closed
metaphyze opened this issue Jan 6, 2018 · 15 comments
Closed

No Enchanced Context in AwsProxyRequest (using Spark) #89

metaphyze opened this issue Jan 6, 2018 · 15 comments
Assignees
Milestone

Comments

@metaphyze
Copy link

I'm passing some enhanced context from my custom authorizer lambda. I can see the context when the request is received by a Python lambda. However, it's missing from the AwsProxyRequest in my Java lambda running Spark: awsProxyRequest.getRequestContext().getAuthorizer().getContextValue("customkey") is null. Also, awsProxyRequest.getRequestContext().getAuthorizer().getClaims() is null.

@sapessi
Copy link
Collaborator

sapessi commented Jan 7, 2018

Are you using a POJO handler or a stream-based one with Lambda? The serializer built into the Lambda runtime does not support annotations. The model classes use annotations to read arbitrary fields from the context. My recommendation would be to switch to a stream-based Lambda handler.

@metaphyze
Copy link
Author

metaphyze commented Jan 7, 2018

I'm using the lambda (the exact lambda) from the Spark sample project. Here's the signature:

@Override
public AwsProxyResponse handleRequest(AwsProxyRequest awsProxyRequest, Context context) {

This sounds like a framework bug to me. It looks like that's how I'm supposed to use the framework with Spark.

@sapessi
Copy link
Collaborator

sapessi commented Jan 7, 2018

I think that's the issue then. I will add a stream sample in the next release of the framework. Unfortunately we cannot rely on Lambda's serialization to read those values, and because the keys for the context values are arbitrary I cannot define a model for them - I need to rely on the @JsonAny... annotations.

The only way to make this work is to change the handler class to implement the RequestStreamHandler and do the serialization yourself, just like I do in the Spring sample here before passing requests/responses in and out of the framework.

I have brought this up with the Lambda team, they may look into it but it's low priority at the moment

@metaphyze
Copy link
Author

Thanks. I'll take a look at the Spring example. Someone should make a note of this though in the "Enhanced Context" documentation:

https://aws.amazon.com/about-aws/whats-new/2016/12/enhanced-context-for-custom-authorizers-in-amazon-api-gateway/

and the custom authenticator documentation:

https://docs.aws.amazon.com/apigateway/latest/developerguide/use-custom-authorizer.html

I did a lot of debugging trying to figure out where my values were (because they were showing up in the Python lambda). Thanks, anyway.

@metaphyze
Copy link
Author

Also, while you're talking to the Lambda team, can you ask them to document how you get the Authorizer out of the Spark Request object in the request handlers. This is how:

((ApiGatewayRequestContext)req.attribute("com.amazonaws.apigateway.request.context")).getAuthorizer().getPrincipalId();

Definitely need to write that down somewhere.

@metaphyze
Copy link
Author

Here's one possible solution. Since the framework jar isn't sealed, in my own project, I can replace ApiGatewayAuthorizerContext with an identical version in the same package but with my custom attributes added to the class. For example, if I wanted to add the JWT info to the enhanced context in the authorizer lambda (because I've already extracted it there and don't want to do it again in the Spark Java lambda), I could create the com.amazonaws.serverless.proxy.internal.model package in my own project and add:

package com.amazonaws.serverless.proxy.internal.model;

import com.fasterxml.jackson.annotation.JsonAnyGetter;
import com.fasterxml.jackson.annotation.JsonAnySetter;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import java.util.HashMap;
import java.util.Map;

@JsonIgnoreProperties(
ignoreUnknown = true
)
public class ApiGatewayAuthorizerContext {
private Map<String, String> contextProperties = new HashMap();
private String principalId;
private String email;
private String family_name;
private String given_name;
private String locale;
private String name;
private String picture;
private CognitoAuthorizerClaims claims;

public ApiGatewayAuthorizerContext() {
}

@JsonAnyGetter
public String getContextValue(String key) {
    return (String)this.contextProperties.get(key);
}

@JsonAnySetter
public void setContextValue(String key, String value) {
    this.contextProperties.put(key, value);
}

public String getPrincipalId() {
    return this.principalId;
}

public String getEmail() {
    return email;
}

public String getFamily_name() {
    return family_name;
}

public String getGiven_name() {
    return given_name;
}

public String getLocale() {
    return locale;
}

public String getName() {
    return name;
}

public String getPicture() {
    return picture;
}

public void setPrincipalId(String principalId) {
    this.principalId = principalId;
}

public void setEmail(String email) {
    this.email = email;
}

public void setFamily_name(String family_name) {
    this.family_name = family_name;
}

public void setGiven_name(String given_name) {
    this.given_name = given_name;
}

public void setLocale(String locale) {
    this.locale = locale;
}

public void setName(String name) {
    this.name = name;
}

public void setPicture(String picture) {
    this.picture = picture;
}

public CognitoAuthorizerClaims getClaims() {
    return this.claims;
}

public void setClaims(CognitoAuthorizerClaims claims) {
    this.claims = claims;
}

}

I've tested this and it works. The main thing I don't like about it is that I might need to update this file if the "real" version of ApiGatewayAuthorizerContext changes in the next framework update.

@sapessi
Copy link
Collaborator

sapessi commented Jan 7, 2018

The context properties are saved as request attributes. That is a possible solutions, but the cleanest (and recommended) way is to switch to the Stream handler. What's preventing you from using the stream handler? Is there something else the library should do there?

@metaphyze
Copy link
Author

It looks like the enhanced context properties are coming in as custom json properties on the authorizer object. I wrote this to see exactly what the stream contained:

package com.amazonaws.serverless.sample.spark;

import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;

import com.amazonaws.serverless.proxy.internal.model.AwsProxyResponse;
import com.amazonaws.services.lambda.runtime.RequestStreamHandler;
import com.amazonaws.services.lambda.runtime.Context;

import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestStreamHandler;
import com.google.gson.Gson;
import org.apache.commons.io.IOUtils;

public class LambdaRawHandler implements RequestStreamHandler {

@Override
public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context) throws IOException {
    String input = IOUtils.toString(inputStream);
    AwsProxyResponse response = new AwsProxyResponse();
    response.setBody(input);
    response.setStatusCode(200);
    outputStream.write(new Gson().toJson(response).getBytes());
}

}

and buried in the json returned from the lambda, I see this:

    "authorizer": {
        "email": "[email protected]",
        "family_name": "XXXX",
        "given_name": "XXXX",
        "locale": "en",
        "name": "XXXX XXXX",
        "picture": "XXXXXX",
        "principalId": "12345"
    },

Those attributes were added by me to the context by my authorizer lambda:

context = {
    'email' : idinfo['email'],
    'name' : idinfo['name'],
    'picture' : idinfo['picture'],
    'given_name' : idinfo['given_name'],
    'family_name' : idinfo['family_name'],
    'locale' : idinfo['locale']
}

I know I can get this by parsing the JWT token again in the Spark lambda. I just didn't want to because I already had the information. I could have added anything else I wanted rather than simply the JWT information. My point is, the information is in the Authorizer object, not the body of my request. I'm not sure how I'd use the inputstream to capture that information other than maybe dumping it all into a raw JSON object, which is inconvenient to access. Perhaps, the RequestHandler (public class LambdaHandler
implements RequestHandler<AwsProxyRequest, AwsProxyResponse>)
could take a parameter or something like that so that framework users could add in additional enhanced context information without replacing ApiGatewayAuthorizerContext. Anyway, basically, I don't know how to get at the custom authorizer info using the InputStream other than (like I said) dumping it all into a raw JSON object.

@sapessi
Copy link
Collaborator

sapessi commented Jan 7, 2018

So long as you use the Stream Handler like the example, you should be able to access all those parameters.

First, in your Spark method, use the getAttribute method of the HttpServletRequest object to extract the API_GATEWAY_CONTEXT_PROPERTY attribute.

Once you have the ApiGatewayRequestContext object, use the getAuthorizer method to receive an instance of the ApiGatewayAuthorizerContext object.

The authorizer context object contains all of the values, including the custom ones you returned to API Gateway - this is because the object uses the @JsonAnySetter annotation to tell Jackson to store any unrecognized value using a custom method, a Map behind the scenes. You can then use the getContextValue method to retrieve any custom authorizer context value by its key.

Writing this code off the top of my head here, don't expect it to compile or be 100% correct:

get("/pets", (req, res) -> {
    ApiGatewayRequestContext ctx = (ApiGatewayRequestContext)req.raw().getAttribute(API_GATEWAY_CONTEXT_PROPERTY);
    ApiGatewayAuthorizerContext authCtx = ctx.getAuthorizer();
    String picture = authCtx.getContextValue("picture");
});

@metaphyze
Copy link
Author

Thanks. I just tried that. It actually does compile, but it looks like those values aren't there. I get null for the all the custom values. This is similar to the first thing I tried, like this:

((ApiGatewayRequestContext)req.attribute("com.amazonaws.apigateway.request.context")).getAuthorizer().getContextValue("picture")

That also returns null.

@metaphyze
Copy link
Author

Oh, but yeah, I'm still using the POJO example, not the stream handler. I guess I need to work out the stream handler equivalent of this:

@Override
public AwsProxyResponse handleRequest(AwsProxyRequest awsProxyRequest, Context context) {
    if (!isInitialized) {
        isInitialized = true;
        try {
            handler = SparkLambdaContainerHandler.getAwsProxyHandler();
            defineResources();
            Spark.awaitInitialization();
        } catch (ContainerInitializationException e) {
            log.error("Cannot initialize Spark application", e);
            return null;
        }
    }
    return handler.proxy(awsProxyRequest, context);
}

Mainly, I want to regard that as boiler-plate code and just concentrate on the Spark handlers, but I still want the enhanced context info within the Spark handler request.

@sapessi
Copy link
Collaborator

sapessi commented Jan 7, 2018 via email

@metaphyze
Copy link
Author

That did it. I removed my replacement version of ApiGatewayAuthorizerContext. Then I changed the code to be:

public class LambdaHandler implements RequestStreamHandler { // rather than RequestHandler<AwsProxyRequest, AwsProxyResponse>

and the lambda function to be:

@Override
public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context)
        throws IOException {
    if (handler == null) {
        try {
            handler = SparkLambdaContainerHandler.getAwsProxyHandler();
            defineResources();
            Spark.awaitInitialization();
        } catch (ContainerInitializationException e) {
            log.error("Cannot initialize Spark container", e);
            outputStream.close();
            throw new RuntimeException(e);
        }
    }

    AwsProxyRequest request = objectMapper.readValue(inputStream, AwsProxyRequest.class);
    AwsProxyResponse resp = handler.proxy(request, context);
    objectMapper.writeValue(outputStream, resp);
    // just in case it wasn't closed by the mapper
    outputStream.close();
}

Now, I am indeed getting the enhanced context using:

ApiGatewayRequestContext ctx =
(ApiGatewayRequestContext)req.raw().getAttribute(API_GATEWAY_CONTEXT_PROPERTY);
ApiGatewayAuthorizerContext authCtx = ctx.getAuthorizer();
String picture = authCtx.getContextValue("picture");

Thanks for your help. This is great. Amazon really does have the best support in the industry. Could you put this in the sample project or document it somewhere? Thanks again.

@sapessi
Copy link
Collaborator

sapessi commented Jan 7, 2018

I'm going to keep this open to track the updated samples. Actions items:

  • Improve documentation on enhanced custom authorizers context
  • Add stream handler to all samples
  • Explore the possibility of creating a stream handler abstract class that you can extend to make it easier to create stream handlers with these library instead of relying on Lambda's raw interface.

@sapessi sapessi self-assigned this Jan 7, 2018
@sapessi sapessi added this to the Release 0.9 milestone Jan 7, 2018
sapessi added a commit that referenced this issue Jan 11, 2018
…plates to address #82. Updated wiki quick start documentation with all samples to close #89. Added a new Spring Boot sample.
@sapessi
Copy link
Collaborator

sapessi commented Jan 11, 2018

Closing this since I pushed all pending updates in the last commit.

@sapessi sapessi closed this as completed Jan 11, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants