Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Was the first lambda waiting on the second's result before returning? If so, you really shouldn't rely on synchronously invoking à lambda from within another lambda.

Also, things stored in the global scope (in Node.js functions) are kept between different invocations of a same container. If it can help...



Not explicitly, this was orchestrated through the API Gateway authentication support.

And yes, different invocations of the same container helped significantly (most importantly, the interface was already deployed in the private network), that is why I said that the problem was concurrent access. You could serve a pretty good number of users/second, but each user that happened to send a request while the already deployed containers were all busy would have to wait for up to 3s before we could give them any data from the backend. And of course, if two new users came in parallel, they would both have to wait, etc.


Ok, I see. And the VPC explains the high cold start. Only "solution" I see would be to "keep your functions warm" (there's a feature now for that), but it's more of an inconvenient useful anti- pattern to me. Quite annoying.

Sorry I can't be of any more help :/


I am still wondering if there is any way to get good lambda performance with a DB that is neither managed by AWS and not publically accessible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: