Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The bottleneck for us (on that instance) is not postgres but transformations. Transformations are tiny snippets of javascripts which convert the event from Rudder JSON to whatever structure (JSON, keyval etc) that the destinations expect. We also support user defined transformations - functions defined by the user to transform/enhance the event.

Currently, transformations are run in nodeJS. So, for every batch of events there is a call into nodeJS from go and that is slow. We do batching/parallel-calls but still.

I think, postgres gets us > 15K/sec throughput.



What happens if I pass a 64 bit integer, and the Rudder pipeline being in JavaScript silently down-casts it to a 53 bit integer?

Segment's pipeline involves JS at some point. We had an issue where our 64 bit integers were down-casted silently. We found out the hard way. We use strings now (perhaps should have used strings right away, I am not necessarily the sharpest tool in the shed).


Hmm, never thought about that. Need to think how to handle it - great point!!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: