AT

Gremlin driver setup for Amazon Neptune behind a Load balancer

Ggdotv11/15/2023
Hi folks, I've been running into issues connecting to Amazon Neptune behind an HAProxy as detailed on https://aws-samples.github.io/aws-dbs-refarch-graph/src/connecting-using-a-load-balancer/ (2nd option) The issue is to do with sending subsequent queries via a graph traversal source where it appears to cause NoHostAvailableExceptions frequently. The interesting part here is that this only happens when a Graph traversal source is reused. I'm using Gremlin driver 3.7.0 in Java. Are there any specific configurations that should be used when creating the Cluster/Client/Graph traversal source when connecting to a load balancer?
Sspmallette11/17/2023
Technically, 3.7.0 shouldn't have any problems with Neptune but we still advise 3.6.x. I don't like to hear that you are getting NHA exceptions. I was only recently celebrating that we've not heard of a single report of that problems in like 6 months. anyway, there are very few ways left to get a NHA and all of them essentially relate to initialization of the Client instance. You only see them if the Client can't create the connection pool, meaning that either the host isn't there or it is unreachable for some reason. It's a bit odd that you only see this reuse of "g" because if you had a successful query, it means initialization would have already happened. Future requests would not ever initialize again. A failure to connect would put the driver into a state where it would retry the connection until it could find a place to send the request. So you would more likely experience a pause while it tried to do that rather than an exception. cc/ @Kennh before we dig too much further into this, @triggan is there anything special about using the java driver with HAProxy that you know about?
Ggdotv11/17/2023
Lemme keep you posted on that, it seems to have been resolved and I'm wondering if it was happening on an older version of gdotv with an older version of the gremlin driver, ill dig into this more next week and let you know

Looking for more? Join the community!