Seeking Advice on Encoding/Decoding Data from gRPC Endpoints with `connect-es`/`schema`/`effect` ...
Hello! For anyone that has spent time integrating
A bit of context: I already have an internal package generated via codgen with the services exposing gRPC endpoints. The package is not generated with the
Do you codegen with the flag on, transform the data with
When encoding I guess you don't have much choice but use Schema to validate the data coming into the server and then painstakingly use
Just trying to gather feedback around gRPC/connect-es and effect integration
connect-es/schema/effect: what approach did you take for encoding/decoding data from gRPC endpoints?A bit of context: I already have an internal package generated via codgen with the services exposing gRPC endpoints. The package is not generated with the
json_types=true flag though, so the toJson method from @connect/protobuf is not strongly typed. Do you codegen with the flag on, transform the data with
toJson and then validate/"narrow" types with Schema or do you have a lot of adhoc Schema primitives for dealing with all the "complex" gRPC types? When encoding I guess you don't have much choice but use Schema to validate the data coming into the server and then painstakingly use
@bufbuild/protobuf's create(proto_schema) method to fit the data into whatever is required by the gRPC endpoint....Just trying to gather feedback around gRPC/connect-es and effect integration
