Validating Large JSON Schemas without high time/memory costs
I'm building a low-code tool for generating user interfaces. The UI structure is defined using JSON. The schema for the JSON structure is quite large.
I have been using Zod for validation, but it is very slow—taking minutes to validate a single UI definition. I've tried switching to ArkType but it consumes too much memory, often causing the browser to crash. When it does not crash (and on the server where I can set a high memory limit) the call to
Given these performance issues, how can I efficiently validate large JSON structures against a schema without excessive time consumption or memory usage? Are there optimizations, alternative libraries, or hybrid approaches that could help?
Would love to hear any insights or best practices!
Here is a small section of the ArkType schema to give you an idea of how it works. As you can see, it is inherently cyclic (take a look at
I have been using Zod for validation, but it is very slow—taking minutes to validate a single UI definition. I've tried switching to ArkType but it consumes too much memory, often causing the browser to crash. When it does not crash (and on the server where I can set a high memory limit) the call to
type.module takes approximately 15 seconds, and then validation mere milliseconds.Given these performance issues, how can I efficiently validate large JSON structures against a schema without excessive time consumption or memory usage? Are there optimizations, alternative libraries, or hybrid approaches that could help?
Would love to hear any insights or best practices!
Here is a small section of the ArkType schema to give you an idea of how it works. As you can see, it is inherently cyclic (take a look at
component).