VCF store with ~2000 gVCF files, can't export

Before turning the keys over to our researchers, who I know are know are going to try to export data to multi-sample gVCF, I thought I’d try something simple. So I imported a subset of our gVCF data into a bucket on S3.

If I run tiledbvcf list on that bucket I eventually, like 5 minutes, get a list of sample names back. If I run tiledbvcf export on that bucket without specifying samples and using a tiny 2kbp range NC_000001.11:28000-29000, I’m running out of memory. I’ve allocated 64Gb and still get errors. I can export 10 samples, but that’s not going to cut it for a GWAS experiment.

[2025-11-10 16:31:28.310] [tiledb-vcf] [Process: 3709885] [Thread: 3709885] [critical] Exception: SparseIndexReaderBase: Cannot set array memory budget (5153960755.200001) because it is smaller than the current memory usage (28763909118).

The dataset was created using

tiledbvcf create --uri s3://the-path --enable-allele-count --enable-variant-stats --enable-sample-stats --attributes 'info_*','fmt_*'

Am I missing something in the configuration here?