Hi everyone, I am not use if this is the correct forum for this because it relates more with TileDB embedded and Github Copilot than TileDB-cloud proper. Has anyone else noticed that with:
VS Code Jupyter notebooks
Github Copilot enabled
there is a I/O error when trying to write a large dense array (10000 x 10000).
TileDBError: TileDB internal: [OrderedWriter::dowork] ([TileDB::IO] Error: Cannot write to file '....\__fragments\__1734663652875_1734663652875_717d568ee2dcf5de4f362ef743681cd2_22\a0.tdb'; File opening error CreateFile GetLastError 32 (0x00000020): The process cannot access the file because it is being used by another process.
Issue goes away once I disable Github copilot. I suspect that copilot is somehow snooping into fragments and not closing them.
How are you running the code? I tried some variations in VSCode with Copilot enabled and was not able to reproduce (run file in terminal, and in interactive prompt; run selection in terminal, and interactive prompt).
Thanks Isaiah for looking into this. I am running the code:
within a Jupyter notebook
in VSCode
with Copilot turned on in the background with an enterprise license
using Windows
on a company laptop
and I just double checked with another colleague that co-pilot is doing something weird for us. Because it works fine on my personal Linux desktop with my own free copilot license.
So I think it might be some enterprise security software running in the background that our IT has put in. Not sure, but I think this might just be an edge case for enterprise users not on TileDB cloud.
Shout Out: We’ve been using TileDB with Azure blob and it’s been absolutely fantastic so far. You guys really have some magic going on under the hood.
Shout Out : We’ve been using TileDB with Azure blob and it’s been absolutely fantastic so far. You guys really have some magic going on under the hood.
The VS Code workspace is running with the python, Jupyter and copilot extensions enabled, in a virtual environment (venv + Python3.11) that has tiledb (0.33.0) installed from pip.