Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FP file writer plugin errors in multi-file mode with fixed number of frames for block-mode datasets #2

Open
timcnicholls opened this issue Oct 4, 2024 · 0 comments
Assignees
Labels
bug Something isn't working

Comments

@timcnicholls
Copy link
Contributor

In situations were datasets are defined with an outer chunk dimension greater than 1 and the file writer plugin is configured to write a fixed number of frames to HDF into multiple files (so-called "block mode"), the file writer plugin generates an error indicating it is attempting to write beynod the dimensions of the dataset.

For instance, with the following FP config:

 {
        "hdf": {
            "dataset": {
                "raw": {
                    "datatype": "uint16",
                    "dims": [
                        16,
                        16
                    ],
                    "chunks": [
                        1000,
                        16,
                        16
                    ],
                    "compression": "none"
                }
            },
            "file": {
                "path": "/tmp/"
            },
            "frames": 4000,
            "acquisition_id": "test_1",
            "write": true,
            "timeout_timer_period": 10000,
            "process": {
                "number": 1,
                "rank": 0,
                "frames_per_block": 1000,
                "blocks_per_file": 2
            }
        }
    }

results in the following error:

13:48:33,859  FP.BabydFrameBuilderCore INFO  - BabydFrameBuilderCore : 0 Got frame: 0
13:48:33,859  FP.BabydFrameWrapperCore INFO  - Wrapped frame:  | Dataset name: raw | frame_number: 0 | Bitdepth: 2 | image size: 512000 | Compression: false
13:48:33,868  FP.BabydFrameBuilderCore INFO  - BabydFrameBuilderCore : 0 Got frame: 1
13:48:33,868  FP.BabydFrameWrapperCore INFO  - Wrapped frame:  | Dataset name: raw | frame_number: 1 | Bitdepth: 2 | image size: 512000 | Compression: false
13:48:33,872  FP.BabydFrameBuilderCore INFO  - BabydFrameBuilderCore : 0 Got frame: 2
13:48:33,872  FP.BabydFrameWrapperCore INFO  - Wrapped frame:  | Dataset name: raw | frame_number: 2 | Bitdepth: 2 | image size: 512000 | Compression: false
13:48:33,872  FP.BabydFrameBuilderCore INFO  - BabydFrameBuilderCore : 0 Got frame: 3
13:48:33,872  FP.BabydFrameWrapperCore INFO  - Wrapped frame:  | Dataset name: raw | frame_number: 3 | Bitdepth: 2 | image size: 512000 | Compression: false
HDF5-DIAG: Error detected in HDF5 (1.10.4) thread 139831270803200:
  #000: ../../../src/H5Dio.c line 400 in H5Dwrite_chunk(): failure to copy offset array
    major: Dataset
    minor: Can't allocate space
  #001: ../../../src/H5Dio.c line 116 in H5D__get_offset_copy(): offset exceeds dimensions of dataset
    major: Dataspace
    minor: Inappropriate type
13:48:33,872  FP.HDF5File    ERROR - HDF5 Function Error: (H5DOwrite_chunk failed) in /aeg_sw/work/users/mrl93834/develop/projects/odin-data-dpdk/odin-data/cpp/frameProcessor/src/HDF5File.cpp:253: void FrameProcessor::HDF5File::write_frame(const FrameProcessor::Frame&, hsize_t, uint64_t, FrameProcessor::HDF5CallDurations_t&)
HDF5 Stack Trace:
  [0]: ../../../src/H5Dio.c:400 in H5Dwrite_chunk: "failure to copy offset array"
  [1]: ../../../src/H5Dio.c:116 in H5D__get_offset_copy: "offset exceeds dimensions of dataset"
13:48:33,872  FP.FrameProcessorPlugin ERROR - Frame invalid
13:48:33,872  FP.FrameProcessorPlugin ERROR - Unexpected exception: HDF5 Function Error: (H5DOwrite_chunk failed) in /aeg_sw/work/users/mrl93834/develop/projects/odin-data-dpdk/odin-data/cpp/frameProcessor/src/HDF5File.cpp:253: void FrameProcessor::HDF5File::write_frame(const FrameProcessor::Frame&, hsize_t, uint64_t, FrameProcessor::HDF5CallDurations_t&)

The error does not occur if the dataset is being written in continuous mode, i.e. with frames=0.

@timcnicholls timcnicholls added the bug Something isn't working label Oct 4, 2024
@timcnicholls timcnicholls self-assigned this Oct 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant