Skip to content

Commit 63bb0b1

Browse files
authored
Merge pull request #140 from mulimoen/feature/dcpl-2021
The DCPL branch
2 parents 9035bb6 + 8fdf133 commit 63bb0b1

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+7628
-1888
lines changed

.github/workflows/ci.yml

+16-3
Original file line numberDiff line numberDiff line change
@@ -162,9 +162,6 @@ jobs:
162162
run: |
163163
[ "${{matrix.mpi}}" != "serial" ] && FEATURES=mpio
164164
cargo test -vv --features="$FEATURES"
165-
- name: Test const generics
166-
if: matrix.rust == 'nightly'
167-
run: cargo test -p hdf5-types --features hdf5-types/const_generics
168165
169166
msi:
170167
name: msi
@@ -209,6 +206,22 @@ jobs:
209206
- name: Build and test all crates
210207
run: cargo test -vv
211208

209+
msrv:
210+
name: Minimal Supported Rust Version
211+
runs-on: ubuntu-18.04
212+
strategy:
213+
fail-fast: false
214+
steps:
215+
- name: Checkout repository
216+
uses: actions/checkout@v2
217+
with: {submodules: true}
218+
- name: Install Rust
219+
uses: actions-rs/toolchain@v1
220+
with: {toolchain: 1.51, profile: minimal, override: true}
221+
- name: Build and test all crates
222+
run:
223+
cargo test --workspace -vv --features=hdf5-sys/static --exclude=hdf5-derive
224+
212225
wine:
213226
name: wine
214227
runs-on: ubuntu-latest

CHANGELOG.md

+59
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,65 @@
22

33
## Unreleased
44

5+
### Added
6+
7+
- Complete rewrite of `DatasetBuilder`; dataset creation API is now different and not
8+
backwards-compatible (however, it integrates all of the new features and is more
9+
flexible and powerful). It is now possible to create and write the datasets in one step.
10+
Refer to the API docs for full reference.
11+
- New `Extents` type matching HDF5 extents types: null (no elements), scalar, simple (fixed
12+
dimensionality); it is used to query and specify shapes of datasets. Extents objects
13+
are convertible from numbers and also tuples, slices and vectors of indices - all of
14+
which can be used whenever extents are required (e.g., when creating a new dataset).
15+
- New `Selection` type and the API around it closely matching HDF5 selection API - this
16+
includes 'all' selection, point-wise selection and hyperslab selection (only 'regular'
17+
hyperslabs are supported - that is, hyperslabs that can be represented as a single
18+
multi-dimensional box some of whose dimensions may be infinite). Selection objects
19+
are convertible from numbers and ranges and tuples and arrays of numbers and ranges;
20+
one can also use `s!` macro from `ndarray` crate. Selections can be provided when
21+
reading and writing slices.
22+
- LZF / Blosc filters have been added, they have to be enabled by `"lzf"` / `"blosc"`
23+
cargo features and depend on `lzf-sys` / `blosc-sys` crates respectively. Blosc filter
24+
is a meta-filter providing multi-threaded access to the best-in-class compression codecs
25+
like Zstd and LZ4 and is recommended to use as a default when compression is critical.
26+
- New `Filter` type to unify all of the filters API; if LZF / Blosc filters are enabled,
27+
this enum also contains the corresponding variants. It is now also possible to provide
28+
user-defined filters with custom filter IDs and parameters.
29+
- Dataset creation property list (DCPL) API is now supported; this provides access to all
30+
of the properties that can be specified at dataset creation time (e.g., layout, chunking,
31+
fill values, external file linking, virtual maps, object time tracking, attribute
32+
creation order, and a few other low-level settings).
33+
- As part of DCPL change, virtual dataset maps (VDS API in HDF5 1.10+) are now supported.
34+
- Link creation property list (LCPL) API is now also wrapped.
35+
- File creation property list (FCPL) API has been extended to include a few previously
36+
missing properties (object time tracking, attribute creation order and few other
37+
low-level settings).
38+
- Added `h5-alloc` feature to `hdf5-types` crate - uses the HDF5 allocator for
39+
varlen types and dynamic values. This may be necessary on platforms where different
40+
allocators may be used in different libraries (e.g. dynamic libraries on Windows),
41+
or if `libhdf5` is compiled with the memchecker option enabled. This option is
42+
force-enabled by default if using a dll version of the library on Windows.
43+
- New `DynValue` type which represents a dynamic self-describing HDF5 object that
44+
also knows how to deallocate itself; it supports all of the HDF5 types including
45+
compound types, strings and arrays.
46+
- Added methods to `Dataset`: `layout`, `dapl`, `access_plist`, `dcpl`, `create_plist`.
47+
48+
### Changed
49+
50+
- `Dataspace` type has been reworked and can be now constructed from an extents object
51+
and sliced with a selection object.
52+
- `Dataset::fill_value` now returns an object of the newly added `DynValue` type; this
53+
object is self-describing and knows how to free itself.
54+
- Automatic chunking now uses a fill-from-back approach instead of the previously
55+
used method which is used in `h5py`.
56+
- Removed `Filters` type (there's now `Filter` that represents a single filter).
57+
- `write_slice`, `read_slice`, `read_slice_1d`, `read_slice_2d` now take any object
58+
convertible to `Selection` (instead of `SliceInfo`).
59+
- `Dataset::chunks` has been renamed to `Dataset::chunk`
60+
- Const generics support (MSRV 1.51): `hdf5-types` now uses const generics for array types,
61+
allowing fixed-size arrays of arbitrary sizes.
62+
- The `ndarray` dependency has been updated to `0.15`.
63+
564
## 0.7.1
665

766
### Added

Cargo.toml

+9-3
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ edition = "2018"
1414
[features]
1515
default = []
1616
mpio = ["mpi-sys", "hdf5-sys/mpio"]
17+
lzf = ["lzf-sys", "errno"]
18+
blosc = ["blosc-sys"]
1719

1820
[workspace]
1921
members = [".", "hdf5-types", "hdf5-derive", "hdf5-sys", "hdf5-src"]
@@ -24,21 +26,25 @@ bitflags = "1.2"
2426
lazy_static = "1.4"
2527
libc = "0.2"
2628
parking_lot = "0.11"
27-
ndarray = ">=0.13,<0.15"
29+
ndarray = "0.15"
2830
num-integer = "0.1"
2931
num-traits = "0.2"
3032
mpi-sys = { version = "0.1", optional = true }
33+
errno = { version = "0.2", optional = true }
3134
hdf5-sys = { path = "hdf5-sys", version = "0.7.1" } # !V
3235
hdf5-types = { path = "hdf5-types", version = "0.7.1" } # !V
3336
hdf5-derive = { path = "hdf5-derive", version = "0.7.1" } # !V
37+
blosc-sys = { version = "0.1", package = "blosc-src", optional = true }
38+
lzf-sys = { version = "0.1", optional = true }
39+
cfg-if = "1.0"
3440

3541
[dev-dependencies]
3642
paste = "1.0"
37-
pretty_assertions = "0.6"
43+
pretty_assertions = "0.7"
3844
rand = { version = "0.8", features = ["small_rng"] }
3945
regex = "1.3"
4046
scopeguard = "1.0"
4147
tempfile = "3.2"
4248

4349
[package.metadata.docs.rs]
44-
features = ["hdf5-sys/static", "hdf5-sys/zlib"]
50+
features = ["hdf5-sys/static", "hdf5-sys/zlib", "blosc", "lzf"]

README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -49,10 +49,10 @@ fn main() -> hdf5::Result<()> {
4949
{
5050
// write
5151
let file = hdf5::File::create("pixels.h5")?;
52-
let colors = file.new_dataset::<Color>().create("colors", 2)?;
52+
let colors = file.new_dataset::<Color>().shape(2).create("colors")?;
5353
colors.write(&[RED, BLUE])?;
5454
let group = file.create_group("dir")?;
55-
let pixels = group.new_dataset::<Pixel>().create("pixels", (2, 2))?;
55+
let pixels = group.new_dataset::<Pixel>().shape((2, 2)).create("pixels")?;
5656
pixels.write(&arr2(&[
5757
[Pixel { xy: (1, 2), color: RED }, Pixel { xy: (3, 4), color: BLUE }],
5858
[Pixel { xy: (5, 6), color: GREEN }, Pixel { xy: (7, 8), color: RED }],

examples/simple.rs

+2-2
Original file line numberDiff line numberDiff line change
@@ -23,10 +23,10 @@ fn main() -> hdf5::Result<()> {
2323
{
2424
// write
2525
let file = hdf5::File::create("pixels.h5")?;
26-
let colors = file.new_dataset::<Color>().create("colors", 2)?;
26+
let colors = file.new_dataset::<Color>().shape(2).create("colors")?;
2727
colors.write(&[RED, BLUE])?;
2828
let group = file.create_group("dir")?;
29-
let pixels = group.new_dataset::<Pixel>().create("pixels", (2, 2))?;
29+
let pixels = group.new_dataset::<Pixel>().shape((2, 2)).create("pixels")?;
3030
pixels.write(&arr2(&[
3131
[Pixel { xy: (1, 2), color: RED }, Pixel { xy: (3, 4), color: BLUE }],
3232
[Pixel { xy: (5, 6), color: GREEN }, Pixel { xy: (7, 8), color: RED }],

hdf5-sys/Cargo.toml

+1-1
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ static = ["hdf5-src"]
2929
deprecated = ["hdf5-src/deprecated"]
3030

3131
[build-dependencies]
32-
libloading = "0.6"
32+
libloading = "0.7"
3333
regex = { version = "1.3", features = ["std"] }
3434

3535
[target.'cfg(all(unix, not(target_os = "macos")))'.build-dependencies]

hdf5-sys/build.rs

+20-22
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@ impl Display for RuntimeError {
8585

8686
#[allow(non_snake_case, non_camel_case_types)]
8787
fn get_runtime_version_single<P: AsRef<Path>>(path: P) -> Result<Version, Box<dyn Error>> {
88-
let lib = libloading::Library::new(path.as_ref())?;
88+
let lib = unsafe { libloading::Library::new(path.as_ref()) }?;
8989

9090
type H5open_t = unsafe extern "C" fn() -> c_int;
9191
let H5open = unsafe { lib.get::<H5open_t>(b"H5open")? };
@@ -124,28 +124,26 @@ fn validate_runtime_version(config: &Config) {
124124
}
125125
for link_path in &link_paths {
126126
if let Ok(paths) = fs::read_dir(link_path) {
127-
for path in paths {
128-
if let Ok(path) = path {
129-
let path = path.path();
130-
if let Some(filename) = path.file_name() {
131-
let filename = filename.to_str().unwrap_or("");
132-
if path.is_file() && libfiles.contains(&filename) {
133-
println!("Attempting to load: {:?}", path);
134-
match get_runtime_version_single(&path) {
135-
Ok(version) => {
136-
println!(" => runtime version = {:?}", version);
137-
if version == config.header.version {
138-
println!("HDF5 library runtime version matches headers.");
139-
return;
140-
}
141-
panic!(
142-
"Invalid HDF5 runtime version (expected: {:?}).",
143-
config.header.version
144-
);
145-
}
146-
Err(err) => {
147-
println!(" => {}", err);
127+
for path in paths.flatten() {
128+
let path = path.path();
129+
if let Some(filename) = path.file_name() {
130+
let filename = filename.to_str().unwrap_or("");
131+
if path.is_file() && libfiles.contains(&filename) {
132+
println!("Attempting to load: {:?}", path);
133+
match get_runtime_version_single(&path) {
134+
Ok(version) => {
135+
println!(" => runtime version = {:?}", version);
136+
if version == config.header.version {
137+
println!("HDF5 library runtime version matches headers.");
138+
return;
148139
}
140+
panic!(
141+
"Invalid HDF5 runtime version (expected: {:?}).",
142+
config.header.version
143+
);
144+
}
145+
Err(err) => {
146+
println!(" => {}", err);
149147
}
150148
}
151149
}

hdf5-sys/src/lib.rs

+1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
#![cfg_attr(feature = "cargo-clippy", allow(clippy::unreadable_literal))]
33
#![cfg_attr(feature = "cargo-clippy", allow(clippy::missing_safety_doc))]
44
#![cfg_attr(feature = "cargo-clippy", allow(clippy::cognitive_complexity))]
5+
#![cfg_attr(feature = "cargo-clippy", allow(clippy::upper_case_acronyms))]
56

67
macro_rules! extern_static {
78
($dest:ident, $src:ident) => {

hdf5-types/Cargo.toml

+5-1
Original file line numberDiff line numberDiff line change
@@ -8,13 +8,17 @@ description = "Native Rust equivalents of HDF5 types."
88
repository = "https://github.com/aldanor/hdf5-rust"
99
homepage = "https://github.com/aldanor/hdf5-rust"
1010
edition = "2018"
11+
build = "build.rs"
1112

1213
[features]
13-
const_generics = []
14+
h5-alloc = []
1415

1516
[dependencies]
1617
ascii = "1.0"
1718
libc = "0.2"
19+
hdf5-sys = { version = "0.7.1", path = "../hdf5-sys" } # !V
20+
cfg-if = "1.0.0"
1821

1922
[dev-dependencies]
2023
quickcheck = { version = "1.0", default-features = false }
24+
unindent = "0.1"

hdf5-types/build.rs

+6
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
fn main() {
2+
println!("cargo:rerun-if-changed=build.rs");
3+
if std::env::var_os("DEP_HDF5_MSVC_DLL_INDIRECTION").is_some() {
4+
println!("cargo:rustc-cfg=windows_dll");
5+
}
6+
}

hdf5-types/src/array.rs

+17-71
Original file line numberDiff line numberDiff line change
@@ -14,76 +14,22 @@ pub unsafe trait Array: 'static {
1414
fn capacity() -> usize;
1515
}
1616

17-
#[cfg(not(feature = "const_generics"))]
18-
mod impl_array {
19-
use super::*;
20-
21-
macro_rules! impl_array {
22-
() => ();
23-
24-
($n:expr, $($ns:expr,)*) => (
25-
unsafe impl<T: 'static> Array for [T; $n] {
26-
type Item = T;
27-
28-
#[inline(always)]
29-
fn as_ptr(&self) -> *const T {
30-
self as *const _ as *const _
31-
}
32-
33-
#[inline(always)]
34-
fn as_mut_ptr(&mut self) -> *mut T {
35-
self as *mut _ as *mut _
36-
}
37-
38-
#[inline(always)]
39-
fn capacity() -> usize {
40-
$n
41-
}
42-
}
17+
unsafe impl<T: 'static, const N: usize> Array for [T; N] {
18+
type Item = T;
4319

44-
impl_array!($($ns,)*);
45-
);
20+
#[inline(always)]
21+
fn as_ptr(&self) -> *const T {
22+
self as *const _
4623
}
4724

48-
impl_array!(
49-
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24,
50-
25, 26, 27, 28, 29, 30, 31,
51-
);
52-
impl_array!(
53-
32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54,
54-
55, 56, 57, 58, 59, 60, 61, 62, 63,
55-
);
56-
impl_array!(
57-
64, 70, 72, 80, 90, 96, 100, 110, 120, 128, 130, 140, 150, 160, 170, 180, 190, 192, 200,
58-
210, 220, 224, 230, 240, 250,
59-
);
60-
impl_array!(
61-
256, 300, 365, 366, 384, 400, 500, 512, 600, 700, 768, 800, 900, 1000, 1024, 2048, 4096,
62-
8192, 16384, 32768,
63-
);
64-
}
65-
66-
#[cfg(feature = "const_generics")]
67-
mod impl_array {
68-
use super::*;
69-
70-
unsafe impl<T: 'static, const N: usize> Array for [T; N] {
71-
type Item = T;
72-
73-
#[inline(always)]
74-
fn as_ptr(&self) -> *const T {
75-
self as *const _
76-
}
77-
78-
#[inline(always)]
79-
fn as_mut_ptr(&mut self) -> *mut T {
80-
self as *mut _ as *mut _
81-
}
25+
#[inline(always)]
26+
fn as_mut_ptr(&mut self) -> *mut T {
27+
self as *mut _ as *mut _
28+
}
8229

83-
#[inline(always)]
84-
fn capacity() -> usize {
85-
N
86-
}
30+
#[inline(always)]
31+
fn capacity() -> usize {
32+
N
8733
}
8834
}
8935

@@ -97,7 +43,7 @@ pub struct VarLenArray<T: Copy> {
9743
impl<T: Copy> VarLenArray<T> {
9844
pub unsafe fn from_parts(p: *const T, len: usize) -> VarLenArray<T> {
9945
let (len, ptr) = if !p.is_null() && len != 0 {
100-
let dst = libc::malloc(len * mem::size_of::<T>());
46+
let dst = crate::malloc(len * mem::size_of::<T>());
10147
ptr::copy_nonoverlapping(p, dst as *mut _, len);
10248
(len, dst)
10349
} else {
@@ -136,7 +82,7 @@ impl<T: Copy> Drop for VarLenArray<T> {
13682
fn drop(&mut self) {
13783
if !self.ptr.is_null() {
13884
unsafe {
139-
libc::free(self.ptr as *mut _);
85+
crate::free(self.ptr as *mut _);
14086
}
14187
self.ptr = ptr::null();
14288
if self.len != 0 {
@@ -173,10 +119,10 @@ impl<'a, T: Copy> From<&'a [T]> for VarLenArray<T> {
173119
}
174120
}
175121

176-
impl<T: Copy> Into<Vec<T>> for VarLenArray<T> {
122+
impl<T: Copy> From<VarLenArray<T>> for Vec<T> {
177123
#[inline]
178-
fn into(self) -> Vec<T> {
179-
self.iter().cloned().collect()
124+
fn from(v: VarLenArray<T>) -> Self {
125+
v.iter().cloned().collect()
180126
}
181127
}
182128

0 commit comments

Comments
 (0)