Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: echemdata/galvani
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: 0.2.0
Choose a base ref
...
head repository: echemdata/galvani
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: master
Choose a head ref

Commits on Nov 7, 2020

  1. Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    e11419e View commit details
  2. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    a60caa4 View commit details
  3. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    8d0e2a4 View commit details
  4. Add all test data files to the repo

    Store the files with git-lfs to avoid making the git history
    excessively large.
    chatcannon committed Nov 7, 2020

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    093cde0 View commit details
  5. Remove get_testdata.sh

    This file is no longer needed, because the test data are saved
    in the repo with git-lfs.
    chatcannon committed Nov 7, 2020

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    8d31743 View commit details

Commits on Mar 20, 2021

  1. Merge pull request #57 from echemdata/add-codeql-analysis

    Enable CodeQL security analysis
    chatcannon authored Mar 20, 2021

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    ce011f2 View commit details
  2. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    dd9cf01 View commit details
  3. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    4b20425 View commit details
  4. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    635655e View commit details
  5. Remove .flake8 file which is no longer used

    The flake8 configuration is in tox.ini instead.
    chatcannon committed Mar 20, 2021

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    a78b711 View commit details
  6. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    b8742bf View commit details
  7. Add SPDX-FileCopyrightText to BioLogic.py

    There are additional committers who have made changes to this
    file, but only adding new colIDs etc., which is not copyrightable.
    
    Here is the corresponding git-shortlog output:
    
    Dennis (1):
          Improved compatibility with .mpr files
    
    Peter Attia (1):
          Update BioLogic.py
    
    Tim (3):
          improved parsing for PEIS files
          new column types
          new column types
    
    dennissheberla (2):
          Improved compatibility with .mpt files
          Improved compatibility with new .mpr files
    
    nhshetty-99 (3):
          Added colIDs 74 and 462 to VMPdata_colID_dtype_map
          Changed colID 74 and 462 order from original addition
          Added column 469 to BioLogic.py
    chatcannon committed Mar 20, 2021

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    4257a29 View commit details
  8. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    c57cd52 View commit details
  9. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    d00319f View commit details
  10. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    0560643 View commit details
  11. Wheel package actually only needs GPLv3 licence file

    The files with other licences are not included in the wheel package
    chatcannon committed Mar 20, 2021

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    741b17d View commit details
  12. Add MANIFEST.in to include licence files in the source tarball

    Exclude the MIT licence since the GitHub CodeQL file is not packaged.
    chatcannon committed Mar 20, 2021

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    def2bba View commit details

Commits on Mar 21, 2021

  1. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    352fc43 View commit details

Commits on Apr 25, 2021

  1. Merge pull request #61 from chatcannon/fsfe-reuse-metadata

    Add REUSE metadata
    chatcannon authored Apr 25, 2021

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    dd605e8 View commit details
  2. Unverified

    This commit is not signed, but one or more authors requires that any commit attributed to them is signed.
    Copy the full SHA
    dcd4315 View commit details

Commits on May 23, 2021

  1. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    cec14e6 View commit details

Commits on Jul 2, 2021

  1. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    4ebdc66 View commit details

Commits on Jul 3, 2021

  1. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    bcd7c5a View commit details

Commits on Aug 30, 2021

  1. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    9bbff69 View commit details
  2. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    de182bd View commit details

Commits on Aug 31, 2021

  1. Merge pull request #66 from chatcannon/add-column-R-Ohm

    Add column IDs for 'R/Ohm' and 'Rapp/Ohm'
    chatcannon authored Aug 31, 2021

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    7ef5be1 View commit details
  2. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    a3c742e View commit details
  3. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    b9a8afa View commit details
  4. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    4aea136 View commit details
  5. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    f1fbcbe View commit details
  6. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    ad39747 View commit details

Commits on Sep 1, 2021

  1. Merge pull request #63 from chatcannon/time-format-dotted

    Add '%m.%d.%y' date format for .mpr file timestamps
    chatcannon authored Sep 1, 2021

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    3b68a30 View commit details
  2. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    54c3813 View commit details

Commits on Jan 15, 2022

  1. Merge pull request #58 from chatcannon/testdata-lfs

    Store the test data with git-lfs
    chatcannon authored Jan 15, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    c02a871 View commit details
  2. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    b63abc4 View commit details
  3. Add column 174 'Ewe/V' to parser

    Suggested by @Etruria89 to fix #67
    chatcannon committed Jan 15, 2022

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    c1e5d92 View commit details

Commits on Jan 18, 2022

  1. Merge pull request #68 from chatcannon/add-ewe-column

    Add Column 174 'Ewe/V'
    chatcannon authored Jan 18, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    d6d2125 View commit details

Commits on May 30, 2022

  1. Update BioLogic.py

    Added "control/mA", "Q charge/discharge/mA.h", "step time/s", "Q charge/mA.h", "Q discharge/mA.h", "Efficiency/%", "Capacity/mA.h" to possible fieldnames in fieldname_to_dtype(fieldname). Also in VMPdata_colID_dtype_map.
    GhostDeini authored May 30, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    32ea152 View commit details

Commits on Sep 10, 2022

  1. Add Column 438 'Unknown' to parser

    陳致諭(Chihyu Chen#5570) authored and chatcannon committed Sep 10, 2022

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    2e7437c View commit details
  2. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    54e5765 View commit details
  3. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    0ffdd26 View commit details
  4. Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    chatcannon Chris Kerr
    Copy the full SHA
    e1ff99a View commit details

Commits on Nov 17, 2022

  1. add support for column 27: E_we-E_ce/V (fix #74)

    Ilka Schulz committed Nov 17, 2022
    Copy the full SHA
    fec3a22 View commit details

Commits on Nov 30, 2022

  1. Merge pull request #71 from GhostDeini/patch-1

    Add more column types to BioLogic.py
    chatcannon authored Nov 30, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    e5a1b84 View commit details
  2. Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    1025923 View commit details
  3. Merge pull request #75 from chatcannon/yuyu-step-time

    Add "step time/s" column data type
    chatcannon authored Nov 30, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    9f51925 View commit details

Commits on Dec 30, 2022

  1. fixed colon error

    whs92 committed Dec 30, 2022
    Copy the full SHA
    ab65d28 View commit details

Commits on Dec 31, 2022

  1. Merge pull request #86 from whs92/master

    Fixed syntax error typo
    chatcannon authored Dec 31, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    4bca2ac View commit details

Commits on Jul 21, 2023

  1. Added ID 505 and 509 from EC-Lab, according to the export to Text Dia…

    …log, assuming they are ordered by ID
    Paulemeister committed Jul 21, 2023
    Copy the full SHA
    8ce4eb0 View commit details

Commits on Aug 18, 2023

  1. Add tox-gh based CI

    ml-evs committed Aug 18, 2023

    Verified

    This commit was signed with the committer’s verified signature. The key has expired.
    ml-evs Matthew Evans
    Copy the full SHA
    0f0c281 View commit details
Showing with 1,774 additions and 444 deletions.
  1. +0 −2 .flake8
  2. +8 −0 .gitattributes
  3. +73 −0 .github/workflows/ci.yml
  4. +71 −0 .github/workflows/codeql-analysis.yml
  5. +11 −2 .gitignore
  6. +8 −0 .reuse/dep5
  7. +2 −2 .travis.yml
  8. +156 −0 LICENSES/CC-BY-4.0.txt
  9. +121 −0 LICENSES/CC0-1.0.txt
  10. +232 −0 LICENSES/GPL-3.0-or-later.txt
  11. +9 −0 LICENSES/MIT.txt
  12. +9 −0 MANIFEST.in
  13. +68 −7 README.md
  14. +429 −221 galvani/BioLogic.py
  15. +5 −1 galvani/__init__.py
  16. +119 −70 galvani/res2sqlite.py
  17. +0 −28 get_testdata.sh
  18. +2 −0 requirements.txt
  19. +8 −0 setup.cfg
  20. +24 −21 setup.py
  21. +6 −2 tests/conftest.py
  22. +39 −18 tests/test_Arbin.py
  23. +238 −69 tests/test_BioLogic.py
  24. +3 −0 tests/testdata/020-formation_CB5.mpr
  25. +2 −0 tests/testdata/020-formation_CB5.mpr.license
  26. +3 −0 tests/testdata/020-formation_CB5.mpt
  27. +2 −0 tests/testdata/020-formation_CB5.mpt.license
  28. +3 −0 tests/testdata/121_CA_455nm_6V_30min_C01.mpr
  29. +3 −0 tests/testdata/121_CA_455nm_6V_30min_C01.mpt
  30. +3 −0 tests/testdata/C019P-0ppb-A_C01.mpr
  31. +3 −0 tests/testdata/CV_C01.mpr
  32. +3 −0 tests/testdata/CV_C01.mpt
  33. +3 −0 tests/testdata/EIS_latin1.mpt
  34. +3 −0 tests/testdata/Ewe_Error.mpr
  35. +2 −0 tests/testdata/Ewe_Error.mpr.license
  36. +3 −0 tests/testdata/Rapp_Error.mpr
  37. +2 −0 tests/testdata/Rapp_Error.mpr.license
  38. +3 −0 tests/testdata/UM34_Test005E.res
  39. +2 −0 tests/testdata/UM34_Test005E.res.license
  40. +3 −0 tests/testdata/arbin1.res
  41. +3 −0 tests/testdata/bio_logic1.mpr
  42. +3 −0 tests/testdata/bio_logic1.mpt
  43. +3 −0 tests/testdata/bio_logic2.mpr
  44. +3 −0 tests/testdata/bio_logic2.mpt
  45. +3 −0 tests/testdata/bio_logic3.mpr
  46. +3 −0 tests/testdata/bio_logic4.mpr
  47. +3 −0 tests/testdata/bio_logic4.mpt
  48. +3 −0 tests/testdata/bio_logic5.mpr
  49. +3 −0 tests/testdata/bio_logic5.mpt
  50. +3 −0 tests/testdata/bio_logic6.mpr
  51. +3 −0 tests/testdata/bio_logic6.mpt
  52. +3 −0 tests/testdata/col_27_issue_74.mpr
  53. +3 −0 tests/testdata/v1150/v1150_CA.mpr
  54. +3 −0 tests/testdata/v1150/v1150_CA.mpt
  55. +3 −0 tests/testdata/v1150/v1150_CP.mpr
  56. +3 −0 tests/testdata/v1150/v1150_CP.mpt
  57. +3 −0 tests/testdata/v1150/v1150_GCPL.mpr
  58. +3 −0 tests/testdata/v1150/v1150_GCPL.mpt
  59. +3 −0 tests/testdata/v1150/v1150_GEIS.mpr
  60. +3 −0 tests/testdata/v1150/v1150_GEIS.mpt
  61. +3 −0 tests/testdata/v1150/v1150_MB.mpr
  62. +3 −0 tests/testdata/v1150/v1150_MB.mpt
  63. +3 −0 tests/testdata/v1150/v1150_OCV.mpr
  64. +3 −0 tests/testdata/v1150/v1150_OCV.mpt
  65. +3 −0 tests/testdata/v1150/v1150_PEIS.mpr
  66. +3 −0 tests/testdata/v1150/v1150_PEIS.mpt
  67. +12 −1 tox.ini
2 changes: 0 additions & 2 deletions .flake8

This file was deleted.

8 changes: 8 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# SPDX-FileCopyrightText: 2021 Christopher Kerr <chris.kerr@mykolab.ch>
# SPDX-License-Identifier: CC0-1.0

# Arbin data files
*.res filter=lfs diff=lfs merge=lfs -text
# Bio-Logic data files
*.mpr filter=lfs diff=lfs merge=lfs -text
*.mpt filter=lfs diff=lfs merge=lfs -text
73 changes: 73 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
# SPDX-FileCopyrightText: 2013-2020 Christopher Kerr, "bcolsen"
# SPDX-License-Identifier: GPL-3.0-or-later

name: CI tests
on:
pull_request:
push:
branches:
- master

concurrency:
# cancels running checks on new pushes
group: check-${{ github.ref }}
cancel-in-progress: true

jobs:

pytest:
name: Run Python unit tests
runs-on: ubuntu-22.04

strategy:
fail-fast: false
max-parallel: 6
matrix:
python-version: ['3.8', '3.9', '3.10', '3.11']

steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
lfs: false

# Due to limited LFS bandwidth, it is preferable to download
# test files from the last release.
#
# This does mean that testing new LFS files in the CI is tricky;
# care should be taken to also test new files locally first
# Tests missing these files in the CI should still fail.
- name: Download static files from last release for testing
uses: robinraju/release-downloader@v1
with:
latest: true
tarBall: false
fileName: "galvani-*.gz"
zipBall: false
out-file-path: /home/runner/work/last-release
extract: true

- name: Copy test files from static downloaded release
run: |
cp -r /home/runner/work/last-release/*/tests/testdata tests
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Install MDBTools OS dependency
run: |
sudo apt install -y mdbtools
# tox-gh workflow following instructions at https://github.com/tox-dev/tox-gh
- name: Install tox
run: python -m pip install tox-gh

- name: Setup tests
run: |
tox -vv --notest
- name: Run all tests
run: |-
tox --skip-pkg-install
71 changes: 71 additions & 0 deletions .github/workflows/codeql-analysis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
# SPDX-FileCopyrightText: 2006-2020 GitHub, Inc.
# SPDX-License-Identifier: MIT

# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
# ******** NOTE ********

name: "CodeQL"

on:
push:
branches: [ master ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ master ]
schedule:
- cron: '22 23 * * 1'

jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest

strategy:
fail-fast: false
matrix:
language: [ 'python' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python' ]
# Learn more...
# https://docs.github.com/en/github/finding-security-vulnerabilities-and-errors-in-your-code/configuring-code-scanning#overriding-automatic-language-detection

steps:
- name: Checkout repository
uses: actions/checkout@v2

# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# queries: ./path/to/local/query, your-org/your-repo/queries@main

# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v1

# ℹ️ Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl

# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# and modify them (or add more) to build your code if your project
# uses a compiled language

#- run: |
# make bootstrap
# make release

- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1
13 changes: 11 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# SPDX-FileCopyrightText: 2013-2017 Christopher Kerr <chris.kerr@mykolab.ch>
# SPDX-License-Identifier: CC0-1.0

*.py[cod]

# C extensions
@@ -36,5 +39,11 @@ nosetests.xml
.project
.pydevproject

# Data for testing
testdata
# Compressed files used to transfer test data
*.gz
*.bz2
*.xz
*.zip
*.tar
*.tgz
*.tbz2
8 changes: 8 additions & 0 deletions .reuse/dep5
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
Upstream-Name: Galvani
Upstream-Contact: Christopher Kerr <chris.kerr@mykolab.ch>
Source: https://github.com/echemdata/galvani

Files: tests/testdata/*
Copyright: 2010-2014 Christopher Kerr <chris.kerr@mykolab.ch>
License: CC-BY-4.0
4 changes: 2 additions & 2 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
# SPDX-FileCopyrightText: 2017-2020 Christopher Kerr <chris.kerr@mykolab.ch>
# SPDX-License-Identifier: GPL-3.0-or-later
sudo: false
language: python
cache:
directories:
- .tox
- .pytest_cache
- tests/testdata
python:
- "3.6"
- "3.7"
- "3.8"
- "3.9"
install:
- pip install tox-travis
- sh get_testdata.sh
script: tox
156 changes: 156 additions & 0 deletions LICENSES/CC-BY-4.0.txt

Large diffs are not rendered by default.

121 changes: 121 additions & 0 deletions LICENSES/CC0-1.0.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
Creative Commons Legal Code

CC0 1.0 Universal

CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE
LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN
ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS
INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES
REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS
PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM
THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED
HEREUNDER.

Statement of Purpose

The laws of most jurisdictions throughout the world automatically confer
exclusive Copyright and Related Rights (defined below) upon the creator
and subsequent owner(s) (each and all, an "owner") of an original work of
authorship and/or a database (each, a "Work").

Certain owners wish to permanently relinquish those rights to a Work for
the purpose of contributing to a commons of creative, cultural and
scientific works ("Commons") that the public can reliably and without fear
of later claims of infringement build upon, modify, incorporate in other
works, reuse and redistribute as freely as possible in any form whatsoever
and for any purposes, including without limitation commercial purposes.
These owners may contribute to the Commons to promote the ideal of a free
culture and the further production of creative, cultural and scientific
works, or to gain reputation or greater distribution for their Work in
part through the use and efforts of others.

For these and/or other purposes and motivations, and without any
expectation of additional consideration or compensation, the person
associating CC0 with a Work (the "Affirmer"), to the extent that he or she
is an owner of Copyright and Related Rights in the Work, voluntarily
elects to apply CC0 to the Work and publicly distribute the Work under its
terms, with knowledge of his or her Copyright and Related Rights in the
Work and the meaning and intended legal effect of CC0 on those rights.

1. Copyright and Related Rights. A Work made available under CC0 may be
protected by copyright and related or neighboring rights ("Copyright and
Related Rights"). Copyright and Related Rights include, but are not
limited to, the following:

i. the right to reproduce, adapt, distribute, perform, display,
communicate, and translate a Work;
ii. moral rights retained by the original author(s) and/or performer(s);
iii. publicity and privacy rights pertaining to a person's image or
likeness depicted in a Work;
iv. rights protecting against unfair competition in regards to a Work,
subject to the limitations in paragraph 4(a), below;
v. rights protecting the extraction, dissemination, use and reuse of data
in a Work;
vi. database rights (such as those arising under Directive 96/9/EC of the
European Parliament and of the Council of 11 March 1996 on the legal
protection of databases, and under any national implementation
thereof, including any amended or successor version of such
directive); and
vii. other similar, equivalent or corresponding rights throughout the
world based on applicable law or treaty, and any national
implementations thereof.

2. Waiver. To the greatest extent permitted by, but not in contravention
of, applicable law, Affirmer hereby overtly, fully, permanently,
irrevocably and unconditionally waives, abandons, and surrenders all of
Affirmer's Copyright and Related Rights and associated claims and causes
of action, whether now known or unknown (including existing as well as
future claims and causes of action), in the Work (i) in all territories
worldwide, (ii) for the maximum duration provided by applicable law or
treaty (including future time extensions), (iii) in any current or future
medium and for any number of copies, and (iv) for any purpose whatsoever,
including without limitation commercial, advertising or promotional
purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each
member of the public at large and to the detriment of Affirmer's heirs and
successors, fully intending that such Waiver shall not be subject to
revocation, rescission, cancellation, termination, or any other legal or
equitable action to disrupt the quiet enjoyment of the Work by the public
as contemplated by Affirmer's express Statement of Purpose.

3. Public License Fallback. Should any part of the Waiver for any reason
be judged legally invalid or ineffective under applicable law, then the
Waiver shall be preserved to the maximum extent permitted taking into
account Affirmer's express Statement of Purpose. In addition, to the
extent the Waiver is so judged Affirmer hereby grants to each affected
person a royalty-free, non transferable, non sublicensable, non exclusive,
irrevocable and unconditional license to exercise Affirmer's Copyright and
Related Rights in the Work (i) in all territories worldwide, (ii) for the
maximum duration provided by applicable law or treaty (including future
time extensions), (iii) in any current or future medium and for any number
of copies, and (iv) for any purpose whatsoever, including without
limitation commercial, advertising or promotional purposes (the
"License"). The License shall be deemed effective as of the date CC0 was
applied by Affirmer to the Work. Should any part of the License for any
reason be judged legally invalid or ineffective under applicable law, such
partial invalidity or ineffectiveness shall not invalidate the remainder
of the License, and in such case Affirmer hereby affirms that he or she
will not (i) exercise any of his or her remaining Copyright and Related
Rights in the Work or (ii) assert any associated claims and causes of
action with respect to the Work, in either case contrary to Affirmer's
express Statement of Purpose.

4. Limitations and Disclaimers.

a. No trademark or patent rights held by Affirmer are waived, abandoned,
surrendered, licensed or otherwise affected by this document.
b. Affirmer offers the Work as-is and makes no representations or
warranties of any kind concerning the Work, express, implied,
statutory or otherwise, including without limitation warranties of
title, merchantability, fitness for a particular purpose, non
infringement, or the absence of latent or other defects, accuracy, or
the present or absence of errors, whether or not discoverable, all to
the greatest extent permissible under applicable law.
c. Affirmer disclaims responsibility for clearing rights of other persons
that may apply to the Work or any use thereof, including without
limitation any person's Copyright and Related Rights in the Work.
Further, Affirmer disclaims responsibility for obtaining any necessary
consents, permissions or other rights required for any use of the
Work.
d. Affirmer understands and acknowledges that Creative Commons is not a
party to this document and has no duty or obligation with respect to
this CC0 or use of the Work.
232 changes: 232 additions & 0 deletions LICENSES/GPL-3.0-or-later.txt

Large diffs are not rendered by default.

9 changes: 9 additions & 0 deletions LICENSES/MIT.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
MIT License

Copyright (c) <year> <copyright holders>

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
9 changes: 9 additions & 0 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# SPDX-FileCopyrightText: 2021 Christopher Kerr
# SPDX-License-Identifier: CC0-1.0

recursive-include LICENSES *.txt
include README.md

# The GitHub CodeQL file is not included in the tarball,
# so its licence does not need to be included either
exclude LICENSES/MIT.txt
75 changes: 68 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,82 @@
galvani
=======

Read proprietary file formats from electrochemical test stations
<!---
SPDX-FileCopyrightText: 2013-2020 Christopher Kerr, Peter Attia
## Bio-Logic .mpr files ##
SPDX-License-Identifier: GPL-3.0-or-later
-->

Read proprietary file formats from electrochemical test stations.

# Usage

## Bio-Logic .mpr files

Use the `MPRfile` class from BioLogic.py (exported in the main package)

````
```python
from galvani import BioLogic
import pandas as pd

mpr_file = BioLogic.MPRfile('test.mpr')
df = pd.DataFrame(mpr_file.data)
````
```

## Arbin .res files

Use the `./galvani/res2sqlite.py` script to convert the .res file to a sqlite3 database with the same schema, which can then be interrogated with external tools or directly in Python.
For example, to extract the data into a pandas DataFrame (will need to be installed separately):

```python
import sqlite3
import pandas as pd
from galvani.res2sqlite import convert_arbin_to_sqlite
convert_arbin_to_sqlite("input.res", "output.sqlite")
with sqlite3.connect("output.sqlite") as db:
df = pd.read_sql(sql="select * from Channel_Normal_Table", con=db)
```

This functionality requires [MDBTools](https://github.com/mdbtools/mdbtools) to be installed on the local system.

# Installation

The latest galvani releases can be installed from [PyPI](https://pypi.org/project/galvani/) via

```shell
pip install galvani
```

The latest development version can be installed with `pip` directly from GitHub (see note about git-lfs below):

```shell
GIT_LFS_SKIP_SMUDGE=1 pip install git+https://github.com/echemdata/galvani
```

## Development installation and contributing

> [!WARNING]
>
> This project uses Git Large File Storage (LFS) to store its test files,
> however the LFS quota provided by GitHub is frequently exceeded.
> This means that anyone cloning the repository with LFS installed will get
> failures unless they set the `GIT_LFS_SKIP_SMUDGE=1` environment variable when
> cloning.
> The full test data from the last release can always be obtained by
> downloading the GitHub release archives (tar or zip), at
> https://github.com/echemdata/galvani/releases/latest
>
> If you wish to add test files, please ensure they are as small as possible,
> and take care that your tests work locally without the need for the LFS files.
> Ideally, you could commit them to your fork when making a PR, and then they
> can be converted to LFS files as part of the review.
If you wish to contribute to galvani, please clone the repository and install the testing dependencies:

## Arbin .res files ##
```shell
git clone git@github.com:echemdata/galvani
cd galvani
pip install -e .\[tests\]
```

Use the res2sqlite.py script to convert the .res file to a sqlite3 database
with the same schema.
Code can be contributed back via [GitHub pull requests](https://github.com/echemdata/galvani/pulls) and new features or bugs can be discussed in the [issue tracker](https://github.com/echemdata/galvani/issues).
650 changes: 429 additions & 221 deletions galvani/BioLogic.py

Large diffs are not rendered by default.

6 changes: 5 additions & 1 deletion galvani/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
# SPDX-FileCopyrightText: 2014-2019 Christopher Kerr <chris.kerr@mykolab.ch>
#
# SPDX-License-Identifier: GPL-3.0-or-later

from .BioLogic import MPRfile, MPTfile

__all__ = ['MPRfile', 'MPTfile']
__all__ = ["MPRfile", "MPTfile"]
189 changes: 119 additions & 70 deletions galvani/res2sqlite.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
#!/usr/bin/python

# SPDX-FileCopyrightText: 2013-2020 Christopher Kerr <chris.kerr@mykolab.ch>
#
# SPDX-License-Identifier: GPL-3.0-or-later

import subprocess as sp
import sqlite3
import re
@@ -12,43 +16,43 @@
# $ mdb-schema <result.res> oracle

mdb_tables = [
'Version_Table',
'Global_Table',
'Resume_Table',
'Channel_Normal_Table',
'Channel_Statistic_Table',
'Auxiliary_Table',
'Event_Table',
'Smart_Battery_Info_Table',
'Smart_Battery_Data_Table',
"Version_Table",
"Global_Table",
"Resume_Table",
"Channel_Normal_Table",
"Channel_Statistic_Table",
"Auxiliary_Table",
"Event_Table",
"Smart_Battery_Info_Table",
"Smart_Battery_Data_Table",
]
mdb_5_23_tables = [
'MCell_Aci_Data_Table',
'Aux_Global_Data_Table',
'Smart_Battery_Clock_Stretch_Table',
"MCell_Aci_Data_Table",
"Aux_Global_Data_Table",
"Smart_Battery_Clock_Stretch_Table",
]
mdb_5_26_tables = [
'Can_BMS_Info_Table',
'Can_BMS_Data_Table',
"Can_BMS_Info_Table",
"Can_BMS_Data_Table",
]

mdb_tables_text = {
'Version_Table',
'Global_Table',
'Event_Table',
'Smart_Battery_Info_Table',
'Can_BMS_Info_Table',
"Version_Table",
"Global_Table",
"Event_Table",
"Smart_Battery_Info_Table",
"Can_BMS_Info_Table",
}
mdb_tables_numeric = {
'Resume_Table',
'Channel_Normal_Table',
'Channel_Statistic_Table',
'Auxiliary_Table',
'Smart_Battery_Data_Table',
'MCell_Aci_Data_Table',
'Aux_Global_Data_Table',
'Smart_Battery_Clock_Stretch_Table',
'Can_BMS_Data_Table',
"Resume_Table",
"Channel_Normal_Table",
"Channel_Statistic_Table",
"Auxiliary_Table",
"Smart_Battery_Data_Table",
"MCell_Aci_Data_Table",
"Aux_Global_Data_Table",
"Smart_Battery_Clock_Stretch_Table",
"Can_BMS_Data_Table",
}

mdb_create_scripts = {
@@ -187,7 +191,7 @@
Event_Type INTEGER,
Event_Describe TEXT
); """,
"Smart_Battery_Info_Table": """
"Smart_Battery_Info_Table": """
CREATE TABLE Smart_Battery_Info_Table
(
Test_ID INTEGER PRIMARY KEY REFERENCES Global_Table(Test_ID),
@@ -267,7 +271,7 @@
REFERENCES Channel_Normal_Table (Test_ID, Data_Point)
); """,
# The following tables are not present in version 1.14, but are in 5.23
'MCell_Aci_Data_Table': """
"MCell_Aci_Data_Table": """
CREATE TABLE MCell_Aci_Data_Table
(
Test_ID INTEGER,
@@ -281,7 +285,7 @@
FOREIGN KEY (Test_ID, Data_Point)
REFERENCES Channel_Normal_Table (Test_ID, Data_Point)
);""",
'Aux_Global_Data_Table': """
"Aux_Global_Data_Table": """
CREATE TABLE Aux_Global_Data_Table
(
Channel_Index INTEGER,
@@ -291,7 +295,7 @@
Unit TEXT,
PRIMARY KEY (Channel_Index, Auxiliary_Index, Data_Type)
);""",
'Smart_Battery_Clock_Stretch_Table': """
"Smart_Battery_Clock_Stretch_Table": """
CREATE TABLE Smart_Battery_Clock_Stretch_Table
(
Test_ID INTEGER,
@@ -340,15 +344,15 @@
REFERENCES Channel_Normal_Table (Test_ID, Data_Point)
);""",
# The following tables are not present in version 5.23, but are in 5.26
'Can_BMS_Info_Table': """
"Can_BMS_Info_Table": """
CREATE TABLE "Can_BMS_Info_Table"
(
Channel_Index INTEGER PRIMARY KEY,
CAN_Cfg_File_Name TEXT,
CAN_Configuration TEXT
);
""",
'Can_BMS_Data_Table': """
"Can_BMS_Data_Table": """
CREATE TABLE "Can_BMS_Data_Table"
(
Test_ID INTEGER,
@@ -367,7 +371,8 @@
CREATE UNIQUE INDEX data_point_index ON Channel_Normal_Table (Test_ID, Data_Point);
CREATE INDEX voltage_index ON Channel_Normal_Table (Test_ID, Voltage);
CREATE INDEX test_time_index ON Channel_Normal_Table (Test_ID, Test_Time);
"""}
"""
}

helper_table_script = """
CREATE TEMPORARY TABLE capacity_helper(
@@ -434,17 +439,20 @@
def mdb_get_data_text(s3db, filename, table):
print("Reading %s..." % table)
insert_pattern = re.compile(
r'INSERT INTO "\w+" \([^)]+?\) VALUES \(("[^"]*"|[^")])+?\);\n',
re.IGNORECASE
r"""INSERT INTO "\w+" \([^)]+?\) VALUES (\((('[^']*')|"[^"]*"|[^')])+?\),?\s*)+;\n""",
re.IGNORECASE,
)
try:
# Initialize values to avoid NameError in except clause
mdb_output = ''
mdb_output = ""
insert_match = None
with sp.Popen(['mdb-export', '-I', 'postgres', filename, table],
bufsize=-1, stdin=sp.DEVNULL, stdout=sp.PIPE,
universal_newlines=True) as mdb_sql:

with sp.Popen(
["mdb-export", "-I", "postgres", filename, table],
bufsize=-1,
stdin=sp.DEVNULL,
stdout=sp.PIPE,
universal_newlines=True,
) as mdb_sql:
mdb_output = mdb_sql.stdout.read()
while len(mdb_output) > 0:
insert_match = insert_pattern.match(mdb_output)
@@ -455,8 +463,10 @@ def mdb_get_data_text(s3db, filename, table):

except OSError as e:
if e.errno == 2:
raise RuntimeError('Could not locate the `mdb-export` executable. '
'Check that mdbtools is properly installed.')
raise RuntimeError(
"Could not locate the `mdb-export` executable. "
"Check that mdbtools is properly installed."
)
else:
raise
except BaseException:
@@ -471,14 +481,18 @@ def mdb_get_data_text(s3db, filename, table):
def mdb_get_data_numeric(s3db, filename, table):
print("Reading %s..." % table)
try:
with sp.Popen(['mdb-export', filename, table],
bufsize=-1, stdin=sp.DEVNULL, stdout=sp.PIPE,
universal_newlines=True) as mdb_sql:
with sp.Popen(
["mdb-export", filename, table],
bufsize=-1,
stdin=sp.DEVNULL,
stdout=sp.PIPE,
universal_newlines=True,
) as mdb_sql:
mdb_csv = csv.reader(mdb_sql.stdout)
mdb_headers = next(mdb_csv)
quoted_headers = ['"%s"' % h for h in mdb_headers]
joined_headers = ', '.join(quoted_headers)
joined_placemarks = ', '.join(['?' for h in mdb_headers])
joined_headers = ", ".join(quoted_headers)
joined_placemarks = ", ".join(["?" for h in mdb_headers])
insert_stmt = 'INSERT INTO "{0}" ({1}) VALUES ({2});'.format(
table,
joined_headers,
@@ -488,8 +502,10 @@ def mdb_get_data_numeric(s3db, filename, table):
s3db.commit()
except OSError as e:
if e.errno == 2:
raise RuntimeError('Could not locate the `mdb-export` executable. '
'Check that mdbtools is properly installed.')
raise RuntimeError(
"Could not locate the `mdb-export` executable. "
"Check that mdbtools is properly installed."
)
else:
raise

@@ -500,7 +516,9 @@ def mdb_get_data(s3db, filename, table):
elif table in mdb_tables_numeric:
mdb_get_data_numeric(s3db, filename, table)
else:
raise ValueError("'%s' is in neither mdb_tables_text nor mdb_tables_numeric" % table)
raise ValueError(
"'%s' is in neither mdb_tables_text nor mdb_tables_numeric" % table
)


def mdb_get_version(filename):
@@ -510,9 +528,13 @@ def mdb_get_version(filename):
"""
print("Reading version number...")
try:
with sp.Popen(['mdb-export', filename, 'Version_Table'],
bufsize=-1, stdin=sp.DEVNULL, stdout=sp.PIPE,
universal_newlines=True) as mdb_sql:
with sp.Popen(
["mdb-export", filename, "Version_Table"],
bufsize=-1,
stdin=sp.DEVNULL,
stdout=sp.PIPE,
universal_newlines=True,
) as mdb_sql:
mdb_csv = csv.reader(mdb_sql.stdout)
mdb_headers = next(mdb_csv)
mdb_values = next(mdb_csv)
@@ -521,33 +543,53 @@ def mdb_get_version(filename):
except StopIteration:
pass
else:
raise ValueError('Version_Table of %s lists multiple versions' % filename)
raise ValueError(
"Version_Table of %s lists multiple versions" % filename
)
except OSError as e:
if e.errno == 2:
raise RuntimeError('Could not locate the `mdb-export` executable. '
'Check that mdbtools is properly installed.')
raise RuntimeError(
"Could not locate the `mdb-export` executable. "
"Check that mdbtools is properly installed."
)
else:
raise
if 'Version_Schema_Field' not in mdb_headers:
raise ValueError('Version_Table of %s does not contain a Version_Schema_Field column'
% filename)
if "Version_Schema_Field" not in mdb_headers:
raise ValueError(
"Version_Table of %s does not contain a Version_Schema_Field column"
% filename
)
version_fields = dict(zip(mdb_headers, mdb_values))
version_text = version_fields['Version_Schema_Field']
version_match = re.fullmatch('Results File ([.0-9]+)', version_text)
version_text = version_fields["Version_Schema_Field"]
version_match = re.fullmatch("Results File ([.0-9]+)", version_text)
if not version_match:
raise ValueError('File version "%s" did not match expected format' % version_text)
raise ValueError(
'File version "%s" did not match expected format' % version_text
)
version_string = version_match.group(1)
version_tuple = tuple(map(int, version_string.split('.')))
version_tuple = tuple(map(int, version_string.split(".")))
return version_tuple


def convert_arbin_to_sqlite(input_file, output_file):
def convert_arbin_to_sqlite(input_file, output_file=None):
"""Read data from an Arbin .res data file and write to a sqlite file.
Any data currently in the sqlite file will be erased!
Any data currently in an sqlite file at `output_file` will be erased!
Parameters:
input_file (str): The path to the Arbin .res file to read from.
output_file (str or None): The path to the sqlite file to write to; if None,
return a `sqlite3.Connection` into an in-memory database.
Returns:
None or sqlite3.Connection
"""
arbin_version = mdb_get_version(input_file)

if output_file is None:
output_file = ":memory:"

s3db = sqlite3.connect(output_file)

tables_to_convert = copy(mdb_tables)
@@ -572,17 +614,24 @@ def convert_arbin_to_sqlite(input_file, output_file):
print("Vacuuming database...")
s3db.executescript("VACUUM; ANALYZE;")

if output_file == ":memory:":
return s3db

s3db.close()


def main(argv=None):
parser = argparse.ArgumentParser(
description="Convert Arbin .res files to sqlite3 databases using mdb-export",
)
parser.add_argument('input_file', type=str) # need file name to pass to sp.Popen
parser.add_argument('output_file', type=str) # need file name to pass to sqlite3.connect
parser.add_argument("input_file", type=str) # need file name to pass to sp.Popen
parser.add_argument(
"output_file", type=str
) # need file name to pass to sqlite3.connect

args = parser.parse_args(argv)
convert_arbin_to_sqlite(args.input_file, args.output_file)


if __name__ == '__main__':
if __name__ == "__main__":
main()
28 changes: 0 additions & 28 deletions get_testdata.sh

This file was deleted.

2 changes: 2 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1 +1,3 @@
# SPDX-FileCopyrightText: 2017 Christopher Kerr <chris.kerr@mykolab.ch>
# SPDX-License-Identifier: CC0-1.0
numpy
8 changes: 8 additions & 0 deletions setup.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# SPDX-FileCopyrightText: 2021 Christopher Kerr
# SPDX-License-Identifier: CC0-1.0
[metadata]
# N.B. The MIT-licensed CodeQL file and the CC0-licensed
# config files are not included in the .whl package so
# their licenses do not need to be packaged either.
license_files =
LICENSES/GPL-3.0-or-later.txt
45 changes: 24 additions & 21 deletions setup.py
Original file line number Diff line number Diff line change
@@ -1,38 +1,41 @@
# -*- coding: utf-8 -*-
# SPDX-FileCopyrightText: 2014-2020 Christopher Kerr <chris.kerr@mykolab.ch>
#
# SPDX-License-Identifier: GPL-3.0-or-later

import os.path

from setuptools import setup

with open(os.path.join(os.path.dirname(__file__), 'README.md')) as f:
with open(os.path.join(os.path.dirname(__file__), "README.md")) as f:
readme = f.read()

setup(
name='galvani',
version='0.2.0',
description='Open and process battery charger log data files',
name="galvani",
version="0.4.1",
description="Open and process battery charger log data files",
long_description=readme,
long_description_content_type="text/markdown",
url='https://github.com/echemdata/galvani',
author='Chris Kerr',
author_email='chris.kerr@mykolab.ch',
license='GPLv3+',
url="https://github.com/echemdata/galvani",
author="Chris Kerr",
author_email="chris.kerr@mykolab.ch",
license="GPLv3+",
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'Intended Audience :: Science/Research',
'License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)',
'Natural Language :: English',
'Programming Language :: Python :: 3 :: Only',
'Topic :: Scientific/Engineering :: Chemistry',
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Intended Audience :: Science/Research",
"License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)",
"Natural Language :: English",
"Programming Language :: Python :: 3 :: Only",
"Topic :: Scientific/Engineering :: Chemistry",
],
packages=['galvani'],
packages=["galvani"],
entry_points={
'console_scripts': [
'res2sqlite = galvani.res2sqlite:main',
"console_scripts": [
"res2sqlite = galvani.res2sqlite:main",
],
},
python_requires='>=3.6',
install_requires=['numpy'],
tests_require=['pytest'],
python_requires=">=3.6",
install_requires=["numpy"],
tests_require=["pytest"],
)
8 changes: 6 additions & 2 deletions tests/conftest.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,15 @@
"""Helpers for pytest tests."""

# SPDX-FileCopyrightText: 2019 Christopher Kerr <chris.kerr@mykolab.ch>
#
# SPDX-License-Identifier: GPL-3.0-or-later

import os

import pytest


@pytest.fixture(scope='session')
@pytest.fixture(scope="session")
def testdata_dir():
"""Path to the testdata directory."""
return os.path.join(os.path.dirname(__file__), 'testdata')
return os.path.join(os.path.dirname(__file__), "testdata")
57 changes: 39 additions & 18 deletions tests/test_Arbin.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
"""Tests for loading Arbin .res files."""

# SPDX-FileCopyrightText: 2019-2020 Christopher Kerr <chris.kerr@mykolab.ch>
#
# SPDX-License-Identifier: GPL-3.0-or-later

import os
import sqlite3
import subprocess
@@ -9,48 +13,65 @@
from galvani import res2sqlite


have_mdbtools = (subprocess.call(['which', 'mdb-export'],
stdout=subprocess.DEVNULL) == 0)
have_mdbtools = subprocess.call(["which", "mdb-export"], stdout=subprocess.DEVNULL) == 0


def test_res2sqlite_help():
"""Test running `res2sqlite --help`.
This should work even when mdbtools is not installed.
"""
help_output = subprocess.check_output(['res2sqlite', '--help'])
assert b'Convert Arbin .res files to sqlite3 databases' in help_output
help_output = subprocess.check_output(["res2sqlite", "--help"])
assert b"Convert Arbin .res files to sqlite3 databases" in help_output


@pytest.mark.skipif(have_mdbtools, reason='This tests the failure when mdbtools is not installed')
@pytest.mark.skipif(
have_mdbtools, reason="This tests the failure when mdbtools is not installed"
)
def test_convert_Arbin_no_mdbtools(testdata_dir, tmpdir):
"""Checks that the conversion fails with an appropriate error message."""
res_file = os.path.join(testdata_dir, 'arbin1.res')
sqlite_file = os.path.join(str(tmpdir), 'arbin1.s3db')
with pytest.raises(RuntimeError, match="Could not locate the `mdb-export` executable."):
res_file = os.path.join(testdata_dir, "arbin1.res")
sqlite_file = os.path.join(str(tmpdir), "arbin1.s3db")
with pytest.raises(
RuntimeError, match="Could not locate the `mdb-export` executable."
):
res2sqlite.convert_arbin_to_sqlite(res_file, sqlite_file)


@pytest.mark.skipif(not have_mdbtools, reason='Reading the Arbin file requires MDBTools')
@pytest.mark.parametrize('basename', ['arbin1', 'UM34_Test005E'])
@pytest.mark.skipif(
not have_mdbtools, reason="Reading the Arbin file requires MDBTools"
)
@pytest.mark.parametrize("basename", ["arbin1", "UM34_Test005E"])
def test_convert_Arbin_to_sqlite_function(testdata_dir, tmpdir, basename):
"""Convert an Arbin file to SQLite using the functional interface."""
res_file = os.path.join(testdata_dir, basename + '.res')
sqlite_file = os.path.join(str(tmpdir), basename + '.s3db')
res_file = os.path.join(testdata_dir, basename + ".res")
sqlite_file = os.path.join(str(tmpdir), basename + ".s3db")
res2sqlite.convert_arbin_to_sqlite(res_file, sqlite_file)
assert os.path.isfile(sqlite_file)
with sqlite3.connect(sqlite_file) as conn:
csr = conn.execute('SELECT * FROM Channel_Normal_Table;')
csr = conn.execute("SELECT * FROM Channel_Normal_Table;")
csr.fetchone()


@pytest.mark.parametrize("basename", ["arbin1", "UM34_Test005E"])
def test_convert_Arbin_to_sqlite_function_in_memory(testdata_dir, tmpdir, basename):
"""Convert an Arbin file to an in-memory SQLite database."""
res_file = os.path.join(testdata_dir, basename + ".res")
conn = None
with res2sqlite.convert_arbin_to_sqlite(res_file) as conn:
csr = conn.execute("SELECT * FROM Channel_Normal_Table;")
csr.fetchone()


@pytest.mark.skipif(not have_mdbtools, reason='Reading the Arbin file requires MDBTools')
@pytest.mark.skipif(
not have_mdbtools, reason="Reading the Arbin file requires MDBTools"
)
def test_convert_cmdline(testdata_dir, tmpdir):
"""Checks that the conversion fails with an appropriate error message."""
res_file = os.path.join(testdata_dir, 'arbin1.res')
sqlite_file = os.path.join(str(tmpdir), 'arbin1.s3db')
subprocess.check_call(['res2sqlite', res_file, sqlite_file])
res_file = os.path.join(testdata_dir, "arbin1.res")
sqlite_file = os.path.join(str(tmpdir), "arbin1.s3db")
subprocess.check_call(["res2sqlite", res_file, sqlite_file])
assert os.path.isfile(sqlite_file)
with sqlite3.connect(sqlite_file) as conn:
csr = conn.execute('SELECT * FROM Channel_Normal_Table;')
csr = conn.execute("SELECT * FROM Channel_Normal_Table;")
csr.fetchone()
307 changes: 238 additions & 69 deletions tests/test_BioLogic.py

Large diffs are not rendered by default.

3 changes: 3 additions & 0 deletions tests/testdata/020-formation_CB5.mpr
Git LFS file not shown
2 changes: 2 additions & 0 deletions tests/testdata/020-formation_CB5.mpr.license
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
SPDX-FileCopyrightText Chihyu Chen <chihyu.chen@molicel.com>
SPDX-License-Identifier CC-BY-4.0
3 changes: 3 additions & 0 deletions tests/testdata/020-formation_CB5.mpt
Git LFS file not shown
2 changes: 2 additions & 0 deletions tests/testdata/020-formation_CB5.mpt.license
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
SPDX-FileCopyrightText Chihyu Chen <chihyu.chen@molicel.com>
SPDX-License-Identifier CC-BY-4.0
3 changes: 3 additions & 0 deletions tests/testdata/121_CA_455nm_6V_30min_C01.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/121_CA_455nm_6V_30min_C01.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/C019P-0ppb-A_C01.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/CV_C01.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/CV_C01.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/EIS_latin1.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/Ewe_Error.mpr
Git LFS file not shown
2 changes: 2 additions & 0 deletions tests/testdata/Ewe_Error.mpr.license
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
SPDX-FileCopyrightText Danzi Federico
SPDX-License-Identifier CC-BY-4.0
3 changes: 3 additions & 0 deletions tests/testdata/Rapp_Error.mpr
Git LFS file not shown
2 changes: 2 additions & 0 deletions tests/testdata/Rapp_Error.mpr.license
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
SPDX-FileCopyrightText Danzi Federico
SPDX-License-Identifier CC-BY-4.0
3 changes: 3 additions & 0 deletions tests/testdata/UM34_Test005E.res
Git LFS file not shown
2 changes: 2 additions & 0 deletions tests/testdata/UM34_Test005E.res.license
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
SPDX-FileCopyrightText Nikhil Shetty
SPDX-License-Identifier CC-BY-4.0
3 changes: 3 additions & 0 deletions tests/testdata/arbin1.res
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic1.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic1.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic2.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic2.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic3.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic4.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic4.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic5.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic5.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic6.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/bio_logic6.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/col_27_issue_74.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_CA.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_CA.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_CP.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_CP.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_GCPL.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_GCPL.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_GEIS.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_GEIS.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_MB.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_MB.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_OCV.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_OCV.mpt
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_PEIS.mpr
Git LFS file not shown
3 changes: 3 additions & 0 deletions tests/testdata/v1150/v1150_PEIS.mpt
Git LFS file not shown
13 changes: 12 additions & 1 deletion tox.ini
Original file line number Diff line number Diff line change
@@ -1,13 +1,24 @@
# SPDX-FileCopyrightText: 2017-2021 Christopher Kerr <chris.kerr@mykolab.ch>
# SPDX-License-Identifier: GPL-3.0-or-later
[tox]
envlist = py36,py37,py38,py39
envlist = py38,py39,py310,py311
[testenv]
deps =
flake8
reuse
pytest
commands =
flake8
reuse lint
pytest

[flake8]
exclude = build,dist,*.egg-info,.cache,.git,.tox,__pycache__
max-line-length = 100

[gh]
python =
3.11 = py311
3.10 = py310
3.9 = py39
3.8 = py38