Skip to content

Commit

Permalink
Merge pull request #28 from yarikoptic/enh-codespell
Browse files Browse the repository at this point in the history
codespell: add config and action to codespell the code to avoid known typos
  • Loading branch information
yarikoptic authored Sep 5, 2023
2 parents be3a8b3 + 673df9f commit 6d9e380
Show file tree
Hide file tree
Showing 11 changed files with 43 additions and 15 deletions.
6 changes: 6 additions & 0 deletions .codespellrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
[codespell]
skip = .git,*.pdf,*.svg
# all images embedded in .ipynb jsons
ignore-regex = "image/png": ".*
# nd - people just like it
ignore-words-list = nd
22 changes: 22 additions & 0 deletions .github/workflows/codespell.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
---
name: Codespell

on:
push:
branches: [master]
pull_request:
branches: [master]

permissions:
contents: read

jobs:
codespell:
name: Check for spelling errors
runs-on: ubuntu-latest

steps:
- name: Checkout
uses: actions/checkout@v3
- name: Codespell
uses: codespell-project/actions-codespell@v2
4 changes: 2 additions & 2 deletions 000004/RutishauserLab/000004_demo_analysis.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@
"## Read Data and Meta-data\n",
"Here, we read the data and meta-data from the specified NWB file using the NWB read utility. \n",
"\n",
"The NWB file is composed of various Groups, Datasets, and Attributes. The data and cooresponding meta-data are encapsulated within these Groups. The data are thus organized according to these Groups. We can also read the data and meta-data within these Groups, and visualize the components within NWB file via the *nwb2widget* utility -- the following illustrates this process:"
"The NWB file is composed of various Groups, Datasets, and Attributes. The data and corresponding meta-data are encapsulated within these Groups. The data are thus organized according to these Groups. We can also read the data and meta-data within these Groups, and visualize the components within NWB file via the *nwb2widget* utility -- the following illustrates this process:"
]
},
{
Expand Down Expand Up @@ -140,7 +140,7 @@
"source": [
"## Extracting and Plotting the Mean Waveform(s)\n",
"\n",
"To extract the mean waveform, we simply call waveform_mean_encoding from the \\units table -- *nwb.units['waveform_mean_encoding']*. The brain area of each of the electrodes is located within the \\electrodes table -- *nwb.electrodes['location']*. To see the relationship bewteen the \\units and \\electrodes table, see **Figure 2b** in our data descriptor. "
"To extract the mean waveform, we simply call waveform_mean_encoding from the \\units table -- *nwb.units['waveform_mean_encoding']*. The brain area of each of the electrodes is located within the \\electrodes table -- *nwb.electrodes['location']*. To see the relationship between the \\units and \\electrodes table, see **Figure 2b** in our data descriptor. "
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion 000055/BruntonLab/peterson21/plot_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -543,7 +543,7 @@ def _plot_electrodes(
5,
6,
5,
] # different sized subplot to make saggital view similar size to other two slices
] # different sized subplot to make sagittal view similar size to other two slices
current_col = 0
total_colspans = int(np.sum(np.asarray(colspans)))
for ind, colspan in enumerate(colspans):
Expand Down
6 changes: 3 additions & 3 deletions 000055/BruntonLab/peterson21/spec_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ def _calc_dens_norm_factor(elec_locs, headGrid, projectionParameter):
"""Calculate the factors (scalar values, each for a electrode) that normalize the elecrode
projected density inside brain volume (makes its sum to be equal to one)."""

# create a parameter set but withough normalization
# create a parameter set but without normalization
newProjectionParemeter = projectionParameter.copy()
newProjectionParemeter["normalizeInBrainDipoleDenisty"] = False

Expand Down Expand Up @@ -99,7 +99,7 @@ def _getProjectionMatrix(
-dist_elec_gridlocs[dipoleNumber, :] ** 2 / (2 * sd_est_err_pow2)
)

# truncate the dipole denisty Gaussian at ~3 standard deviation
# truncate the dipole density Gaussian at ~3 standard deviation
gaussianWeightMatrix[
dipoleNumber,
dist_elec_gridlocs[dipoleNumber, :]
Expand All @@ -109,7 +109,7 @@ def _getProjectionMatrix(
),
] = 0

# normalize the dipole in-brain denisty (make it sum up to one)
# normalize the dipole in-brain density (make it sum up to one)
if projectionParameter["normalizeInBrainDipoleDenisty"]:
gaussianWeightMatrix[dipoleNumber, :] = (
gaussianWeightMatrix[dipoleNumber, :]
Expand Down
6 changes: 3 additions & 3 deletions 000402/MICrONS/demo/000402_microns_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -923,7 +923,7 @@
"\n",
"The movie data is stored in an `ImageSeries` object.\n",
"\n",
"The frames are stored in a 4D array where the first dimension is time (frame), the second and third dimenion represents the size of the image and the last dimension are the RGB channels."
"The frames are stored in a 4D array where the first dimension is time (frame), the second and third dimension represents the size of the image and the last dimension are the RGB channels."
]
},
{
Expand Down Expand Up @@ -1207,7 +1207,7 @@
}
},
"source": [
"The traces are stored in a 2D array where the first dimension is time, the second dimenion is the number of ROIs for this field."
"The traces are stored in a 2D array where the first dimension is time, the second dimension is the number of ROIs for this field."
]
},
{
Expand Down Expand Up @@ -2097,7 +2097,7 @@
"source": [
"# Access the treadmill velocity\n",
"treadmill_velocity = nwbfile.acquisition[\"treadmill_velocity\"]\n",
"# Acess the timestamps for the treadmill velocity\n",
"# Access the timestamps for the treadmill velocity\n",
"treadmill_timestamps = treadmill_velocity.timestamps[:]\n",
" \n",
"treadmill_velocity"
Expand Down
2 changes: 1 addition & 1 deletion dandi/DANDI User Guide, Part I.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -173,7 +173,7 @@
"source": [
"# Uploading to DANDI\n",
"\n",
"When you register a dandiset, it creates a permenant ID. For instructional purposes, we will be using a staging version of DANDI, so that we do not create real IDs for pretend datasets.\n",
"When you register a dandiset, it creates a permanent ID. For instructional purposes, we will be using a staging version of DANDI, so that we do not create real IDs for pretend datasets.\n",
"\n",
"We are going to use a staging version of DANDI. \n",
"\n",
Expand Down
4 changes: 2 additions & 2 deletions dandi/DANDI User Guide, Part II.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -313,7 +313,7 @@
"\n",
"3. (\"optional\"- bonus point) What is different between draft and published (0.210831.2033) version of the dandiset?\n",
"\n",
" *Hint:* `diff -Naur folder1/ folder2/` could be used in the Terminal to find an anwser.\n",
" *Hint:* `diff -Naur folder1/ folder2/` could be used in the Terminal to find an answer.\n",
"\n",
"\n",
"4. Download the `sub-anm369962/` folder from `000006` dandiset. "
Expand Down Expand Up @@ -386,7 +386,7 @@
"source": [
"## dandi Python library\n",
"\n",
"The `dandi` command line interface we have practiced with above is a part ofthe `dandi` Python package, which also provides Python interfaces to interact with any instance of the DANDI archive (*hint*: the `dandi instances` command will list known instances of the archive).\n",
"The `dandi` command line interface we have practiced with above is a part of the `dandi` Python package, which also provides Python interfaces to interact with any instance of the DANDI archive (*hint*: the `dandi instances` command will list known instances of the archive).\n",
"\n",
"In the previous section you already used the library in the following Python code snippet:\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion tutorials/cosyne_2020/NWB_tutorial_2019.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@
"metadata": {},
"source": [
"## Electrodes table\n",
"Extracellular electrodes are stored in a `electrodes`, which is a `DynamicTable`. `electrodes` has several required fields: x, y, z, impedence, location, filtering, and electrode_group. Here, we also demonstate how to add optional columns to a table by adding the `'label'` column.<img src=\"images/electrodes_table.png\" width=\"300\">"
"Extracellular electrodes are stored in a `electrodes`, which is a `DynamicTable`. `electrodes` has several required fields: x, y, z, impedance, location, filtering, and electrode_group. Here, we also demonstate how to add optional columns to a table by adding the `'label'` column.<img src=\"images/electrodes_table.png\" width=\"300\">"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion tutorials/cosyne_2020/NWB_tutorial_2019.m
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@
%% Electrode table
% Extracellular |electrodes| are stored in a electrodes, which is a
% |DynamicTable|. |electrodes| has several required fields: x, y, z,
% impedence, location, filtering, and electrode_group. Here, we also
% impedance, location, filtering, and electrode_group. Here, we also
% demonstate how to add optional columns to a table by adding the |'label'|
% column.

Expand Down
2 changes: 1 addition & 1 deletion tutorials/cosyne_2023/advanced_asset_search.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@
"outputs": [],
"source": [
"from warnings import simplefilter\n",
"simplefilter(\"ignore\") # Supress namespace warnings from reading older NWB files\n",
"simplefilter(\"ignore\") # Suppress namespace warnings from reading older NWB files\n",
"\n",
"from nwbinspector.tools import get_s3_urls_and_dandi_paths\n",
"from pynwb import NWBHDF5IO"
Expand Down

0 comments on commit 6d9e380

Please sign in to comment.