From b41129d0821b5aa2cb6851564678bdac79a761ab Mon Sep 17 00:00:00 2001 From: harshilgajera-crest <69803385+harshilgajera-crest@users.noreply.github.com> Date: Fri, 26 Apr 2024 16:25:47 +0530 Subject: [PATCH 1/3] docs: migrating to GitHub Pages (#825) Migrating docs from readthedocs to GitHub pages. The existing readthedocs website will be deleted after couple of weeks. --------- Co-authored-by: Artem Rys --- .github/workflows/build-test-release.yml | 17 - .github/workflows/docs.yml | 29 + .licenserc.yaml | 1 - .readthedocs.yml | 17 - README.md | 12 + README.rst | 205 ------- docs/Makefile | 192 ------- docs/api_reference/addon_basic.rst | 5 - docs/api_reference/addon_parser.md | 41 ++ docs/api_reference/addon_parser.rst | 42 -- docs/api_reference/api_reference.md | 9 + docs/api_reference/api_reference.rst | 26 - docs/api_reference/app_test_generator.md | 4 + docs/api_reference/app_test_generator.rst | 5 - docs/api_reference/cim_tests.md | 57 ++ docs/api_reference/cim_tests.rst | 70 --- docs/api_reference/event_ingestion.md | 26 + docs/api_reference/event_ingestion.rst | 26 - docs/api_reference/fields_tests.md | 23 + docs/api_reference/fields_tests.rst | 23 - docs/api_reference/index_time_tests.md | 17 + docs/api_reference/index_time_tests.rst | 17 - docs/api_reference/sample_generation.md | 23 + docs/api_reference/sample_generation.rst | 26 - docs/cim_compliance.md | 60 +++ docs/cim_compliance.rst | 66 --- docs/cim_tests.md | 142 +++++ docs/cim_tests.rst | 151 ------ docs/common_tests.md | 12 + docs/common_tests.rst | 14 - docs/conf.py | 182 ------- docs/field_tests.md | 157 ++++++ docs/field_tests.rst | 165 ------ docs/generate_conf.md | 58 ++ docs/generate_conf.rst | 62 --- docs/how_to_use.md | 444 +++++++++++++++ docs/how_to_use.rst | 339 ------------ docs/index.md | 28 + docs/index.rst | 26 - docs/index_time_tests.md | 249 +++++++++ docs/index_time_tests.rst | 243 --------- docs/make.bat | 263 --------- docs/overview.rst | 43 -- docs/requirements.txt | 3 - docs/sample_generator.md | 355 ++++++++++++ docs/sample_generator.rst | 350 ------------ docs/troubleshoot.md | 44 ++ docs/troubleshoot.rst | 43 -- mkdocs.yml | 76 +++ poetry.lock | 630 ++++++---------------- pyproject.toml | 9 - requirements.txt | 4 - tests/e2e/test_splunk_addon.py | 37 -- 53 files changed, 2023 insertions(+), 3145 deletions(-) create mode 100644 .github/workflows/docs.yml delete mode 100644 .readthedocs.yml create mode 100644 README.md delete mode 100644 README.rst delete mode 100644 docs/Makefile delete mode 100644 docs/api_reference/addon_basic.rst create mode 100644 docs/api_reference/addon_parser.md delete mode 100644 docs/api_reference/addon_parser.rst create mode 100644 docs/api_reference/api_reference.md delete mode 100644 docs/api_reference/api_reference.rst create mode 100644 docs/api_reference/app_test_generator.md delete mode 100644 docs/api_reference/app_test_generator.rst create mode 100644 docs/api_reference/cim_tests.md delete mode 100644 docs/api_reference/cim_tests.rst create mode 100644 docs/api_reference/event_ingestion.md delete mode 100644 docs/api_reference/event_ingestion.rst create mode 100644 docs/api_reference/fields_tests.md delete mode 100644 docs/api_reference/fields_tests.rst create mode 100644 docs/api_reference/index_time_tests.md delete mode 100644 docs/api_reference/index_time_tests.rst create mode 100644 docs/api_reference/sample_generation.md delete mode 100644 docs/api_reference/sample_generation.rst create mode 100644 docs/cim_compliance.md delete mode 100644 docs/cim_compliance.rst create mode 100644 docs/cim_tests.md delete mode 100644 docs/cim_tests.rst create mode 100644 docs/common_tests.md delete mode 100644 docs/common_tests.rst delete mode 100644 docs/conf.py create mode 100644 docs/field_tests.md delete mode 100644 docs/field_tests.rst create mode 100644 docs/generate_conf.md delete mode 100644 docs/generate_conf.rst create mode 100644 docs/how_to_use.md delete mode 100644 docs/how_to_use.rst create mode 100644 docs/index.md delete mode 100644 docs/index.rst create mode 100644 docs/index_time_tests.md delete mode 100644 docs/index_time_tests.rst delete mode 100644 docs/make.bat delete mode 100644 docs/overview.rst delete mode 100644 docs/requirements.txt create mode 100644 docs/sample_generator.md delete mode 100644 docs/sample_generator.rst create mode 100644 docs/troubleshoot.md delete mode 100644 docs/troubleshoot.rst create mode 100644 mkdocs.yml delete mode 100644 requirements.txt diff --git a/.github/workflows/build-test-release.yml b/.github/workflows/build-test-release.yml index e10049e33..802e0bda7 100644 --- a/.github/workflows/build-test-release.yml +++ b/.github/workflows/build-test-release.yml @@ -83,21 +83,6 @@ jobs: poetry install poetry run pytest -v tests/unit - test-splunk-doc: - name: Test Docs - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v4 - with: - submodules: true - - uses: actions/setup-python@v5 - with: - python-version: 3.7 - - name: Install and run tests - run: | - curl -sSL https://install.python-poetry.org | python3 - --version 1.5.1 - poetry install --with docs - poetry run pytest -v -m doc tests/e2e test-splunk-external: runs-on: ubuntu-latest @@ -106,7 +91,6 @@ jobs: - pre-commit - fossa-scan - compliance-copyrights - - test-splunk-doc - test-splunk-unit strategy: fail-fast: false @@ -154,7 +138,6 @@ jobs: - pre-commit - fossa-scan - compliance-copyrights - - test-splunk-doc - test-splunk-unit runs-on: ubuntu-latest strategy: diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml new file mode 100644 index 000000000..4fb0da166 --- /dev/null +++ b/.github/workflows/docs.yml @@ -0,0 +1,29 @@ +name: docs +on: + push: + branches: + - main + pull_request: + branches: + - main + +jobs: + docs: + runs-on: ubuntu-latest + permissions: + contents: write + pages: write + steps: + - uses: actions/checkout@v4 + - uses: actions/setup-python@v5 + with: + python-version: 3.8 + - run: | + curl -sSL https://install.python-poetry.org | python3 - --version 1.5.1 + pip install mkdocs mkdocs-material mkdocstrings-python + - name: Deploy to GitHub Pages + if: github.ref_name == 'main' + run: mkdocs gh-deploy --force --strict + - name: Build Docs + if: github.ref_name != 'main' + run: mkdocs build diff --git a/.licenserc.yaml b/.licenserc.yaml index f001b7259..0b48976ba 100644 --- a/.licenserc.yaml +++ b/.licenserc.yaml @@ -37,4 +37,3 @@ header: - "renovate.json" - "pytest_splunk_addon/.ignore_splunk_internal_errors" - "pytest_splunk_addon/docker_class.py" - - "requirements.txt" diff --git a/.readthedocs.yml b/.readthedocs.yml deleted file mode 100644 index 513b627d1..000000000 --- a/.readthedocs.yml +++ /dev/null @@ -1,17 +0,0 @@ -# .readthedocs.yml -# Read the Docs configuration file -# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details - -# Required -version: 2 - -submodules: - exclude: all - -# Set the version of Python and requirements required to build your docs -python: - version: 3.7 - install: - - requirements: docs/requirements.txt - - method: pip - path: . diff --git a/README.md b/README.md new file mode 100644 index 000000000..8d2496aca --- /dev/null +++ b/README.md @@ -0,0 +1,12 @@ +# pytest-splunk-addon + +![PyPI](https://img.shields.io/pypi/v/pytest-splunk-addon) +![Python](https://img.shields.io/pypi/pyversions/pytest-splunk-addon.svg) + +## What is PSA + +A Dynamic test tool for Splunk Apps and Add-ons. + +## Usage + +For full usage instructions, please visit the [documentation](https://splunk.github.io/pytest-splunk-addon). diff --git a/README.rst b/README.rst deleted file mode 100644 index 94157e522..000000000 --- a/README.rst +++ /dev/null @@ -1,205 +0,0 @@ -=================== -pytest-splunk-addon -=================== - -.. image:: https://img.shields.io/pypi/v/pytest-splunk-addon.svg - :target: https://pypi.org/project/pytest-splunk-addon - :alt: PyPI version - -.. image:: https://img.shields.io/pypi/pyversions/pytest-splunk-addon.svg - :target: https://pypi.org/project/pytest-splunk-addon - :alt: Python versions - - -A Dynamic test tool for Splunk Apps and Add-ons - -Documentation ---------------- -The detailed documentation for pytest-splunk-addon can be found here : ``_ - -Features --------- - -* Generate tests for Splunk Knowledge objects in your Splunk Technology Add-ons -* Validate your add-ons using Splunk + Docker and this test tool - - -Requirements ------------- - -* Docker or an external single instance Splunk deployment - - -Installation ------------- - -You can install "pytest-splunk-addon" via `pip`_ from `PyPI`_:: - - $ pip install pytest-splunk-addon - -Developing ------------- - -Note: Must install docker desktop. - -.. code:: bash - - $ git clone --recurse-submodules -j8 git@github.com:splunk/pytest-splunk-addon.git - $ cd pytest-splunk-addon - $ poetry install - $ ... (change something) - # run unit tests - $ poetry run pytest tests/unit - # run some of the docker-based tests to verify end-to-end behaviour, example: - $ poetry run pytest -v --splunk-version=8.2 -m docker tests/test_splunk_addon.py::test_splunk_app_requirements_modinput - - -Usage ------ - -Installation for external Splunk - -.. code:: bash - - pip install pytest-splunk-addon - -Installation with built in docker orchestration - -.. code:: bash - - pip install pytest-splunk-addon[docker] - - -Basic project structure - -The tool assumes the Splunk Add-on is located in a folder "package" in the project root - -Triggering the tool: - -Create a test file in the tests folder - -.. code:: python3 - - from pytest_splunk_addon.standard_lib.addon_basic import Basic - - - class Test_App(Basic): - def empty_method(): - pass - -Create a Dockerfile-splunk file - -.. code:: Dockerfile - - ARG SPLUNK_VERSION=latest - FROM splunk/splunk:$SPLUNK_VERSION - ARG SPLUNK_APP=TA_UNKNOWN - ARG SPLUNK_APP_PACKAGE=package - COPY deps/apps /opt/splunk/etc/apps/ - - COPY $SPLUNK_APP_PACKAGE /opt/splunk/etc/apps/$SPLUNK_APP - - -Create a docker-compose.yml update the value of SPLUNK_APP - -.. code:: yaml - - version: "3.7" - services: - splunk: - build: - context: . - dockerfile: Dockerfile-splunk - args: - - SPLUNK_APP=xxxxxxx - ports: - - "8000" - - "8089" - environment: - - SPLUNK_PASSWORD=Changed@11 - - SPLUNK_START_ARGS=--accept-license - -Run pytest with the add-on and SA-eventgen installed and enabled in an external Splunk deployment - -.. code::: bash - - pytest \ - --splunk-type=external \ - --splunk-type=external \ - --splunk-host=splunk \ - --splunk-port=8089 \ - --splunk-password=Changed@11 \ - -v - -Run pytest with the add-on and SA-eventgen installed and enabled in docker - -.. code::: bash - - pytest \ - --splunk-password=Changed@11 \ - -v - -For full usage instructions, please visit the `pytest-splunk-addon documentation pages over at readthedocs`_. - -Run e2e tests locally ---------------------- - -* For e2e tests we are using a functionality of pytest which creates a temp dir and copies all the required file to that dir and then runs the pytest cmd from the tests. -* e2e tests can be found under /tests/e2e - -Prerequisites: - -* Docker version: 25.0.3 -* Docker Compose version: v2.24.6-desktop.1 - -.. code:: bash - - $ git clone --recurse-submodules -j8 git@github.com:splunk/pytest-splunk-addon.git - $ cd pytest-splunk-addon - $ poetry install - $ poetry run pytest -v --splunk-version=${splunk-version} -m docker -m ${test-marker} tests/e2e - -Troubleshooting: - -1. If you face an error like this: - - argparse.ArgumentError: argument -K/--keepalive: conflicting option strings: -K, --keepalive - - * This is likely to happen if you have older version of PSA requirements installed, to solve this try to uninstall lovely-pytest-docker and pull the latest main branch and then do `poetry install` - -2. If while running the tests you face an exception like this: - - `Exception: Command ['docker', 'compose', '-f', '/docker-compose.yml', '-p', '', 'down', '-v'] returned 125: """unknown shorthand flag: 'f' in -f` - - * This happens due to misconfigurations in docker, try to follow below steps: - * sudo mkdir -p /usr/local/lib/docker - * sudo ln -s /Applications/Docker.app/Contents/Resources/cli-plugins /usr/local/lib/docker/cli-plugins - -3. If you face error like this: - - ERROR: no match for platform in manifest: not found - - * Try adding platform: `linux/amd64` to docker-compose.yml file - -Contributing ------------- -Contributions are very welcome. Tests can be run with `pytest`_, please ensure -the coverage at least stays the same before you submit a pull request. - -License -------- - -Distributed under the terms of the `Apache Software License 2.0`_ license, "pytest-splunk-addon" is free and open source software - - -Issues ------- - -If you encounter any problems, please `file an issue`_ along with a detailed description. - -.. _`pytest-splunk-addon documentation pages over at readthedocs`: https://pytest-splunk-addon.readthedocs.io/en/latest/ -.. _`Apache Software License 2.0`: http://www.apache.org/licenses/LICENSE-2.0 -.. _`file an issue`: https://github.com/splunk/pytest-splunk-addon/issues -.. _`pytest`: https://github.com/pytest-dev/pytest -.. _`pip`: https://pypi.org/project/pip/ -.. _`PyPI`: https://pypi.org/project diff --git a/docs/Makefile b/docs/Makefile deleted file mode 100644 index 2d87ff4fe..000000000 --- a/docs/Makefile +++ /dev/null @@ -1,192 +0,0 @@ -# Makefile for Sphinx documentation -# - -# You can set these variables from the command line. -SPHINXOPTS = -SPHINXBUILD = sphinx-build -PAPER = -BUILDDIR = _build - -# User-friendly check for sphinx-build -ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) -$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) -endif - -# Internal variables. -PAPEROPT_a4 = -D latex_paper_size=a4 -PAPEROPT_letter = -D latex_paper_size=letter -ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . -# the i18n builder cannot share the environment and doctrees with the others -I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . - -.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest coverage gettext - -help: - @echo "Please use \`make ' where is one of" - @echo " html to make standalone HTML files" - @echo " dirhtml to make HTML files named index.html in directories" - @echo " singlehtml to make a single large HTML file" - @echo " pickle to make pickle files" - @echo " json to make JSON files" - @echo " htmlhelp to make HTML files and a HTML help project" - @echo " qthelp to make HTML files and a qthelp project" - @echo " applehelp to make an Apple Help Book" - @echo " devhelp to make HTML files and a Devhelp project" - @echo " epub to make an epub" - @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" - @echo " latexpdf to make LaTeX files and run them through pdflatex" - @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" - @echo " text to make text files" - @echo " man to make manual pages" - @echo " texinfo to make Texinfo files" - @echo " info to make Texinfo files and run them through makeinfo" - @echo " gettext to make PO message catalogs" - @echo " changes to make an overview of all changed/added/deprecated items" - @echo " xml to make Docutils-native XML files" - @echo " pseudoxml to make pseudoxml-XML files for display purposes" - @echo " linkcheck to check all external links for integrity" - @echo " doctest to run all doctests embedded in the documentation (if enabled)" - @echo " coverage to run coverage check of the documentation (if enabled)" - -clean: - rm -rf $(BUILDDIR)/* - -html: - $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html - @echo - @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." - -dirhtml: - $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml - @echo - @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." - -singlehtml: - $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml - @echo - @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." - -pickle: - $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle - @echo - @echo "Build finished; now you can process the pickle files." - -json: - $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json - @echo - @echo "Build finished; now you can process the JSON files." - -htmlhelp: - $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp - @echo - @echo "Build finished; now you can run HTML Help Workshop with the" \ - ".hhp project file in $(BUILDDIR)/htmlhelp." - -qthelp: - $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp - @echo - @echo "Build finished; now you can run "qcollectiongenerator" with the" \ - ".qhcp project file in $(BUILDDIR)/qthelp, like this:" - @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/pytest-cookiecutterplugin_name.qhcp" - @echo "To view the help file:" - @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/pytest-cookiecutterplugin_name.qhc" - -applehelp: - $(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp - @echo - @echo "Build finished. The help book is in $(BUILDDIR)/applehelp." - @echo "N.B. You won't be able to view it unless you put it in" \ - "~/Library/Documentation/Help or install it in your application" \ - "bundle." - -devhelp: - $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp - @echo - @echo "Build finished." - @echo "To view the help file:" - @echo "# mkdir -p $$HOME/.local/share/devhelp/pytest-cookiecutterplugin_name" - @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/pytest-cookiecutterplugin_name" - @echo "# devhelp" - -epub: - $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub - @echo - @echo "Build finished. The epub file is in $(BUILDDIR)/epub." - -latex: - $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex - @echo - @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." - @echo "Run \`make' in that directory to run these through (pdf)latex" \ - "(use \`make latexpdf' here to do that automatically)." - -latexpdf: - $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex - @echo "Running LaTeX files through pdflatex..." - $(MAKE) -C $(BUILDDIR)/latex all-pdf - @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." - -latexpdfja: - $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex - @echo "Running LaTeX files through platex and dvipdfmx..." - $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja - @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." - -text: - $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text - @echo - @echo "Build finished. The text files are in $(BUILDDIR)/text." - -man: - $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man - @echo - @echo "Build finished. The manual pages are in $(BUILDDIR)/man." - -texinfo: - $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo - @echo - @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." - @echo "Run \`make' in that directory to run these through makeinfo" \ - "(use \`make info' here to do that automatically)." - -info: - $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo - @echo "Running Texinfo files through makeinfo..." - make -C $(BUILDDIR)/texinfo info - @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." - -gettext: - $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale - @echo - @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." - -changes: - $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes - @echo - @echo "The overview file is in $(BUILDDIR)/changes." - -linkcheck: - $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck - @echo - @echo "Link check complete; look for any errors in the above output " \ - "or in $(BUILDDIR)/linkcheck/output.txt." - -doctest: - $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest - @echo "Testing of doctests in the sources finished, look at the " \ - "results in $(BUILDDIR)/doctest/output.txt." - -coverage: - $(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage - @echo "Testing of coverage in the sources finished, look at the " \ - "results in $(BUILDDIR)/coverage/python.txt." - -xml: - $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml - @echo - @echo "Build finished. The XML files are in $(BUILDDIR)/xml." - -pseudoxml: - $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml - @echo - @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." diff --git a/docs/api_reference/addon_basic.rst b/docs/api_reference/addon_basic.rst deleted file mode 100644 index 5877bf6c9..000000000 --- a/docs/api_reference/addon_basic.rst +++ /dev/null @@ -1,5 +0,0 @@ -AddonBasic -------------- -.. automodule:: standard_lib.addon_basic - :members: - :show-inheritance: diff --git a/docs/api_reference/addon_parser.md b/docs/api_reference/addon_parser.md new file mode 100644 index 000000000..e6651b8c8 --- /dev/null +++ b/docs/api_reference/addon_parser.md @@ -0,0 +1,41 @@ +# AddonParser + +::: pytest_splunk_addon.standard_lib.addon_parser + handler: python + + +## PropsParser + +::: pytest_splunk_addon.standard_lib.addon_parser.props_parser + handler: python + + +## EventtypeParser + +::: pytest_splunk_addon.standard_lib.addon_parser.eventtype_parser + handler: python + + +## Field + +::: pytest_splunk_addon.standard_lib.addon_parser.fields + handler: python + + +## TagsParser + +::: pytest_splunk_addon.standard_lib.addon_parser.tags_parser + handler: python + + +## TransformsParser + +::: pytest_splunk_addon.standard_lib.addon_parser.transforms_parser + handler: python + + +## SavedsearchesParser + +::: pytest_splunk_addon.standard_lib.addon_parser.savedsearches_parser + handler: python + diff --git a/docs/api_reference/addon_parser.rst b/docs/api_reference/addon_parser.rst deleted file mode 100644 index 44679e882..000000000 --- a/docs/api_reference/addon_parser.rst +++ /dev/null @@ -1,42 +0,0 @@ -AddonParser -------------- -.. automodule:: standard_lib.addon_parser - :members: - :show-inheritance: - -PropsParser -~~~~~~~~~~~~ -.. automodule:: standard_lib.addon_parser.props_parser - :members: - :show-inheritance: - -EventtypeParser -~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.addon_parser.eventtype_parser - :members: - :show-inheritance: - -Field -~~~~~~ - -.. automodule:: standard_lib.addon_parser.fields - :members: - :show-inheritance: - -TagsParser -~~~~~~~~~~~ -.. automodule:: standard_lib.addon_parser.tags_parser - :members: - :show-inheritance: - -TransformsParser -~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.addon_parser.transforms_parser - :members: - :show-inheritance: - -SavedsearchesParser -~~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.addon_parser.savedsearches_parser - :members: - :show-inheritance: diff --git a/docs/api_reference/api_reference.md b/docs/api_reference/api_reference.md new file mode 100644 index 000000000..e592364bb --- /dev/null +++ b/docs/api_reference/api_reference.md @@ -0,0 +1,9 @@ +# API Documentation + +::: pytest_splunk_addon.standard_lib + handler: python + + +The workflow of the pytest-splunk-addon is as follows: + +![Architecture diagram](architecture.jpeg) diff --git a/docs/api_reference/api_reference.rst b/docs/api_reference/api_reference.rst deleted file mode 100644 index 099322973..000000000 --- a/docs/api_reference/api_reference.rst +++ /dev/null @@ -1,26 +0,0 @@ -API Documentation -================== - -.. automodule:: standard_lib - :members: - :show-inheritance: - -The workflow of the pytest-splunk-addon is as follows: - -.. image:: architecture.jpeg - :align: center - :alt: Flow diagram - -The API is divided into the following packages & modules. - -.. toctree:: - :maxdepth: 1 - - addon_parser - cim_tests - fields_tests - index_time_tests - addon_basic - app_test_generator - sample_generation - event_ingestion diff --git a/docs/api_reference/app_test_generator.md b/docs/api_reference/app_test_generator.md new file mode 100644 index 000000000..f8bb70174 --- /dev/null +++ b/docs/api_reference/app_test_generator.md @@ -0,0 +1,4 @@ +# AppTestGenerator + +::: pytest_splunk_addon.standard_lib.app_test_generator + handler: python diff --git a/docs/api_reference/app_test_generator.rst b/docs/api_reference/app_test_generator.rst deleted file mode 100644 index 8f49c8ffd..000000000 --- a/docs/api_reference/app_test_generator.rst +++ /dev/null @@ -1,5 +0,0 @@ -AppTestGenerator ------------------- -.. automodule:: standard_lib.app_test_generator - :members: - :show-inheritance: diff --git a/docs/api_reference/cim_tests.md b/docs/api_reference/cim_tests.md new file mode 100644 index 000000000..953532651 --- /dev/null +++ b/docs/api_reference/cim_tests.md @@ -0,0 +1,57 @@ +# CimTests + +::: pytest_splunk_addon.standard_lib.cim_tests + handler: python + +## TestTemplates + +::: pytest_splunk_addon.standard_lib.cim_tests.test_templates + handler: python + + +## TestGenerator + +::: pytest_splunk_addon.standard_lib.cim_tests.test_generator + handler: python + + +## DataModelHandler + +::: pytest_splunk_addon.standard_lib.cim_tests.data_model_handler + handler: python + + +## DataModel + +::: pytest_splunk_addon.standard_lib.cim_tests.data_model + handler: python + + +## DataSet + +::: pytest_splunk_addon.standard_lib.cim_tests.data_set + handler: python + + +## FieldTestAdapter + +::: pytest_splunk_addon.standard_lib.cim_tests.field_test_adapter + handler: python + + +## FieldTestHelper + +::: pytest_splunk_addon.standard_lib.cim_tests.field_test_helper + handler: python + + +## JsonSchema + +::: pytest_splunk_addon.standard_lib.cim_tests.json_schema + handler: python + + +## BaseSchema + +::: pytest_splunk_addon.standard_lib.cim_tests.base_schema + handler: python diff --git a/docs/api_reference/cim_tests.rst b/docs/api_reference/cim_tests.rst deleted file mode 100644 index f7cd78839..000000000 --- a/docs/api_reference/cim_tests.rst +++ /dev/null @@ -1,70 +0,0 @@ -CimTests ----------- -.. automodule:: standard_lib.cim_tests - :members: - :show-inheritance: - -TestTemplates -~~~~~~~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.test_templates - :members: - :show-inheritance: - -TestGenerator -~~~~~~~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.test_generator - :members: - :show-inheritance: - -DataModelHandler -~~~~~~~~~~~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.data_model_handler - :members: - :show-inheritance: - - -DataModel -~~~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.data_model - :members: - :show-inheritance: - - -DataSet -~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.data_set - :members: - :show-inheritance: - -FieldTestAdapter -~~~~~~~~~~~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.field_test_adapter - :members: - :show-inheritance: - -FieldTestHelper -~~~~~~~~~~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.field_test_helper - :members: - :show-inheritance: - -JsonSchema -~~~~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.json_schema - :members: - :show-inheritance: - -BaseSchema -~~~~~~~~~~~ - -.. automodule:: standard_lib.cim_tests.base_schema - :members: - :show-inheritance: diff --git a/docs/api_reference/event_ingestion.md b/docs/api_reference/event_ingestion.md new file mode 100644 index 000000000..2d7d2a69d --- /dev/null +++ b/docs/api_reference/event_ingestion.md @@ -0,0 +1,26 @@ +# EventIngestor + +## HEC Event Ingestor + +::: pytest_splunk_addon.standard_lib.event_ingestors.hec_event_ingestor + handler: python + + +## HEC Raw Ingestor + +::: pytest_splunk_addon.standard_lib.event_ingestors.hec_raw_ingestor + handler: python + + +## SC4S Event Ingestor + +::: pytest_splunk_addon.standard_lib.event_ingestors.sc4s_event_ingestor + handler: python + + +## File Monitor Ingestor + +::: pytest_splunk_addon.standard_lib.event_ingestors.file_monitor_ingestor + handler: python + + diff --git a/docs/api_reference/event_ingestion.rst b/docs/api_reference/event_ingestion.rst deleted file mode 100644 index d7da62500..000000000 --- a/docs/api_reference/event_ingestion.rst +++ /dev/null @@ -1,26 +0,0 @@ -EventIngestor ------------------- - -HEC Event Ingestor -~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.event_ingestors.hec_event_ingestor - :members: - :show-inheritance: - -HEC Raw Ingestor -~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.event_ingestors.hec_raw_ingestor - :members: - :show-inheritance: - -SC4S Event Ingestor -~~~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.event_ingestors.sc4s_event_ingestor - :members: - :show-inheritance: - -File Monitor Ingestor -~~~~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.event_ingestors.file_monitor_ingestor - :members: - :show-inheritance: \ No newline at end of file diff --git a/docs/api_reference/fields_tests.md b/docs/api_reference/fields_tests.md new file mode 100644 index 000000000..63d2816b1 --- /dev/null +++ b/docs/api_reference/fields_tests.md @@ -0,0 +1,23 @@ +# FieldsTests + +::: pytest_splunk_addon.standard_lib.fields_tests + handler: python + + +## TestTemplates + +::: pytest_splunk_addon.standard_lib.fields_tests.test_templates + handler: python + + +## TestGenerator + +::: pytest_splunk_addon.standard_lib.fields_tests.test_generator + handler: python + + +## FieldBank + +::: pytest_splunk_addon.standard_lib.fields_tests.field_bank + handler: python + diff --git a/docs/api_reference/fields_tests.rst b/docs/api_reference/fields_tests.rst deleted file mode 100644 index 31e77387b..000000000 --- a/docs/api_reference/fields_tests.rst +++ /dev/null @@ -1,23 +0,0 @@ -FieldsTests -------------- -.. automodule:: standard_lib.fields_tests - :members: - :show-inheritance: - -TestTemplates -~~~~~~~~~~~~~~ -.. automodule:: standard_lib.fields_tests.test_templates - :members: - :show-inheritance: - -TestGenerator -~~~~~~~~~~~~~~ -.. automodule:: standard_lib.fields_tests.test_generator - :members: - :show-inheritance: - -FieldBank -~~~~~~~~~~ -.. automodule:: standard_lib.fields_tests.field_bank - :members: - :show-inheritance: diff --git a/docs/api_reference/index_time_tests.md b/docs/api_reference/index_time_tests.md new file mode 100644 index 000000000..6acffdfa3 --- /dev/null +++ b/docs/api_reference/index_time_tests.md @@ -0,0 +1,17 @@ +# IndexTimeTests + +::: pytest_splunk_addon.standard_lib.index_tests + handler: python + + +## TestTemplates + +::: pytest_splunk_addon.standard_lib.index_tests.test_templates + handler: python + + +## TestGenerator + +::: pytest_splunk_addon.standard_lib.index_tests.test_generator + handler: python + diff --git a/docs/api_reference/index_time_tests.rst b/docs/api_reference/index_time_tests.rst deleted file mode 100644 index 0bd0db9f6..000000000 --- a/docs/api_reference/index_time_tests.rst +++ /dev/null @@ -1,17 +0,0 @@ -IndexTimeTests ---------------- -.. automodule:: standard_lib.index_tests - :members: - :show-inheritance: - -TestTemplates -~~~~~~~~~~~~~~ -.. automodule:: standard_lib.index_tests.test_templates - :members: - :show-inheritance: - -TestGenerator -~~~~~~~~~~~~~~ -.. automodule:: standard_lib.index_tests.test_generator - :members: - :show-inheritance: \ No newline at end of file diff --git a/docs/api_reference/sample_generation.md b/docs/api_reference/sample_generation.md new file mode 100644 index 000000000..67c52705b --- /dev/null +++ b/docs/api_reference/sample_generation.md @@ -0,0 +1,23 @@ +# DataGenerator + +## PytestSplunkAddonDataParser + +::: pytest_splunk_addon.standard_lib.sample_generation.pytest_splunk_addon_data_parser + handler: python + +## SampleStanza + +::: pytest_splunk_addon.standard_lib.sample_generation.sample_stanza + handler: python + + +## SampleEvent + +::: pytest_splunk_addon.standard_lib.sample_generation.sample_event + handler: python + + +## Rule + +::: pytest_splunk_addon.standard_lib.sample_generation.rule + handler: python diff --git a/docs/api_reference/sample_generation.rst b/docs/api_reference/sample_generation.rst deleted file mode 100644 index 99a155752..000000000 --- a/docs/api_reference/sample_generation.rst +++ /dev/null @@ -1,26 +0,0 @@ -DataGenerator ------------------- - -PytestSplunkAddonDataParser -~~~~~~~~~~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.sample_generation.pytest_splunk_addon_data_parser - :members: - :show-inheritance: - -SampleStanza -~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.sample_generation.sample_stanza - :members: - :show-inheritance: - -SampleEvent -~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.sample_generation.sample_event - :members: - :show-inheritance: - -Rule -~~~~~~~~~~~~~~~~~~ -.. automodule:: standard_lib.sample_generation.rule - :members: - :show-inheritance: diff --git a/docs/cim_compliance.md b/docs/cim_compliance.md new file mode 100644 index 000000000..a7db94a32 --- /dev/null +++ b/docs/cim_compliance.md @@ -0,0 +1,60 @@ +# CIM Compliance Report + +## Overview + +- CIM compliance report provides insights to an user about the compatibility of the add-on with the supported CIM Data Models, which helps in identifying CIM coverage gaps and helps in understanding those gaps which can be fixed in the add-on. + +## What does the report contain? + +The report is divided into the following sections: + +1. Summary of the number of test cases that failed versus the total number of test cases executed for all the supported Data Models, along with the status of the data model. A list of all possible 'status' values is given below: + + - Passed: All the test cases for the data model are executed successfully. + - Failed: At least one of the test cases for the data model resulted into failure. + - N/A: The data model is not mapped with the add-on. + +2. Summary of the number of test cases that failed versus the total number of test cases executed for stanzas in tags.conf and the data model mapped with it. + +3. Summary of test case results (passed/failed/skipped) for all the fields in the dataset for the tag-stanza it is mapped with. + +4. A list of data models which are not supported by the plugin. + +## How to generate the report? + +There are two ways to generate the CIM Compliance report: + +**1. Generating the report while executing the test cases** + +- Append the following to [any one of the commands](how_to_use.md#test-execution) used for executing the test cases: + + ```console + --cim-report ` used for executing the test cases: - - .. code-block:: console - - --cim-report - - **2. Generating the report using the test results stored in the junit-xml file** - - * Execute the following command: - - .. code-block:: console - - cim-report - - -Report Generation Troubleshooting -"""""""""""""""""""""""""""""""""""""" - -If the CIM Compliance report is not generated, check for the following: - - 1. If the report was to generated during test case execution, check if CIM compatibility test cases were executed or not. If they were not executed, the report will not be generated. - - 2. If the report was generated using a JUnit report, check if the provided JUnit report: - - * Exists at the given location. - * Has a valid format. - * Has all the required properties which includes the following: - - 1. Tag_stanza - 2. Data Model - 3. Data Set - 4. Field name - 5. Field type \ No newline at end of file diff --git a/docs/cim_tests.md b/docs/cim_tests.md new file mode 100644 index 000000000..f0b23f63b --- /dev/null +++ b/docs/cim_tests.md @@ -0,0 +1,142 @@ +# CIM Compatibility Tests + +## Overview + +The CIM tests are written with a purpose of testing the compatibility of the add-on with CIM Data Models (Based on Splunk_SA_CIM 4.15.0). +An add-on is said to be CIM compatible if it fulfils the following two criteria: + +1. The add-on extracts all the fields with valid values, which are marked as required by the [Data Model Definitions](https://github.com/splunk/pytest-splunk-addon/tree/main/pytest_splunk_addon/standard_lib/data_models). +2. Any event for the add-on is not mapped with more than one data model. + +______________________________________________________________________ + +To generate test cases only for CIM compatibility, append the following marker to pytest command: + + ```console + -m splunk_searchtime_cim + ``` + +## Test Scenarios + + +**1. Testcase for each eventtype mapped with a dataset.** + + ```python + test_cim_required_fields[::] + ``` + + Testcase verifies if an eventtype is mapped with the dataset, events must follow the search constraints of the dataset. + + **Workflow:** + + - Plugin parses tags.conf to get a list of tags for each eventtype. + - Plugin parses all the [supported datamodels](https://github.com/splunk/pytest-splunk-addon/tree/main/pytest_splunk_addon/standard_lib/data_models). + - Then it gets a list of the datasets mapped with an eventtype. + - Generates test case for each eventtype. + +**2. Testcases for all required, conditional and cluster fields in dataset.** + + ```python + test_cim_required_fields[::::] + ``` + + + #### Testcase Assertions: + + - There should be at least 1 event mapped with the dataset. + - Each required field should be extracted in all the events mapped with the datasets. + - Each conditional fields should be extracted in all the events filtered by the condition. + - If there are interdependent fields, either all fields should be extracted or none of them should be extracted *i.e \["bytes","bytes_in","bytes_out"\].* + - Fields should not have values other than the expected values defined in field properties. + - Fields must not have invalid values \[" ", "-", "null", "(null)", "unknown"\]. + + **Workflow:** + + - For an eventtype, mapped dataset will be identified as mentioned in [#2 scenario](cim_tests.md#test-scenarios). + + - Test case will be generated for each required fields of a dataset. + + - To generate the test case the following properties of fields will be considered : + + - An filtering condition to filter the events, only for which the field should be verified. + - Expected values + - Validation to check the values follows a proper type. + - List of co-related fields. + + - Generate the query according to the properties of the field mentioned above. + + - Search the query to the Splunk instance. + + - Assert the assertions mentioned in [Testcase assertions](cim_tests.md#testcase-assertions). + +**3. Testcase for all not_allowed_in_search fields** + + ```python + test_cim_fields_not_allowed_in_search[::] + ``` + + These fields are not allowed to be extracted for the eventtype + + **Workflow:** + + - Plugin collects the list of not_allowed_in_search fields from mapped datasets and [CommonFields.json](https://github.com/splunk/pytest-splunk-addon/blob/main/pytest_splunk_addon/standard_lib/cim_tests/CommonFields.json). + - Using search query the test case verifies if not_allowed_in_search fields are populated in search or not. + +> **_NOTE:_** + [CommonFields.json](https://github.com/splunk/pytest-splunk-addon/blob/main/pytest_splunk_addon/standard_lib/cim_tests/CommonFields.json) contains fields which are automatically provided by asset and identity correlation features of applications like Splunk Enterprise Security. + + +**4. Testcase for all not_allowed_in_props fields** + + ```python + test_cim_fields_not_allowed_in_props[searchtime_cim_fields] + ``` + + Defining extractions in the configuration files is not allowed for these fields. But if these fields are automatically extracted by Splunk, that's fine *i.e tag* + + **Workflow:** + + - Plugin gets a list of fields of type not_allowed_in_props from CommonFields.json and mapped datasets. + - Plugin gets a list of fields whose extractions are defined in props using addon_parser. + - By comparing we obtain a list of fields whose extractions are not allowed but defined. + +**5. Testcase to check that eventtype is not be mapped with multiple datamodels.** + + + **Workflow:** + + - Parsing tags.conf it already has a list of eventtype mapped with the datasets. + - Using SPL we check that each eventtype is not be mapped with multiple datamodels. + +## Testcase Troubleshooting + +In case of test case failure check if: + +- The add-on to be tested is installed on the Splunk instance. +- Data is generated sufficiently for the addon being tested. +- Splunk licence has not expired. +- Splunk instance is up and running. +- Splunk instance's management port is accessible from the test machine. + +If all the above conditions are satisfied, further analysis of the test is required. +For every CIM validation test case there is a defined structure for the stack trace. + + ```text + AssertionError: <`_. -2. Any event for the add-on is not mapped with more than one data model. - ---------------------- - -To generate test cases only for CIM compatibility, append the following marker to pytest command: - - .. code-block:: console - - -m splunk_searchtime_cim - -Test Scenarios --------------- - -.. _mapped_datasets: - -**1. Testcase for each eventtype mapped with a dataset.** - - .. code-block:: python - - test_cim_required_fields[::] - - Testcase verifies if an eventtype is mapped with the dataset, events must follow the search constraints of the dataset. - - **Workflow:** - - * Plugin parses tags.conf to get a list of tags for each eventtype. - * Plugin parses all the `supported datamodels `_. - * Then it gets a list of the datasets mapped with an eventtype. - * Generates test case for each eventtype. - -**2. Testcases for all required, conditional and cluster fields in dataset.** - - .. code-block:: python - - test_cim_required_fields[::::] - - .. _test_assertions: - - Testcase assertions: - - * There should be at least 1 event mapped with the dataset. - * Each required field should be extracted in all the events mapped with the datasets. - * Each conditional fields should be extracted in all the events filtered by the condition. - * If there are interdependent fields, either all fields should be extracted or none of them should be extracted *i.e ["bytes","bytes_in","bytes_out"].* - * Fields should not have values other than the expected values defined in field properties. - * Fields must not have invalid values [" ", "-", "null", "(null)", "unknown"]. - - **Workflow:** - - * For an eventtype, mapped dataset will be identified as mentioned in :ref:`#2 scenario`. - * Test case will be generated for each required fields of a dataset. - * To generate the test case the following properties of fields will be considered : - - * An filtering condition to filter the events, only for which the field should be verified. - * Expected values - * Validation to check the values follows a proper type. - * List of co-related fields. - * Generate the query according to the properties of the field mentioned above. - * Search the query to the Splunk instance. - * Assert the assertions mentioned in :ref:`Testcase assertions`. - - -**3. Testcase for all not_allowed_in_search fields** - - .. code-block:: python - - test_cim_fields_not_allowed_in_search[::] - - These fields are not allowed to be extracted for the eventtype - - **Workflow:** - - * Plugin collects the list of not_allowed_in_search fields from mapped datasets and `CommonFields.json `_. - * Using search query the test case verifies if not_allowed_in_search fields are populated in search or not. - - .. note:: - `CommonFields.json `_ contains fields which are automatically provided by asset and identity correlation features of applications like Splunk Enterprise Security. - -**4. Testcase for all not_allowed_in_props fields** - - .. code-block:: python - - test_cim_fields_not_allowed_in_props[searchtime_cim_fields] - - Defining extractions in the configuration files is not allowed for these fields. But if these fields are automatically extracted by Splunk, that's fine *i.e tag* - - **Workflow:** - - * Plugin gets a list of fields of type not_allowed_in_props from CommonFields.json and mapped datasets. - * Plugin gets a list of fields whose extractions are defined in props using addon_parser. - * By comparing we obtain a list of fields whose extractions are not allowed but defined. - -**5. Testcase to check that eventtype is not be mapped with multiple datamodels.** - - .. code-block:: python - - test_eventtype_mapped_multiple_cim_datamodel - - **Workflow:** - - * Parsing tags.conf it already has a list of eventtype mapped with the datasets. - * Using SPL we check that each eventtype is not be mapped with multiple datamodels. - -Testcase Troubleshooting ------------------------- - -In case of test case failure check if: - - - The add-on to be tested is installed on the Splunk instance. - - Data is generated sufficiently for the addon being tested. - - Splunk licence has not expired. - - Splunk instance is up and running. - - Splunk instance's management port is accessible from the test machine. - -If all the above conditions are satisfied, further analysis of the test is required. -For every CIM validation test case there is a defined structure for the stack trace [1]_. - - .. code-block:: text - - AssertionError: <> - Source | Sourcetype | Field | Event Count | Field Count | Invalid Field Count | Invalid Values - -------- | --------------- | ------| ----------- | ----------- | ------------------- | -------------- - str | str | str | int | int | int | str - - Search = - - Properties for the field :: - type= Required/Conditional - condition= Condition for field - validity= EVAL conditions - expected_values=[list of expected values] - negative_values=[list of negative values] - - Get the search query from the stack trace and execute it on the Splunk instance and verify which specific type of events are causing failure. - - If a field validating test case is failing, check the field's properties from the table provided for the reason of failure. - ------------- - -.. [1] Stacktrace is the text displayed in the Exception block when the Test fails. - diff --git a/docs/common_tests.md b/docs/common_tests.md new file mode 100644 index 000000000..60646f501 --- /dev/null +++ b/docs/common_tests.md @@ -0,0 +1,12 @@ +# Common Tests + +## Test Scenarios + +**1. Events ingested are properly tokenised or not.** + + ```python + test_events_with_untokenised_values + ``` + + Testcase verifies that all the events have been properly tokenised. + That is event does not contain any token from the conf file in its raw form i.e enclosed within ##. diff --git a/docs/common_tests.rst b/docs/common_tests.rst deleted file mode 100644 index 09660f374..000000000 --- a/docs/common_tests.rst +++ /dev/null @@ -1,14 +0,0 @@ -Common Tests -======================= - -Test Scenarios --------------- - -**1. Events ingested are properly tokenised or not.** - - .. code-block:: python - - test_events_with_untokenised_values - - Testcase verifies that all the events have been properly tokenised. - That is event does not contain any token from the conf file in its raw form i.e enclosed within ##. \ No newline at end of file diff --git a/docs/conf.py b/docs/conf.py deleted file mode 100644 index 8c9682677..000000000 --- a/docs/conf.py +++ /dev/null @@ -1,182 +0,0 @@ -# -*- coding: utf-8 -*- -# -# Configuration file for the Sphinx documentation builder. -# -# This file only contains a selection of the most common options. For a full -# list see the documentation: -# https://www.sphinx-doc.org/en/master/usage/configuration.html - -# -- Path setup -------------------------------------------------------------- - -# If extensions (or modules to document with autodoc) are in another directory, -# add these directories to sys.path here. If the directory is relative to the -# documentation root, use os.path.abspath to make it absolute, like shown here. -# -import os -import sys -import sphinx_rtd_theme - -sys.path.insert(0, os.path.abspath("../pytest_splunk_addon")) - - -# -- Project information ----------------------------------------------------- - -project = "pytest-splunk-addon" -copyright = "2021, Splunk, Inc." -author = "Splunk, Inc." - -# The short X.Y version -version = "" -# The full version, including alpha/beta/rc tags -release = "" - - -# -- General configuration --------------------------------------------------- - -# Add any Sphinx extension module names here, as strings. They can be -# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom -# ones. -extensions = [ - "sphinx.ext.autodoc", - "sphinx.ext.todo", - "sphinx.ext.coverage", - "sphinx.ext.viewcode", - "sphinx.ext.napoleon", - "sphinx_panels", -] - -# Add any paths that contain templates here, relative to this directory. -# templates_path = ['_templates'] -templates_path = [] - -# The suffix(es) of source filenames. -# You can specify multiple suffix as a list of string: -# -# source_suffix = ['.rst', '.md'] -source_suffix = ".rst" - -# The master toctree document. -master_doc = "index" - -# The language for content autogenerated by Sphinx. Refer to documentation -# for a list of supported languages. -# -# This is also used if you do content translation via gettext catalogs. -# Usually you set "language" from the command line for these cases. -language = None - -# List of patterns, relative to source directory, that match files and -# directories to ignore when looking for source files. -# This pattern also affects html_static_path and html_extra_path . -exclude_patterns = [] - -# The name of the Pygments (syntax highlighting) style to use. -pygments_style = "sphinx" - - -# -- Options for HTML output ------------------------------------------------- - -# The theme to use for HTML and HTML Help pages. See the documentation for -# a list of builtin themes. -# -html_theme = "sphinx_rtd_theme" - -# Theme options are theme-specific and customize the look and feel of a theme -# further. For a list of options available for each theme, see the -# documentation. -# -# html_theme_options = {} - -# Add any paths that contain custom static files (such as style sheets) here, -# relative to this directory. They are copied after the builtin static files, -# so a file named "default.css" will overwrite the builtin "default.css". -# html_static_path = ['_static'] -html_static_path = [] - -# Custom sidebar templates, must be a dictionary that maps document names -# to template names. -# -# The default sidebars (for documents that don't match any pattern) are -# defined by theme itself. Builtin themes are using these templates by -# default: ``['localtoc.html', 'relations.html', 'sourcelink.html', -# 'searchbox.html']``. -# -# html_sidebars = {} - - -# -- Options for HTMLHelp output --------------------------------------------- - -# Output file base name for HTML help builder. -htmlhelp_basename = "pytest-splunk-addondoc" - - -# -- Options for LaTeX output ------------------------------------------------ - -latex_elements = { - # The paper size ('letterpaper' or 'a4paper'). - # - # 'papersize': 'letterpaper', - # The font size ('10pt', '11pt' or '12pt'). - # - # 'pointsize': '10pt', - # Additional stuff for the LaTeX preamble. - # - # 'preamble': '', - # Latex figure (float) alignment - # - # 'figure_align': 'htbp', -} - -# Grouping the document tree into LaTeX files. List of tuples -# (source start file, target name, title, -# author, documentclass [howto, manual, or own class]). -latex_documents = [ - ( - master_doc, - "pytest-splunk-addon.tex", - "pytest-splunk-addon Documentation", - "Splunk, Inc.", - "manual", - ), -] - - -# -- Options for manual page output ------------------------------------------ - -# One entry per manual page. List of tuples -# (source start file, name, description, authors, manual section). -man_pages = [ - ( - master_doc, - "pytest-splunk-addon", - "pytest-splunk-addon Documentation", - [author], - 1, - ) -] - - -# -- Options for Texinfo output ---------------------------------------------- - -# Grouping the document tree into Texinfo files. List of tuples -# (source start file, target name, title, author, -# dir menu entry, description, category) -texinfo_documents = [ - ( - master_doc, - "pytest-splunk-addon", - "pytest-splunk-addon Documentation", - author, - "pytest-splunk-addon", - "One line description of project.", - "Miscellaneous", - ), -] - - -# -- Extension configuration ------------------------------------------------- - -# -- Options for todo extension ---------------------------------------------- - -# If true, `todo` and `todoList` produce output, else they produce nothing. -todo_include_todos = True diff --git a/docs/field_tests.md b/docs/field_tests.md new file mode 100644 index 000000000..6de550d58 --- /dev/null +++ b/docs/field_tests.md @@ -0,0 +1,157 @@ +# Knowledge Object Tests + +## Overview + +- The tests are written with a purpose of testing the proper functioning of the search-time knowledge objects of the add-on. + +- Search-time knowledge objects are extracted/generated when a search query is executed on a Splunk Instance. + +- The search-time knowledge objects include the following: + 1. Extract + 2. Report + 3. Lookups + 4. Fieldalias + 5. Eval + 6. Eventtypes + 7. Tags + 8. Savedsearches + +______________________________________________________________________ + +To generate test cases only for knowledge objects, append the following marker to pytest command: + + ```console + -m splunk_searchtime_fields + ``` + +## Test Scenarios + +**1. Events should be present in source/sourcetype defined in props.conf stanza.** + + ```python + test_props_fields[] + ``` + + Testcase verifies that there are events mapped with source/sourcetype. + Here ] + ``` + + Testcase verifies that the field should be extracted in the source/sourcetype. + Here ::field::] + ``` + + Testcase verifies that the field should not have "-" (dash) or "" (empty) as a value. + Here ::EXTRACT-] + test_props_fields[::LOOKUP-] + test_props_fields[::REPORT-::] + ``` + + All the fields mentioned in EXTRACT, REPORT or LOOKUP can be interdependent. + The test case verifies that the extractions are working fine and all the fields are + being extracted in a single event. + The reason for keeping the test is to identify the corner cases where the fields are being + extracted in several events but the extractions mentioned in EXTRACT, REPORT or LOOKUP is not + working due to invalid regex/lookup configuration. + + **Workflow:** + + - While parsing the conf file when the plugin finds one of EXTRACT, REPORT, LOOKUP + the plugin gets the list of fields extracted and generates a test case. + - For all the fields in the test case it generates a single SPL search query including the stanza and asserts event_count 0. + - This verifies that all the fields are extracted in the same event. + +**5. Events should be present in each eventtype** + + ```python + test_eventtype[eventtype=] + ``` + + Test case verifies that there are events mapped with the eventtype. + Here eventtype is an eventtype mentioned in eventtypes.conf. + + **Workflow:** + + - For each eventtype mentioned in eventtypes.conf plugin generates an SPL search query and asserts event_count 0 for the eventtype. + +**6. Tags defined in tags.conf should be applied to the events.** + + ```python + test_tags[::tag::] + ``` + + Test case verifies that the there are events mapped with the tag. + Here ] + ``` + + Test case verifies that the search mentioned in savedsearch.conf generates valid search results. + Here savedsearch_stanza is a stanza mentioned in savedsearches.conf file. + + **Workflow:** + + - In savedsearches.conf for each stanza, the plugin generates a test case. + - For each stanza mentioned in savedsearches.conf plugin generates an SPL search query and asserts event_count 0 for the savedsearch. + +## Testcase Troubleshooting + +In the case of test-case failure check if: + + - The add-on to be tested is installed on the Splunk instance. + - Data is generated sufficiently for the add-on being tested. + - Data is generated sufficiently in the specific index, it is being tested. + - Splunk licence has not expired. + - Splunk instance is up and running. + - Splunk instance's management port is accessible from the test machine. + +If all the above conditions are satisfied, further analysis of the test is required. +For every test case failure, there is a defined structure for the stack trace. + + ```text + AssertionError: <] - - Testcase verifies that there are events mapped with source/sourcetype. - Here is the source/sourcetype that is defined in the stanza. - - **Workflow:** - - * Plugin gets the list of defined sourcetypes by parsing props.conf - * For each sourcetype, plugin generates an SPL search query and asserts event_count > 0. - -**2. Fields mentioned under source/sourcetype should be extracted** - - .. code-block:: python - - test_props_fields[::field::] - - Testcase verifies that the field should be extracted in the source/sourcetype. - Here is the source/sourcetype that is defined in the stanza and - is the name of a field which is extracted under source/sourcetype. - - **Workflow:** - - * Plugin generates a list of fields extracted under the source/sourcetype by parsing the knowledge objects like Extract, Eval, Fieldalias etc. - * For each field, plugin generates an SPL search query and asserts event_count > 0. - -**3. Negative scenarios for field values** - - .. code-block:: python - - test_props_fields_no_dash_not_empty[::field::] - - Testcase verifies that the field should not have "-" (dash) or "" (empty) as a value. - Here is the source/sourcetype that is defined in the stanza and - is name of field which is extracted under source/sourcetype. - - - **Workflow:** - - * Plugin generates a list of fields extracted under the source/sourcetype. - * For each field, plugin generates a search query to check if the field has invalid values like [" ", "-"]. - * For each field, the event count should be 0. - -**4. All the fields mentioned in an EXTRACT, REPORT, LOOKUP should be extracted in a single event.** - - .. code-block:: python - - test_props_fields[::EXTRACT-] - test_props_fields[::LOOKUP-] - test_props_fields[::REPORT-::] - - All the fields mentioned in EXTRACT, REPORT or LOOKUP can be interdependent. - The test case verifies that the extractions are working fine and all the fields are - being extracted in a single event. - The reason for keeping the test is to identify the corner cases where the fields are being - extracted in several events but the extractions mentioned in EXTRACT, REPORT or LOOKUP is not - working due to invalid regex/lookup configuration. - - **Workflow:** - - * While parsing the conf file when the plugin finds one of EXTRACT, REPORT, LOOKUP - the plugin gets the list of fields extracted and generates a test case. - * For all the fields in the test case it generates a single SPL search query including the stanza and asserts event_count > 0. - * This verifies that all the fields are extracted in the same event. - -**5. Events should be present in each eventtype** - - .. code-block:: python - - test_eventtype[eventtype=] - - Test case verifies that there are events mapped with the eventtype. - Here is an eventtype mentioned in eventtypes.conf. - - **Workflow:** - - * For each eventtype mentioned in eventtypes.conf plugin generates an SPL search query and asserts event_count > 0 for the eventtype. - -**6. Tags defined in tags.conf should be applied to the events.** - - .. code-block:: python - - test_tags[::tag::] - - Test case verifies that the there are events mapped with the tag. - Here is a stanza mentioned in tags.conf and is an individual tag - applied to that stanza. - - **Workflow:** - - * In tags.conf for each tag defined in the stanza, the plugin generates a test case. - * For each tag, the plugin generates a search query including the stanza and the tag and asserts event_count > 0. - -**7. Search query should be present in each savedsearches.** - - .. code-block:: python - - test_savedsearches[] - - Test case verifies that the search mentioned in savedsearch.conf generates valid search results. - Here is a stanza mentioned in savedsearches.conf file. - - **Workflow:** - - * In savedsearches.conf for each stanza, the plugin generates a test case. - * For each stanza mentioned in savedsearches.conf plugin generates an SPL search query and asserts event_count > 0 for the savedsearch. - -Testcase Troubleshooting ------------------------- - -In the case of test-case failure check if: - - - The add-on to be tested is installed on the Splunk instance. - - Data is generated sufficiently for the add-on being tested. - - Data is generated sufficiently in the specific index, it is being tested. - - Splunk licence has not expired. - - Splunk instance is up and running. - - Splunk instance's management port is accessible from the test machine. - -If all the above conditions are satisfied, further analysis of the test is required. -For every test case failure, there is a defined structure for the stack trace [1]_. - - .. code-block:: text - - AssertionError: <> - Search = - -Get the search query from the stack trace and execute it on the Splunk instance and verify which specific type of events are causing failure. - ------------- - -.. [1] Stacktrace is the text displayed in the Exception block when the Test fails. diff --git a/docs/generate_conf.md b/docs/generate_conf.md new file mode 100644 index 000000000..78da4777b --- /dev/null +++ b/docs/generate_conf.md @@ -0,0 +1,58 @@ +# Generate Conf Utility + +(generate-conf)= + +## Overview + +> **_NOTE:_** This is deprecated since `pytest-splunk-addon` v1.12.0 and latest available version is v1.11.4. + + +- The utility helps in creating the `pytest-splunk-addon-data.conf` from the existing `eventgen.conf` of the add-on. + +- The utility adds the following metadata required for the index-times tests in the new conf file: + + - input_type + - host_type + - timestamp_type + - sample_count + - sourcetype_to_search + +- All of these above metadata will be appended with keyword 'REVIEW', indicating that these values will have to be filled by the + user, depending on the requirements of the add-on. + +- The utility reads the tokens in the existing `eventgen.conf` and identifies if it can be replaced with any of + the new token replacement settings. The new token replacement settings are as follows: + + - src + - dest + - src_port + - dest_port + - dvc + - host + - url + +- As all the above mentioned new replacement settings are for key_fields, a new parameter will also be added i.e `token.n.field` + +- The new token replacement settings and field parameter will be appended with keyword 'REVIEW', indicating the user will have to check + for the following: + + 1. If the new token replacement settings are applicable for the addon. If yes, then the user will have to fill the appropriate values as mentioned in the spec file. + 2. If the field provided in the 'token.n.field', is extracted in the add-on or not. If the field is not extracted, + the parameter 'token.n.field' should be removed. + +## How to generate the new conf file? + + - Execute the following command: + + ```console + generate-indextime-conf **_NOTE:_** Add-on must contain samples folder, for the utility to work properly. + diff --git a/docs/generate_conf.rst b/docs/generate_conf.rst deleted file mode 100644 index b3c568533..000000000 --- a/docs/generate_conf.rst +++ /dev/null @@ -1,62 +0,0 @@ -Generate Conf Utility -====================== - -.. _generate_conf: - -Overview -"""""""" - -.. note:: - - This is deprecated since `pytest-splunk-addon` v1.12.0 and latest available version is v1.11.4. - -* The utility helps in creating the `pytest-splunk-addon-data.conf` from the existing `eventgen.conf` of the add-on. -* The utility adds the following metadata required for the index-times tests in the new conf file: - - * input_type - * host_type - * timestamp_type - * sample_count - * sourcetype_to_search - -* All of these above metadata will be appended with keyword 'REVIEW', indicating that these values will have to be filled by the - user, depending on the requirements of the add-on. - -* The utility reads the tokens in the existing `eventgen.conf` and identifies if it can be replaced with any of - the new token replacement settings. The new token replacement settings are as follows: - - * src - * dest - * src_port - * dest_port - * dvc - * host - * url - -* As all the above mentioned new replacement settings are for key_fields, a new parameter will also be added i.e `token.n.field` -* The new token replacement settings and field parameter will be appended with keyword 'REVIEW', indicating the user will have to check - for the following: - - 1. If the new token replacement settings are applicable for the addon. If yes, then the user will have to fill the appropriate values as mentioned in the spec file. - 2. If the field provided in the 'token.n.field', is extracted in the add-on or not. If the field is not extracted, - the parameter 'token.n.field' should be removed. - - -How to generate the new conf file? -""""""""""""""""""""""""""""""""""" - - * Execute the following command: - - .. code-block:: console - - generate-indextime-conf [] - - For example: - - .. code-block:: console - - generate-indextime-conf SampleTA SampleTA/default/pytest-splunk-addon-data.conf - - - .. note:: - Add-on must contain samples folder, for the utility to work properly. diff --git a/docs/how_to_use.md b/docs/how_to_use.md new file mode 100644 index 000000000..245569e2d --- /dev/null +++ b/docs/how_to_use.md @@ -0,0 +1,444 @@ +# How To Use + + +Create a test file in the tests folder + +```python +from pytest_splunk_addon.standard_lib.addon_basic import Basic +class Test_App(Basic): + def empty_method(): + pass +``` + +## Test Execution + +There are three ways to execute the tests: + +**1. Running tests with an external Splunk instance** + + + Run pytest with the add-on, in an external Splunk deployment + + ```bash + pytest --splunk-type=external --splunk-app= +Create docker-compose.yml + +```commandline +version: "3.7" +services: + + sc4s: + image: ghcr.io/splunk/splunk-connect-for-syslog/container2:latest + hostname: sc4s + #When this is enabled test_common will fail + # command: -det + ports: + - "514" + - "601" + - "514/udp" + - "5000-5050" + - "5000-5050/udp" + - "6514" + stdin_open: true + tty: true + links: + - splunk + environment: + - SPLUNK_HEC_URL=https://splunk:8088 + - SPLUNK_HEC_TOKEN=${SPLUNK_HEC_TOKEN} + - SC4S_SOURCE_TLS_ENABLE=no + - SC4S_DEST_SPLUNK_HEC_TLS_VERIFY=no + - SC4S_LISTEN_JUNIPER_NETSCREEN_TCP_PORT=5000 + - SC4S_LISTEN_CISCO_ASA_TCP_PORT=5001 + - SC4S_LISTEN_CISCO_IOS_TCP_PORT=5002 + - SC4S_LISTEN_CISCO_MERAKI_TCP_PORT=5003 + - SC4S_LISTEN_JUNIPER_IDP_TCP_PORT=5004 + - SC4S_LISTEN_PALOALTO_PANOS_TCP_PORT=5005 + - SC4S_LISTEN_PFSENSE_TCP_PORT=5006 + - SC4S_LISTEN_CISCO_ASA_UDP_PORT=5001 + - SC4S_LISTEN_CISCO_IOS_UDP_PORT=5002 + - SC4S_LISTEN_CISCO_MERAKI_UDP_PORT=5003 + - SC4S_LISTEN_JUNIPER_IDP_UDP_PORT=5004 + - SC4S_LISTEN_PALOALTO_PANOS_UDP_PORT=5005 + - SC4S_LISTEN_PFSENSE_UDP_PORT=5006 + - SC4S_ARCHIVE_GLOBAL=no + - SC4S_LISTEN_CHECKPOINT_SPLUNK_NOISE_CONTROL=yes + + splunk: + build: + context: . + dockerfile: Dockerfile.splunk + args: + SPLUNK_APP_ID: ${SPLUNK_APP_ID} + SPLUNK_APP_PACKAGE: ${SPLUNK_APP_PACKAGE} + SPLUNK_VERSION: ${SPLUNK_VERSION} + ports: + - "8000" + - "8088" + - "8089" + - "9997" + environment: + - SPLUNK_PASSWORD=${SPLUNK_PASSWORD} + - SPLUNK_START_ARGS=--accept-license + - SPLUNK_HEC_TOKEN=${SPLUNK_HEC_TOKEN} + - TEST_SC4S_ACTIVATE_EXAMPLES=yes + + uf: + build: + context: . + dockerfile: Dockerfile.uf + args: + SPLUNK_APP_ID: ${SPLUNK_APP_ID} + SPLUNK_APP_PACKAGE: ${SPLUNK_APP_PACKAGE} + SPLUNK_VERSION: ${SPLUNK_VERSION} + hostname: uf + ports: + - "9997" + - "8089" + links: + - splunk + environment: + - SPLUNK_PASSWORD=Chang3d! + - SPLUNK_START_ARGS=--accept-license + volumes: + - ${CURRENT_DIR}/uf_files:${CURRENT_DIR}/uf_files + +volumes: + splunk-sc4s-var: + external: false +``` + + +
+Create conftest.py file + +``` +import os +import pytest + +pytest_plugins = "pytester" + + +def pytest_configure(config): + config.addinivalue_line("markers", "external: Test search time only") + config.addinivalue_line("markers", "docker: Test search time only") + config.addinivalue_line("markers", "doc: Test Sphinx docs") + + +@pytest.fixture(scope="session") +def docker_compose_files(request): + """ + Get an absolute path to the `docker-compose.yml` file. Override this + fixture in your tests if you need a custom location. + + Returns: + string: the path of the `docker-compose.yml` file + + """ + docker_compose_path = os.path.join( + str(request.config.invocation_dir), "docker-compose.yml" + ) + # LOGGER.info("docker-compose path: %s", docker_compose_path) + + return [docker_compose_path] + + +@pytest.fixture(scope="session") +def docker_services_project_name(pytestconfig): + rootdir = str(pytestconfig.rootdir) + docker_compose_v2_rootdir = rootdir.lower().replace("/", "") + return f"pytest{docker_compose_v2_rootdir}" + +``` +
+ + Run pytest with the add-on, using the following command: + + ```bash + pytest --splunk-type=docker --splunk-data-generator= **_NOTE:_** + - If live events are available in external Splunk instance or docker splunk, then SA-Eventgen is not required. This is applicable only till v1.2.0 of pytest-splunk-addon. + - From v1.3.0 pytest-splunk-addon ingests data independently which is used for execution of all the test cases. + + +**3. Running tests with an external forwarder and Splunk instance** + + - Run pytest with the add-on, using an external forwarder sending events to another Splunk deployment where a user can search for received events. + - Forwarding & receiving configuration in --splunk-forwarder-host and --splunk-host must be done before executing the tests. + - User can validate the forwarding using makeresults command. + + ```bash + | makeresults | eval _raw="sample event" | collect index=main, source=test_source, sourcetype=test_src_type + ``` + + - Sample pytest command with the required params + + ```bash + pytest --splunk-type=external # Whether you want to run the addon with docker or an external Splunk instance + --splunk-app= **_NOTE:_** + - Forwarder params are supported only for external splunk-type. + - If Forwarder params are not provided It will ingest and search in the same Splunk deployment provided in --splunk-host param. + +______________________________________________________________________ + +There are 3 types of tests included in pytest-splunk-addon are: + + 1. To generate test cases only for knowledge objects, append the following marker to pytest command: + + ```console + -m splunk_searchtime_fields + ``` + + 2. To generate test cases only for CIM compatibility, append the following marker to pytest command: + + ```console + -m splunk_searchtime_cim + ``` + + 3. To generate test cases only for index time properties, append the following marker to pytest command: + + ```console + -m splunk_indextime --splunk-data-generator= **_NOTE:_** *Each line in the file will be considered a separate string to be ignored in the events.* + + - Sample Event which will be ignored by the search query. + + ```console + 11-04-2020 13:26:01.026 +0000 ERROR SearchMessages - orig_component="SearchStatusEnforcer" app="search" sid="ta_1604496283.232" peer_name="" message_key="" message=NOT requires an argument + ``` + +6. Options to separate event generation, event ingestion and test execution stage + : + + ```console + --tokenized-event-source=new|store_new|pregenerated + ``` + + - new: Generate new events + - store_new: Generate new events and store it in file + - pregenerated: Use pregenerated events from file + - Default value for this parameter is *store_new* + + ```console + --event-file-path= **_NOTE:_** Make sure there is enough data on the Splunk instance before running tests with pytest-xdist because faster the execution, lesser the time to generate enough data.. + + +**2. Want flaky/known failures to not fail the execution** + + Use [pytest-expect](https://pypi.org/project/pytest-expect/) to mark a list of test cases as flaky/known failures which will not affect the final result of testing. + + How to use pytest-expect: + + - pip install pytest-expect + - Add `--update-xfail` to the pytest command to generate a `.pytest.expect` file, which is a list of failures while execution. + - Make sure that the `.pytest.expect` file is in the root directory from where the test cases are executed. + - When the test cases are executed the next time, all the tests in the `.pytest.expect` file will be marked as `xfail`. + - If there is a custom file containing the list of failed test cases, it can be used by adding `--xfail-file custom_file` to the pytest command. + +> **_NOTE:_** Test cases should be added to .pytest.expect only after proper validation + + +**3. Setup test environment before executing the test cases** + + If any setup is required in the Splunk/test environment before executing the test cases, implement a fixture in {ref}`conftest.py +Example conftest file + +``` +from splunklib import binding, client, results + +class TASetup(object): + def __init__(self, splunk): + self.splunk = splunk + + def wait_for_lookup(self, lookup): + splunk_client = client.connect(**self.splunk) + for _ in range(60): + job_result = splunk_client.jobs.oneshot(f" | inputlookup {lookup}") + for _ in results.ResultsReader(job_result): + return + time.sleep(1) + + def enable_savedsearch(self, addon_name, savedsearch): + splunk_binding = binding.connect(**self.splunk) + splunk_binding.post( + f"/servicesNS/nobody/{addon_name}/saved/searches/{savedsearch}/enable", + data="", + ) + + +@pytest.fixture(scope="session") +def splunk_setup(splunk): + ta_setup = TASetup(splunk) + ta_setup.enable_savedsearch("TA_SavedSearch", "ta_saved_search_one") + ta_setup.wait_for_lookup("ta_saved_search_lookup") +``` + + + +**4. Check mapping of an add-on with custom data models** + + pytest-splunk-addon is capable of testing mapping of an add-on with custom data models. + + How can this be achieved : + + - Make json representation of the data models, which satisfies this [DataModelSchema](https://github.com/splunk/pytest-splunk-addon/blob/main/pytest_splunk_addon/standard_lib/cim_tests/DatamodelSchema.json). + - Provide the path to the directory having all the data models by adding `--splunk_dm_path path_to_dir` to the pytest command + - The test cases will now be generated for the data models provided to the plugin and not for the [default data models](https://github.com/splunk/pytest-splunk-addon/tree/main/pytest_splunk_addon/standard_lib/data_models). diff --git a/docs/how_to_use.rst b/docs/how_to_use.rst deleted file mode 100644 index 485317529..000000000 --- a/docs/how_to_use.rst +++ /dev/null @@ -1,339 +0,0 @@ - -How To Use ----------- - -.. _test_file: - -Create a test file in the tests folder - -.. dropdown:: Example Test File - - .. code:: python3 - - from pytest_splunk_addon.standard_lib.addon_basic import Basic - class Test_App(Basic): - def empty_method(): - pass - - -.. _test_execution: - -There are three ways to execute the tests: - -**1. Running tests with an external Splunk instance** - - .. code:: python3 - - pip3 install pytest-splunk-addon - - Run pytest with the add-on, in an external Splunk deployment - - .. code:: bash - - pytest --splunk-type=external --splunk-app= --splunk-data-generator= --splunk-host= --splunk-port= --splunk-user= --splunk-password= --splunk-hec-token= - - -**2. Running tests with docker splunk** - - .. code:: bash - - git clone git@github.com:splunk/pytest-splunk-addon.git - cd pytest-splunk-addon - pip install poetry - poetry install - - Create a Dockerfile.splunk file - - .. dropdown:: Example Dockerfile - - .. code:: Dockerfile - - ARG SPLUNK_VERSION=latest - FROM splunk/splunk:$SPLUNK_VERSION - ARG SPLUNK_VERSION=latest - ARG SPLUNK_APP_ID=TA_UNKNOWN - ARG SPLUNK_APP_PACKAGE=$SPLUNK_APP_PACKAGE - RUN echo Splunk VERSION=$SPLUNK_VERSION - COPY deps/apps /opt/splunk/etc/apps/ - COPY $SPLUNK_APP_PACKAGE /opt/splunk/etc/apps/$SPLUNK_APP_ID - - Create a Dockerfile.uf file - - .. dropdown:: Example Dockerfile - - .. code:: Dockerfile - - ARG SPLUNK_VERSION=latest - FROM splunk/universalforwarder:$SPLUNK_VERSION - ARG SPLUNK_VERSION=latest - ARG SPLUNK_APP_ID=TA_UNKNOWN - ARG SPLUNK_APP_PACKAGE=$SPLUNK_APP_PACKAGE - COPY $SPLUNK_APP_PACKAGE /opt/splunkforwarder/etc/apps/$SPLUNK_APP_ID - - Create docker-compose.yml - - .. dropdown:: Example docker-compose file - - .. literalinclude:: ../docker-compose.yml - :language: YAML - :lines: 9- - -.. _conftest_file: - - Create conftest.py in the test folder along with :ref:`the test file ` - - .. dropdown:: Example conftest file - - .. literalinclude:: ../tests/e2e/conftest.py - :language: python - :lines: 1-2,12- - - Run pytest with the add-on, using the following command: - - .. code:: bash - - pytest --splunk-type=docker --splunk-data-generator= - -The tool assumes the Splunk Add-on is located in a folder "package" in the project root. - -.. note:: - * If live events are available in external Splunk instance or docker splunk, then SA-Eventgen is not required. This is applicable only till v1.2.0 of pytest-splunk-addon. - * From v1.3.0 pytest-splunk-addon ingests data independently which is used for execution of all the test cases. - - - -**3. Running tests with an external forwarder and Splunk instance** - - * Run pytest with the add-on, using an external forwarder sending events to another Splunk deployment where a user can search for received events. - * Forwarding & receiving configuration in --splunk-forwarder-host and --splunk-host must be done before executing the tests. - * User can validate the forwarding using makeresults command. - - .. code:: bash - - | makeresults | eval _raw="sample event" | collect index=main, source=test_source, sourcetype=test_src_type - - * Sample pytest command with the required params - - .. code:: bash - - pytest --splunk-type=external # Whether you want to run the addon with docker or an external Splunk instance - --splunk-app= # Path to Splunk app package. The package should have the configuration files in the default folder. - --splunk-host= # Receiver Splunk instance where events are searchable. - --splunk-port= # default 8089 - --splunk-user= # default admin - --splunk-password= # default Chang3d! - --splunk-forwarder-host= # Splunk instance where forwarding to receiver instance is configured. - --splunk-hec-port= # HEC port of the forwarder instance. - --splunk-hec-token= # HEC token configured in forwarder instance. - --splunk-data-generator= # Path to pytest-splunk-addon-data.conf - -.. note:: - * Forwarder params are supported only for external splunk-type. - * If Forwarder params are not provided It will ingest and search in the same Splunk deployment provided in --splunk-host param. - - ----------------------- - -There are 3 types of tests included in pytest-splunk-addon are: - - 1. To generate test cases only for knowledge objects, append the following marker to pytest command: - - .. code-block:: console - - -m splunk_searchtime_fields - - 2. To generate test cases only for CIM compatibility, append the following marker to pytest command: - - .. code-block:: console - - -m splunk_searchtime_cim - - 3. To generate test cases only for index time properties, append the following marker to pytest command: - - .. code-block:: console - - -m splunk_indextime --splunk-data-generator= - - For detailed information on index time test execution, please refer :ref:`here `. - - * To execute all the searchtime tests together, i.e both Knowledge objects and CIM compatibility tests, - append the following marker to the pytest command: - - .. code-block:: console - - -m "splunk_searchtime_fields or splunk_searchtime_cim" - ----------------------- - -The following optional arguments are available to modify the default settings in the test cases: - - 1. To search for events in a specific index, user can provide following additional arguments: - - .. code-block:: console - - --search-index= - - Splunk index of which the events will be searched while testing. Default value: "*, _internal". - - - 2. To increase/decrease time interval and retries for flaky tests, user can provide following additional arguments: - - .. code-block:: console - - --search-retry= - - Number of retries to make if there are no events found while searching in the Splunk instance. Default value: 0. - - --search-interval= - - Time interval to wait before retrying the search query.Default value: 0. - - 3. To discard the eventlog generation in the working directory, user can provide following additional argument along with pytest command: - - .. code-block:: console - - --discard-eventlogs - - 4. To enable the Splunk Index cleanup performed before the test run, user can provide argument along with pytest command: - - .. code-block:: console - - --splunk-cleanup - - 5. A new functionality is introduced in pytest-splunk-addon to suppress unwanted errors in **test_splunk_internal_errors**. - - - **Splunk related errors**: There is a file maintained in pytest-splunk-addon `".ignore_splunk_internal_errors" `_ , user can add the string in the file and events containing these strings will be suppressed by the search query. - - **Addon related errors:** To suppress these user can create a file with the list of strings and provide the file in the **--ignore-addon-errors** param while test execution. - - .. code-block:: console - - --ignore-addon-errors= - - - Sample strings in the file. - - .. code-block:: console - - SearchMessages - orig_component="SearchStatusEnforcer" - message_key="" message=NOT requires an argument - - .. Note :: - *Each line in the file will be considered a separate string to be ignored in the events.* - - - Sample Event which will be ignored by the search query. - - .. code-block:: console - - 11-04-2020 13:26:01.026 +0000 ERROR SearchMessages - orig_component="SearchStatusEnforcer" app="search" sid="ta_1604496283.232" peer_name="" message_key="" message=NOT requires an argument - - 6. Options to separate event generation, event ingestion and test execution stage - .. code-block:: console - - --tokenized-event-source=new|store_new|pregenerated - - - new: Generate new events - - store_new: Generate new events and store it in file - - pregenerated: Use pregenerated events from file - - Default value for this parameter is *store_new* - - | - - .. code-block:: console - - --event-file-path= - - - Path to tokenized events file - - If –tokenized-event-source=store_new, then it will store tokenized event file on given path - - If –tokenized-event-source=pregenerated, then it will fetch tokenized events from given path - - | - - .. code-block:: console - - --ingest-events=true|false - - - Select false to disable event ingestion on splunk instance, default value is true - - | - - .. code-block:: console - - --execute-test=true|false - - - Select false to disable test execution, default value is true - - - -Extending pytest-splunk-addon -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -**1. Test cases taking too long to execute** - - Use `pytest-xdist `_ to execute test cases across multiple processes. - - How to use pytest-xdist : - - - pip install pytest-xdist - - add ``-n {number-of-processes}`` to the pytest command - - This will create the mentioned amount of processes and divide the test cases amongst them. - - .. Note :: - Make sure there is enough data on the Splunk instance before running tests with pytest-xdist because faster the execution, lesser the time to generate enough data. - -**2. Want flaky/known failures to not fail the execution** - - Use `pytest-expect `_ to mark a list of test cases as flaky/known failures which will not affect the final result of testing. - - How to use pytest-expect: - - - pip install pytest-expect - - Add ``--update-xfail`` to the pytest command to generate a `.pytest.expect` file, which is a list of failures while execution. - - Make sure that the `.pytest.expect` file is in the root directory from where the test cases are executed. - - When the test cases are executed the next time, all the tests in the `.pytest.expect` file will be marked as `xfail` [#]_ - - If there is a custom file containing the list of failed test cases, it can be used by adding ``--xfail-file custom_file`` to the pytest command. - - .. Note :: - Test cases should be added to .pytest.expect only after proper validation. - -**3. Setup test environment before executing the test cases** - - If any setup is required in the Splunk/test environment before executing the test cases, implement a fixture in :ref:`conftest.py `. - - .. code-block:: python - - @pytest.fixture(scope="session") - def splunk_setup(splunk): - # Will be executed before test execution starts - . . . - - The setup fixture opens many possibilities to setup the testing environment / to configure Splunk. For example, - - - Enable Saved-searches - - Configure the inputs of an Add-on. - - Wait for an lookup to be populated. - - Restart Splunk. - - The following snippet shows an example in which the setup fixture is used to enable a saved search. - - .. dropdown:: enable_saved_search_conftest.py - - .. literalinclude:: ../tests/e2e/enable_saved_search_conftest.py - :language: python - :lines: 2,31- - - -**4. Check mapping of an add-on with custom data models** - - pytest-splunk-addon is capable of testing mapping of an add-on with custom data models. - - How can this be achieved : - - - Make json representation of the data models, which satisfies this `DataModelSchema `_. - - Provide the path to the directory having all the data models by adding ``--splunk_dm_path path_to_dir`` to the pytest command - - The test cases will now be generated for the data models provided to the plugin and not for the `default data models `_. - -.. raw:: html - -
- -.. [#] xfail indicates that you expect a test to fail for some reason. A common example is a test for a feature not yet implemented, or a bug not yet fixed. When a test passes despite being expected to fail, it's an xpass and will be reported in the test summary. \ No newline at end of file diff --git a/docs/index.md b/docs/index.md new file mode 100644 index 000000000..8f488a50b --- /dev/null +++ b/docs/index.md @@ -0,0 +1,28 @@ +# Overview + +pytest-splunk-addon is an open-source dynamic test plugin for Splunk Apps and Add-ons +which allows the user to test knowledge objects, CIM compatibility and index time properties. + +## Support + +- **Python**: 3.7 +- **Platforms**: Linux, Windows and MacOS + +## Features + +- Generate tests for Splunk Knowledge objects in Splunk Technology Add-ons. +- Generate tests for checking CIM compatibility in Splunk Technology Add-ons. +- Generate tests for checking Splunk index-time properties in Splunk Technology Add-ons. +- Validate your add-ons using Splunk + Docker. + +## Release notes + +Find details about all the releases [here](https://github.com/splunk/pytest-splunk-addon/releases). + +## Installation + +pytest-splunk-addon can be installed via pip from PyPI: + +```console +pip3 install pytest-splunk-addon +``` diff --git a/docs/index.rst b/docs/index.rst deleted file mode 100644 index 907fb7ed1..000000000 --- a/docs/index.rst +++ /dev/null @@ -1,26 +0,0 @@ -.. test documentation master file, created by - sphinx-quickstart on Tue May 5 11:30:29 2020. - You can adapt this file completely to your liking, but it should at least - contain the root `toctree` directive. - -pytest-splunk-addon documentation -================================= - -.. toctree:: - :caption: Table of Contents - :maxdepth: 3 - - overview - how_to_use - common_tests - cim_tests - cim_compliance - field_tests - index_time_tests - sample_generator - generate_conf - api_reference/api_reference - troubleshoot - - - diff --git a/docs/index_time_tests.md b/docs/index_time_tests.md new file mode 100644 index 000000000..26f5c5f2e --- /dev/null +++ b/docs/index_time_tests.md @@ -0,0 +1,249 @@ +# Index Time Tests + +## Overview + +- The tests are written with a purpose of testing the proper functioning of the index-time properties of the add-on. + +- Index time properties are applied on the events when they are ingested into the Splunk instance. + +- The index time properties which are tested are as follows: + 1. Key Fields Extraction + 2. Timestamp (_time) + 3. LINE_BREAKER + +### Prerequisites + +- `pytest-splunk-addon-data.conf` file which contains all the required data + executing the tests. The conf file should follow the specifications as mentioned {ref}`here **_NOTE:_** --splunk-data-generator should contain the path to *pytest-splunk-addon-data.conf*, + as the test cases will not execute on *eventgen.conf* file. + +## Test Scenarios + + +**1. Test case for key fields extraction:** + + ```python + test_indextime_key_fields[::] + ``` + + - Test case verifies if all the key fields are extracted properly, + as mentioned in the `pytest-splunk-addon-data.conf` file. + + - The key fields which are checked are as follows: + + - src + - src_port + - dest + - dest_port + - dvc + - host + - user + - url + + + - This test case will not be generated if there are no key fields specified for the event. + - Key field can be assign to token using field property. `i.e token.n.field = ` + + Testcase assertions: + + - There should be at least 1 event with the sourcetype and host. + - The values of the key fields obtained from the event + must match with the values of the key fields which was used in generating and ingesting the event. + + **Workflow:** + + - To generate the test case, the following properties will be required: + + - sourcetype and host in the event. + - Key fields in the event for which the test case is executed. + + - Generates an SPL query according to the properties mentioned above. + + - Execute the SPL query in a Splunk instance. + + - Assert the test case results as mentioned in {ref}`testcase assertions::] + ``` + + - Test case verifies if the timestamp for the event is assigned properly. + - The timestamp is assigned to the _time field which is validated by the test case. + - This test case will be generated if timestamp_type = event in stanza. + - _time field can be assign to token using field property. i.e `token.n.field = _time` + + Testcase assertions: + + - There should be at least 1 event with the sourcetype and host. + - There should be at least 1 token with field _time in stanza. + - One event should have only one token with token.n.field = _time. + - Every event should have token with token.n.field = _time. + - The values of the _time fields obtained from the event + must match with the values of the time values which was used in generating and ingesting the event. + + **Workflow:** + + - Generates an SPL query using sourcetype and host from the event. + - Execute the SPL query in a Splunk instance. + - The value of _time obtained from the search query is matched + with the _time value assigned to the event before ingesting it. + +> **_NOTE:_** The test case for _time field will not be generated if `timestamp_type = plugin` in + pytest-splunk-addon-data.conf + +**3. Test case for line-breaker property:** + + ```python + test_indextime_line_breaker[::] + ``` + + - Test case verifies if the LINE_BREAKER property used in props.conf works properly. + - If sample_count is not provided in pytest-splunk-addon-data.conf, it will take + sample_count = 1. + + Testcase assertions: + + - Number of events for particular sourcetype and host should match with value of + `expected_event_count` which is calculated by pytest-splunk-addon from the `sample_count` + parameter provided in the pytest-splunk-addon-data.conf. + + **Workflow:** + + - Generates an SPL query using sourcetype and host from the event. + - Execute the SPL query in a Splunk instance. + - The number of results obtained from the search query is matched with the + *expected_event_count* value, which is calculated by the plugin. + +## Testcase Troubleshooting + +In the case test-case failure check if: + + - The add-on to be tested is installed on the Splunk instance. + - Data is generated for the addon being tested. + - Splunk licence has not expired. + - Splunk instance is up and running. + - Splunk instance's management port is accessible from the test machine. + +If all the above conditions are satisfied, further analysis of the test is required. +For every test case failure, there is a defined structure for the stack trace. + + ```text + AssertionError: <`. - --------------------------------- - -.. _index_time_tests: - -To generate test cases only for index time properties, append the following marker to pytest command: - - .. code-block:: console - - -m splunk_indextime --splunk-data-generator= - - .. note:: - --splunk-data-generator should contain the path to *pytest-splunk-addon-data.conf*, - as the test cases will not execute on *eventgen.conf* file. - - -Test Scenarios --------------- - -.. _key_fields: - -**1. Test case for key fields extraction:** - - .. code-block:: python - - test_indextime_key_fields[::] - - * Test case verifies if all the key fields are extracted properly, - as mentioned in the `pytest-splunk-addon-data.conf` file. - * The key fields which are checked are as follows: - - * src - * src_port - * dest - * dest_port - * dvc - * host - * user - * url - - .. _test_assertions_key_field: - - * This test case will not be generated if there are no key fields specified for the event. - * Key field can be assign to token using field property. `i.e token.n.field = ` - - Testcase assertions: - - * There should be at least 1 event with the sourcetype and host. - * The values of the key fields obtained from the event - must match with the values of the key fields which was used in generating and ingesting the event. - - **Workflow:** - - * To generate the test case, the following properties will be required: - - * sourcetype and host in the event. - * Key fields in the event for which the test case is executed. - - * Generates an SPL query according to the properties mentioned above. - * Execute the SPL query in a Splunk instance. - * Assert the test case results as mentioned in :ref:`testcase assertions`. - -**2. Test case for _time property:** - - .. code-block:: python - - test_indextime_time[::] - - * Test case verifies if the timestamp for the event is assigned properly. - * The timestamp is assigned to the _time field which is validated by the test case. - * This test case will be generated if timestamp_type = event in stanza. - * _time field can be assign to token using field property. i.e `token.n.field = _time` - - Testcase assertions: - - * There should be at least 1 event with the sourcetype and host. - * There should be at least 1 token with field _time in stanza. - * One event should have only one token with token.n.field = _time. - * Every event should have token with token.n.field = _time. - * The values of the _time fields obtained from the event - must match with the values of the time values which was used in generating and ingesting the event. - - **Workflow:** - - * Generates an SPL query using sourcetype and host from the event. - * Execute the SPL query in a Splunk instance. - * The value of _time obtained from the search query is matched - with the _time value assigned to the event before ingesting it. - - .. note:: - The test case for _time field will not be generated if `timestamp_type = plugin` in - pytest-splunk-addon-data.conf - -**3. Test case for line-breaker property:** - - .. code-block:: python - - test_indextime_line_breaker[::] - - * Test case verifies if the LINE_BREAKER property used in props.conf works properly. - * If sample_count is not provided in pytest-splunk-addon-data.conf, it will take - sample_count = 1. - - Testcase assertions: - - * Number of events for particular sourcetype and host should match with value of - `expected_event_count` which is calculated by pytest-splunk-addon from the `sample_count` - parameter provided in the pytest-splunk-addon-data.conf. - - **Workflow:** - - * Generates an SPL query using sourcetype and host from the event. - * Execute the SPL query in a Splunk instance. - * The number of results obtained from the search query is matched with the - *expected_event_count* value, which is calculated by the plugin. - -Testcase Troubleshooting ------------------------- - -In the case test-case failure check if: - - - The add-on to be tested is installed on the Splunk instance. - - Data is generated for the addon being tested. - - Splunk licence has not expired. - - Splunk instance is up and running. - - Splunk instance's management port is accessible from the test machine. - -If all the above conditions are satisfied, further analysis of the test is required. -For every test case failure, there is a defined structure for the stack trace [1]_. - - .. code-block:: text - - AssertionError: <> - Search = - -Get the search query from the stack trace and execute it on the Splunk instance and verify which specific type of events are causing failure. - - -FAQ ----- - -1. What is the source of data used while testing with pytest-splunk-addon 1.3.0 and above? - * pytest-splunk-addon relies on samples available in addon folder under path provided ``--splunk-app`` or ``--splunk-data-generator`` options. -2. When do I assign timestamp_type = event to test the time extraction (_time) for a stanza? - * When the Splunk assigns _time value from a timestamp present in event based on props configurations, you should assign ``timestamp_type=event`` for that sample stanza. - * Example: - For this sample, Splunk assigns the value ``2020-06-23T00:00:00.000Z`` to ``_time``. - - .. code-block:: text - - 2020-06-23T00:00:00.000Z test_sample_1 test_static=##token_static_field## . . . - - In this scenario the value ``2020-06-23T00:00:00.000Z`` should be tokenized, stanza should have ``timestamp_type=event`` and the token should also have ``token.0.field = _time`` as shown below: - - .. code-block:: text - - token.0.token = (\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+) - token.0.replacementType = timestamp - token.0.replacement = %Y-%m-%dT%H:%M:%S - token.0.field = _time -3. When do I assign timestamp_type = plugin to test the time extraction (_time) for a stanza? - * When there is no timestamp available in event or the props configurations are written to have the Splunk default timestamp assigned instead timestamp present in event, you should assign ``timestamp_type=plugin`` for that sample stanza. - * No _time test generates for the sample stanza when ``timestamp_type = plugin``. - * Example: - For this sample, Splunk assigns the default time value to ``_time``. - - .. code-block:: text - - test_sample_1 test_static=##token_static_field## src=##token_src_ipv4## . . . - - In this scenario, the stanza should have ``timestamp_type=plugin``. -4. When do I assign host_type = plugin for a sample stanza? - * When there are no configurations written in props to override the host value in event and Splunk default host value is assigned for host field instead of a value present in event, you should assign ``host_type=plugin`` for that sample stanza. -5. When do I assign host_type = event for a sample stanza? - * When there are some configurations written in props to override the host value for an event you should assign ``host_type=event`` for that sample stanza. - * Example: - For this sample, Splunk assigns the value sample_host to host based on the props configurations present in addon - - .. code-block:: text - - test_modinput_1 host=sample_host static_value_2=##static_value_2## . . . - - In this scenario the value "sample_host" should be tokenized, stanza should have ``host_type=event`` and the token should also have ``token.0.field = host`` as shown below: - - .. code-block:: text - - token.0.token = ##host_value## - token.0.replacementType = random - token.0.replacement = host["host"] - token.0.field = host -6. Can I test any field present in my event as Key Field in Key Fields tests? - * No, Key Fields are defined in plugin and only below fields can be validated as part of Key Field tests. - - * src - * src_port - * dest - * dest_port - * dvc - * host - * user - * url -7. What if I don't assign any field as key_field in a particular stanza even if its present in props? - * No test would generate to test Key Fields for that particular stanza and thus won't be correctly tested. -8. When do I assign token..field = to test the Key Fields for an event? - * When there props configurations written in props to extract any of the field present in Key Fields list, you should add ``token..field = `` to the token for that field value. - * Example: - For this sample, there is report written in props that extracts ``127.0.0.1`` as ``src``, - - .. code-block:: text - - 2020-06-23T00:00:00.000Z test_sample_1 127.0.0.1 - - In this scenario the value ``127.0.0.1`` should be tokenized and the token should also have ``token.0.field = src`` as shown below: - - .. code-block:: text - - token.0.token = ##src_value## - token.0.replacementType = random - token.0.replacement = src["ipv4"] - token.0.field = src - ------------- - -.. [1] Stacktrace is the text displayed in the Exception block when the Test fails. diff --git a/docs/make.bat b/docs/make.bat deleted file mode 100644 index 28beda17a..000000000 --- a/docs/make.bat +++ /dev/null @@ -1,263 +0,0 @@ -@ECHO OFF - -REM Command file for Sphinx documentation - -if "%SPHINXBUILD%" == "" ( - set SPHINXBUILD=sphinx-build -) -set BUILDDIR=_build -set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . -set I18NSPHINXOPTS=%SPHINXOPTS% . -if NOT "%PAPER%" == "" ( - set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% - set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% -) - -if "%1" == "" goto help - -if "%1" == "help" ( - :help - echo.Please use `make ^` where ^ is one of - echo. html to make standalone HTML files - echo. dirhtml to make HTML files named index.html in directories - echo. singlehtml to make a single large HTML file - echo. pickle to make pickle files - echo. json to make JSON files - echo. htmlhelp to make HTML files and a HTML help project - echo. qthelp to make HTML files and a qthelp project - echo. devhelp to make HTML files and a Devhelp project - echo. epub to make an epub - echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter - echo. text to make text files - echo. man to make manual pages - echo. texinfo to make Texinfo files - echo. gettext to make PO message catalogs - echo. changes to make an overview over all changed/added/deprecated items - echo. xml to make Docutils-native XML files - echo. pseudoxml to make pseudoxml-XML files for display purposes - echo. linkcheck to check all external links for integrity - echo. doctest to run all doctests embedded in the documentation if enabled - echo. coverage to run coverage check of the documentation if enabled - goto end -) - -if "%1" == "clean" ( - for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i - del /q /s %BUILDDIR%\* - goto end -) - - -REM Check if sphinx-build is available and fallback to Python version if any -%SPHINXBUILD% 2> nul -if errorlevel 9009 goto sphinx_python -goto sphinx_ok - -:sphinx_python - -set SPHINXBUILD=python -m sphinx.__init__ -%SPHINXBUILD% 2> nul -if errorlevel 9009 ( - echo. - echo.The 'sphinx-build' command was not found. Make sure you have Sphinx - echo.installed, then set the SPHINXBUILD environment variable to point - echo.to the full path of the 'sphinx-build' executable. Alternatively you - echo.may add the Sphinx directory to PATH. - echo. - echo.If you don't have Sphinx installed, grab it from - echo.http://sphinx-doc.org/ - exit /b 1 -) - -:sphinx_ok - - -if "%1" == "html" ( - %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The HTML pages are in %BUILDDIR%/html. - goto end -) - -if "%1" == "dirhtml" ( - %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. - goto end -) - -if "%1" == "singlehtml" ( - %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. - goto end -) - -if "%1" == "pickle" ( - %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle - if errorlevel 1 exit /b 1 - echo. - echo.Build finished; now you can process the pickle files. - goto end -) - -if "%1" == "json" ( - %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json - if errorlevel 1 exit /b 1 - echo. - echo.Build finished; now you can process the JSON files. - goto end -) - -if "%1" == "htmlhelp" ( - %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp - if errorlevel 1 exit /b 1 - echo. - echo.Build finished; now you can run HTML Help Workshop with the ^ -.hhp project file in %BUILDDIR%/htmlhelp. - goto end -) - -if "%1" == "qthelp" ( - %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp - if errorlevel 1 exit /b 1 - echo. - echo.Build finished; now you can run "qcollectiongenerator" with the ^ -.qhcp project file in %BUILDDIR%/qthelp, like this: - echo.^> qcollectiongenerator %BUILDDIR%\qthelp\pytest-cookiecutterplugin_name.qhcp - echo.To view the help file: - echo.^> assistant -collectionFile %BUILDDIR%\qthelp\pytest-cookiecutterplugin_name.ghc - goto end -) - -if "%1" == "devhelp" ( - %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. - goto end -) - -if "%1" == "epub" ( - %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The epub file is in %BUILDDIR%/epub. - goto end -) - -if "%1" == "latex" ( - %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex - if errorlevel 1 exit /b 1 - echo. - echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. - goto end -) - -if "%1" == "latexpdf" ( - %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex - cd %BUILDDIR%/latex - make all-pdf - cd %~dp0 - echo. - echo.Build finished; the PDF files are in %BUILDDIR%/latex. - goto end -) - -if "%1" == "latexpdfja" ( - %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex - cd %BUILDDIR%/latex - make all-pdf-ja - cd %~dp0 - echo. - echo.Build finished; the PDF files are in %BUILDDIR%/latex. - goto end -) - -if "%1" == "text" ( - %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The text files are in %BUILDDIR%/text. - goto end -) - -if "%1" == "man" ( - %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The manual pages are in %BUILDDIR%/man. - goto end -) - -if "%1" == "texinfo" ( - %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. - goto end -) - -if "%1" == "gettext" ( - %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The message catalogs are in %BUILDDIR%/locale. - goto end -) - -if "%1" == "changes" ( - %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes - if errorlevel 1 exit /b 1 - echo. - echo.The overview file is in %BUILDDIR%/changes. - goto end -) - -if "%1" == "linkcheck" ( - %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck - if errorlevel 1 exit /b 1 - echo. - echo.Link check complete; look for any errors in the above output ^ -or in %BUILDDIR%/linkcheck/output.txt. - goto end -) - -if "%1" == "doctest" ( - %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest - if errorlevel 1 exit /b 1 - echo. - echo.Testing of doctests in the sources finished, look at the ^ -results in %BUILDDIR%/doctest/output.txt. - goto end -) - -if "%1" == "coverage" ( - %SPHINXBUILD% -b coverage %ALLSPHINXOPTS% %BUILDDIR%/coverage - if errorlevel 1 exit /b 1 - echo. - echo.Testing of coverage in the sources finished, look at the ^ -results in %BUILDDIR%/coverage/python.txt. - goto end -) - -if "%1" == "xml" ( - %SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The XML files are in %BUILDDIR%/xml. - goto end -) - -if "%1" == "pseudoxml" ( - %SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml - if errorlevel 1 exit /b 1 - echo. - echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml. - goto end -) - -:end diff --git a/docs/overview.rst b/docs/overview.rst deleted file mode 100644 index e215cb131..000000000 --- a/docs/overview.rst +++ /dev/null @@ -1,43 +0,0 @@ - -Overview -============= -pytest-splunk-addon is an open-source dynamic test plugin for Splunk Apps and Add-ons -which allows the user to test knowledge objects, CIM compatibility and index time properties. - -Support -------- - -* **Python**: 3.7 -* **Platforms**: Linux, Windows and MacOS - -Features --------- -* Generate tests for Splunk Knowledge objects in Splunk Technology Add-ons. - -* Generate tests for checking CIM compatibility in Splunk Technology Add-ons. - -* Generate tests for checking Splunk index-time properties in Splunk Technology Add-ons. - -* Validate your add-ons using Splunk + Docker. - -Release notes -------------- - -Find details about all the releases `here `_. - -Installation ------------- -pytest-splunk-addon can be installed via pip from PyPI: - -.. code-block:: console - - pip3 install pytest-splunk-addon - -Or, in any case, if pip is unavailable: - -.. code-block:: console - - 1. git clone https://github.com/splunk/pytest-splunk-addon.git - 2. cd pytest-splunk-addon - 3. pip install poetry - 4. poetry install diff --git a/docs/requirements.txt b/docs/requirements.txt deleted file mode 100644 index 5489a22f5..000000000 --- a/docs/requirements.txt +++ /dev/null @@ -1,3 +0,0 @@ -sphinx_rtd_theme -sphinx_panels --r ../requirements.txt \ No newline at end of file diff --git a/docs/sample_generator.md b/docs/sample_generator.md new file mode 100644 index 000000000..985383200 --- /dev/null +++ b/docs/sample_generator.md @@ -0,0 +1,355 @@ +# Data Generator + +To ingest samples into Splunk, plugin takes `pytest-splunk-addon-data.conf` as input. +The sample generation & ingestion takes place before executing the testcases. +For index-time test cases, there are multiple metadata required about the sample file for which `pytest-splunk-addon-data.conf` must be created and provided to the pytest command. + + +## pytest-splunk-addon-data.conf.spec + +**Default Values**: + +``` +[default] +host_type = plugin +input_type = default +index = main +sourcetype = pytest-splunk-addon +source = pytest-splunk-addon:{{input_type}} +sourcetype_to_search = {{sourcetype}} +sample_count = 1 +timestamp_type = event +count = 0 +earliest = now +latest = now +timezone = 0000 +breaker = {{regex}} +host_prefix = {{host_prefix}} +``` + +**[]** + +- The stanza can contain the sample File Name or Regex to match multiple sample files. +- The sample file should be located in samples folder under the Add-on package. +- Example1: \[sample_file.samples\] would collect samples from file sample_file.samples +- Example2: \[sample\_\*.samples\] would collect samples from both sample_file.samples and sample_sample.samples. + +**sourcetype = ** + +- sourcetype to be assigned to the sample events + +**source = ** + +- source to be assigned to the sample events + - default value: pytest-splunk-addon:{\{input_type}} + + +**sourcetype_to_search = ** + +- The sourcetype used to search events + - This would be different then sourcetype= param in cases where TRANSFORMS is used to update the sourcetype index time. + +**host_type = plugin | event** + +- This key determines if host is assigned from event or default host should be assigned by plugin. +- If the value is plugin, the plugin will generate host with format of "stanza\_\{count}" to uniquely identify the events. +- If the value is event, the host field should be provided for a token using "token..field = host". + +**input_type = modinput | scripted_input | syslog_tcp | file_monitor | windows_input | uf_file_monitor | default** +- +- The input_type used in addon to ingest data of a sourcetype used in stanza. +- The way with which the sample data is ingested in Splunk depends on Splunk. The most similar ingesting approach is used for each input_type to get accurate index-time testing. +- In input_type=uf_file_monitor, universal forwarder will use file monitor to read event and then it will send data to indexer. +- For example, in an Add-on, a sourcetype "alert" is ingested through syslog in live environment, provide input_type=syslog_tcp. + +> **_warning:_** uf_file_monitor input_type will only work with splunk-type=docker. + + +**index = ** + +- The index used to ingest the data. +- The index must be configured beforehand. +- If the index is not available then the data will not get ingested into Splunk and a warning message will be printed. +- Custom index is not supported for syslog_tcp or syslog_udp + +**sample_count = ** + +- The no. of events present in the sample file. +- This parameter will be used to calculate the total number of events which will be generated from the sample file. +- If `input_type = modinput`, do not provide this parameter. + +**expected_event_count = ** + +- The no. of events this sample stanza should generate. +- The parameter will be used to test the line breaking in index-time tests. +- To calculate expected_event_count 2 parameters can be used. 1) Number of events in the sample file. 2) Number of values of replacementType=all tokens in the sample file. Both the parameters can be multiplied to get expected_event_count. +- For example, if sample contains 3 lines & a token has replacement_type=all and replacement has list of 2 values, then 6 events will be generated. +- This parameter is optional, if it is not provided by the user, it will be calculated automatically by the pytest-splunk-addon. + +**timestamp_type = plugin | event** + +- This key determines if \_time is assigned from event or default \_time should be assigned by plugin. +- The parameter will be used to test the time extraction in index-time tests. +- If value is plugin, the plugin will assign the time while ingesting the event. +- If value is event, that means the time will be extracted from event and therfore, there should be a token provided with token..field = \_time. + +**breaker = ** + +- The breaker is used to breakdown the sample file into multiple events, based on the regex provided. +- This parameter is optional. If it is not provided by the user, the events will be ingested into Splunk, +as per the *input_type* provided. + +**host_prefix = ** +- This param is used as an identification for the **host** field, for the events which are ingested using SC4S. + +## Token replacement settings + +The following replacementType -> replacement values are supported + +| ReplacementType | Replacement | +| --------------- | --------------------------------------------------------------------------------- | +| static | | +| timestamp | | +| random | ipv4 | +| random | ipv6 | +| random | mac | +| random | guid | +| random | integer\[:\] | +| random | float\[:\] | +| random | list\[< "," separated list\>\] | +| random | hex(\[integer\]) | +| random | file\[:\] | +| random | dest\["host", "ipv4", "ipv6", "fqdn"\] | +| random | src\["host", "ipv4", "ipv6", "fqdn"\] | +| random | host\["host", "ipv4", "ipv6", "fqdn"\] | +| random | dvc\["host", "ipv4", "ipv6", "fqdn"\] | +| random | user\["name", "email", "domain_user", "distinquised_name"\] | +| random | url\["ip_host", "fqdn_host", "path", "query", "protocol"\] | +| random | email | +| random | src_port | +| random | dest_port | +| file | : | +| all | integer\[:\] | +| all | list\[< , separated list\>\] | +| all | file\[:\] | + +**token..token = ** + +- "n" is a number starting at 0, and increasing by 1. +- PCRE expression used to identify segment for replacement. +- If one or more capture groups are present the replacement will be performed on group 1. + +**token..replacementType = static | timestamp | random | all | file** + +- "n" is a number starting at 0, and increasing by 1. +- For static, the token will be replaced with the value specified in the replacement setting. +- For timestamp, the token will be replaced with the strptime specified in the replacement setting. Strptime directive: +- For random, the token will be replaced with a randomly picked type-aware value +- For all, For each possible replacement value, a new event will be generated and the token will be replaced with it. The configuration can be used where a token replacement contains multiple templates/values and all of the values are important and should be ingested at least once. The number of events will be multiplied by the number of values in the replacement. For example, if sample contains 3 lines & a token replacement has list of 2 values, then 6 events will be generated. For a replacement if replacementType='all' is not supported, then be default plugin will consider replacementType="random". +- For file, the token will be replaced with a random value retrieved from a file specified in the replacement setting. + +**token..replacement = | | \["list","of","values"\] | guid | ipv4 | ipv6 | mac | integer\[:\] | float\[:\] | hex() | | : | host | src | dest | dvc | user | url | email | src_port | dest_port** + +- "n" is a number starting at 0, and increasing by 1. + +- For , the token will be replaced with the value specified. + +- For , a strptime formatted string to replace the timestamp with + +- For guid, the token will be replaced with a random GUID value. + +- For ipv4, the token will be replaced with a random valid IPv4 Address (i.e. 10.10.200.1). + +- For ipv6, the token will be replaced with a random valid IPv6 Address (i.e. c436:4a57:5dea:1035:7194:eebb:a210:6361). + +- For mac, the token will be replaced with a random valid MAC Address (i.e. 6e:0c:51:c6:c6:3a). + +- For integer\[:\], the token will be replaced with a random integer between start and end values where is a number greater than 0 and is a number greater than 0 and greater than or equal to . For replacement=all, one event will be generated for each value of integer within range and . + +- For float\[:\], the token will be replaced with a random float between start and end values where is a number greater than or equal to . For floating point numbers, precision will be based off the precision specified in . For example, if we specify 1.0, precision will be one digit, if we specify 1.0000, precision will be four digits. + +- For hex(), the token will be replaced with i number of Hexadecimal characters \[0-9A-F\] where "i" is a number greater than 0. + +- For list, the token will be replaced with a random member of the JSON list provided. For replacement=all, one event will be generated for each value within the list + +- For , the token will be replaced with a random line in the replacement file. + + - Replacement file name should be a fully qualified path (i.e. \$SPLUNK_HOME/etc/apps/windows/samples/users.list). + - Windows separators should contain double forward slashes "\\" (i.e. \$SPLUNK_HOME\\etc\\apps\\windows\\samples\\users.list). + - Unix separators will work on Windows and vice-versa. + - Column numbers in mvfile references are indexed at 1, meaning the first column is column 1, not 0. + +- For host\["host", "ipv4", "ipv6", "fqdn"\], 4 types of host replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. + + - For host\["host"\], the token will be replaced with a sequential host value with pattern "host_sample_host\_". + - For host\["ipv4"\], the token will be replaced with a random valid IPv4 Address. + - For host\["ipv6"\], the token will be replaced with a random valid IPv6 Address from fdee:1fe4:2b8c:3264:0:0:0:0 range. + - For host\["fqdn"\], the token will be replaced with a sequential fqdn value with pattern "host_sample_host.sample_domain.com". + +- For src\["host", "ipv4", "ipv6", "fqdn"\], 4 types of src replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. + + - For src\["host"\], the token will be replaced with a sequential host value with pattern "src_sample_host\_". + - For src\["ipv4"\], the token will be replaced with a random valid IPv4 Address from 10.1.0.0 range. + - For src\["ipv6"\], the token will be replaced with a random valid IPv6 Address from fdee:1fe4:2b8c:3261:0:0:0:0 range. + - For src\["fqdn"\], the token will be replaced with a sequential fqdn value with pattern "src_sample_host.sample_domain.com". + +- For dest\["host", "ipv4", "ipv6", "fqdn"\], 4 types of dest replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. + + - For dest\["host"\], the token will be replaced with a sequential host value with pattern "dest_sample_host\_". + - For dest\["ipv4"\], the token will be replaced with a random valid IPv4 Address from 10.100.0.0 range. + - For dest\["ipv6"\], the token will be replaced with a random valid IPv6 Address from fdee:1fe4:2b8c:3262:0:0:0:0 range. + - For dest\["fqdn"\], the token will be replaced with a sequential fqdn value with pattern "dest_sample_host.sample_domain.com". + +- For dvc\["host", "ipv4", "ipv6", "fqdn"\], 4 types of dvc replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. + + - For dvc\["host"\], the token will be replaced with a sequential host value with pattern "dvc_sample_host\_". + - For dvc\["ipv4"\], the token will be replaced with a random valid IPv4 Address from 172.16.0-50.0 range. + - For dvc\["ipv6"\], the token will be replaced with a random valid IPv6 Address from fdee:1fe4:2b8c:3263:0:0:0:0 range. + - For dvc\["fqdn"\], the token will be replaced with a sequential fqdn value with pattern "dvc_sample_host.sample_domain.com". + +- For user\["name", "email", "domain_user", "distinquised_name"\], 4 types of user replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. + + - For user\["name"\], the token will be replaced with a random name with pattern "user". + - For user\["email"\], the token will be replaced with a random email with pattern "user@email.com". + - For user\["domain_user"\], the token will be replaced with a random domain user pattern sample_domain.comuser. + - For user\["distinquised_name"\], the token will be replaced with a distinquised user with pattern CN=user. + +- For url\["full", "ip_host", "fqdn_host", "path", "query", "protocol"\], 6 types of url replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. + + - For url\["ip_host"\], the url to be replaced will contain ip based address. + - For url\["fqdn_host"\], the url to be replaced will contain fqdn address. + - For path\["path"\], the url to be replaced will contain path with pattern "". + - For url\["query"\], the url to be replaced will contain query with pattern "?=". + - For url\["protocol"\], the url to be replaced will contain protocol with pattern "://". + - For url\["full"\], the url contain all the parts mentioned above i.e. ip_host, fqdn_host, path, query, protocol. + - Example 1: url\["ip_host", "path", "query"\], will be replaced with pattern /?= + - Example 2: url\["fqdn_host", "path", "protocol"\], will be replaced with pattern :/// + - Example 3: url\["ip_host", "fqdn_host", "path", "query", "protocol"\], will be replaced with pattern :///?= + - Example 4: url\["full"\], will be replaced same as example 3. + + - For email, the token will be replaced with a random email. If the same sample has a user token as well, the email and user tokens will be replaced with co-related values. + + - For src_port, the token will be replaced with a random source port value between 4000 and 5000 + + - For dest_port, the token will be replaced with a random dest port value from (80,443,25,22,21) + +**token..field = ** + +- "n" is a number starting at 0, and increasing by 1. +- Assign the field_name for which the tokenized value will be extracted. +- For this [key field](index_time_tests.md#test-scenarios), the index time test cases will be generated. +- Make sure props.conf contains extractions to extract the value from the field. +- If this parameter is not provided, the default value will be same as the token name. + +> **_NOTE:_** Make sure token name is not same as that any of [key field](index_time_tests.md#test-scenarios) values. + + +## Example + +```console +[sample_file.samples] + +sourcetype = juniper:junos:secintel:structured +sourcetype_to_search = juniper:junos:secintel:structured +source = pytest-splunk-addon:syslog_tcp +host_type = plugin +input_type = syslog_tcp +index = main +timestamp_type = event +sample_count = 10 + +token.0.token = (\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+Z) +token.0.replacementType = timestamp +token.0.replacement = %Y-%m-%dT%H:%M:%S + +token.1.token = ##token1## +token.1.replacementType = static +token.1.replacement = sample_value + +token.2.token = ##Src_Addr## +token.2.replacementType = random +token.2.replacement = src["ipv4"] +token.2.field = src + +token.3.token = ##Dest_Addr## +token.3.replacementType = random +token.3.replacement = dest["ipv4"] + +token.4.token = ##Src_Port## +token.4.replacementType = random +token.4.replacement = src_port +token.4.field = src_port + +token.5.token = ##Dest_Port## +token.5.replacementType = random +token.5.replacement = dest_port + +token.6.token = ##dvc## +token.6.replacementType = random +token.6.replacement = dvc["fqdn","host"] +token.6.field = dvc + +token.7.token = ##User## +token.7.replacementType = random +token.7.replacement = user["name"] + +token.8.token = ##HTTP_Host## +token.8.replacementType = random +token.8.replacement = host["fqdn"] + +token.9.token = ##ReferenceIDhex## +token.9.replacementType = random +token.9.replacement = hex(8) + +token.10.token = ##Ip## +token.10.replacementType = random +token.10.replacement = ipv4 + +token.11.token = ##Ipv6## +token.11.replacementType = random +token.11.replacement = ipv6 + +token.12.token = ##Name## +token.12.replacementType = random +token.12.replacement = list["abc.exe","def.exe","efg.exe"] + +token.13.token = ##Name## +token.13.replacementType = all +token.13.replacement = list["abc.exe","def.exe","efg.exe"] + +token.14.token = ##email## +token.14.replacementType = random +token.14.replacement = email + +token.15.token = ##mac## +token.15.replacementType = random +token.15.replacement = mac + +token.16.token = ##memUsedPct## +token.16.replacementType = random +token.16.replacement = float[1.0:99.0] + +token.17.token = ##guid## +token.17.replacementType = random +token.17.replacement = guid + +token.18.token = ##size## +token.18.replacementType = random +token.18.replacement = integer[1:10] + +token.19.token = ##integer_all## +token.19.replacementType = all +token.19.replacement = integer[1:5] + +token.20.token = ##url## +token.20.replacementType = random +token.20.replacement = url["ip_host", "fqdn_host", "path", "query", "protocol"] + +token.21.token = ##DHCP_HOST## +token.21.replacementType = random +token.21.replacement = file[/path/linux.host.sample] + +token.22.token = ##DHCP_HOST_all## +token.22.replacementType = all +token.22.replacement = file[/path/linux.host.sample] +``` diff --git a/docs/sample_generator.rst b/docs/sample_generator.rst deleted file mode 100644 index 0019a72be..000000000 --- a/docs/sample_generator.rst +++ /dev/null @@ -1,350 +0,0 @@ -Data Generator -=============== - -To ingest samples into Splunk, plugin takes `pytest-splunk-addon-data.conf` as input. -The sample generation & ingestion takes place before executing the testcases. -For index-time test cases, there are multiple metadata required about the sample file for which `pytest-splunk-addon-data.conf` must be created and provided to the pytest command. - -To create the `pytest-splunk-addon-data.conf` file, a utility can be used. -Detailed steps on how to create the conf using utility can be found :ref:`here `. - -.. _conf_spec: - -pytest-splunk-addon-data.conf.spec ------------------------------------------------- -**Default Values**:: - - [default] - host_type = plugin - input_type = default - index = main - sourcetype = pytest-splunk-addon - source = pytest-splunk-addon:{{input_type}} - sourcetype_to_search = {{sourcetype}} - sample_count = 1 - timestamp_type = event - count = 0 - earliest = now - latest = now - timezone = 0000 - breaker = {{regex}} - host_prefix = {{host_prefix}} - -[] - * The stanza can contain the sample File Name or Regex to match multiple sample files. - * The sample file should be located in samples folder under the Add-on package. - * Example1: [sample_file.samples] would collect samples from file sample_file.samples - * Example2: [sample_*.samples] would collect samples from both sample_file.samples and sample_sample.samples. - -sourcetype = - * sourcetype to be assigned to the sample events - -source = - * source to be assigned to the sample events - * default value: pytest-splunk-addon:{{input_type}} - -sourcetype_to_search = - * The sourcetype used to search events - * This would be different then sourcetype= param in cases where TRANSFORMS is used to update the sourcetype index time. - -host_type = plugin | event - * This key determines if host is assigned from event or default host should be assigned by plugin. - * If the value is plugin, the plugin will generate host with format of "stanza_{count}" to uniquely identify the events. - * If the value is event, the host field should be provided for a token using "token..field = host". - -input_type = modinput | scripted_input | syslog_tcp | file_monitor | windows_input | uf_file_monitor | default - * The input_type used in addon to ingest data of a sourcetype used in stanza. - * The way with which the sample data is ingested in Splunk depends on Splunk. The most similar ingesting approach is used for each input_type to get accurate index-time testing. - * In input_type=uf_file_monitor, universal forwarder will use file monitor to read event and then it will send data to indexer. - * For example, in an Add-on, a sourcetype "alert" is ingested through syslog in live environment, provide input_type=syslog_tcp. - - .. warning:: - uf_file_monitor input_type will only work with splunk-type=docker. - -index = - * The index used to ingest the data. - * The index must be configured beforehand. - * If the index is not available then the data will not get ingested into Splunk and a warning message will be printed. - * Custom index is not supported for syslog_tcp or syslog_udp - -sample_count = - * The no. of events present in the sample file. - * This parameter will be used to calculate the total number of events which will be generated from the sample file. - * If `input_type = modinput`, do not provide this parameter. - -expected_event_count = - * The no. of events this sample stanza should generate. - * The parameter will be used to test the line breaking in index-time tests. - * To calculate expected_event_count 2 parameters can be used. 1) Number of events in the sample file. 2) Number of values of replacementType=all tokens in the sample file. Both the parameters can be multiplied to get expected_event_count. - * For example, if sample contains 3 lines & a token has replacement_type=all and replacement has list of 2 values, then 6 events will be generated. - * This parameter is optional, if it is not provided by the user, it will be calculated automatically by the pytest-splunk-addon. - -timestamp_type = plugin | event - * This key determines if _time is assigned from event or default _time should be assigned by plugin. - * The parameter will be used to test the time extraction in index-time tests. - * If value is plugin, the plugin will assign the time while ingesting the event. - * If value is event, that means the time will be extracted from event and therfore, there should be a token provided with token..field = _time. - -breaker = - * The breaker is used to breakdown the sample file into multiple events, based on the regex provided. - * This parameter is optional. If it is not provided by the user, the events will be ingested into Splunk, - as per the *input_type* provided. - -host_prefix = - * This param is used as an identification for the **host** field, for the events which are ingested using SC4S. - -Token replacement settings ------------------------------ -The following replacementType -> replacement values are supported - -+-----------------+-------------------------------------------------------------------------------+ -| ReplacementType | Replacement | -+=================+===============================================================================+ -| static | | -+-----------------+-------------------------------------------------------------------------------+ -| timestamp | | -+-----------------+-------------------------------------------------------------------------------+ -| random | ipv4 | -+-----------------+-------------------------------------------------------------------------------+ -| random | ipv6 | -+-----------------+-------------------------------------------------------------------------------+ -| random | mac | -+-----------------+-------------------------------------------------------------------------------+ -| random | guid | -+-----------------+-------------------------------------------------------------------------------+ -| random | integer[:] | -+-----------------+-------------------------------------------------------------------------------+ -| random | float[:] | -+-----------------+-------------------------------------------------------------------------------+ -| random | list[< "," separated list>] | -+-----------------+-------------------------------------------------------------------------------+ -| random | hex([integer]) | -+-----------------+-------------------------------------------------------------------------------+ -| random | file[:] | -+-----------------+-------------------------------------------------------------------------------+ -| random | dest["host", "ipv4", "ipv6", "fqdn"] | -+-----------------+-------------------------------------------------------------------------------+ -| random | src["host", "ipv4", "ipv6", "fqdn"] | -+-----------------+-------------------------------------------------------------------------------+ -| random | host["host", "ipv4", "ipv6", "fqdn"] | -+-----------------+-------------------------------------------------------------------------------+ -| random | dvc["host", "ipv4", "ipv6", "fqdn"] | -+-----------------+-------------------------------------------------------------------------------+ -| random | user["name", "email", "domain_user", "distinquised_name"] | -+-----------------+-------------------------------------------------------------------------------+ -| random | url["ip_host", "fqdn_host", "path", "query", "protocol"] | -+-----------------+-------------------------------------------------------------------------------+ -| random | email | -+-----------------+-------------------------------------------------------------------------------+ -| random | src_port | -+-----------------+-------------------------------------------------------------------------------+ -| random | dest_port | -+-----------------+-------------------------------------------------------------------------------+ -| file | : | -+-----------------+-------------------------------------------------------------------------------+ -| all | integer[:] | -+-----------------+-------------------------------------------------------------------------------+ -| all | list[< , separated list>] | -+-----------------+-------------------------------------------------------------------------------+ -| all | file[:] | -+-----------------+-------------------------------------------------------------------------------+ - -token..token = - * "n" is a number starting at 0, and increasing by 1. - * PCRE expression used to identify segment for replacement. - * If one or more capture groups are present the replacement will be performed on group 1. - - -token..replacementType = static | timestamp | random | all | file - * "n" is a number starting at 0, and increasing by 1. - * For static, the token will be replaced with the value specified in the replacement setting. - * For timestamp, the token will be replaced with the strptime specified in the replacement setting. Strptime directive: https://docs.python.org/2/library/datetime.html#strftime-and-strptime-behavior - * For random, the token will be replaced with a randomly picked type-aware value - * For all, For each possible replacement value, a new event will be generated and the token will be replaced with it. The configuration can be used where a token replacement contains multiple templates/values and all of the values are important and should be ingested at least once. The number of events will be multiplied by the number of values in the replacement. For example, if sample contains 3 lines & a token replacement has list of 2 values, then 6 events will be generated. For a replacement if replacementType='all' is not supported, then be default plugin will consider replacementType="random". - * For file, the token will be replaced with a random value retrieved from a file specified in the replacement setting. - - -token..replacement = | | ["list","of","values"] | guid | ipv4 | ipv6 | mac | integer[:] | float[:] | hex() | | : | host | src | dest | dvc | user | url | email | src_port | dest_port - * "n" is a number starting at 0, and increasing by 1. - * For , the token will be replaced with the value specified. - * For , a strptime formatted string to replace the timestamp with - * For guid, the token will be replaced with a random GUID value. - * For ipv4, the token will be replaced with a random valid IPv4 Address (i.e. 10.10.200.1). - * For ipv6, the token will be replaced with a random valid IPv6 Address (i.e. c436:4a57:5dea:1035:7194:eebb:a210:6361). - * For mac, the token will be replaced with a random valid MAC Address (i.e. 6e:0c:51:c6:c6:3a). - * For integer[:], the token will be replaced with a random integer between start and end values where is a number greater than 0 and is a number greater than 0 and greater than or equal to . For replacement=all, one event will be generated for each value of integer within range and . - * For float[:], the token will be replaced with a random float between start and end values where is a number greater than or equal to . For floating point numbers, precision will be based off the precision specified in . For example, if we specify 1.0, precision will be one digit, if we specify 1.0000, precision will be four digits. - * For hex(), the token will be replaced with i number of Hexadecimal characters [0-9A-F] where "i" is a number greater than 0. - * For list, the token will be replaced with a random member of the JSON list provided. For replacement=all, one event will be generated for each value within the list - * For , the token will be replaced with a random line in the replacement file. - - * Replacement file name should be a fully qualified path (i.e. $SPLUNK_HOME/etc/apps/windows/samples/users.list). - * Windows separators should contain double forward slashes "\\" (i.e. $SPLUNK_HOME\\etc\\apps\\windows\\samples\\users.list). - * Unix separators will work on Windows and vice-versa. - * Column numbers in mvfile references are indexed at 1, meaning the first column is column 1, not 0. - * For host["host", "ipv4", "ipv6", "fqdn"], 4 types of host replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. - - * For host["host"], the token will be replaced with a sequential host value with pattern "host_sample_host_". - * For host["ipv4"], the token will be replaced with a random valid IPv4 Address. - * For host["ipv6"], the token will be replaced with a random valid IPv6 Address from fdee:1fe4:2b8c:3264:0:0:0:0 range. - * For host["fqdn"], the token will be replaced with a sequential fqdn value with pattern "host_sample_host.sample_domain.com". - * For src["host", "ipv4", "ipv6", "fqdn"], 4 types of src replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. - - * For src["host"], the token will be replaced with a sequential host value with pattern "src_sample_host_". - * For src["ipv4"], the token will be replaced with a random valid IPv4 Address from 10.1.0.0 range. - * For src["ipv6"], the token will be replaced with a random valid IPv6 Address from fdee:1fe4:2b8c:3261:0:0:0:0 range. - * For src["fqdn"], the token will be replaced with a sequential fqdn value with pattern "src_sample_host.sample_domain.com". - * For dest["host", "ipv4", "ipv6", "fqdn"], 4 types of dest replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. - - * For dest["host"], the token will be replaced with a sequential host value with pattern "dest_sample_host_". - * For dest["ipv4"], the token will be replaced with a random valid IPv4 Address from 10.100.0.0 range. - * For dest["ipv6"], the token will be replaced with a random valid IPv6 Address from fdee:1fe4:2b8c:3262:0:0:0:0 range. - * For dest["fqdn"], the token will be replaced with a sequential fqdn value with pattern "dest_sample_host.sample_domain.com". - * For dvc["host", "ipv4", "ipv6", "fqdn"], 4 types of dvc replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. - - * For dvc["host"], the token will be replaced with a sequential host value with pattern "dvc_sample_host_". - * For dvc["ipv4"], the token will be replaced with a random valid IPv4 Address from 172.16.0-50.0 range. - * For dvc["ipv6"], the token will be replaced with a random valid IPv6 Address from fdee:1fe4:2b8c:3263:0:0:0:0 range. - * For dvc["fqdn"], the token will be replaced with a sequential fqdn value with pattern "dvc_sample_host.sample_domain.com". - * For user["name", "email", "domain_user", "distinquised_name"], 4 types of user replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. - - * For user["name"], the token will be replaced with a random name with pattern "user". - * For user["email"], the token will be replaced with a random email with pattern "user@email.com". - * For user["domain_user"], the token will be replaced with a random domain user pattern sample_domain.com\user. - * For user["distinquised_name"], the token will be replaced with a distinquised user with pattern CN=user. - * For url["full", "ip_host", "fqdn_host", "path", "query", "protocol"], 6 types of url replacement are supported. Either one or multiple from the list can be provided to randomly replace the token. - - * For url["ip_host"], the url to be replaced will contain ip based address. - * For url["fqdn_host"], the url to be replaced will contain fqdn address. - * For path["path"], the url to be replaced will contain path with pattern "/". - * For url["query"], the url to be replaced will contain query with pattern "?=". - * For url["protocol"], the url to be replaced will contain protocol with pattern "://". - * For url["full"], the url contain all the parts mentioned above i.e. ip_host, fqdn_host, path, query, protocol. - * Example 1: url["ip_host", "path", "query"], will be replaced with pattern /?= - * Example 2: url["fqdn_host", "path", "protocol"], will be replaced with pattern :/// - * Example 3: url["ip_host", "fqdn_host", "path", "query", "protocol"], will be replaced with pattern :///?= - * Example 4: url["full"], will be replaced same as example 3. - * For email, the token will be replaced with a random email. If the same sample has a user token as well, the email and user tokens will be replaced with co-related values. - * For src_port, the token will be replaced with a random source port value between 4000 and 5000 - * For dest_port, the token will be replaced with a random dest port value from (80,443,25,22,21) - -token..field = - * "n" is a number starting at 0, and increasing by 1. - * Assign the field_name for which the tokenized value will be extracted. - * For this :ref:`key fields `, the index time test cases will be generated. - * Make sure props.conf contains extractions to extract the value from the field. - * If this parameter is not provided, the default value will be same as the token name. - -.. note:: - Make sure token name is not same as that any of :ref:`key field ` values. - - -Example ---------- -.. code-block:: console - - [sample_file.samples] - - sourcetype = juniper:junos:secintel:structured - sourcetype_to_search = juniper:junos:secintel:structured - source = pytest-splunk-addon:syslog_tcp - host_type = plugin - input_type = syslog_tcp - index = main - timestamp_type = event - sample_count = 10 - - token.0.token = (\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+Z) - token.0.replacementType = timestamp - token.0.replacement = %Y-%m-%dT%H:%M:%S - - token.1.token = ##token1## - token.1.replacementType = static - token.1.replacement = sample_value - - token.2.token = ##Src_Addr## - token.2.replacementType = random - token.2.replacement = src["ipv4"] - token.2.field = src - - token.3.token = ##Dest_Addr## - token.3.replacementType = random - token.3.replacement = dest["ipv4"] - - token.4.token = ##Src_Port## - token.4.replacementType = random - token.4.replacement = src_port - token.4.field = src_port - - token.5.token = ##Dest_Port## - token.5.replacementType = random - token.5.replacement = dest_port - - token.6.token = ##dvc## - token.6.replacementType = random - token.6.replacement = dvc["fqdn","host"] - token.6.field = dvc - - token.7.token = ##User## - token.7.replacementType = random - token.7.replacement = user["name"] - - token.8.token = ##HTTP_Host## - token.8.replacementType = random - token.8.replacement = host["fqdn"] - - token.9.token = ##ReferenceIDhex## - token.9.replacementType = random - token.9.replacement = hex(8) - - token.10.token = ##Ip## - token.10.replacementType = random - token.10.replacement = ipv4 - - token.11.token = ##Ipv6## - token.11.replacementType = random - token.11.replacement = ipv6 - - token.12.token = ##Name## - token.12.replacementType = random - token.12.replacement = list["abc.exe","def.exe","efg.exe"] - - token.13.token = ##Name## - token.13.replacementType = all - token.13.replacement = list["abc.exe","def.exe","efg.exe"] - - token.14.token = ##email## - token.14.replacementType = random - token.14.replacement = email - - token.15.token = ##mac## - token.15.replacementType = random - token.15.replacement = mac - - token.16.token = ##memUsedPct## - token.16.replacementType = random - token.16.replacement = float[1.0:99.0] - - token.17.token = ##guid## - token.17.replacementType = random - token.17.replacement = guid - - token.18.token = ##size## - token.18.replacementType = random - token.18.replacement = integer[1:10] - - token.19.token = ##integer_all## - token.19.replacementType = all - token.19.replacement = integer[1:5] - - token.20.token = ##url## - token.20.replacementType = random - token.20.replacement = url["ip_host", "fqdn_host", "path", "query", "protocol"] - - token.21.token = ##DHCP_HOST## - token.21.replacementType = random - token.21.replacement = file[/path/linux.host.sample] - - token.22.token = ##DHCP_HOST_all## - token.22.replacementType = all - token.22.replacement = file[/path/linux.host.sample] diff --git a/docs/troubleshoot.md b/docs/troubleshoot.md new file mode 100644 index 000000000..68b515003 --- /dev/null +++ b/docs/troubleshoot.md @@ -0,0 +1,44 @@ +--- +substitutions: + Wall: '`Docker-compose.yml not found`' +--- + +# Troubleshooting + +**1. Test Case takes forever to run when using splunk-type=external** + + - Check all the Splunk instance information provided in arguments are present and correct. + - `--splunk-host`, `--splunk-port`, `--splunk-user` and `--splunk-password` are required. If not provided, default values are considered which may not work for every setup. + - Make sure the Splunk instance is up and running, also the Splunk server's management port should be accessible to the test machine. + +**2. Getting No such file or directory while test collection.** + + - Check if the path provided in `--splunk-app` exists. + +**3. No tests generated for any fields.** + + - Make sure the directory mentioned in `--splunk-app` has a format of Splunk app and has files like all the required configuration files in the default directory. + +**4. Getting Couldn't find a version that satisfies the requirement when installing pytest-splunk-addon using pip.** + + - Use `pip3 install pytest-splunk-addon` and make sure you are using python 3.7 + +**5. While executing test cases on Docker, all the test cases abort with the following setup failure:** + + - Docker-compose.yml not found. + + - Provide a docker-compose.yml in the root of the current working directory. + + - Couldn't connect to Docker daemon. + + - Need to restart docker daemon. + + - Service 'splunk' failed to build. + + - Check your internet connection + - Try `docker pull splunk/splunk` once this is done test cases can be executed. + +**6. Only no-dash-no-empty test cases are passing** + + - If splunk-type is docker, make sure you have add-on getting installed on the Docker OR if external, install the add-on on the Splunk instance. + - Make sure to configure the inputs to collect data or check if sufficient data required for testing is generated by the pytest-splunk-addon. diff --git a/docs/troubleshoot.rst b/docs/troubleshoot.rst deleted file mode 100644 index df461a307..000000000 --- a/docs/troubleshoot.rst +++ /dev/null @@ -1,43 +0,0 @@ -Troubleshooting -=================== - -**1. Test Case takes forever to run when using splunk-type=external** - - - Check all the Splunk instance information provided in arguments are present and correct. - - ``--splunk-host``, ``--splunk-port``, ``--splunk-user`` and ``--splunk-password`` are required. If not provided, default values are considered which may not work for every setup. - - Make sure the Splunk instance is up and running, also the Splunk server's management port should be accessible to the test machine. - -**2. Getting No such file or directory while test collection.** - - - Check if the path provided in ``--splunk-app`` exists. - -**3. No tests generated for any fields.** - - - Make sure the directory mentioned in ``--splunk-app`` has a format of Splunk app and has files like all the required configuration files in the default directory. - -**4. Getting Couldn't find a version that satisfies the requirement when installing pytest-splunk-addon using pip.** - - - Use ``pip3 install pytest-splunk-addon`` and make sure you are using python 3.7 - -.. |Wall| replace:: ``Docker-compose.yml not found`` - -**5. While executing test cases on Docker, all the test cases abort with the following setup failure:** - - - Docker-compose.yml not found. - - - Provide a docker-compose.yml in the root of the current working directory. - - - Couldn't connect to Docker daemon. - - - Need to restart docker daemon. - - - Service 'splunk' failed to build. - - - Check your internet connection - - Try ``docker pull splunk/splunk`` once this is done test cases can be executed. - -**6. Only no-dash-no-empty test cases are passing** - - - If splunk-type is docker, make sure you have add-on getting installed on the Docker OR if external, install the add-on on the Splunk instance. - - Make sure to configure the inputs to collect data or check if sufficient data required for testing is generated by the pytest-splunk-addon. - diff --git a/mkdocs.yml b/mkdocs.yml new file mode 100644 index 000000000..51c1516fe --- /dev/null +++ b/mkdocs.yml @@ -0,0 +1,76 @@ +# +# Copyright 2024 Splunk Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +site_name: Pytest Splunk Addon +site_author: Splunk +site_url: "https://splunk.github.io/pytest-splunk-addon/" +edit_uri: "tree/main/docs/" +remote_branch: gh-pages + +repo_name: Pytest Splunk Addon +repo_url: "https://github.com/splunk/pytest-splunk-addon" + +markdown_extensions: + - toc: + permalink: True + toc_depth: 3 + - smarty + - fenced_code + - sane_lists + - codehilite + - pymdownx.superfences + - pymdownx.snippets + +plugins: + - mkdocstrings: + handlers: + python: + setup_commands: + - import sys + - sys.path.append('../') + selection: + new_path_syntax: true + options: + inherited_members: false + +theme: + name: "material" + palette: + primary: "black" + accent: "orange" + features: + - content.code.copy + - navigation.indexes + - navigation.expand + +nav: + - Home: "index.md" + - How to use: "how_to_use.md" + - Common Tests: "common_tests.md" + - CIM Compatibility Tests: "cim_tests.md" + - CIM Compliance Report: "cim_compliance.md" + - Knowledge Object Tests: "field_tests.md" + - Index Time Tests: "index_time_tests.md" + - Data Generator: "sample_generator.md" + - API Documentation: + - API Overview: "api_reference/api_reference.md" + - AddonParser: "api_reference/addon_parser.md" + - CimTests: "api_reference/cim_tests.md" + - FieldsTests: "api_reference/fields_tests.md" + - IndexTimeTests: "api_reference/index_time_tests.md" + - AppTestGenerator: "api_reference/app_test_generator.md" + - DataGenerator: "api_reference/sample_generation.md" + - EventIngestor: "api_reference/event_ingestion.md" + - Troubleshooting: "troubleshoot.md" \ No newline at end of file diff --git a/poetry.lock b/poetry.lock index 0f69155a2..88fd3b77e 100644 --- a/poetry.lock +++ b/poetry.lock @@ -2,35 +2,24 @@ [[package]] name = "addonfactory-splunk-conf-parser-lib" -version = "0.3.4" +version = "0.4.0" description = "Splunk .conf files parser" optional = false python-versions = ">=3.7,<4.0" files = [ - {file = "addonfactory_splunk_conf_parser_lib-0.3.4-py3-none-any.whl", hash = "sha256:f2eb25bd03d4fd464b8b0304a87d5dc1e2935c42a6c262c6461b9b73b8e30217"}, - {file = "addonfactory_splunk_conf_parser_lib-0.3.4.tar.gz", hash = "sha256:e550105c8c85a6f553688e5b1f4f55ae7ca758a1a428cbce1e0b9b73b9f4e35d"}, -] - -[[package]] -name = "alabaster" -version = "0.7.13" -description = "A configurable sidebar-enabled Sphinx theme" -optional = false -python-versions = ">=3.6" -files = [ - {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"}, - {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"}, + {file = "addonfactory_splunk_conf_parser_lib-0.4.0-py3-none-any.whl", hash = "sha256:9567b6c8cb87668c1120517fb6001f9cf3c04d07c923f57385b0bb025df42c24"}, + {file = "addonfactory_splunk_conf_parser_lib-0.4.0.tar.gz", hash = "sha256:e892cd26dc9708512f864a397ed101579cceefe2c408ea5be2dc5032d2edf05d"}, ] [[package]] name = "attrs" -version = "23.1.0" +version = "23.2.0" description = "Classes Without Boilerplate" optional = false python-versions = ">=3.7" files = [ - {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"}, - {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"}, + {file = "attrs-23.2.0-py3-none-any.whl", hash = "sha256:99b87a485a5820b23b879f04c2305b44b951b502fd64be915879d77a7e8fc6f1"}, + {file = "attrs-23.2.0.tar.gz", hash = "sha256:935dc3b529c262f6cf76e50877d35a4bd3c1de194fd41f47a2b7ae8f19971f30"}, ] [package.dependencies] @@ -38,118 +27,120 @@ importlib-metadata = {version = "*", markers = "python_version < \"3.8\""} [package.extras] cov = ["attrs[tests]", "coverage[toml] (>=5.3)"] -dev = ["attrs[docs,tests]", "pre-commit"] +dev = ["attrs[tests]", "pre-commit"] docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"] tests = ["attrs[tests-no-zope]", "zope-interface"] -tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"] - -[[package]] -name = "babel" -version = "2.12.1" -description = "Internationalization utilities" -optional = false -python-versions = ">=3.7" -files = [ - {file = "Babel-2.12.1-py3-none-any.whl", hash = "sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610"}, - {file = "Babel-2.12.1.tar.gz", hash = "sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455"}, -] - -[package.dependencies] -pytz = {version = ">=2015.7", markers = "python_version < \"3.9\""} +tests-mypy = ["mypy (>=1.6)", "pytest-mypy-plugins"] +tests-no-zope = ["attrs[tests-mypy]", "cloudpickle", "hypothesis", "pympler", "pytest (>=4.3.0)", "pytest-xdist[psutil]"] [[package]] name = "certifi" -version = "2023.7.22" +version = "2024.2.2" description = "Python package for providing Mozilla's CA Bundle." optional = false python-versions = ">=3.6" files = [ - {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"}, - {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"}, + {file = "certifi-2024.2.2-py3-none-any.whl", hash = "sha256:dc383c07b76109f368f6106eee2b593b04a011ea4d55f652c6ca24a754d1cdd1"}, + {file = "certifi-2024.2.2.tar.gz", hash = "sha256:0569859f95fc761b18b45ef421b1290a0f65f147e92a1e5eb3e635f9a5e4e66f"}, ] [[package]] name = "charset-normalizer" -version = "3.2.0" +version = "3.3.2" description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." optional = false python-versions = ">=3.7.0" files = [ - {file = "charset-normalizer-3.2.0.tar.gz", hash = "sha256:3bb3d25a8e6c0aedd251753a79ae98a093c7e7b471faa3aa9a93a81431987ace"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b87549028f680ca955556e3bd57013ab47474c3124dc069faa0b6545b6c9710"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7c70087bfee18a42b4040bb9ec1ca15a08242cf5867c58726530bdf3945672ed"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a103b3a7069b62f5d4890ae1b8f0597618f628b286b03d4bc9195230b154bfa9"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94aea8eff76ee6d1cdacb07dd2123a68283cb5569e0250feab1240058f53b623"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:db901e2ac34c931d73054d9797383d0f8009991e723dab15109740a63e7f902a"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b0dac0ff919ba34d4df1b6131f59ce95b08b9065233446be7e459f95554c0dc8"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:193cbc708ea3aca45e7221ae58f0fd63f933753a9bfb498a3b474878f12caaad"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:09393e1b2a9461950b1c9a45d5fd251dc7c6f228acab64da1c9c0165d9c7765c"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:baacc6aee0b2ef6f3d308e197b5d7a81c0e70b06beae1f1fcacffdbd124fe0e3"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:bf420121d4c8dce6b889f0e8e4ec0ca34b7f40186203f06a946fa0276ba54029"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:c04a46716adde8d927adb9457bbe39cf473e1e2c2f5d0a16ceb837e5d841ad4f"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:aaf63899c94de41fe3cf934601b0f7ccb6b428c6e4eeb80da72c58eab077b19a"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d62e51710986674142526ab9f78663ca2b0726066ae26b78b22e0f5e571238dd"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-win32.whl", hash = "sha256:04e57ab9fbf9607b77f7d057974694b4f6b142da9ed4a199859d9d4d5c63fe96"}, - {file = "charset_normalizer-3.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:48021783bdf96e3d6de03a6e39a1171ed5bd7e8bb93fc84cc649d11490f87cea"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4957669ef390f0e6719db3613ab3a7631e68424604a7b448f079bee145da6e09"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:46fb8c61d794b78ec7134a715a3e564aafc8f6b5e338417cb19fe9f57a5a9bf2"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f779d3ad205f108d14e99bb3859aa7dd8e9c68874617c72354d7ecaec2a054ac"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f25c229a6ba38a35ae6e25ca1264621cc25d4d38dca2942a7fce0b67a4efe918"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2efb1bd13885392adfda4614c33d3b68dee4921fd0ac1d3988f8cbb7d589e72a"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f30b48dd7fa1474554b0b0f3fdfdd4c13b5c737a3c6284d3cdc424ec0ffff3a"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:246de67b99b6851627d945db38147d1b209a899311b1305dd84916f2b88526c6"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9bd9b3b31adcb054116447ea22caa61a285d92e94d710aa5ec97992ff5eb7cf3"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:8c2f5e83493748286002f9369f3e6607c565a6a90425a3a1fef5ae32a36d749d"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:3170c9399da12c9dc66366e9d14da8bf7147e1e9d9ea566067bbce7bb74bd9c2"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:7a4826ad2bd6b07ca615c74ab91f32f6c96d08f6fcc3902ceeedaec8cdc3bcd6"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:3b1613dd5aee995ec6d4c69f00378bbd07614702a315a2cf6c1d21461fe17c23"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9e608aafdb55eb9f255034709e20d5a83b6d60c054df0802fa9c9883d0a937aa"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-win32.whl", hash = "sha256:f2a1d0fd4242bd8643ce6f98927cf9c04540af6efa92323e9d3124f57727bfc1"}, - {file = "charset_normalizer-3.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:681eb3d7e02e3c3655d1b16059fbfb605ac464c834a0c629048a30fad2b27489"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c57921cda3a80d0f2b8aec7e25c8aa14479ea92b5b51b6876d975d925a2ea346"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41b25eaa7d15909cf3ac4c96088c1f266a9a93ec44f87f1d13d4a0e86c81b982"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f058f6963fd82eb143c692cecdc89e075fa0828db2e5b291070485390b2f1c9c"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a7647ebdfb9682b7bb97e2a5e7cb6ae735b1c25008a70b906aecca294ee96cf4"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eef9df1eefada2c09a5e7a40991b9fc6ac6ef20b1372abd48d2794a316dc0449"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e03b8895a6990c9ab2cdcd0f2fe44088ca1c65ae592b8f795c3294af00a461c3"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ee4006268ed33370957f55bf2e6f4d263eaf4dc3cfc473d1d90baff6ed36ce4a"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c4983bf937209c57240cff65906b18bb35e64ae872da6a0db937d7b4af845dd7"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:3bb7fda7260735efe66d5107fb7e6af6a7c04c7fce9b2514e04b7a74b06bf5dd"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:72814c01533f51d68702802d74f77ea026b5ec52793c791e2da806a3844a46c3"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:70c610f6cbe4b9fce272c407dd9d07e33e6bf7b4aa1b7ffb6f6ded8e634e3592"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-win32.whl", hash = "sha256:a401b4598e5d3f4a9a811f3daf42ee2291790c7f9d74b18d75d6e21dda98a1a1"}, - {file = "charset_normalizer-3.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:c0b21078a4b56965e2b12f247467b234734491897e99c1d51cee628da9786959"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:95eb302ff792e12aba9a8b8f8474ab229a83c103d74a750ec0bd1c1eea32e669"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a100c6d595a7f316f1b6f01d20815d916e75ff98c27a01ae817439ea7726329"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6339d047dab2780cc6220f46306628e04d9750f02f983ddb37439ca47ced7149"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e4b749b9cc6ee664a3300bb3a273c1ca8068c46be705b6c31cf5d276f8628a94"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a38856a971c602f98472050165cea2cdc97709240373041b69030be15047691f"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f87f746ee241d30d6ed93969de31e5ffd09a2961a051e60ae6bddde9ec3583aa"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:89f1b185a01fe560bc8ae5f619e924407efca2191b56ce749ec84982fc59a32a"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e1c8a2f4c69e08e89632defbfabec2feb8a8d99edc9f89ce33c4b9e36ab63037"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2f4ac36d8e2b4cc1aa71df3dd84ff8efbe3bfb97ac41242fbcfc053c67434f46"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a386ebe437176aab38c041de1260cd3ea459c6ce5263594399880bbc398225b2"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:ccd16eb18a849fd8dcb23e23380e2f0a354e8daa0c984b8a732d9cfaba3a776d"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:e6a5bf2cba5ae1bb80b154ed68a3cfa2fa00fde979a7f50d6598d3e17d9ac20c"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:45de3f87179c1823e6d9e32156fb14c1927fcc9aba21433f088fdfb555b77c10"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-win32.whl", hash = "sha256:1000fba1057b92a65daec275aec30586c3de2401ccdcd41f8a5c1e2c87078706"}, - {file = "charset_normalizer-3.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:8b2c760cfc7042b27ebdb4a43a4453bd829a5742503599144d54a032c5dc7e9e"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:855eafa5d5a2034b4621c74925d89c5efef61418570e5ef9b37717d9c796419c"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:203f0c8871d5a7987be20c72442488a0b8cfd0f43b7973771640fc593f56321f"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e857a2232ba53ae940d3456f7533ce6ca98b81917d47adc3c7fd55dad8fab858"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e86d77b090dbddbe78867a0275cb4df08ea195e660f1f7f13435a4649e954e5"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c4fb39a81950ec280984b3a44f5bd12819953dc5fa3a7e6fa7a80db5ee853952"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2dee8e57f052ef5353cf608e0b4c871aee320dd1b87d351c28764fc0ca55f9f4"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8700f06d0ce6f128de3ccdbc1acaea1ee264d2caa9ca05daaf492fde7c2a7200"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1920d4ff15ce893210c1f0c0e9d19bfbecb7983c76b33f046c13a8ffbd570252"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:c1c76a1743432b4b60ab3358c937a3fe1341c828ae6194108a94c69028247f22"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f7560358a6811e52e9c4d142d497f1a6e10103d3a6881f18d04dbce3729c0e2c"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:c8063cf17b19661471ecbdb3df1c84f24ad2e389e326ccaf89e3fb2484d8dd7e"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:cd6dbe0238f7743d0efe563ab46294f54f9bc8f4b9bcf57c3c666cc5bc9d1299"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:1249cbbf3d3b04902ff081ffbb33ce3377fa6e4c7356f759f3cd076cc138d020"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-win32.whl", hash = "sha256:6c409c0deba34f147f77efaa67b8e4bb83d2f11c8806405f76397ae5b8c0d1c9"}, - {file = "charset_normalizer-3.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:7095f6fbfaa55defb6b733cfeb14efaae7a29f0b59d8cf213be4e7ca0b857b80"}, - {file = "charset_normalizer-3.2.0-py3-none-any.whl", hash = "sha256:8e098148dd37b4ce3baca71fb394c81dc5d9c7728c95df695d2dca218edf40e6"}, + {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"}, + {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"}, + {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"}, + {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"}, + {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"}, + {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"}, + {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"}, + {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"}, ] [[package]] @@ -250,16 +241,19 @@ files = [ ] [[package]] -name = "docutils" -version = "0.17.1" -description = "Docutils -- Python Documentation Utilities" +name = "deprecation" +version = "2.1.0" +description = "A library to handle automated deprecations" optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" +python-versions = "*" files = [ - {file = "docutils-0.17.1-py2.py3-none-any.whl", hash = "sha256:cf316c8370a737a022b72b56874f6602acf974a37a9fba42ec2876387549fc61"}, - {file = "docutils-0.17.1.tar.gz", hash = "sha256:686577d2e4c32380bb50cbb22f575ed742d58168cee37e99117a854bcd88f125"}, + {file = "deprecation-2.1.0-py2.py3-none-any.whl", hash = "sha256:a10811591210e1fb0e768a8c25517cabeabcba6f0bf96564f8ff45189f90b14a"}, + {file = "deprecation-2.1.0.tar.gz", hash = "sha256:72b3bde64e5d778694b0cf68178aed03d15e15477116add3fb773e581f9518ff"}, ] +[package.dependencies] +packaging = "*" + [[package]] name = "elementpath" version = "2.5.3" @@ -276,13 +270,13 @@ dev = ["Sphinx", "coverage", "flake8", "lxml", "memory-profiler", "mypy (==0.950 [[package]] name = "exceptiongroup" -version = "1.1.2" +version = "1.2.1" description = "Backport of PEP 654 (exception groups)" optional = false python-versions = ">=3.7" files = [ - {file = "exceptiongroup-1.1.2-py3-none-any.whl", hash = "sha256:e346e69d186172ca7cf029c8c1d16235aa0e04035e5750b4b95039e65204328f"}, - {file = "exceptiongroup-1.1.2.tar.gz", hash = "sha256:12c3e887d6485d16943a309616de20ae5582633e0a2eda17f4e10fd61c1e8af5"}, + {file = "exceptiongroup-1.2.1-py3-none-any.whl", hash = "sha256:5258b9ed329c5bbdd31a309f53cbfb0b155341807f6ff7606a1e801a891b29ad"}, + {file = "exceptiongroup-1.2.1.tar.gz", hash = "sha256:a4785e48b045528f5bfe627b6ad554ff32def154f42372786903b7abcfe1aa16"}, ] [package.extras] @@ -334,13 +328,13 @@ testing = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "diff-cover (>=7.5)", "p [[package]] name = "freezegun" -version = "1.2.2" +version = "1.5.0" description = "Let your Python tests travel through time" optional = false -python-versions = ">=3.6" +python-versions = ">=3.7" files = [ - {file = "freezegun-1.2.2-py3-none-any.whl", hash = "sha256:ea1b963b993cb9ea195adbd893a48d573fda951b0da64f60883d7e988b606c9f"}, - {file = "freezegun-1.2.2.tar.gz", hash = "sha256:cd22d1ba06941384410cd967d8a99d5ae2442f57dfafeff2fda5de8dc5c05446"}, + {file = "freezegun-1.5.0-py3-none-any.whl", hash = "sha256:ec3f4ba030e34eb6cf7e1e257308aee2c60c3d038ff35996d7475760c9ff3719"}, + {file = "freezegun-1.5.0.tar.gz", hash = "sha256:200a64359b363aa3653d8aac289584078386c7c3da77339d257e46a01fb5c77c"}, ] [package.dependencies] @@ -348,12 +342,13 @@ python-dateutil = ">=2.7" [[package]] name = "future" -version = "0.18.3" +version = "1.0.0" description = "Clean single-source support for Python 3 and 2" optional = false python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*" files = [ - {file = "future-0.18.3.tar.gz", hash = "sha256:34a17436ed1e96697a86f9de3d15a3b0be01d8bc8de9c1dffd59fb8234ed5307"}, + {file = "future-1.0.0-py3-none-any.whl", hash = "sha256:929292d34f5872e70396626ef385ec22355a1fae8ad29e1a734c3e43f9fbc216"}, + {file = "future-1.0.0.tar.gz", hash = "sha256:bd2968309307861edae1458a4f8a4f3598c03be43b97521076aebf5d94c07b05"}, ] [[package]] @@ -372,24 +367,13 @@ pyparsing = {version = ">=2.4.2,<3.0.0 || >3.0.0,<3.0.1 || >3.0.1,<3.0.2 || >3.0 [[package]] name = "idna" -version = "3.4" +version = "3.7" description = "Internationalized Domain Names in Applications (IDNA)" optional = false python-versions = ">=3.5" files = [ - {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"}, - {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"}, -] - -[[package]] -name = "imagesize" -version = "1.4.1" -description = "Getting image size from png/jpeg/jpeg2000/gif file" -optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" -files = [ - {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"}, - {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"}, + {file = "idna-3.7-py3-none-any.whl", hash = "sha256:82fee1fc78add43492d3a1898bfa6d8a904cc97d8427f683ed8e798d07761aa0"}, + {file = "idna-3.7.tar.gz", hash = "sha256:028ff3aadf0609c1fd278d8ea3089299412a7a8b9bd005dd08b9f8285bcb5cfc"}, ] [[package]] @@ -441,23 +425,6 @@ files = [ {file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"}, ] -[[package]] -name = "jinja2" -version = "3.1.3" -description = "A very fast and expressive template engine." -optional = false -python-versions = ">=3.7" -files = [ - {file = "Jinja2-3.1.3-py3-none-any.whl", hash = "sha256:7d6d50dd97d52cbc355597bd845fabfbac3f551e1f99619e39a35ce8c370b5fa"}, - {file = "Jinja2-3.1.3.tar.gz", hash = "sha256:ac8bd6544d4bb2c9792bf3a159e80bba8fda7f07e81bc3aed565432d5925ba90"}, -] - -[package.dependencies] -MarkupSafe = ">=2.0" - -[package.extras] -i18n = ["Babel (>=2.7)"] - [[package]] name = "jsonschema" version = "4.17.3" @@ -495,84 +462,15 @@ files = [ [package.dependencies] future = "*" -[[package]] -name = "markupsafe" -version = "2.1.3" -description = "Safely add untrusted strings to HTML/XML markup." -optional = false -python-versions = ">=3.7" -files = [ - {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"}, - {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"}, - {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"}, - {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"}, - {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"}, - {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"}, - {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"}, - {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"}, -] - [[package]] name = "packaging" -version = "23.1" +version = "24.0" description = "Core utilities for Python packages" optional = false python-versions = ">=3.7" files = [ - {file = "packaging-23.1-py3-none-any.whl", hash = "sha256:994793af429502c4ea2ebf6bf664629d07c1a9fe974af92966e4b8d2df7edc61"}, - {file = "packaging-23.1.tar.gz", hash = "sha256:a392980d2b6cffa644431898be54b0045151319d1e7ec34f0cfed48767dd334f"}, + {file = "packaging-24.0-py3-none-any.whl", hash = "sha256:2ddfb553fdf02fb784c234c7ba6ccc288296ceabec964ad2eae3777778130bc5"}, + {file = "packaging-24.0.tar.gz", hash = "sha256:eb82c5e3e56209074766e6885bb04b8c38a0c015d0a30036ebe7ece34c9989e9"}, ] [[package]] @@ -604,40 +502,15 @@ importlib-metadata = {version = ">=0.12", markers = "python_version < \"3.8\""} dev = ["pre-commit", "tox"] testing = ["pytest", "pytest-benchmark"] -[[package]] -name = "py" -version = "1.11.0" -description = "library with cross-python path, ini-parsing, io, code, log facilities" -optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" -files = [ - {file = "py-1.11.0-py2.py3-none-any.whl", hash = "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"}, - {file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"}, -] - -[[package]] -name = "pygments" -version = "2.15.1" -description = "Pygments is a syntax highlighting package written in Python." -optional = false -python-versions = ">=3.7" -files = [ - {file = "Pygments-2.15.1-py3-none-any.whl", hash = "sha256:db2db3deb4b4179f399a09054b023b6a586b76499d36965813c71aa8ed7b5fd1"}, - {file = "Pygments-2.15.1.tar.gz", hash = "sha256:8ace4d3c1dd481894b2005f560ead0f9f19ee64fe983366be1a21e171d12775c"}, -] - -[package.extras] -plugins = ["importlib-metadata"] - [[package]] name = "pyparsing" -version = "3.1.0" +version = "3.1.2" description = "pyparsing module - Classes and methods to define and execute parsing grammars" optional = false python-versions = ">=3.6.8" files = [ - {file = "pyparsing-3.1.0-py3-none-any.whl", hash = "sha256:d554a96d1a7d3ddaf7183104485bc19fd80543ad6ac5bdb6426719d766fb06c1"}, - {file = "pyparsing-3.1.0.tar.gz", hash = "sha256:edb662d6fe322d6e990b1594b5feaeadf806803359e3d4d42f11e295e588f0ea"}, + {file = "pyparsing-3.1.2-py3-none-any.whl", hash = "sha256:f9db75911801ed778fe61bb643079ff86601aca99fcae6345aa67292038fb742"}, + {file = "pyparsing-3.1.2.tar.gz", hash = "sha256:a1bac0ce561155ecc3ed78ca94d3c9378656ad4c94c1270de543f621420f94ad"}, ] [package.extras] @@ -681,13 +554,13 @@ files = [ [[package]] name = "pytest" -version = "7.4.0" +version = "7.4.4" description = "pytest: simple powerful testing with Python" optional = false python-versions = ">=3.7" files = [ - {file = "pytest-7.4.0-py3-none-any.whl", hash = "sha256:78bf16451a2eb8c7a2ea98e32dc119fd2aa758f1d5d66dbf0a59d69a3969df32"}, - {file = "pytest-7.4.0.tar.gz", hash = "sha256:b4bf8c45bd59934ed84001ad51e11b4ee40d40a1229d2c79f9c592b0a3f6bd8a"}, + {file = "pytest-7.4.4-py3-none-any.whl", hash = "sha256:b090cdf5ed60bf4c45261be03239c2c1c22df034fbffe691abe93cd80cea01d8"}, + {file = "pytest-7.4.4.tar.gz", hash = "sha256:2cf0005922c6ace4a3e2ec8b4080eb0d9753fdc93107415332f50ce9e7994280"}, ] [package.dependencies] @@ -720,21 +593,6 @@ pytest = ">=4.6" [package.extras] testing = ["fields", "hunter", "process-tests", "pytest-xdist", "six", "virtualenv"] -[[package]] -name = "pytest-forked" -version = "1.6.0" -description = "run tests in isolated forked subprocesses" -optional = false -python-versions = ">=3.7" -files = [ - {file = "pytest-forked-1.6.0.tar.gz", hash = "sha256:4dafd46a9a600f65d822b8f605133ecf5b3e1941ebb3588e943b4e3eb71a5a3f"}, - {file = "pytest_forked-1.6.0-py3-none-any.whl", hash = "sha256:810958f66a91afb1a1e2ae83089d8dc1cd2437ac96b12963042fbb9fb4d16af0"}, -] - -[package.dependencies] -py = "*" -pytest = ">=3.10" - [[package]] name = "pytest-ordering" version = "0.6" @@ -752,19 +610,18 @@ pytest = "*" [[package]] name = "pytest-xdist" -version = "2.5.0" -description = "pytest xdist plugin for distributed testing and loop-on-failing modes" +version = "3.5.0" +description = "pytest xdist plugin for distributed testing, most importantly across multiple CPUs" optional = false -python-versions = ">=3.6" +python-versions = ">=3.7" files = [ - {file = "pytest-xdist-2.5.0.tar.gz", hash = "sha256:4580deca3ff04ddb2ac53eba39d76cb5dd5edeac050cb6fbc768b0dd712b4edf"}, - {file = "pytest_xdist-2.5.0-py3-none-any.whl", hash = "sha256:6fe5c74fec98906deb8f2d2b616b5c782022744978e7bd4695d39c8f42d0ce65"}, + {file = "pytest-xdist-3.5.0.tar.gz", hash = "sha256:cbb36f3d67e0c478baa57fa4edc8843887e0f6cfc42d677530a36d7472b32d8a"}, + {file = "pytest_xdist-3.5.0-py3-none-any.whl", hash = "sha256:d075629c7e00b611df89f490a5063944bee7a4362a5ff11c7cc7824a03dfce24"}, ] [package.dependencies] execnet = ">=1.1" pytest = ">=6.2.0" -pytest-forked = "*" [package.extras] psutil = ["psutil (>=3.0)"] @@ -773,13 +630,13 @@ testing = ["filelock"] [[package]] name = "python-dateutil" -version = "2.8.2" +version = "2.9.0.post0" description = "Extensions to the standard Python datetime module" optional = false python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" files = [ - {file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"}, - {file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"}, + {file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"}, + {file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"}, ] [package.dependencies] @@ -819,22 +676,20 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] [[package]] name = "requests-mock" -version = "1.11.0" +version = "1.12.1" description = "Mock out responses from the requests package" optional = false -python-versions = "*" +python-versions = ">=3.5" files = [ - {file = "requests-mock-1.11.0.tar.gz", hash = "sha256:ef10b572b489a5f28e09b708697208c4a3b2b89ef80a9f01584340ea357ec3c4"}, - {file = "requests_mock-1.11.0-py2.py3-none-any.whl", hash = "sha256:f7fae383f228633f6bececebdab236c478ace2284d6292c6e7e2867b9ab74d15"}, + {file = "requests-mock-1.12.1.tar.gz", hash = "sha256:e9e12e333b525156e82a3c852f22016b9158220d2f47454de9cae8a77d371401"}, + {file = "requests_mock-1.12.1-py2.py3-none-any.whl", hash = "sha256:b1e37054004cdd5e56c84454cc7df12b25f90f382159087f4b6915aaeef39563"}, ] [package.dependencies] -requests = ">=2.3,<3" -six = "*" +requests = ">=2.22,<3" [package.extras] fixture = ["fixtures"] -test = ["fixtures", "mock", "purl", "pytest", "requests-futures", "sphinx", "testtools"] [[package]] name = "six" @@ -847,204 +702,33 @@ files = [ {file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"}, ] -[[package]] -name = "snowballstemmer" -version = "2.2.0" -description = "This package provides 29 stemmers for 28 languages generated from Snowball algorithms." -optional = false -python-versions = "*" -files = [ - {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"}, - {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"}, -] - -[[package]] -name = "sphinx" -version = "4.5.0" -description = "Python documentation generator" -optional = false -python-versions = ">=3.6" -files = [ - {file = "Sphinx-4.5.0-py3-none-any.whl", hash = "sha256:ebf612653238bcc8f4359627a9b7ce44ede6fdd75d9d30f68255c7383d3a6226"}, - {file = "Sphinx-4.5.0.tar.gz", hash = "sha256:7bf8ca9637a4ee15af412d1a1d9689fec70523a68ca9bb9127c2f3eeb344e2e6"}, -] - -[package.dependencies] -alabaster = ">=0.7,<0.8" -babel = ">=1.3" -colorama = {version = ">=0.3.5", markers = "sys_platform == \"win32\""} -docutils = ">=0.14,<0.18" -imagesize = "*" -importlib-metadata = {version = ">=4.4", markers = "python_version < \"3.10\""} -Jinja2 = ">=2.3" -packaging = "*" -Pygments = ">=2.0" -requests = ">=2.5.0" -snowballstemmer = ">=1.1" -sphinxcontrib-applehelp = "*" -sphinxcontrib-devhelp = "*" -sphinxcontrib-htmlhelp = ">=2.0.0" -sphinxcontrib-jsmath = "*" -sphinxcontrib-qthelp = "*" -sphinxcontrib-serializinghtml = ">=1.1.5" - -[package.extras] -docs = ["sphinxcontrib-websupport"] -lint = ["docutils-stubs", "flake8 (>=3.5.0)", "isort", "mypy (>=0.931)", "types-requests", "types-typed-ast"] -test = ["cython", "html5lib", "pytest", "pytest-cov", "typed-ast"] - -[[package]] -name = "sphinx-panels" -version = "0.6.0" -description = "A sphinx extension for creating panels in a grid layout." -optional = false -python-versions = "*" -files = [ - {file = "sphinx-panels-0.6.0.tar.gz", hash = "sha256:d36dcd26358117e11888f7143db4ac2301ebe90873ac00627bf1fe526bf0f058"}, - {file = "sphinx_panels-0.6.0-py3-none-any.whl", hash = "sha256:bd64afaf85c07f8096d21c8247fc6fd757e339d1be97832c8832d6ae5ed2e61d"}, -] - -[package.dependencies] -docutils = "*" -sphinx = ">=2,<5" - -[package.extras] -code-style = ["pre-commit (>=2.7.0,<2.8.0)"] -live-dev = ["sphinx-autobuild", "web-compile (>=0.2.0,<0.3.0)"] -testing = ["pytest (>=6.0.1,<6.1.0)", "pytest-regressions (>=2.0.1,<2.1.0)"] -themes = ["myst-parser (>=0.12.9,<0.13.0)", "pydata-sphinx-theme (>=0.4.0,<0.5.0)", "sphinx-book-theme (>=0.0.36,<0.1.0)", "sphinx-rtd-theme"] - -[[package]] -name = "sphinx-rtd-theme" -version = "1.1.1" -description = "Read the Docs theme for Sphinx" -optional = false -python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7" -files = [ - {file = "sphinx_rtd_theme-1.1.1-py2.py3-none-any.whl", hash = "sha256:31faa07d3e97c8955637fc3f1423a5ab2c44b74b8cc558a51498c202ce5cbda7"}, - {file = "sphinx_rtd_theme-1.1.1.tar.gz", hash = "sha256:6146c845f1e1947b3c3dd4432c28998a1693ccc742b4f9ad7c63129f0757c103"}, -] - -[package.dependencies] -docutils = "<0.18" -sphinx = ">=1.6,<6" - -[package.extras] -dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"] - -[[package]] -name = "sphinxcontrib-applehelp" -version = "1.0.2" -description = "sphinxcontrib-applehelp is a sphinx extension which outputs Apple help books" -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-applehelp-1.0.2.tar.gz", hash = "sha256:a072735ec80e7675e3f432fcae8610ecf509c5f1869d17e2eecff44389cdbc58"}, - {file = "sphinxcontrib_applehelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:806111e5e962be97c29ec4c1e7fe277bfd19e9652fb1a4392105b43e01af885a"}, -] - -[package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["pytest"] - -[[package]] -name = "sphinxcontrib-devhelp" -version = "1.0.2" -description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document." -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"}, - {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"}, -] - -[package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["pytest"] - -[[package]] -name = "sphinxcontrib-htmlhelp" -version = "2.0.0" -description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files" -optional = false -python-versions = ">=3.6" -files = [ - {file = "sphinxcontrib-htmlhelp-2.0.0.tar.gz", hash = "sha256:f5f8bb2d0d629f398bf47d0d69c07bc13b65f75a81ad9e2f71a63d4b7a2f6db2"}, - {file = "sphinxcontrib_htmlhelp-2.0.0-py2.py3-none-any.whl", hash = "sha256:d412243dfb797ae3ec2b59eca0e52dac12e75a241bf0e4eb861e450d06c6ed07"}, -] - -[package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["html5lib", "pytest"] - -[[package]] -name = "sphinxcontrib-jsmath" -version = "1.0.1" -description = "A sphinx extension which renders display math in HTML via JavaScript" -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"}, - {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"}, -] - -[package.extras] -test = ["flake8", "mypy", "pytest"] - -[[package]] -name = "sphinxcontrib-qthelp" -version = "1.0.3" -description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document." -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"}, - {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"}, -] - -[package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["pytest"] - -[[package]] -name = "sphinxcontrib-serializinghtml" -version = "1.1.5" -description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)." -optional = false -python-versions = ">=3.5" -files = [ - {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"}, - {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"}, -] - -[package.extras] -lint = ["docutils-stubs", "flake8", "mypy"] -test = ["pytest"] - [[package]] name = "splunk-sdk" -version = "1.7.4" +version = "2.0.1" description = "The Splunk Software Development Kit for Python." optional = false python-versions = "*" files = [ - {file = "splunk-sdk-1.7.4.tar.gz", hash = "sha256:8f3f149e3a0daf7526ed36882c109e4ec8080e417efe25d23f4578e86d38b9f2"}, + {file = "splunk-sdk-2.0.1.tar.gz", hash = "sha256:a1cc9b24e0c9c79ef8e2845fedcca066638219eef0018163f97795dbfa367c67"}, ] +[package.dependencies] +deprecation = "*" + [[package]] name = "splunksplwrapper" -version = "1.1.1" +version = "1.1.4" description = "Package to interact with Splunk" optional = false python-versions = ">=3.7,<4.0" files = [ - {file = "splunksplwrapper-1.1.1-py3-none-any.whl", hash = "sha256:6dd93a144c77f78d7b756a05659da76ee5e44773378336d44365e4a046e7cd1c"}, - {file = "splunksplwrapper-1.1.1.tar.gz", hash = "sha256:71d5fc3d37a9105d804d95bd9778023d3f4e7b3d5d2dc71cdcf8019a258395a8"}, + {file = "splunksplwrapper-1.1.4-py3-none-any.whl", hash = "sha256:65d62fe00a89b0f0ef849f37b15db068d293ed2fc430a1b74ca8c9bc34436f67"}, + {file = "splunksplwrapper-1.1.4.tar.gz", hash = "sha256:d8b319080b0260cc47723fe95afeddbdda35f6cb43fdb6249bbcb3c20c03aa77"}, ] [package.dependencies] defusedxml = ">=0.7.1,<0.8.0" -httplib2 = ">=0.22.0,<0.23.0" +httplib2 = ">=0.20.0" splunk-sdk = ">=1.6.20" [[package]] @@ -1133,4 +817,4 @@ testing = ["big-O", "flake8 (<5)", "jaraco.functools", "jaraco.itertools", "more [metadata] lock-version = "2.0" python-versions = "^3.7" -content-hash = "02ea5ca0c0a2e37f94c6c618bfff794fb541a8acd568f77c1dafe89ea305498c" +content-hash = "1910a4c9fc1c52c5694237630fbcb6a0d18d437fcc4ab06a4cca8cc934ab2ced" diff --git a/pyproject.toml b/pyproject.toml index 655e8a8cd..1384f67c2 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -49,21 +49,12 @@ xmlschema = "^1.11.3" splunksplwrapper = "^1.1.1" urllib3 = "<2" - [tool.poetry.group.dev.dependencies] pytest-cov = "^3.0.0" requests-mock = "^1.8.0" freezegun = "^1.2.1" pytz = "^2022.1" -[tool.poetry.group.docs] -optional = true - -[tool.poetry.group.docs.dependencies] -jinja2 = "3.1.3" -sphinx-rtd-theme = "1.1.1" -sphinx-panels = "0.6.0" - [tool.poetry.plugins] pytest11 = { plugin = "pytest_splunk_addon.plugin", "splunk" = "pytest_splunk_addon.splunk" } diff --git a/requirements.txt b/requirements.txt deleted file mode 100644 index 2b266b300..000000000 --- a/requirements.txt +++ /dev/null @@ -1,4 +0,0 @@ -Sphinx -lovely-pytest-docker -sphinx-panels -sphinx-rtd-theme \ No newline at end of file diff --git a/tests/e2e/test_splunk_addon.py b/tests/e2e/test_splunk_addon.py index 4a98cb610..eb0fd523c 100644 --- a/tests/e2e/test_splunk_addon.py +++ b/tests/e2e/test_splunk_addon.py @@ -539,43 +539,6 @@ def empty_method(): result.assert_outcomes(passed=2) -@pytest.mark.doc -def test_help_message(testdir): - result = testdir.runpytest( - "--help", - ) - # fnmatch_lines does an assertion internally - result.stdout.fnmatch_lines( - [ - "splunk-addon:", - "*--splunk-app=*", - "*--splunk-host=*", - "*--splunk-port=*", - "*--splunk-user=*", - "*--splunk-password=*", - ] - ) - - -@pytest.mark.doc -def test_docstrings(testdir): - from sphinx.application import Sphinx - - docs_dir = os.path.join(testdir.request.config.invocation_dir, "docs") - output_dir = os.path.join(docs_dir, "_build", "html") - doctree_dir = os.path.join(docs_dir, "_build", "doctrees") - all_files = 1 - app = Sphinx( - docs_dir, - docs_dir, - output_dir, - doctree_dir, - buildername="html", - warningiserror=True, - ) - app.build(force_all=all_files) - - @pytest.mark.docker @pytest.mark.splunk_app_req def test_splunk_app_req(testdir, request): From b789bb480e590c8182b94f35548ec4bcf7b3c263 Mon Sep 17 00:00:00 2001 From: Artem Rys Date: Fri, 26 Apr 2024 12:59:33 +0200 Subject: [PATCH 2/3] ci(docs): no strict mode (#826) We can allow to live without strict mode for now. --- .github/workflows/docs.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml index 4fb0da166..8e85d070d 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs.yml @@ -23,7 +23,7 @@ jobs: pip install mkdocs mkdocs-material mkdocstrings-python - name: Deploy to GitHub Pages if: github.ref_name == 'main' - run: mkdocs gh-deploy --force --strict + run: mkdocs gh-deploy --force - name: Build Docs if: github.ref_name != 'main' run: mkdocs build From 82dd470de7ef3b748ab05222a2823e0e71521753 Mon Sep 17 00:00:00 2001 From: harshilgajera-crest <69803385+harshilgajera-crest@users.noreply.github.com> Date: Tue, 14 May 2024 14:25:25 +0530 Subject: [PATCH 3/3] docs: fixing documentation (#836) - Updated the md files for proper reference. - Added missing brackets Fixes #834 --- Dockerfile.tests | 2 +- docs/cim_compliance.md | 4 ++-- docs/cim_tests.md | 12 +++++------ docs/field_tests.md | 26 +++++++++++------------ docs/generate_conf.md | 2 +- docs/how_to_use.md | 45 ++++++++++++++++++++-------------------- docs/index_time_tests.md | 16 +++++++------- docs/sample_generator.md | 8 +++---- 8 files changed, 57 insertions(+), 58 deletions(-) diff --git a/Dockerfile.tests b/Dockerfile.tests index 8d2ad273f..f0f11fb27 100644 --- a/Dockerfile.tests +++ b/Dockerfile.tests @@ -13,7 +13,7 @@ # See the License for the specific language governing permissions and # limitations under the License. # -FROM ubuntu:latest +FROM ubuntu:22.04 RUN mkdir -p /work/tests RUN mkdir -p /work/test-results/functional diff --git a/docs/cim_compliance.md b/docs/cim_compliance.md index a7db94a32..4deecb402 100644 --- a/docs/cim_compliance.md +++ b/docs/cim_compliance.md @@ -29,7 +29,7 @@ There are two ways to generate the CIM Compliance report: - Append the following to [any one of the commands](how_to_use.md#test-execution) used for executing the test cases: ```console - --cim-report ``` **2. Generating the report using the test results stored in the junit-xml file** @@ -37,7 +37,7 @@ There are two ways to generate the CIM Compliance report: - Execute the following command: ```console - cim-report ``` ## Report Generation Troubleshooting diff --git a/docs/cim_tests.md b/docs/cim_tests.md index f0b23f63b..1aae162bd 100644 --- a/docs/cim_tests.md +++ b/docs/cim_tests.md @@ -41,7 +41,7 @@ To generate test cases only for CIM compatibility, append the following marker t ``` - #### Testcase Assertions: +#### Testcase Assertions: - There should be at least 1 event mapped with the dataset. - Each required field should be extracted in all the events mapped with the datasets. @@ -100,13 +100,13 @@ To generate test cases only for CIM compatibility, append the following marker t - Plugin gets a list of fields whose extractions are defined in props using addon_parser. - By comparing we obtain a list of fields whose extractions are not allowed but defined. -**5. Testcase to check that eventtype is not be mapped with multiple datamodels.** +**5. Testcase to check that eventtype is not mapped with multiple datamodels.** **Workflow:** - Parsing tags.conf it already has a list of eventtype mapped with the datasets. - - Using SPL we check that each eventtype is not be mapped with multiple datamodels. + - Using SPL we check that each eventtype is not mapped with multiple datamodels. ## Testcase Troubleshooting @@ -122,14 +122,14 @@ If all the above conditions are satisfied, further analysis of the test is requi For every CIM validation test case there is a defined structure for the stack trace. ```text - AssertionError: <> Source | Sourcetype | Field | Event Count | Field Count | Invalid Field Count | Invalid Values -------- | --------------- | ------| ----------- | ----------- | ------------------- | -------------- str | str | str | int | int | int | str - Search = - Properties for the field :: type= Required/Conditional condition= Condition for field validity= EVAL conditions diff --git a/docs/field_tests.md b/docs/field_tests.md index 6de550d58..4cc339d6a 100644 --- a/docs/field_tests.md +++ b/docs/field_tests.md @@ -33,7 +33,7 @@ To generate test cases only for knowledge objects, append the following marker t ``` Testcase verifies that there are events mapped with source/sourcetype. - Here ] + test_props_fields[::field::] ``` Testcase verifies that the field should be extracted in the source/sourcetype. - Here 0. - This verifies that all the fields are extracted in the same event. **5. Events should be present in each eventtype** @@ -104,7 +104,7 @@ To generate test cases only for knowledge objects, append the following marker t **Workflow:** - - For each eventtype mentioned in eventtypes.conf plugin generates an SPL search query and asserts event_count 0 for the eventtype. + - For each eventtype mentioned in eventtypes.conf plugin generates an SPL search query and asserts event_count > 0 for the eventtype. **6. Tags defined in tags.conf should be applied to the events.** @@ -113,13 +113,13 @@ To generate test cases only for knowledge objects, append the following marker t ``` Test case verifies that the there are events mapped with the tag. - Here 0. **7. Search query should be present in each savedsearches.** @@ -133,7 +133,7 @@ To generate test cases only for knowledge objects, append the following marker t **Workflow:** - In savedsearches.conf for each stanza, the plugin generates a test case. - - For each stanza mentioned in savedsearches.conf plugin generates an SPL search query and asserts event_count 0 for the savedsearch. + - For each stanza mentioned in savedsearches.conf plugin generates an SPL search query and asserts event_count > 0 for the savedsearch. ## Testcase Troubleshooting @@ -150,8 +150,8 @@ If all the above conditions are satisfied, further analysis of the test is requi For every test case failure, there is a defined structure for the stack trace. ```text - AssertionError: <> + Search = ``` Get the search query from the stack trace and execute it on the Splunk instance and verify which specific type of events are causing failure. diff --git a/docs/generate_conf.md b/docs/generate_conf.md index 78da4777b..1af9614ee 100644 --- a/docs/generate_conf.md +++ b/docs/generate_conf.md @@ -45,7 +45,7 @@ - Execute the following command: ```console - generate-indextime-conf [] ``` For example: diff --git a/docs/how_to_use.md b/docs/how_to_use.md index 245569e2d..5c0db16ec 100644 --- a/docs/how_to_use.md +++ b/docs/how_to_use.md @@ -20,7 +20,7 @@ There are three ways to execute the tests: Run pytest with the add-on, in an external Splunk deployment ```bash - pytest --splunk-type=external --splunk-app= --splunk-data-generator= --splunk-host= --splunk-port= --splunk-user= --splunk-password= --splunk-hec-token= ``` **2. Running tests with docker splunk** @@ -101,6 +101,7 @@ services: SPLUNK_APP_ID: ${SPLUNK_APP_ID} SPLUNK_APP_PACKAGE: ${SPLUNK_APP_PACKAGE} SPLUNK_VERSION: ${SPLUNK_VERSION} + platform: linux/amd64 ports: - "8000" - "8088" @@ -120,6 +121,7 @@ services: SPLUNK_APP_ID: ${SPLUNK_APP_ID} SPLUNK_APP_PACKAGE: ${SPLUNK_APP_PACKAGE} SPLUNK_VERSION: ${SPLUNK_VERSION} + platform: linux/amd64 hostname: uf ports: - "9997" @@ -132,13 +134,10 @@ services: volumes: - ${CURRENT_DIR}/uf_files:${CURRENT_DIR}/uf_files -volumes: - splunk-sc4s-var: - external: false ``` -
+
Create conftest.py file ``` @@ -184,7 +183,7 @@ def docker_services_project_name(pytestconfig): Run pytest with the add-on, using the following command: ```bash - pytest --splunk-type=docker --splunk-data-generator= ``` The tool assumes the Splunk Add-on is located in a folder "package" in the project root. @@ -209,15 +208,15 @@ The tool assumes the Splunk Add-on is located in a folder "package" in the proje ```bash pytest --splunk-type=external # Whether you want to run the addon with docker or an external Splunk instance - --splunk-app= # Path to Splunk app package. The package should have the configuration files in the default folder. + --splunk-host= # Receiver Splunk instance where events are searchable. + --splunk-port= # default 8089 + --splunk-user= # default admin + --splunk-password= # default Chang3d! + --splunk-forwarder-host= # Splunk instance where forwarding to receiver instance is configured. + --splunk-hec-port= # HEC port of the forwarder instance. + --splunk-hec-token= # HEC token configured in forwarder instance. + --splunk-data-generator= # Path to pytest-splunk-addon-data.conf ``` > **_NOTE:_** @@ -243,10 +242,10 @@ There are 3 types of tests included in pytest-splunk-addon are: 3. To generate test cases only for index time properties, append the following marker to pytest command: ```console - -m splunk_indextime --splunk-data-generator= ``` - For detailed information on index time test execution, please refer {ref}`here Splunk index of which the events will be searched while testing. Default value: "*, _internal". ``` @@ -270,11 +269,11 @@ The following optional arguments are available to modify the default settings in 2. To increase/decrease time interval and retries for flaky tests, user can provide following additional arguments: ```console - --search-retry= Number of retries to make if there are no events found while searching in the Splunk instance. Default value: 0. - --search-interval= Time interval to wait before retrying the search query.Default value: 0. ``` @@ -297,7 +296,7 @@ The following optional arguments are available to modify the default settings in - **Addon related errors:** To suppress these user can create a file with the list of strings and provide the file in the **--ignore-addon-errors** param while test execution. ```console - --ignore-addon-errors= ``` - Sample strings in the file. @@ -328,7 +327,7 @@ The following optional arguments are available to modify the default settings in - Default value for this parameter is *store_new* ```console - --event-file-path= ``` - Path to tokenized events file @@ -380,7 +379,7 @@ The following optional arguments are available to modify the default settings in **3. Setup test environment before executing the test cases** - If any setup is required in the Splunk/test environment before executing the test cases, implement a fixture in {ref}`conftest.py ``` > **_NOTE:_** --splunk-data-generator should contain the path to *pytest-splunk-addon-data.conf*, @@ -55,7 +55,7 @@ To generate test cases only for index time properties, append the following mark - This test case will not be generated if there are no key fields specified for the event. - Key field can be assign to token using field property. `i.e token.n.field = ` - Testcase assertions: +#### Testcase Assertions: - There should be at least 1 event with the sourcetype and host. - The values of the key fields obtained from the event @@ -72,7 +72,7 @@ To generate test cases only for index time properties, append the following mark - Execute the SPL query in a Splunk instance. - - Assert the test case results as mentioned in {ref}`testcase assertions> + Search = ``` Get the search query from the stack trace and execute it on the Splunk instance and verify which specific type of events are causing failure. @@ -229,9 +229,9 @@ Get the search query from the stack trace and execute it on the Splunk instance - No test would generate to test Key Fields for that particular stanza and thus won't be correctly tested. -8. When do I assign token.\.field = ` to the token for that field value. - Example: : For this sample, there is report written in props that extracts `127.0.0.1` as `src`, diff --git a/docs/sample_generator.md b/docs/sample_generator.md index 985383200..ba3d48ce0 100644 --- a/docs/sample_generator.md +++ b/docs/sample_generator.md @@ -56,7 +56,7 @@ host_prefix = {{host_prefix}} - If the value is event, the host field should be provided for a token using "token..field = host". **input_type = modinput | scripted_input | syslog_tcp | file_monitor | windows_input | uf_file_monitor | default** -- + - The input_type used in addon to ingest data of a sourcetype used in stanza. - The way with which the sample data is ingested in Splunk depends on Splunk. The most similar ingesting approach is used for each input_type to get accurate index-time testing. - In input_type=uf_file_monitor, universal forwarder will use file monitor to read event and then it will send data to indexer. @@ -143,7 +143,7 @@ The following replacementType -> replacement values are supported - "n" is a number starting at 0, and increasing by 1. - For static, the token will be replaced with the value specified in the replacement setting. -- For timestamp, the token will be replaced with the strptime specified in the replacement setting. Strptime directive: +- For timestamp, the token will be replaced with the strptime specified in the replacement setting. Strptime directive: [https://docs.python.org/2/library/datetime.html#strftime-and-strptime-behavior](https://docs.python.org/2/library/datetime.html#strftime-and-strptime-behavior) - For random, the token will be replaced with a randomly picked type-aware value - For all, For each possible replacement value, a new event will be generated and the token will be replaced with it. The configuration can be used where a token replacement contains multiple templates/values and all of the values are important and should be ingested at least once. The number of events will be multiplied by the number of values in the replacement. For example, if sample contains 3 lines & a token replacement has list of 2 values, then 6 events will be generated. For a replacement if replacementType='all' is not supported, then be default plugin will consider replacementType="random". - For file, the token will be replaced with a random value retrieved from a file specified in the replacement setting. @@ -174,8 +174,8 @@ The following replacementType -> replacement values are supported - For , the token will be replaced with a random line in the replacement file. - - Replacement file name should be a fully qualified path (i.e. \$SPLUNK_HOME/etc/apps/windows/samples/users.list). - - Windows separators should contain double forward slashes "\\" (i.e. \$SPLUNK_HOME\\etc\\apps\\windows\\samples\\users.list). + - Replacement file name should be a fully qualified path (i.e. $SPLUNK_HOME/etc/apps/windows/samples/users.list). + - Windows separators should contain double forward slashes "\\" (i.e. $SPLUNK_HOME\\etc\\apps\\windows\\samples\\users.list). - Unix separators will work on Windows and vice-versa. - Column numbers in mvfile references are indexed at 1, meaning the first column is column 1, not 0.