Skip to content

Commit 42d85e3

Browse files
committed
docs: update Readme to reflect Python 3.13 / Poetry 1.8.4 requirements
1 parent 0f351cc commit 42d85e3

File tree

1 file changed

+9
-6
lines changed

1 file changed

+9
-6
lines changed

Readme.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
# Open Edu Hub Search ETL
22

3-
## Step 1: Project Setup - Python 3.12 (manual approach)
3+
## Step 1: Project Setup Python 3.13 (manual approach)
44

55
- make sure you have python3 installed (<https://docs.python-guide.org/starting/installation/>)
6-
- (Python 3.12 or newer is required)
6+
- (Python 3.13 is required)
77
- go to project root
88
- Run the following commands:
99

@@ -22,7 +22,7 @@ python3 -m venv .venv
2222

2323
## Step 1 (alternative): Project Setup - Python (automated, via `poetry`)
2424

25-
- Step 1: Make sure that you have [Poetry](https://python-poetry.org) v1.5.0+ installed
25+
- Step 1: Make sure that you have [Poetry](https://python-poetry.org) [v1.8.4](https://github.com/python-poetry/poetry/releases/tag/1.8.4)+ installed
2626
- for detailed instructions, please consult the [Poetry Installation Guide](https://python-poetry.org/docs/#installation)
2727
- Step 2: Open your terminal **in the project root directory**:
2828
- Step 2.1: If you want to have your `.venv` to be created inside the project root directory:
@@ -31,6 +31,7 @@ python3 -m venv .venv
3131
- Step 3: **Install dependencies** (according to `pyproject.toml`) by running: `poetry install`
3232

3333
## Step 2: Project Setup - required Docker Containers
34+
3435
If you have Docker installed, use `docker-compose up` to start up the multi-container for `Splash` and `Playwright`-integration.
3536

3637
As a last step, set up your config variables by copying the `.env.example`-file and modifying it if necessary:
@@ -40,7 +41,7 @@ As a last step, set up your config variables by copying the `.env.example`-file
4041
# Running crawlers
4142

4243
- A crawler can be run with `scrapy crawl <spider-name>`.
43-
- (It assumes that you have an edu-sharing v6.0+ instance in your `.env` settings configured which can accept the data.)
44+
- (It assumes that you have an edu-sharing v8.1+ instance in your `.env` settings configured which can accept the data.)
4445
- If a crawler has [Scrapy Spider Contracts](https://docs.scrapy.org/en/latest/topics/contracts.html#spiders-contracts) implemented, you can test those by running `scrapy check <spider-name>`
4546

4647

@@ -60,8 +61,10 @@ docker compose up
6061

6162
- We use Scrapy as a framework. Please check out the guides for Scrapy spider (https://docs.scrapy.org/en/latest/intro/tutorial.html)
6263
- To create a new spider, create a file inside `converter/spiders/<myname>_spider.py`
63-
- We recommend inheriting the `LomBase` class in order to get out-of-the-box support for our metadata model
64-
- You may also Inherit a Base Class for crawling data, if your site provides LRMI metadata, the `LrmiBase` is a good start. If your system provides an OAI interface, you may use the `OAIBase`
64+
- We recommend inheriting the `LomBase` class to get out-of-the-box support for our metadata model
65+
- You may also inherit a base class (see: `converter/spiders/base_classes/`) for crawling data.
66+
- If your site provides LRMI metadata, the `LrmiBase` is a good start.
67+
- If your system provides an OAI interface, you may use the `OAIBase`
6568
- As a sample/template, please take a look at the `sample_spider.py` or `sample_spider_alternative.py`
6669
- To learn more about the LOM standard we're using, you'll find useful information at https://en.wikipedia.org/wiki/Learning_object_metadata
6770

0 commit comments

Comments
 (0)