Skip to content

Commit

Permalink
Prepare 0.15.0 release
Browse files Browse the repository at this point in the history
  • Loading branch information
oltarasenko committed Apr 11, 2023
1 parent b8bd2a9 commit f65c979
Show file tree
Hide file tree
Showing 3 changed files with 13 additions and 8 deletions.
15 changes: 9 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ historical archival.
# mix.exs
defp deps do
[
{:crawly, "~> 0.14.0"},
{:crawly, "~> 0.15.0"},
{:floki, "~> 0.33.0"}
]
end
Expand Down Expand Up @@ -81,7 +81,7 @@ historical archival.
end
```

**New in 0.15.0 (not released yet):**
**New in 0.15.0 :**

> It's possible to use the command to speed up the spider creation,
so you will have a generated file with all needed callbacks:
Expand Down Expand Up @@ -116,7 +116,7 @@ historical archival.

```

**New in 0.15.0 (not released yet):**
**New in 0.15.0:**

> You can generate example config with the help of the following command:
`mix crawly.gen.config`
Expand Down Expand Up @@ -161,13 +161,16 @@ It allows to:
- Start spiders
- Stop spiders
- Preview scheduled requests
- Preview items extracted so far (it's required to add the
`Crawly.Pipelines.Experimental.Preview` item pipe to have items preview)
- View/Download items extracted
- View/Download logs

![Crawly Management UI](docs/crawly_ui.gif)


## Experimental UI
## Experimental UI [Deprecated]

Now don't have a possibility to work on experimental UI built with Phoenix and LiveViews, and keeping it here for
mainly demo purposes.

The CrawlyUI project is an add-on that aims to provide an interface for managing and rapidly developing spiders.
Checkout the code from [GitHub](https://github.com/oltarasenko/crawly_ui)
Expand Down
4 changes: 3 additions & 1 deletion documentation/standalone_crawly.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,12 +81,14 @@ Here we will show how re-implement the example from Quickstart, to achieve the s

4. Now lets start the Crawly (TODO: Insert link to crawly Docker repos):
```
docker run --name crawlyApp1 -e "SPIDERS_DIR=/app/spiders" \
docker run --name crawly -e "SPIDERS_DIR=/app/spiders" \
-it -p 4001:4001 -v $(pwd)/spiders:/app/spiders \
-v $(pwd)/crawly.config:/app/config/crawly.config \
crawly
```

You can either fetch the latest version of Crawly from DockerHub or built it yourself (`docker build -t crawly .`)

** SPIDERS_DIR environment variable specifies a folder from which additional spiders are going to be fetched. `./spiders` is used by default

5. Open Crawly Web Management interface in your browser: https://localhost:4001/
Expand Down
2 changes: 1 addition & 1 deletion mix.exs
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ defmodule Crawly.Mixfile do
use Mix.Project

@source_url "https://github.com/oltarasenko/crawly"
@version "0.14.0"
@version "0.15.0"

def project do
[
Expand Down

0 comments on commit f65c979

Please sign in to comment.