Skip to content

Latest commit

 

History

History
78 lines (51 loc) · 1.91 KB

README.md

File metadata and controls

78 lines (51 loc) · 1.91 KB

RSS Aggregator with Web Crawler

This project is an RSS aggregator with a web crawler written in Go. It collects RSS feeds from various sources and aggregates them into a single feed. The web crawler extends the functionality by discovering new RSS feeds from websites.

Features

  • Fetch RSS feeds from predefined sources.
  • Discover new RSS feeds through web crawling.
  • Aggregate and consolidate RSS feeds into a single feed.
  • Customizable and extendable.

Installation

git clone https://github.com/aliciacilmora/rss_aggregator.git

Install dependencies

go mod tidy

Configuration

Create .env file in the main directory:-

PORT=8080
DB_URL=postgres://[username]:[password]@localhost:5432/[database_name]

Run the application

go build && ./rss_aggreagator

Usage

You can use Thunder Client by downloading the Thunder Client extension on VS code or curl directly from terminal.

It runs of http://localhost:8080/v1/

API Endpoints

  • User Management:

    • GET /v1/users: Get user information.

      view_user

    • POST /v1/users: Create a new user.

      create_user

  • Error Handling:

    • GET /v1/err: Handle errors.
  • Feeds:

    • GET /v1/feeds: Get list of feeds.

      view_all_feeds

    • POST /v1/feeds: Add new feeds from XML or similar.

      add_feeds

  • Feed Follows:

    • GET /v1/feed_follows: See followed feeds.

      view_followed_feeds

    • POST /v1/feed_follows: Follow a new feed.

      follow_feed

    • DELETE /v1/feed_follows/{feedFollowID}: Unfollow a feed.

      unfollow_feed

  • Posts:

    • GET /v1/posts: See posts from followed feeds.

      find_posts_from_followed_feed