A personal news aggregator to pull information from multi-sources + LLM (ChatGPT) to help us read efficiently with less noise, the sources including Tweets, RSS, YouTube, and Articles.
In the world of this information explosion, we live with noise every day, it becomes even worse after the generative AI was born. Time is a precious resource for each of us, How to use our time more efficiently? It becomes more challenging than ever. Think about how much time we spent on pulling/searching/filtering content from different sources, how many times we put the article/paper or long video as a side tab, but never got a chance to look at, and how much effort to organize the information we have read. We need a better way to get rid of the noises, focus on reading the information efficiently based on our interests, and stay on track with the goals we defined.
See this Blog post for more details.
The Auto-News was born for the following goals:
- Automatically pull feed sources, including RSS and Tweets.
- Support clip content from source directly, later generate summary and translation, including random web articles, YouTube videos
- Filter content based on personal interests and remove 80%+ noises
- A unified/central reading experience (e.g., RSS reader-like style, Notion based)
- Weekly/Monthly top-k aggregations
- UI: Notion-based, cross-platform (Web browser, iOS/Android app)
- Backend: Runs on Linxu/MacOS
Component | Requirement |
---|---|
OS | Linux, MacOS |
Memory | 6GB |
Disk | 20GB+ |
- [Required] Notion token
- [Required] OpenAI token
- [Required] Docker
- [Optional] Notion Web Clipper
- [Optional] Twitter Developer Tokens
Go to Notion, create a page as the main entry (For example Readings
page), and enable Notion Integration
for this page
Checkout the repo and copy .env.template
to build/.env
, then fill up the environment vars:
NOTION_TOKEN
NOTION_ENTRY_PAGE_ID
OPENAI_API_KEY
- [Optional] Vars with
TWITTER_
prefix
make deps && make build && make deploy && make init
make start
Now, the services are up and running, it will pull sources every hour.
Go to the Notion entry page we created before, and we will see the following folder structure has been created automatically:
Readings
├── Inbox
│ ├── Inbox - Article
│ └── Inbox - YouTube
├── Index
│ ├── Index - Inbox
│ ├── Index - ToRead
│ ├── RSS_List
│ └── Tweet_List
└── ToRead
└── ToRead
- Go to
RSS_List
page, and fill in the RSS name and URL - Go to
Tweet_List
page, and fill in the Tweet screen names
Go to Notion ToRead
database page, all the data will flow into this database later on, create the database views for different sources to help us organize flows easier. E.g. Tweets, Articles, YouTube, RSS, etc
Now, enjoy and have fun.
For troubleshooting, we can use the URLs below to access the services and check the logs and data
Service | Role | Panel URL |
---|---|---|
Airflow | Orchestration | http://localhost:8080 |
Milvus | Vector Database | http://localhost:9100 |
Adminer | DB accessor | http://localhost:8070 |
In case we want, apply the following commands from the codebase folder.
# stop
make stop
# restart
make stop && make start
make stop && make init && make start
make upgrade && make stop && make init && make start