CourseGraph utilizes large language models and various prompt optimization techniques to automatically extract knowledge points from textbooks and books, forming a knowledge graph centered around courses, chapters, and knowledge points. To enrich each knowledge point, CourseGraph can link relevant exercises, extended reading materials, and other resources. Additionally, it can leverage multimodal large models to extract information from pptx files, images, and videos, establishing connections with the knowledge points.
- Currently, only basic knowledge graph extraction and pptx parsing have been implemented, with room for optimization
- Video parsing is still in the planning phase
- Improve prompt engineering and explore using Agents for related tasks
- Add more strategies for comparison experiments
- Knowledge graph-based question answering (KBQA or Graph-RAG)
First, obtain an Alibaba Cloud Model Studio API Key, then choose local installation or Docker installation:
Ensure uv, neo4j and rust are installed, then execute:
uv sync
On Linux, libreoffice is required for document conversion. For Debian-based systems:
sudo apt install libreoffice
Provide the Neo4j connection password and the path to the file to be extracted, then execute:
uv run examples/get_knowledge_graph_pdf.py -p neo4j -f assets/deep-learning-from-scratch.pdf
docker-compose -f docker/docker-compose.yml up -d
uv run examples/get_knowledge_graph_pdf.py -f assets/deep-learning-from-scratch.pdf
Documentation can be found in the docs
directory, or you can visit the online documentation (As the project features are still under rapid development, the online documentation is not yet ready). If you wish to customize the online documentation, follow these steps:
The documentation is built with vitepress, requiring node.js 18 or above. Execute:
cd docs
npm i
npm run dev
Open http://localhost:5173/ in your browser to preview.
PR and Issues are welcome, as well as any form of contribution.
This project is open-sourced under the MIT license.
If you find CourseGraph helpful for your work, please click the Cite this repository
button on the right side of the Repository to cite.