Skip to content

LlamaIndex LLM powered web app chat using OpenAI models and internal documents

Notifications You must be signed in to change notification settings

sergiopesch/my-app

Repository files navigation

This is a LlamaIndex project using Next.js bootstrapped with create-llama.

Getting Started

First, install the dependencies:

npm install

Second, run the development server:

npm run dev

Open http://localhost:3000 with your browser to see the result.

You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.

This project uses next/font to automatically optimize and load Inter, a custom Google Font.

Learn More

To learn more about LlamaIndex, take a look at the following resources:

You can check out the LlamaIndexTS GitHub repository - your feedback and contributions are welcome!

About

LlamaIndex LLM powered web app chat using OpenAI models and internal documents

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published