Skip to content

Commit

Permalink
Merge pull request #12 from pamelars86/fix/readme
Browse files Browse the repository at this point in the history
Fix/readme
  • Loading branch information
pamelars86 authored Feb 2, 2025
2 parents 31fb681 + d9ef946 commit 5b21287
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 6 deletions.
4 changes: 2 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,15 @@

All notable changes to this project will be documented in this file.

## [Unreleased] - 2025-02-22
## [1.0.1] - 2025-02-22
### Added
- Added support for using OpenAI or models available in Ollama. Users can choose whether to use OpenAI, but they need to check the `README.md` file for OpenAI configuration.
- Reorganized the endpoint structure. Now, there is only one endpoint: `/blog_post/<video_id>`, simplifying API interactions.

### Changed
- The blog post generation system now uses a separate `prompt.yaml` file to store prompts instead of embedding them in the code. This makes it easier to manage and modify prompts.

## [1.0.0] - 2025-01-28
## [0.0.1] - 2025-01-28
### Added
- Initial version of the API with two endpoints.
- The application only allowed Ollama models for generating blog posts.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,7 @@ An example `client_secrets.json` file format is shown below:
```
### (Optional) Update `prompt.yaml` for Custom Prompts

If you want to customize the prompt used by the `/video_blog` endpoint, you can update the `prompt.yaml` file located in the root directory of the project. This file allows you to define the structure and content of the prompt that will be used to generate the blog post.
If you want to customize the prompt used by the `/blog_post` endpoint, you can update the `prompt.yaml` file located in the root directory of the project. This file allows you to define the structure and content of the prompt that will be used to generate the blog post.

### 3. Install dependencies: If you’re using Python-based dependencies, run:
```bash
Expand All @@ -136,7 +136,7 @@ The application will be accessible at http://localhost:5000.
## Endpoints

### 1. Blog Post Generator
**Endpoint**: `/video_blog/<video_id_youtube>`
**Endpoint**: `/blog_post/<video_id_youtube>`
**Method**: `GET`
**Description**: Creates a blog post based on the video’s title, description, and transcript.

Expand All @@ -145,14 +145,14 @@ The application will be accessible at http://localhost:5000.

**Example**:
```bash
curl http://localhost:5000/video_blog/VIDEO_ID_YOUTUBE?use_openai=true
curl http://localhost:5000/blog_post/VIDEO_ID_YOUTUBE?use_openai=true
```

## Examples of Generated Blog Posts

In the `examples-blog-posts-generated` folder, you will find two main directories: `Ollama` and `OpenAI`. Each of these directories contains subfolders with examples of blog posts generated by the application. Each example is stored in its own folder and includes the following files:

1. `{youtube_video_id}.json`: This file contains the response of the endpoint `/video_blog/`, including metadata about the video, its transcript, and the generated blog post.
1. `{youtube_video_id}.json`: This file contains the response of the endpoint `/blog_post/`, including metadata about the video, its transcript, and the generated blog post.
2. `blog_post.md`: This file contains the blog post generated from the video in a readable markdown format.

These examples demonstrate the capabilities of the application in generating blog posts from YouTube videos using different models.
Expand Down

0 comments on commit 5b21287

Please sign in to comment.