diff --git a/CHANGELOG.md b/CHANGELOG.md index 0e9d9c1..76b2625 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -2,7 +2,7 @@ All notable changes to this project will be documented in this file. -## [Unreleased] - 2025-02-22 +## [1.0.1] - 2025-02-22 ### Added - Added support for using OpenAI or models available in Ollama. Users can choose whether to use OpenAI, but they need to check the `README.md` file for OpenAI configuration. - Reorganized the endpoint structure. Now, there is only one endpoint: `/blog_post/`, simplifying API interactions. @@ -10,7 +10,7 @@ All notable changes to this project will be documented in this file. ### Changed - The blog post generation system now uses a separate `prompt.yaml` file to store prompts instead of embedding them in the code. This makes it easier to manage and modify prompts. -## [1.0.0] - 2025-01-28 +## [0.0.1] - 2025-01-28 ### Added - Initial version of the API with two endpoints. - The application only allowed Ollama models for generating blog posts. \ No newline at end of file diff --git a/README.md b/README.md index df21e69..3622f81 100644 --- a/README.md +++ b/README.md @@ -114,7 +114,7 @@ An example `client_secrets.json` file format is shown below: ``` ### (Optional) Update `prompt.yaml` for Custom Prompts -If you want to customize the prompt used by the `/video_blog` endpoint, you can update the `prompt.yaml` file located in the root directory of the project. This file allows you to define the structure and content of the prompt that will be used to generate the blog post. +If you want to customize the prompt used by the `/blog_post` endpoint, you can update the `prompt.yaml` file located in the root directory of the project. This file allows you to define the structure and content of the prompt that will be used to generate the blog post. ### 3. Install dependencies: If you’re using Python-based dependencies, run: ```bash @@ -136,7 +136,7 @@ The application will be accessible at http://localhost:5000. ## Endpoints ### 1. Blog Post Generator -**Endpoint**: `/video_blog/` +**Endpoint**: `/blog_post/` **Method**: `GET` **Description**: Creates a blog post based on the video’s title, description, and transcript. @@ -145,14 +145,14 @@ The application will be accessible at http://localhost:5000. **Example**: ```bash -curl http://localhost:5000/video_blog/VIDEO_ID_YOUTUBE?use_openai=true +curl http://localhost:5000/blog_post/VIDEO_ID_YOUTUBE?use_openai=true ``` ## Examples of Generated Blog Posts In the `examples-blog-posts-generated` folder, you will find two main directories: `Ollama` and `OpenAI`. Each of these directories contains subfolders with examples of blog posts generated by the application. Each example is stored in its own folder and includes the following files: -1. `{youtube_video_id}.json`: This file contains the response of the endpoint `/video_blog/`, including metadata about the video, its transcript, and the generated blog post. +1. `{youtube_video_id}.json`: This file contains the response of the endpoint `/blog_post/`, including metadata about the video, its transcript, and the generated blog post. 2. `blog_post.md`: This file contains the blog post generated from the video in a readable markdown format. These examples demonstrate the capabilities of the application in generating blog posts from YouTube videos using different models.