diff --git a/404.html b/404.html index 012bc1d..a4c2323 100644 --- a/404.html +++ b/404.html @@ -4,13 +4,13 @@ Page Not Found | eLLMental - +
Skip to main content

Page Not Found

We could not find what you were looking for.

Please contact the owner of the site that linked you to the original URL and let them know their link is broken.

- + \ No newline at end of file diff --git a/assets/js/483c79d4.bf236419.js b/assets/js/483c79d4.815e19bf.js similarity index 78% rename from assets/js/483c79d4.bf236419.js rename to assets/js/483c79d4.815e19bf.js index 0013da9..897f416 100644 --- a/assets/js/483c79d4.bf236419.js +++ b/assets/js/483c79d4.815e19bf.js @@ -227,7 +227,7 @@ __webpack_require__.r(__webpack_exports__); /* harmony import */ var _home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(7462); /* harmony import */ var react__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(7294); /* harmony import */ var _mdx_js_react__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(3905); -/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={};const contentTitle='Core Abstractions';const metadata={"unversionedId":"components/core_abstractions","id":"components/core_abstractions","title":"Core Abstractions","description":"eLLMental uses different 3rd party components and APIs and provides a unified interface. To ensure extensibility and avoid tight coupling with any specific API, the library provides a series of abstract classes that define the expected interface for these components to work with eLLMental. To use eLLMental, you can provide your own implementation or use one of the built-in concrete implementations.","source":"@site/docs/03_components/01_core_abstractions.md","sourceDirName":"03_components","slug":"/components/core_abstractions","permalink":"/components/core_abstractions","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Javier Toledo","lastUpdatedAt":1693394277,"formattedLastUpdatedAt":"Aug 30, 2023","sidebarPosition":1,"frontMatter":{},"sidebar":"docs","previous":{"title":"Getting started","permalink":"/getting-started"},"next":{"title":"EmbeddingsSpaceComponent","permalink":"/components/embeddings_space"}};const assets={};const toc=[{value:'Embedding object',id:'embedding-object',level:2},{value:'EmbeddingsGenerationModel',id:'embeddingsgenerationmodel',level:2},{value:'OpenAIEmbeddingsGenerationModel',id:'openaiembeddingsgenerationmodel',level:3},{value:'EmbeddingsStore',id:'embeddingsstore',level:2},{value:'PineconeEmbeddingsStore',id:'pineconeembeddingsstore',level:3}];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"core-abstractions"},`Core Abstractions`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`eLLMental uses different 3rd party components and APIs and provides a unified interface. To ensure extensibility and avoid tight coupling with any specific API, the library provides a series of abstract classes that define the expected interface for these components to work with eLLMental. To use eLLMental, you can provide your own implementation or use one of the built-in concrete implementations.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"embedding-object"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"h2"},`Embedding`),` object`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`In eLLMental embeddings are represented by the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`Embedding`),` record, which has the following attributes:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`id`),`: An unique identifier of the embedding.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`vector`),`: A numeric vector that represents the semantic location of the text.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`metadata`),`: Additional information associated with the embedding. It can be used to store the original text, the model used to generate the embedding, or any other information you may find useful.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`public record Embedding( +/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={};const contentTitle='Core Abstractions';const metadata={"unversionedId":"components/core_abstractions","id":"components/core_abstractions","title":"Core Abstractions","description":"eLLMental uses different 3rd party components and APIs and provides a unified interface. To ensure extensibility and avoid tight coupling with any specific API, the library provides a series of abstract classes that define the expected interface for these components to work with eLLMental. To use eLLMental, you can provide your own implementation or use one of the built-in concrete implementations.","source":"@site/docs/03_components/01_core_abstractions.md","sourceDirName":"03_components","slug":"/components/core_abstractions","permalink":"/components/core_abstractions","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Juan José Rodríguez López","lastUpdatedAt":1693488408,"formattedLastUpdatedAt":"Aug 31, 2023","sidebarPosition":1,"frontMatter":{},"sidebar":"docs","previous":{"title":"Getting started","permalink":"/getting-started"},"next":{"title":"EmbeddingsSpaceComponent","permalink":"/components/embeddings_space"}};const assets={};const toc=[{value:'Embedding object',id:'embedding-object',level:2},{value:'EmbeddingsGenerationModel',id:'embeddingsgenerationmodel',level:2},{value:'OpenAIEmbeddingsGenerationModel',id:'openaiembeddingsgenerationmodel',level:3},{value:'EmbeddingsStore',id:'embeddingsstore',level:2},{value:'PineconeEmbeddingsStore',id:'pineconeembeddingsstore',level:3}];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"core-abstractions"},`Core Abstractions`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`eLLMental uses different 3rd party components and APIs and provides a unified interface. To ensure extensibility and avoid tight coupling with any specific API, the library provides a series of abstract classes that define the expected interface for these components to work with eLLMental. To use eLLMental, you can provide your own implementation or use one of the built-in concrete implementations.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"embedding-object"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"h2"},`Embedding`),` object`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`In eLLMental embeddings are represented by the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`Embedding`),` record, which has the following attributes:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`id`),`: An unique identifier of the embedding.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`vector`),`: A numeric vector that represents the semantic location of the text.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`metadata`),`: Additional information associated with the embedding. It can be used to store the original text, the model used to generate the embedding, or any other information you may find useful.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`public record Embedding( UUID id, List vector, Map metadata diff --git a/assets/js/5b471282.6003f070.js b/assets/js/5b471282.09dde483.js similarity index 74% rename from assets/js/5b471282.6003f070.js rename to assets/js/5b471282.09dde483.js index 4d306d7..4cb82ce 100644 --- a/assets/js/5b471282.6003f070.js +++ b/assets/js/5b471282.09dde483.js @@ -227,7 +227,7 @@ __webpack_require__.r(__webpack_exports__); /* harmony import */ var _home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(7462); /* harmony import */ var react__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(7294); /* harmony import */ var _mdx_js_react__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(3905); -/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={};const contentTitle='Community';const metadata={"unversionedId":"community/index","id":"community/index","title":"Community","description":"Community contributions are essential to the development and refinement of eLLMental. You can become a part of the eLLMental community in the following ways:","source":"@site/docs/04_community/index.md","sourceDirName":"04_community","slug":"/community/","permalink":"/community/","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Javier Toledo","lastUpdatedAt":1693394277,"formattedLastUpdatedAt":"Aug 30, 2023","frontMatter":{},"sidebar":"docs","previous":{"title":"EmbeddingsSpaceComponent","permalink":"/components/embeddings_space"},"next":{"title":"Contributing Guide","permalink":"/community/contributing"}};const assets={};const toc=[];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"community"},`Community`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Community contributions are essential to the development and refinement of eLLMental. You can become a part of the eLLMental community in the following ways:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ol",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Join the conversation in our `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://discord.gg/34cBbvjjAx"},`Discord server`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Send us suggestions, questions, or feature requests by creating a `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://github.com/theam/ellmental/issues/new"},`New Issue`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Look at the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://github.com/theam/ellmental/issues"},`Open Issues`),`, choose one that interests you, and start contributing.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Spread the word about eLLMental!`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Find detailed instructions and guidelines in the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"contributing"},`Contributing Guide`),`, and make sure to read our `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"code_of_conduct"},`Code of Conduct`),` before you start contributing.`));};MDXContent.isMDXComponent=true; +/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={};const contentTitle='Community';const metadata={"unversionedId":"community/index","id":"community/index","title":"Community","description":"Community contributions are essential to the development and refinement of eLLMental. You can become a part of the eLLMental community in the following ways:","source":"@site/docs/04_community/index.md","sourceDirName":"04_community","slug":"/community/","permalink":"/community/","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Juan José Rodríguez López","lastUpdatedAt":1693488408,"formattedLastUpdatedAt":"Aug 31, 2023","frontMatter":{},"sidebar":"docs","previous":{"title":"EmbeddingsSpaceComponent","permalink":"/components/embeddings_space"},"next":{"title":"Contributing Guide","permalink":"/community/contributing"}};const assets={};const toc=[];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"community"},`Community`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Community contributions are essential to the development and refinement of eLLMental. You can become a part of the eLLMental community in the following ways:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ol",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Join the conversation in our `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://discord.gg/34cBbvjjAx"},`Discord server`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Send us suggestions, questions, or feature requests by creating a `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://github.com/theam/ellmental/issues/new"},`New Issue`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Look at the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://github.com/theam/ellmental/issues"},`Open Issues`),`, choose one that interests you, and start contributing.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Spread the word about eLLMental!`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Find detailed instructions and guidelines in the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"contributing"},`Contributing Guide`),`, and make sure to read our `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"code_of_conduct"},`Code of Conduct`),` before you start contributing.`));};MDXContent.isMDXComponent=true; /***/ }) diff --git a/assets/js/6866e3af.09b3a8dc.js b/assets/js/6866e3af.25c7bf52.js similarity index 84% rename from assets/js/6866e3af.09b3a8dc.js rename to assets/js/6866e3af.25c7bf52.js index a0cbfee..d1f6881 100644 --- a/assets/js/6866e3af.09b3a8dc.js +++ b/assets/js/6866e3af.25c7bf52.js @@ -227,7 +227,7 @@ __webpack_require__.r(__webpack_exports__); /* harmony import */ var _home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(7462); /* harmony import */ var react__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(7294); /* harmony import */ var _mdx_js_react__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(3905); -/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={slug:'/getting-started'};const contentTitle='Getting started';const metadata={"unversionedId":"getting_started","id":"getting_started","title":"Getting started","description":"eLLMental is a library designed for building AI-powered applications written in Java, and it offers production-ready components that can be used right away in your current JVM projects. In this guide, we will showcase how to use the EmbeddingsSpaceComponent to find relevant text based on a query.","source":"@site/docs/02_getting_started.md","sourceDirName":".","slug":"/getting-started","permalink":"/getting-started","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Javier Toledo","lastUpdatedAt":1693394277,"formattedLastUpdatedAt":"Aug 30, 2023","sidebarPosition":2,"frontMatter":{"slug":"/getting-started"},"sidebar":"docs","previous":{"title":"Introduction","permalink":"/"},"next":{"title":"Core Abstractions","permalink":"/components/core_abstractions"}};const assets={};const toc=[{value:'Step 1: Add the eLLMental dependencies',id:'step-1-add-the-ellmental-dependencies',level:2},{value:'Gradle',id:'gradle',level:3},{value:'Maven',id:'maven',level:3},{value:'Step 2: Initializing the EmbeddingsSpaceComponent',id:'step-2-initializing-the-embeddingsspacecomponent',level:2},{value:'Step 3: Running the example',id:'step-3-running-the-example',level:2},{value:'eLLMental ❤️ Springboot',id:'ellmental-️-springboot',level:2},{value:'Importing env variables from application.properties',id:'importing-env-variables-from-applicationproperties',level:3},{value:'Configuring EmbeddingsSpaceComponent',id:'configuring-embeddingsspacecomponent',level:3},{value:'Autowiring EmbeddingsSpaceComponent',id:'autowiring-embeddingsspacecomponent',level:3},{value:'Next steps',id:'next-steps',level:2}];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"getting-started"},`Getting started`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`eLLMental is a library designed for building AI-powered applications written in Java, and it offers production-ready components that can be used right away in your current JVM projects. In this guide, we will showcase how to use the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`EmbeddingsSpaceComponent`),` to find relevant text based on a query.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"step-1-add-the-ellmental-dependencies"},`Step 1: Add the eLLMental dependencies`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`In eLLMental, we make use of `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"https://jitpack.io"},`JitPack`),` to import eLLMental into our projects. Below there are some examples of how you can use it.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h3",{"id":"gradle"},`Gradle`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Incorporate the eLLMental dependencies into your `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`build.gradle`),` file.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`allprojects { +/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={slug:'/getting-started'};const contentTitle='Getting started';const metadata={"unversionedId":"getting_started","id":"getting_started","title":"Getting started","description":"eLLMental is a library designed for building AI-powered applications written in Java, and it offers production-ready components that can be used right away in your current JVM projects. In this guide, we will showcase how to use the EmbeddingsSpaceComponent to find relevant text based on a query.","source":"@site/docs/02_getting_started.md","sourceDirName":".","slug":"/getting-started","permalink":"/getting-started","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Juan José Rodríguez López","lastUpdatedAt":1693488408,"formattedLastUpdatedAt":"Aug 31, 2023","sidebarPosition":2,"frontMatter":{"slug":"/getting-started"},"sidebar":"docs","previous":{"title":"Introduction","permalink":"/"},"next":{"title":"Core Abstractions","permalink":"/components/core_abstractions"}};const assets={};const toc=[{value:'Step 1: Add the eLLMental dependencies',id:'step-1-add-the-ellmental-dependencies',level:2},{value:'Gradle',id:'gradle',level:3},{value:'Maven',id:'maven',level:3},{value:'Step 2: Initializing the EmbeddingsSpaceComponent',id:'step-2-initializing-the-embeddingsspacecomponent',level:2},{value:'Step 3: Running the example',id:'step-3-running-the-example',level:2},{value:'eLLMental ❤️ Springboot',id:'ellmental-️-springboot',level:2},{value:'Importing env variables from application.properties',id:'importing-env-variables-from-applicationproperties',level:3},{value:'Configuring EmbeddingsSpaceComponent',id:'configuring-embeddingsspacecomponent',level:3},{value:'Autowiring EmbeddingsSpaceComponent',id:'autowiring-embeddingsspacecomponent',level:3},{value:'Next steps',id:'next-steps',level:2}];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"getting-started"},`Getting started`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`eLLMental is a library designed for building AI-powered applications written in Java, and it offers production-ready components that can be used right away in your current JVM projects. In this guide, we will showcase how to use the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`EmbeddingsSpaceComponent`),` to find relevant text based on a query.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"step-1-add-the-ellmental-dependencies"},`Step 1: Add the eLLMental dependencies`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`In eLLMental, we make use of `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"https://jitpack.io"},`JitPack`),` to import eLLMental into our projects. Below there are some examples of how you can use it.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h3",{"id":"gradle"},`Gradle`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Incorporate the eLLMental dependencies into your `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`build.gradle`),` file.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`allprojects { repositories { maven { url 'https://jitpack.io' } } diff --git a/assets/js/972024fb.7c7b9639.js b/assets/js/972024fb.5d3d7b73.js similarity index 94% rename from assets/js/972024fb.7c7b9639.js rename to assets/js/972024fb.5d3d7b73.js index 207e96a..0073d64 100644 --- a/assets/js/972024fb.7c7b9639.js +++ b/assets/js/972024fb.5d3d7b73.js @@ -286,7 +286,7 @@ enforcement ladder`),`.`),(0,esm/* mdx */.kt)("p",null,`For answers to common qu `,(0,esm/* mdx */.kt)("a",{parentName:"p","href":"https://www.contributor-covenant.org/faq"},`https://www.contributor-covenant.org/faq`),`. Translations are available at `,(0,esm/* mdx */.kt)("a",{parentName:"p","href":"https://www.contributor-covenant.org/translations"},`https://www.contributor-covenant.org/translations`),`.`));};MDXContent.isMDXComponent=true; ;// CONCATENATED MODULE: ./docs/04_community/code_of_conduct.mdx -/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const code_of_conduct_frontMatter={};const code_of_conduct_contentTitle='Contributor Covenant Code of Conduct';const metadata={"unversionedId":"community/code_of_conduct","id":"community/code_of_conduct","title":"Contributor Covenant Code of Conduct","description":"","source":"@site/docs/04_community/code_of_conduct.mdx","sourceDirName":"04_community","slug":"/community/code_of_conduct","permalink":"/community/code_of_conduct","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Javier Toledo","lastUpdatedAt":1693394277,"formattedLastUpdatedAt":"Aug 30, 2023","frontMatter":{},"sidebar":"docs","previous":{"title":"Contributing Guide","permalink":"/community/contributing"}};const assets={};const code_of_conduct_toc=[];const code_of_conduct_layoutProps={toc: code_of_conduct_toc};const code_of_conduct_MDXLayout="wrapper";function code_of_conduct_MDXContent(_ref){let{components,...props}=_ref;return (0,esm/* mdx */.kt)(code_of_conduct_MDXLayout,(0,esm_extends/* default */.Z)({},code_of_conduct_layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,esm/* mdx */.kt)("h1",{"id":"contributor-covenant-code-of-conduct"},`Contributor Covenant Code of Conduct`),(0,esm/* mdx */.kt)("div",{class:"hiddenh1s"},(0,esm/* mdx */.kt)(MDXContent,{mdxType:"CodeOfConduct"})));};code_of_conduct_MDXContent.isMDXComponent=true; +/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const code_of_conduct_frontMatter={};const code_of_conduct_contentTitle='Contributor Covenant Code of Conduct';const metadata={"unversionedId":"community/code_of_conduct","id":"community/code_of_conduct","title":"Contributor Covenant Code of Conduct","description":"","source":"@site/docs/04_community/code_of_conduct.mdx","sourceDirName":"04_community","slug":"/community/code_of_conduct","permalink":"/community/code_of_conduct","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Juan José Rodríguez López","lastUpdatedAt":1693488408,"formattedLastUpdatedAt":"Aug 31, 2023","frontMatter":{},"sidebar":"docs","previous":{"title":"Contributing Guide","permalink":"/community/contributing"}};const assets={};const code_of_conduct_toc=[];const code_of_conduct_layoutProps={toc: code_of_conduct_toc};const code_of_conduct_MDXLayout="wrapper";function code_of_conduct_MDXContent(_ref){let{components,...props}=_ref;return (0,esm/* mdx */.kt)(code_of_conduct_MDXLayout,(0,esm_extends/* default */.Z)({},code_of_conduct_layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,esm/* mdx */.kt)("h1",{"id":"contributor-covenant-code-of-conduct"},`Contributor Covenant Code of Conduct`),(0,esm/* mdx */.kt)("div",{class:"hiddenh1s"},(0,esm/* mdx */.kt)(MDXContent,{mdxType:"CodeOfConduct"})));};code_of_conduct_MDXContent.isMDXComponent=true; /***/ }) diff --git a/assets/js/c01ae077.d41d41a7.js b/assets/js/c01ae077.52c2792f.js similarity index 55% rename from assets/js/c01ae077.d41d41a7.js rename to assets/js/c01ae077.52c2792f.js index a333e2f..5d8a95a 100644 --- a/assets/js/c01ae077.d41d41a7.js +++ b/assets/js/c01ae077.52c2792f.js @@ -227,7 +227,7 @@ __webpack_require__.r(__webpack_exports__); /* harmony import */ var _home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(7462); /* harmony import */ var react__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(7294); /* harmony import */ var _mdx_js_react__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(3905); -/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={slug:'/'};const contentTitle='Introduction';const metadata={"unversionedId":"introduction","id":"introduction","title":"Introduction","description":"eLLMental is the ultimate library of components for building LLM-driven projects in the JVM.","source":"@site/docs/01_introduction.md","sourceDirName":".","slug":"/","permalink":"/","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Javier Toledo","lastUpdatedAt":1693394277,"formattedLastUpdatedAt":"Aug 30, 2023","sidebarPosition":1,"frontMatter":{"slug":"/"},"sidebar":"docs","next":{"title":"Getting started","permalink":"/getting-started"}};const assets={};const toc=[{value:'What can you do with eLLMental?',id:'what-can-you-do-with-ellmental',level:2},{value:'Embeddings Space Component',id:'embeddings-space-component',level:3},{value:'eLLMental Principles',id:'ellmental-principles',level:2},{value:'Join the movement!',id:'join-the-movement',level:2}];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"introduction"},`Introduction`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`eLLMental is the ultimate library of components for building LLM-driven projects in the JVM.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Wanna try? Go straight to the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/getting-started"},`Getting Started Guide`),`, or keep reading to know more about eLLMental.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"what-can-you-do-with-ellmental"},`What can you do with eLLMental?`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`eLLMental is divided into components that can be installed and used independently. Here's a summary of the available functionality:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h3",{"id":"embeddings-space-component"},`Embeddings Space Component`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Embedding models are a special kind of Large Language Models (LLMs) that allow, given a piece of text, to calculate a large vector that represents a point in what we call the embeddings space. This embeddings space has the property that two pieces of text that are semantically related will be placed close to each other, allowing us to calculate a semantic distance between any two given pieces of text. Embeddings can be used to implement powerful search features that go beyond keyword matching, find related documents in a large database, or detect redundant information even when it's written in different ways.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`The Embeddings Space Component provides straightforward interfaces to create and operate with embeddings, find the semantically closest documents to a given piece of text and many other operations. See the Embeddings Semantic Search Component documentation page for more details.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"ellmental-principles"},`eLLMental Principles`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`These are the design principles behind eLLMental:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ol",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"p"},`Simplicity in Complexity:`),` We aim to make the AI development process as simple and intuitive as any other library, hiding implementation details and glue code so the developer can focus on creating value.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"p"},`Readiness for Production:`),` From development to deployment, all features of eLLMental are crafted with a production-ready mindset.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"p"},`Continuous Improvement:`),` eLLMental continuously evolves for the better. With the support of our active community and dedicated team, we regularly add improvements and introduce new features.`))),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"join-the-movement"},`Join the movement!`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`We'll need your help to build something that becomes really useful for everyone. There are many things you can do to contribute:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ol",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Join the conversation in our `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://discord.gg/34cBbvjjAx"},`Discord server`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Send us suggestions, questions, or feature requests by creating a `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://github.com/theam/ellmental/issues/new"},`New Issue`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Look at the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://github.com/theam/ellmental/issues"},`Open Issues`),`, choose one that interests you, and start contributing.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Spread the word about eLLMental!`)));};MDXContent.isMDXComponent=true; +/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={slug:'/'};const contentTitle='Introduction';const metadata={"unversionedId":"introduction","id":"introduction","title":"Introduction","description":"eLLMental is the ultimate library of components for building LLM-driven projects in the JVM.","source":"@site/docs/01_introduction.md","sourceDirName":".","slug":"/","permalink":"/","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Juan José Rodríguez López","lastUpdatedAt":1693488408,"formattedLastUpdatedAt":"Aug 31, 2023","sidebarPosition":1,"frontMatter":{"slug":"/"},"sidebar":"docs","next":{"title":"Getting started","permalink":"/getting-started"}};const assets={};const toc=[{value:'What can you do with eLLMental?',id:'what-can-you-do-with-ellmental',level:2},{value:'Embeddings Space Component',id:'embeddings-space-component',level:3},{value:'eLLMental Principles',id:'ellmental-principles',level:2},{value:'Join the movement!',id:'join-the-movement',level:2}];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"introduction"},`Introduction`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`eLLMental is the ultimate library of components for building LLM-driven projects in the JVM.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Wanna try? Go straight to the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/getting-started"},`Getting Started Guide`),`, or keep reading to know more about eLLMental.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"what-can-you-do-with-ellmental"},`What can you do with eLLMental?`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`eLLMental is divided into components that can be installed and used independently. Here's a summary of the available functionality:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h3",{"id":"embeddings-space-component"},`Embeddings Space Component`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Embedding models are a special kind of Large Language Models (LLMs) that allow, given a piece of text, to calculate a large vector that represents a point in what we call the embeddings space. This embeddings space has the property that two pieces of text that are semantically related will be placed close to each other, allowing us to calculate a semantic distance between any two given pieces of text. Embeddings can be used to implement powerful search features that go beyond keyword matching, find related documents in a large database, or detect redundant information even when it's written in different ways.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`The Embeddings Space Component provides straightforward interfaces to create and operate with embeddings, find the semantically closest documents to a given piece of text and many other operations. See the Embeddings Semantic Search Component documentation page for more details.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"ellmental-principles"},`eLLMental Principles`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`These are the design principles behind eLLMental:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ol",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"p"},`Simplicity in Complexity:`),` We aim to make the AI development process as simple and intuitive as any other library, hiding implementation details and glue code so the developer can focus on creating value.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"p"},`Readiness for Production:`),` From development to deployment, all features of eLLMental are crafted with a production-ready mindset.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"p"},`Continuous Improvement:`),` eLLMental continuously evolves for the better. With the support of our active community and dedicated team, we regularly add improvements and introduce new features.`))),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"join-the-movement"},`Join the movement!`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`We'll need your help to build something that becomes really useful for everyone. There are many things you can do to contribute:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ol",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Join the conversation in our `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://discord.gg/34cBbvjjAx"},`Discord server`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Send us suggestions, questions, or feature requests by creating a `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://github.com/theam/ellmental/issues/new"},`New Issue`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Look at the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"li","href":"https://github.com/theam/ellmental/issues"},`Open Issues`),`, choose one that interests you, and start contributing.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ol"},`Spread the word about eLLMental!`)));};MDXContent.isMDXComponent=true; /***/ }) diff --git a/assets/js/db385b99.c36257aa.js b/assets/js/db385b99.9e3b316d.js similarity index 96% rename from assets/js/db385b99.c36257aa.js rename to assets/js/db385b99.9e3b316d.js index b3c64f6..0ec3de1 100644 --- a/assets/js/db385b99.c36257aa.js +++ b/assets/js/db385b99.9e3b316d.js @@ -291,7 +291,7 @@ contribute and reduces the chance of duplicate work.`),(0,esm/* mdx */.kt)("h2", to `,(0,esm/* mdx */.kt)("a",{parentName:"p","href":"mailto:info@theagilemonkeys.com"},`info@theagilemonkeys.com`),`, or joining our official `,(0,esm/* mdx */.kt)("a",{parentName:"p","href":"https://discord.gg/34cBbvjjAx"},`Discord server`),`. We will be more than happy to hear about you!`));};MDXContent.isMDXComponent=true; ;// CONCATENATED MODULE: ./docs/04_community/contributing.mdx -/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const contributing_frontMatter={};const contributing_contentTitle='Contributing Guide';const metadata={"unversionedId":"community/contributing","id":"community/contributing","title":"Contributing Guide","description":"","source":"@site/docs/04_community/contributing.mdx","sourceDirName":"04_community","slug":"/community/contributing","permalink":"/community/contributing","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Javier Toledo","lastUpdatedAt":1693394277,"formattedLastUpdatedAt":"Aug 30, 2023","frontMatter":{},"sidebar":"docs","previous":{"title":"Community","permalink":"/community/"},"next":{"title":"Contributor Covenant Code of Conduct","permalink":"/community/code_of_conduct"}};const assets={};const contributing_toc=[];const contributing_layoutProps={toc: contributing_toc};const contributing_MDXLayout="wrapper";function contributing_MDXContent(_ref){let{components,...props}=_ref;return (0,esm/* mdx */.kt)(contributing_MDXLayout,(0,esm_extends/* default */.Z)({},contributing_layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,esm/* mdx */.kt)("h1",{"id":"contributing-guide"},`Contributing Guide`),(0,esm/* mdx */.kt)("div",{class:"hiddenh1s"},(0,esm/* mdx */.kt)(MDXContent,{mdxType:"ContributingGuide"})));};contributing_MDXContent.isMDXComponent=true; +/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const contributing_frontMatter={};const contributing_contentTitle='Contributing Guide';const metadata={"unversionedId":"community/contributing","id":"community/contributing","title":"Contributing Guide","description":"","source":"@site/docs/04_community/contributing.mdx","sourceDirName":"04_community","slug":"/community/contributing","permalink":"/community/contributing","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Juan José Rodríguez López","lastUpdatedAt":1693488408,"formattedLastUpdatedAt":"Aug 31, 2023","frontMatter":{},"sidebar":"docs","previous":{"title":"Community","permalink":"/community/"},"next":{"title":"Contributor Covenant Code of Conduct","permalink":"/community/code_of_conduct"}};const assets={};const contributing_toc=[];const contributing_layoutProps={toc: contributing_toc};const contributing_MDXLayout="wrapper";function contributing_MDXContent(_ref){let{components,...props}=_ref;return (0,esm/* mdx */.kt)(contributing_MDXLayout,(0,esm_extends/* default */.Z)({},contributing_layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,esm/* mdx */.kt)("h1",{"id":"contributing-guide"},`Contributing Guide`),(0,esm/* mdx */.kt)("div",{class:"hiddenh1s"},(0,esm/* mdx */.kt)(MDXContent,{mdxType:"ContributingGuide"})));};contributing_MDXContent.isMDXComponent=true; /***/ }) diff --git a/assets/js/e9c83615.7898d12a.js b/assets/js/e9c83615.44887121.js similarity index 75% rename from assets/js/e9c83615.7898d12a.js rename to assets/js/e9c83615.44887121.js index 40c2759..838e465 100644 --- a/assets/js/e9c83615.7898d12a.js +++ b/assets/js/e9c83615.44887121.js @@ -227,22 +227,11 @@ __webpack_require__.r(__webpack_exports__); /* harmony import */ var _home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(7462); /* harmony import */ var react__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(7294); /* harmony import */ var _mdx_js_react__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(3905); -/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={};const contentTitle='EmbeddingsSpaceComponent';const metadata={"unversionedId":"components/embeddings_space","id":"components/embeddings_space","title":"EmbeddingsSpaceComponent","description":"Introduction","source":"@site/docs/03_components/02_embeddings_space.md","sourceDirName":"03_components","slug":"/components/embeddings_space","permalink":"/components/embeddings_space","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Javier Toledo","lastUpdatedAt":1693394277,"formattedLastUpdatedAt":"Aug 30, 2023","sidebarPosition":2,"frontMatter":{},"sidebar":"docs","previous":{"title":"Core Abstractions","permalink":"/components/core_abstractions"},"next":{"title":"Community","permalink":"/community/"}};const assets={};const toc=[{value:'Introduction',id:'introduction',level:2},{value:'Overview',id:'overview',level:2},{value:'Constructor',id:'constructor',level:2},{value:'generate',id:'generate',level:2},{value:'save',id:'save',level:2},{value:'mostSimilarEmbeddings',id:'mostsimilarembeddings',level:2},{value:'calculateRelationshipVector',id:'calculaterelationshipvector',level:2},{value:'storeNamedRelationshipVector',id:'storenamedrelationshipvector',level:2},{value:'translateEmbedding',id:'translateembedding',level:2},{value:'get',id:'get',level:2},{value:'delete',id:'delete',level:2}];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"embeddingsspacecomponent"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"h1"},`EmbeddingsSpaceComponent`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"introduction"},`Introduction`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`The `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`EmbeddingsSpaceComponent`),` represents an embeddings space, facilitating the management and operations within it. Think of embeddings as a numeric representation of the meaning behind the text. Similar to how coordinates help pinpoint locations on Earth, in the embeddings space, semantically similar concepts cluster closer together. Before diving in, make sure to follow the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/getting-started"},`Getting Started Guide`),` to install the library in your project. It'd also be advisable to familiarize yourself with the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/components/core_abstractions"},`core abstractions`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"overview"},`Overview`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Leveraging the power of embeddings models, this component allows you to represent pieces of text in an embeddings space. Once you generate an embedding, it's stored using an embeddings database, along with its metadata for efficient retrieval.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("blockquote",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",{parentName:"blockquote"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"p"},`Warning`),`: The current version supports the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"https://platform.openai.com/docs/guides/embeddings"},`OpenAI embeddings model`),` and `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"https://www.pinecone.io"},`Pinecone`),` for storage. Please provide the necessary credentials for these services. Note: these services may involve costs, always review their pricing details before use.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`The `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`EmbeddingsSpaceComponent`),` interface defines the following methods:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"constructor"},`Constructor`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`To instantiate an `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`EmbeddingsSpaceComponent`),`, provide both an `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/components/core_abstractions#embeddingsgenerationmodel"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"a"},`EmbeddingsGenerationModel`)),` and an `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/components/core_abstractions#embeddingsstore"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"a"},`EmbeddingsStore`)),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`EmbeddingsGenerationModel openAIModel = new OpenAIEmbeddingsGenerationModel("YOUR_OPENAI_API_KEY"); +/* @jsxRuntime classic */ /* @jsx mdx */ /* @jsxFrag React.Fragment */const frontMatter={};const contentTitle='EmbeddingsSpaceComponent';const metadata={"unversionedId":"components/embeddings_space","id":"components/embeddings_space","title":"EmbeddingsSpaceComponent","description":"Introduction","source":"@site/docs/03_components/02_embeddings_space.md","sourceDirName":"03_components","slug":"/components/embeddings_space","permalink":"/components/embeddings_space","draft":false,"tags":[],"version":"current","lastUpdatedBy":"Juan José Rodríguez López","lastUpdatedAt":1693488408,"formattedLastUpdatedAt":"Aug 31, 2023","sidebarPosition":2,"frontMatter":{},"sidebar":"docs","previous":{"title":"Core Abstractions","permalink":"/components/core_abstractions"},"next":{"title":"Community","permalink":"/community/"}};const assets={};const toc=[{value:'Introduction',id:'introduction',level:2},{value:'Overview',id:'overview',level:2},{value:'Constructor',id:'constructor',level:2},{value:'save',id:'save',level:2},{value:'mostSimilarEmbeddings',id:'mostsimilarembeddings',level:2},{value:'calculateRelationshipVector',id:'calculaterelationshipvector',level:2},{value:'storeNamedRelationshipVector',id:'storenamedrelationshipvector',level:2},{value:'translateEmbedding',id:'translateembedding',level:2},{value:'get',id:'get',level:2},{value:'delete',id:'delete',level:2}];const layoutProps={toc};const MDXLayout="wrapper";function MDXContent(_ref){let{components,...props}=_ref;return (0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)(MDXLayout,(0,_home_runner_work_eLLMental_eLLMental_docs_site_node_modules_babel_runtime_helpers_esm_extends_js__WEBPACK_IMPORTED_MODULE_2__/* ["default"] */ .Z)({},layoutProps,props,{components:components,mdxType:"MDXLayout"}),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h1",{"id":"embeddingsspacecomponent"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"h1"},`EmbeddingsSpaceComponent`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"introduction"},`Introduction`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`The `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`EmbeddingsSpaceComponent`),` represents an embeddings space, facilitating the management and operations within it. Think of embeddings as a numeric representation of the meaning behind the text. Similar to how coordinates help pinpoint locations on Earth, in the embeddings space, semantically similar concepts cluster closer together. Before diving in, make sure to follow the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/getting-started"},`Getting Started Guide`),` to install the library in your project. It'd also be advisable to familiarize yourself with the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/components/core_abstractions"},`core abstractions`),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"overview"},`Overview`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Leveraging the power of embeddings models, this component allows you to represent pieces of text in an embeddings space. Once you generate an embedding, it's stored using an embeddings database, along with its metadata for efficient retrieval.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("blockquote",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",{parentName:"blockquote"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"p"},`Warning`),`: The current version supports the `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"https://platform.openai.com/docs/guides/embeddings"},`OpenAI embeddings model`),` and `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"https://www.pinecone.io"},`Pinecone`),` for storage. Please provide the necessary credentials for these services. Note: these services may involve costs, always review their pricing details before use.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`The `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`EmbeddingsSpaceComponent`),` interface defines the following methods:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"constructor"},`Constructor`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`To instantiate an `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"p"},`EmbeddingsSpaceComponent`),`, provide both an `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/components/core_abstractions#embeddingsgenerationmodel"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"a"},`EmbeddingsGenerationModel`)),` and an `,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("a",{parentName:"p","href":"/components/core_abstractions#embeddingsstore"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"a"},`EmbeddingsStore`)),`.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`EmbeddingsGenerationModel openAIModel = new OpenAIEmbeddingsGenerationModel("YOUR_OPENAI_API_KEY"); EmbeddingsStore pineconeStore = new PineconeEmbeddingsStore("YOUR_PINECONE_URL", "YOUR_PINECONE_API_KEY", "YOUR_PINECONE_SPACE"); EmbeddingsSpaceComponent embeddingsSpace = new EmbeddingsSpaceComponent(openAIModel, pineconeStore); -`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"generate"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"h2"},`generate`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Generates an embedding from a text without persisting it.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"li"},`Parameters`),`:`,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`text`),`: The textual input for embedding.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`additionalMetadata`),`: Supplementary metadata associated with the text.`))),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"li"},`Returns`),`: The generated embedding.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`String sampleText = "Hello, eLLMental!"; -Map additionalMetadata = new HashMap<>(); -additionalMetadata.put("key", "value"); - -Embedding embedding = embeddingsSpace.generate(sampleText, additionalMetadata); -`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"save"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"h2"},`save`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Generates and persists an embedding for a given text.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"li"},`Parameters`),`:`,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`text`),`: Text to be embedded.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`additionalMetadata`),`: (Optional) Additional metadata.`))),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"li"},`Returns`),`: The generated embedding.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`Map additionalMetadata = new HashMap<>(); -additionalMetadata.put("key", "value"); - -String sampleText = "Hello, eLLMental!"; -Embedding embedding = embeddingsSpace.save(sampleText, additionalMetadata); - -// Or just -Embedding embedding = embeddingSpace.save(sampleText); +`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"save"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"h2"},`save`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Generates and persists an embedding for a given text.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"li"},`Parameters`),`:`,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`text`),`: Text to be embedded.`))),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"li"},`Returns`),`: The generated embedding.`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`String sampleText = "Hello, eLLMental!"; +Embedding embedding = embeddingsSpace.save(sampleText); `)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("h2",{"id":"mostsimilarembeddings"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"h2"},`mostSimilarEmbeddings`)),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`Fetches embeddings semantically closest to a reference text or embedding.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("p",null,`With a reference text:`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("strong",{parentName:"li"},`Parameters`),`:`,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("ul",{parentName:"li"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`referenceText`),`: The text for comparison.`),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("li",{parentName:"ul"},(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("inlineCode",{parentName:"li"},`limit`),`: Maximum number of similar embeddings to return.`)))),(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("pre",null,(0,_mdx_js_react__WEBPACK_IMPORTED_MODULE_1__/* .mdx */ .kt)("code",{parentName:"pre","className":"language-java"},`// First we add a few embeddings to the space embeddingsSpace.save("Hello, eLLMental!"); embeddingsSpace.save("Hello, world!"); diff --git a/assets/js/runtime~main.f011083f.js b/assets/js/runtime~main.e8c04cdc.js similarity index 98% rename from assets/js/runtime~main.f011083f.js rename to assets/js/runtime~main.e8c04cdc.js index b4ed99e..adbaa52 100644 --- a/assets/js/runtime~main.f011083f.js +++ b/assets/js/runtime~main.e8c04cdc.js @@ -136,7 +136,7 @@ /******/ // This function allow to reference async chunks /******/ __webpack_require__.u = (chunkId) => { /******/ // return url for filenames based on template -/******/ return "assets/js/" + ({"53":"935f2afb","164":"483c79d4","178":"972024fb","512":"2981f78a","514":"1be78505","728":"db385b99","805":"e9c83615","888":"6866e3af","918":"17896441","941":"c01ae077","956":"5b471282"}[chunkId] || chunkId) + "." + {"53":"1149fc24","164":"bf236419","178":"7c7b9639","512":"9b7ebc1c","514":"1065f06a","728":"c36257aa","805":"7898d12a","888":"09b3a8dc","918":"f895a5e4","941":"d41d41a7","956":"6003f070","972":"1eab1bc1"}[chunkId] + ".js"; +/******/ return "assets/js/" + ({"53":"935f2afb","164":"483c79d4","178":"972024fb","512":"2981f78a","514":"1be78505","728":"db385b99","805":"e9c83615","888":"6866e3af","918":"17896441","941":"c01ae077","956":"5b471282"}[chunkId] || chunkId) + "." + {"53":"1149fc24","164":"815e19bf","178":"5d3d7b73","512":"9b7ebc1c","514":"1065f06a","728":"9e3b316d","805":"44887121","888":"25c7bf52","918":"f895a5e4","941":"52c2792f","956":"09dde483","972":"1eab1bc1"}[chunkId] + ".js"; /******/ }; /******/ })(); /******/ diff --git a/community/code_of_conduct/index.html b/community/code_of_conduct/index.html index 41d0fc1..d6cd484 100644 --- a/community/code_of_conduct/index.html +++ b/community/code_of_conduct/index.html @@ -4,7 +4,7 @@ Contributor Covenant Code of Conduct | eLLMental - + @@ -58,8 +58,8 @@ https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.

Community Impact Guidelines were inspired by Mozilla's code of conduct enforcement ladder.

For answers to common questions about this code of conduct, see the FAQ at https://www.contributor-covenant.org/faq. Translations are available at -https://www.contributor-covenant.org/translations.

- +https://www.contributor-covenant.org/translations.

+ \ No newline at end of file diff --git a/community/contributing/index.html b/community/contributing/index.html index a186559..1347e23 100644 --- a/community/contributing/index.html +++ b/community/contributing/index.html @@ -4,7 +4,7 @@ Contributing Guide | eLLMental - + @@ -59,8 +59,8 @@ for impact a given change will have.

Once you've chosen an issue to work on, please assign it to yourself. This helps communicate your intention to contribute and reduces the chance of duplicate work.

Get in touch!

If you feel lost, don't hesitate to reach out the core team. You can connect with us via email writing to info@theagilemonkeys.com, or joining our -official Discord server. We will be more than happy to hear about you!

- +official Discord server. We will be more than happy to hear about you!

+ \ No newline at end of file diff --git a/community/index.html b/community/index.html index 2ff9125..b18b56e 100644 --- a/community/index.html +++ b/community/index.html @@ -4,13 +4,13 @@ Community | eLLMental - +
-

Community

Community contributions are essential to the development and refinement of eLLMental. You can become a part of the eLLMental community in the following ways:

  1. Join the conversation in our Discord server.
  2. Send us suggestions, questions, or feature requests by creating a New Issue.
  3. Look at the Open Issues, choose one that interests you, and start contributing.
  4. Spread the word about eLLMental!

Find detailed instructions and guidelines in the Contributing Guide, and make sure to read our Code of Conduct before you start contributing.

- +
Skip to main content

Community

Community contributions are essential to the development and refinement of eLLMental. You can become a part of the eLLMental community in the following ways:

  1. Join the conversation in our Discord server.
  2. Send us suggestions, questions, or feature requests by creating a New Issue.
  3. Look at the Open Issues, choose one that interests you, and start contributing.
  4. Spread the word about eLLMental!

Find detailed instructions and guidelines in the Contributing Guide, and make sure to read our Code of Conduct before you start contributing.

+ \ No newline at end of file diff --git a/components/core_abstractions/index.html b/components/core_abstractions/index.html index f78783d..620fd86 100644 --- a/components/core_abstractions/index.html +++ b/components/core_abstractions/index.html @@ -4,13 +4,13 @@ Core Abstractions | eLLMental - +
-

Core Abstractions

eLLMental uses different 3rd party components and APIs and provides a unified interface. To ensure extensibility and avoid tight coupling with any specific API, the library provides a series of abstract classes that define the expected interface for these components to work with eLLMental. To use eLLMental, you can provide your own implementation or use one of the built-in concrete implementations.

Embedding object

In eLLMental embeddings are represented by the Embedding record, which has the following attributes:

  • id: An unique identifier of the embedding.
  • vector: A numeric vector that represents the semantic location of the text.
  • metadata: Additional information associated with the embedding. It can be used to store the original text, the model used to generate the embedding, or any other information you may find useful.
public record Embedding(
UUID id,
List<Double> vector,
Map<String, String> metadata
) {}

EmbeddingsGenerationModel

This abstract class defines the interface expected by eLLMental for a valid embeddings generation model.

public abstract class EmbeddingsGenerationModel {
public abstract Embedding generateEmbedding(String text);
}

OpenAIEmbeddingsGenerationModel

eLLMental provides an implementation to use OpenAI's embeddings model. This model is only accessible via API, so you'll need to initialize it with a valid OpenAI API key.

EmbeddingsGenerationModel openAIModel = new OpenAIEmbeddingsGenerationModel("YOUR_OPENAI_API_KEY");

// You'll rarely need to interact directly with the `openAIModel`, but you can use it to generate an embedding object:
Embedding embedding = openAIModel.generateEmbedding("Sample string");

The OpenAI embeddings generator will automatically include the original text, the timestamp and the model used to generate the embedding in the metadata.

EmbeddingsStore

This abstract class defines the expected interface for a persistence mechanism capable of storing and querying embeddings:

public abstract class EmbeddingsStore {
public abstract void store(Embedding embedding);
public abstract List<Embedding> similaritySearch(Embedding reference, int limit);
}

PineconeEmbeddingsStore

eLLMental provides a concrete implementation for Pinecone, which requires defining an URL, an API Key and a space.

EmbeddingsStore pineconeStore = new PineconeEmbeddingsStore("YOUR_PINECONE_URL", "YOUR_PINECONE_API_KEY", "YOUR_PINECONE_NAMESPACE");

// You can now insert or perform similarity searches using the pineconeStore instance:
pineconeStore.store(someEmbedding);
List<Embedding> similarEmbeddings = pineconeStore.similaritySearch(referenceEmbedding, 5);
- +
Skip to main content

Core Abstractions

eLLMental uses different 3rd party components and APIs and provides a unified interface. To ensure extensibility and avoid tight coupling with any specific API, the library provides a series of abstract classes that define the expected interface for these components to work with eLLMental. To use eLLMental, you can provide your own implementation or use one of the built-in concrete implementations.

Embedding object

In eLLMental embeddings are represented by the Embedding record, which has the following attributes:

  • id: An unique identifier of the embedding.
  • vector: A numeric vector that represents the semantic location of the text.
  • metadata: Additional information associated with the embedding. It can be used to store the original text, the model used to generate the embedding, or any other information you may find useful.
public record Embedding(
UUID id,
List<Double> vector,
Map<String, String> metadata
) {}

EmbeddingsGenerationModel

This abstract class defines the interface expected by eLLMental for a valid embeddings generation model.

public abstract class EmbeddingsGenerationModel {
public abstract Embedding generateEmbedding(String text);
}

OpenAIEmbeddingsGenerationModel

eLLMental provides an implementation to use OpenAI's embeddings model. This model is only accessible via API, so you'll need to initialize it with a valid OpenAI API key.

EmbeddingsGenerationModel openAIModel = new OpenAIEmbeddingsGenerationModel("YOUR_OPENAI_API_KEY");

// You'll rarely need to interact directly with the `openAIModel`, but you can use it to generate an embedding object:
Embedding embedding = openAIModel.generateEmbedding("Sample string");

The OpenAI embeddings generator will automatically include the original text, the timestamp and the model used to generate the embedding in the metadata.

EmbeddingsStore

This abstract class defines the expected interface for a persistence mechanism capable of storing and querying embeddings:

public abstract class EmbeddingsStore {
public abstract void store(Embedding embedding);
public abstract List<Embedding> similaritySearch(Embedding reference, int limit);
}

PineconeEmbeddingsStore

eLLMental provides a concrete implementation for Pinecone, which requires defining an URL, an API Key and a space.

EmbeddingsStore pineconeStore = new PineconeEmbeddingsStore("YOUR_PINECONE_URL", "YOUR_PINECONE_API_KEY", "YOUR_PINECONE_NAMESPACE");

// You can now insert or perform similarity searches using the pineconeStore instance:
pineconeStore.store(someEmbedding);
List<Embedding> similarEmbeddings = pineconeStore.similaritySearch(referenceEmbedding, 5);
+ \ No newline at end of file diff --git a/components/embeddings_space/index.html b/components/embeddings_space/index.html index bdd58ad..0fc2220 100644 --- a/components/embeddings_space/index.html +++ b/components/embeddings_space/index.html @@ -4,13 +4,13 @@ EmbeddingsSpaceComponent | eLLMental - +
-

EmbeddingsSpaceComponent

Introduction

The EmbeddingsSpaceComponent represents an embeddings space, facilitating the management and operations within it. Think of embeddings as a numeric representation of the meaning behind the text. Similar to how coordinates help pinpoint locations on Earth, in the embeddings space, semantically similar concepts cluster closer together. Before diving in, make sure to follow the Getting Started Guide to install the library in your project. It'd also be advisable to familiarize yourself with the core abstractions.

Overview

Leveraging the power of embeddings models, this component allows you to represent pieces of text in an embeddings space. Once you generate an embedding, it's stored using an embeddings database, along with its metadata for efficient retrieval.

Warning: The current version supports the OpenAI embeddings model and Pinecone for storage. Please provide the necessary credentials for these services. Note: these services may involve costs, always review their pricing details before use.

The EmbeddingsSpaceComponent interface defines the following methods:

Constructor

To instantiate an EmbeddingsSpaceComponent, provide both an EmbeddingsGenerationModel and an EmbeddingsStore.

EmbeddingsGenerationModel openAIModel = new OpenAIEmbeddingsGenerationModel("YOUR_OPENAI_API_KEY");
EmbeddingsStore pineconeStore = new PineconeEmbeddingsStore("YOUR_PINECONE_URL", "YOUR_PINECONE_API_KEY", "YOUR_PINECONE_SPACE");
EmbeddingsSpaceComponent embeddingsSpace = new EmbeddingsSpaceComponent(openAIModel, pineconeStore);

generate

Generates an embedding from a text without persisting it.

  • Parameters:
    • text: The textual input for embedding.
    • additionalMetadata: Supplementary metadata associated with the text.
  • Returns: The generated embedding.
String sampleText = "Hello, eLLMental!";
Map<String, String> additionalMetadata = new HashMap<>();
additionalMetadata.put("key", "value");

Embedding embedding = embeddingsSpace.generate(sampleText, additionalMetadata);

save

Generates and persists an embedding for a given text.

  • Parameters:
    • text: Text to be embedded.
    • additionalMetadata: (Optional) Additional metadata.
  • Returns: The generated embedding.
Map<String, String> additionalMetadata = new HashMap<>();
additionalMetadata.put("key", "value");

String sampleText = "Hello, eLLMental!";
Embedding embedding = embeddingsSpace.save(sampleText, additionalMetadata);

// Or just
Embedding embedding = embeddingSpace.save(sampleText);

mostSimilarEmbeddings

Fetches embeddings semantically closest to a reference text or embedding.

With a reference text:

  • Parameters:
    • referenceText: The text for comparison.
    • limit: Maximum number of similar embeddings to return.
// First we add a few embeddings to the space
embeddingsSpace.save("Hello, eLLMental!");
embeddingsSpace.save("Hello, world!");

List<Embedding> closestNeighbors = embeddingsSpace.mostSimilarEmbeddings("Greetings!", 3);
// closestNeighbors will contain the embeddings for "Hello, eLLMental!" and "Hello, world!"

With a reference embedding:

  • Parameters:
    • referenceEmbedding: The embedding for comparison.
    • limit: Maximum number of similar embeddings to return.
// First we add a few embeddings to the space
embeddingsSpace.add("Hello, eLLMental!");
embeddingsSpace.add("Hello, world!");

Embedding embedding = embeddingsSpace.generate("Hello everyone!");

List<Embedding> closestNeighbors = embeddingsSpace.mostSimilarEmbeddings(embedding, 3);
// closestNeighbors will contain the embeddings for "Hello, eLLMental!" and "Hello, world!"

calculateRelationshipVector

Computes a relationship vector for provided text pairs (To be used with the translateEmbedding method)

  • Parameters:
    • textPairs: Array of text pairs.

For instance, with the following list of text pairs:

Text 1Text 2
ManWoman
BoyGirl
KingQueen
PrincePrincess
FatherMother

The relationship vector for this group represents a translation in the embeddings space that, given a word that matches the ones in the left column, provides the location of a word that would likely appear in the right column for the given word. See the documentation for translateEmbedding for more details.

The RelationshipVector class is defined as follows:

public class RelationshipVector {
public final String label;
public final float[] vector;
}

And it can be calculated like this:

String[][] textPairs = [["Man", "Woman"], ["Boy", "Girl"], ["King", "Queen"], ["Prince", "Princess"], ["Father", "Mother"]];
RelationshipVector relationshipVector = embeddingsSpace.calculateRelationshipVector(textPairs);

storeNamedRelationshipVector

Stores a relationship vector in the embeddings store and assigns it a label for later use.

  • Parameters:
    • label: The label to assign to the relationship vector.
    • relationshipVector: The relationship vector to store.
// First we calculate a relationship vector
RelationshipVector relationshipVector = embeddingsSpace.calculateRelationshipVector(textPairs);

// Then we store it in the embeddings store for future use
embeddingsSpace.storeNamedRelationshipVector("feminize", relationshipVector);

translateEmbedding

Shifts a reference text embedding in the embeddings space to find the location of the text that would meet the relationship represented by the vector.

  • Parameters:
    • referenceText: The primary embedding.
    • vector: The vector determining translation.

This is useful if you want to search for embeddings that are similar to a given one, but in a different context. For instance, let's say we have the following embedding:

And a relationship vector calculated with the calculateRelationshipVector method as follows:

String[][] textPairs = [["Man", "Woman"], ["Boy", "Girl"], ["King", "Queen"], ["Prince", "Princess"], ["Father", "Mother"]];
RelationshipVector relationshipVector = embeddingsSpace.calculateRelationshipVector(textPairs);

We can use the relationship vector to find the location of the words that would be similar to "Cow" instead of "Bull". Notice that embeddings cannot be reversed, and we can't really know if this embedding represents a cow, but it will give us a good approximation that can be used to refine search results later.

// This will create an estimated embedding of the word "Cow"
Embedding likelyACowEmbedding = embeddingsSpace.translateEmbedding("Bull", relationshipVector);

// We use it as any other embedding to find stored texts that are similar to "Cow"
List<Embedding> similarToCowEmbeddings = embeddingsSpace.mostSimilarEmbeddings(likelyACowEmbedding, 5);

get

Retrieves an embedding from the embeddings store using its ID.

  • Parameters:
    • id: The ID of the embedding to retrieve.
  • Returns: The desired embedding or null if not found.
Embedding embedding = embeddingsSpace.get("embedding-id");

delete

Deletes an embedding from the embeddings store using its ID.

  • Parameters:
    • id: The ID of the embedding to delete.
embeddingsSpace.delete("embedding-id");
- +
Skip to main content

EmbeddingsSpaceComponent

Introduction

The EmbeddingsSpaceComponent represents an embeddings space, facilitating the management and operations within it. Think of embeddings as a numeric representation of the meaning behind the text. Similar to how coordinates help pinpoint locations on Earth, in the embeddings space, semantically similar concepts cluster closer together. Before diving in, make sure to follow the Getting Started Guide to install the library in your project. It'd also be advisable to familiarize yourself with the core abstractions.

Overview

Leveraging the power of embeddings models, this component allows you to represent pieces of text in an embeddings space. Once you generate an embedding, it's stored using an embeddings database, along with its metadata for efficient retrieval.

Warning: The current version supports the OpenAI embeddings model and Pinecone for storage. Please provide the necessary credentials for these services. Note: these services may involve costs, always review their pricing details before use.

The EmbeddingsSpaceComponent interface defines the following methods:

Constructor

To instantiate an EmbeddingsSpaceComponent, provide both an EmbeddingsGenerationModel and an EmbeddingsStore.

EmbeddingsGenerationModel openAIModel = new OpenAIEmbeddingsGenerationModel("YOUR_OPENAI_API_KEY");
EmbeddingsStore pineconeStore = new PineconeEmbeddingsStore("YOUR_PINECONE_URL", "YOUR_PINECONE_API_KEY", "YOUR_PINECONE_SPACE");
EmbeddingsSpaceComponent embeddingsSpace = new EmbeddingsSpaceComponent(openAIModel, pineconeStore);

save

Generates and persists an embedding for a given text.

  • Parameters:
    • text: Text to be embedded.
  • Returns: The generated embedding.
String sampleText = "Hello, eLLMental!";
Embedding embedding = embeddingsSpace.save(sampleText);

mostSimilarEmbeddings

Fetches embeddings semantically closest to a reference text or embedding.

With a reference text:

  • Parameters:
    • referenceText: The text for comparison.
    • limit: Maximum number of similar embeddings to return.
// First we add a few embeddings to the space
embeddingsSpace.save("Hello, eLLMental!");
embeddingsSpace.save("Hello, world!");

List<Embedding> closestNeighbors = embeddingsSpace.mostSimilarEmbeddings("Greetings!", 3);
// closestNeighbors will contain the embeddings for "Hello, eLLMental!" and "Hello, world!"

With a reference embedding:

  • Parameters:
    • referenceEmbedding: The embedding for comparison.
    • limit: Maximum number of similar embeddings to return.
// First we add a few embeddings to the space
embeddingsSpace.add("Hello, eLLMental!");
embeddingsSpace.add("Hello, world!");

Embedding embedding = embeddingsSpace.generate("Hello everyone!");

List<Embedding> closestNeighbors = embeddingsSpace.mostSimilarEmbeddings(embedding, 3);
// closestNeighbors will contain the embeddings for "Hello, eLLMental!" and "Hello, world!"

calculateRelationshipVector

Computes a relationship vector for provided text pairs (To be used with the translateEmbedding method)

  • Parameters:
    • textPairs: Array of text pairs.

For instance, with the following list of text pairs:

Text 1Text 2
ManWoman
BoyGirl
KingQueen
PrincePrincess
FatherMother

The relationship vector for this group represents a translation in the embeddings space that, given a word that matches the ones in the left column, provides the location of a word that would likely appear in the right column for the given word. See the documentation for translateEmbedding for more details.

The RelationshipVector class is defined as follows:

public class RelationshipVector {
public final String label;
public final float[] vector;
}

And it can be calculated like this:

String[][] textPairs = [["Man", "Woman"], ["Boy", "Girl"], ["King", "Queen"], ["Prince", "Princess"], ["Father", "Mother"]];
RelationshipVector relationshipVector = embeddingsSpace.calculateRelationshipVector(textPairs);

storeNamedRelationshipVector

Stores a relationship vector in the embeddings store and assigns it a label for later use.

  • Parameters:
    • label: The label to assign to the relationship vector.
    • relationshipVector: The relationship vector to store.
// First we calculate a relationship vector
RelationshipVector relationshipVector = embeddingsSpace.calculateRelationshipVector(textPairs);

// Then we store it in the embeddings store for future use
embeddingsSpace.storeNamedRelationshipVector("feminize", relationshipVector);

translateEmbedding

Shifts a reference text embedding in the embeddings space to find the location of the text that would meet the relationship represented by the vector.

  • Parameters:
    • referenceText: The primary embedding.
    • vector: The vector determining translation.

This is useful if you want to search for embeddings that are similar to a given one, but in a different context. For instance, let's say we have the following embedding:

And a relationship vector calculated with the calculateRelationshipVector method as follows:

String[][] textPairs = [["Man", "Woman"], ["Boy", "Girl"], ["King", "Queen"], ["Prince", "Princess"], ["Father", "Mother"]];
RelationshipVector relationshipVector = embeddingsSpace.calculateRelationshipVector(textPairs);

We can use the relationship vector to find the location of the words that would be similar to "Cow" instead of "Bull". Notice that embeddings cannot be reversed, and we can't really know if this embedding represents a cow, but it will give us a good approximation that can be used to refine search results later.

// This will create an estimated embedding of the word "Cow"
Embedding likelyACowEmbedding = embeddingsSpace.translateEmbedding("Bull", relationshipVector);

// We use it as any other embedding to find stored texts that are similar to "Cow"
List<Embedding> similarToCowEmbeddings = embeddingsSpace.mostSimilarEmbeddings(likelyACowEmbedding, 5);

get

Retrieves an embedding from the embeddings store using its ID.

  • Parameters:
    • id: The ID of the embedding to retrieve.
  • Returns: The desired embedding or null if not found.
Embedding embedding = embeddingsSpace.get("embedding-id");

delete

Deletes an embedding from the embeddings store using its ID.

  • Parameters:
    • id: The ID of the embedding to delete.
embeddingsSpace.delete("embedding-id");
+ \ No newline at end of file diff --git a/getting-started/index.html b/getting-started/index.html index 9667ed9..f6c7f87 100644 --- a/getting-started/index.html +++ b/getting-started/index.html @@ -4,13 +4,13 @@ Getting started | eLLMental - +
-

Getting started

eLLMental is a library designed for building AI-powered applications written in Java, and it offers production-ready components that can be used right away in your current JVM projects. In this guide, we will showcase how to use the EmbeddingsSpaceComponent to find relevant text based on a query.

Step 1: Add the eLLMental dependencies

In eLLMental, we make use of JitPack to import eLLMental into our projects. Below there are some examples of how you can use it.

Gradle

Incorporate the eLLMental dependencies into your build.gradle file.

allprojects {
repositories {
maven { url 'https://jitpack.io' }
}
}

dependencies {
implementation 'com.github.theam:ellmental:main'
}

Maven

You can also add the eLLMental dependencies into your pom.xml file.

<repositories>
<repository>
<id>jitpack.io</id>
<url>https://jitpack.io</url>
</repository>
</repositories>

<dependency>
<groupId>com.github.theam</groupId>
<artifactId>eLLMental</artifactId>
<version>main</version>
</dependency>

Step 2: Initializing the EmbeddingsSpaceComponent

Before initializing the EmbeddingsSpaceComponent, set up the OpenAIEmbeddingsModel and PineconeVectorStore.

Retrieve the required API tokens and configuration parameters following the PineCone quickstart guide and OpenAI API keys guide.

import com.theagilemonkeys.ellmental.embeddingsmodel.openai.OpenAIEmbeddingsModel;
import com.theagilemonkeys.ellmental.vectorstore.pinecone.PineconeVectorStore;

public OpenAIEmbeddingsModel embeddingsModel() {
return new OpenAIEmbeddingsModel("OPEN_AI_API_KEY");
}

public PineconeVectorStore vectorStore() {
return new PineconeVectorStore("PINECONE_API_KEY", "PINECONE_URL", "PINECONE_NAMESPACE");
}

Now, initialize the EmbeddingsSpaceComponent:

import com.theagilemonkeys.ellmental.EmbeddingsSpaceComponent;

public EmbeddingsSpaceComponent initializeEmbeddingsSpace() {
return new EmbeddingsSpaceComponent(embeddingsModel(), vectorStore());
}

Step 3: Running the example

To run an example, you can write a simple main function:

public class MainApp {

public static void main(String[] args) {
EmbeddingsSpaceComponent embeddingsSpace = initializeEmbeddingsSpace();

// Add some embedding samples to the embeddings space.
embeddingsSpace.save("Hello, eLLMental!");
embeddingsSpace.save("Hello, world!");
embeddingsSpace.save("Hi!");
embeddingsSpace.save("Cats are cute");
embeddingsSpace.save("Dogs are loyal");
// You can provide Metadata to the `save` call too
Map<String, String> metadata = new HashMap<>();
metadata.put("key", "value");
embeddingsSpace.save("Hey there!", metadata)


// Search similar embeddings
List<Embedding> results = embeddingsSpace.mostSimilarEmbeddings("Greetings!", 3);
for (Embedding embedding : results) {
System.out.println(embedding.getText());
}
}
}

Run the main function, and you should see the most similar texts to "Greetings!":

$ ./gradlew run

> Task :run
Hello, eLLMental!
Hello, world!
Hi!

Notice that the result outputs three entries because we specified the limit to be 3 in the mostSimilarEmbeddings function, but you can change this value to any number you want. Take into account that in the embeddings space, the database will calculate distances with every other embedding, so higher limits may return results that are not strictly similar to the query. Take into account that the list is ranked by similarity, so the first result is the most similar to the query and the latest is the least similar.

eLLMental ❤️ Springboot

If you prefer to use eLLMental from Springboot, you can always use the application.properties file to import your environment variables and just modify a little bit the code as seen below:

Importing env variables from application.properties

# application.properties

OPEN_AI_API_KEY=<your_openai_key>
PINECONE_API_KEY=<your_pinecone_key>
PINECONE_URL=<your_pinecone_url>
PINECONE_NAMESPACE=<your_pinecone_namespace>

Configuring EmbeddingsSpaceComponent

import com.theagilemonkeys.ellmental.EmbeddingsSpaceComponent;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class EllmentalConfiguration{

@Value("${OPEN_AI_API_KEY}")
private String openAiApiKey;

@Value("${PINECONE_API_KEY}")
private String pineconeApiKey;

@Value("${PINECONE_URL}")
private String pineconeUrl;

@Value("${PINECONE_NAMESPACE}")
private String pineconeNamespace;

private OpenAIEmbeddingsModel embeddingsModel() {
return new OpenAIEmbeddingsModel(openAiApiKey);
}

private PineconeVectorStore vectorStore() {
return new PineconeVectorStore(pineconeApiKey, pineconeUrl, pineconeNamespace);
}

// Usable public Bean
@Bean
public EmbeddingsSpaceComponent embeddingsSpaceComponent() {
return new EmbeddingsSpaceComponent(embeddingsModel(), vectorStore());
}
}

Autowiring EmbeddingsSpaceComponent

import com.theagilemonkeys.ellmental.EmbeddingsSpaceComponent;
import org.springframework.beans.factory.annotation.Autowired;

public class SomeServiceClass {
private final EmbeddingsSpaceComponent embeddingsSpaceComponent;

@Autowired
public SomeServiceClass(EmbeddingsSpaceComponent embeddingsSpaceComponent){
this.embeddingsSpaceComponent = embeddingsSpaceComponent;
}

// ...Here you can use embeddingsSpaceComponent
}

Next steps

Now that you've learned the basics, you can include eLLMental in your own projects and start experimenting. Try to generate embeddings for a larger corpus of texts like HTML files extracted from a web scraper process, a series of blog posts from your database or a collection of tweets from your Twitter account

- +
Skip to main content

Getting started

eLLMental is a library designed for building AI-powered applications written in Java, and it offers production-ready components that can be used right away in your current JVM projects. In this guide, we will showcase how to use the EmbeddingsSpaceComponent to find relevant text based on a query.

Step 1: Add the eLLMental dependencies

In eLLMental, we make use of JitPack to import eLLMental into our projects. Below there are some examples of how you can use it.

Gradle

Incorporate the eLLMental dependencies into your build.gradle file.

allprojects {
repositories {
maven { url 'https://jitpack.io' }
}
}

dependencies {
implementation 'com.github.theam:ellmental:main'
}

Maven

You can also add the eLLMental dependencies into your pom.xml file.

<repositories>
<repository>
<id>jitpack.io</id>
<url>https://jitpack.io</url>
</repository>
</repositories>

<dependency>
<groupId>com.github.theam</groupId>
<artifactId>eLLMental</artifactId>
<version>main</version>
</dependency>

Step 2: Initializing the EmbeddingsSpaceComponent

Before initializing the EmbeddingsSpaceComponent, set up the OpenAIEmbeddingsModel and PineconeVectorStore.

Retrieve the required API tokens and configuration parameters following the PineCone quickstart guide and OpenAI API keys guide.

import com.theagilemonkeys.ellmental.embeddingsmodel.openai.OpenAIEmbeddingsModel;
import com.theagilemonkeys.ellmental.vectorstore.pinecone.PineconeVectorStore;

public OpenAIEmbeddingsModel embeddingsModel() {
return new OpenAIEmbeddingsModel("OPEN_AI_API_KEY");
}

public PineconeVectorStore vectorStore() {
return new PineconeVectorStore("PINECONE_API_KEY", "PINECONE_URL", "PINECONE_NAMESPACE");
}

Now, initialize the EmbeddingsSpaceComponent:

import com.theagilemonkeys.ellmental.EmbeddingsSpaceComponent;

public EmbeddingsSpaceComponent initializeEmbeddingsSpace() {
return new EmbeddingsSpaceComponent(embeddingsModel(), vectorStore());
}

Step 3: Running the example

To run an example, you can write a simple main function:

public class MainApp {

public static void main(String[] args) {
EmbeddingsSpaceComponent embeddingsSpace = initializeEmbeddingsSpace();

// Add some embedding samples to the embeddings space.
embeddingsSpace.save("Hello, eLLMental!");
embeddingsSpace.save("Hello, world!");
embeddingsSpace.save("Hi!");
embeddingsSpace.save("Cats are cute");
embeddingsSpace.save("Dogs are loyal");
// You can provide Metadata to the `save` call too
Map<String, String> metadata = new HashMap<>();
metadata.put("key", "value");
embeddingsSpace.save("Hey there!", metadata)


// Search similar embeddings
List<Embedding> results = embeddingsSpace.mostSimilarEmbeddings("Greetings!", 3);
for (Embedding embedding : results) {
System.out.println(embedding.getText());
}
}
}

Run the main function, and you should see the most similar texts to "Greetings!":

$ ./gradlew run

> Task :run
Hello, eLLMental!
Hello, world!
Hi!

Notice that the result outputs three entries because we specified the limit to be 3 in the mostSimilarEmbeddings function, but you can change this value to any number you want. Take into account that in the embeddings space, the database will calculate distances with every other embedding, so higher limits may return results that are not strictly similar to the query. Take into account that the list is ranked by similarity, so the first result is the most similar to the query and the latest is the least similar.

eLLMental ❤️ Springboot

If you prefer to use eLLMental from Springboot, you can always use the application.properties file to import your environment variables and just modify a little bit the code as seen below:

Importing env variables from application.properties

# application.properties

OPEN_AI_API_KEY=<your_openai_key>
PINECONE_API_KEY=<your_pinecone_key>
PINECONE_URL=<your_pinecone_url>
PINECONE_NAMESPACE=<your_pinecone_namespace>

Configuring EmbeddingsSpaceComponent

import com.theagilemonkeys.ellmental.EmbeddingsSpaceComponent;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class EllmentalConfiguration{

@Value("${OPEN_AI_API_KEY}")
private String openAiApiKey;

@Value("${PINECONE_API_KEY}")
private String pineconeApiKey;

@Value("${PINECONE_URL}")
private String pineconeUrl;

@Value("${PINECONE_NAMESPACE}")
private String pineconeNamespace;

private OpenAIEmbeddingsModel embeddingsModel() {
return new OpenAIEmbeddingsModel(openAiApiKey);
}

private PineconeVectorStore vectorStore() {
return new PineconeVectorStore(pineconeApiKey, pineconeUrl, pineconeNamespace);
}

// Usable public Bean
@Bean
public EmbeddingsSpaceComponent embeddingsSpaceComponent() {
return new EmbeddingsSpaceComponent(embeddingsModel(), vectorStore());
}
}

Autowiring EmbeddingsSpaceComponent

import com.theagilemonkeys.ellmental.EmbeddingsSpaceComponent;
import org.springframework.beans.factory.annotation.Autowired;

public class SomeServiceClass {
private final EmbeddingsSpaceComponent embeddingsSpaceComponent;

@Autowired
public SomeServiceClass(EmbeddingsSpaceComponent embeddingsSpaceComponent){
this.embeddingsSpaceComponent = embeddingsSpaceComponent;
}

// ...Here you can use embeddingsSpaceComponent
}

Next steps

Now that you've learned the basics, you can include eLLMental in your own projects and start experimenting. Try to generate embeddings for a larger corpus of texts like HTML files extracted from a web scraper process, a series of blog posts from your database or a collection of tweets from your Twitter account

+ \ No newline at end of file diff --git a/index.html b/index.html index 0e06347..3b85a02 100644 --- a/index.html +++ b/index.html @@ -4,13 +4,13 @@ Introduction | eLLMental - +
-

Introduction

eLLMental is the ultimate library of components for building LLM-driven projects in the JVM.

Wanna try? Go straight to the Getting Started Guide, or keep reading to know more about eLLMental.

What can you do with eLLMental?

eLLMental is divided into components that can be installed and used independently. Here's a summary of the available functionality:

Embeddings Space Component

Embedding models are a special kind of Large Language Models (LLMs) that allow, given a piece of text, to calculate a large vector that represents a point in what we call the embeddings space. This embeddings space has the property that two pieces of text that are semantically related will be placed close to each other, allowing us to calculate a semantic distance between any two given pieces of text. Embeddings can be used to implement powerful search features that go beyond keyword matching, find related documents in a large database, or detect redundant information even when it's written in different ways.

The Embeddings Space Component provides straightforward interfaces to create and operate with embeddings, find the semantically closest documents to a given piece of text and many other operations. See the Embeddings Semantic Search Component documentation page for more details.

eLLMental Principles

These are the design principles behind eLLMental:

  1. Simplicity in Complexity: We aim to make the AI development process as simple and intuitive as any other library, hiding implementation details and glue code so the developer can focus on creating value.

  2. Readiness for Production: From development to deployment, all features of eLLMental are crafted with a production-ready mindset.

  3. Continuous Improvement: eLLMental continuously evolves for the better. With the support of our active community and dedicated team, we regularly add improvements and introduce new features.

Join the movement!

We'll need your help to build something that becomes really useful for everyone. There are many things you can do to contribute:

  1. Join the conversation in our Discord server.
  2. Send us suggestions, questions, or feature requests by creating a New Issue.
  3. Look at the Open Issues, choose one that interests you, and start contributing.
  4. Spread the word about eLLMental!
- +
Skip to main content

Introduction

eLLMental is the ultimate library of components for building LLM-driven projects in the JVM.

Wanna try? Go straight to the Getting Started Guide, or keep reading to know more about eLLMental.

What can you do with eLLMental?

eLLMental is divided into components that can be installed and used independently. Here's a summary of the available functionality:

Embeddings Space Component

Embedding models are a special kind of Large Language Models (LLMs) that allow, given a piece of text, to calculate a large vector that represents a point in what we call the embeddings space. This embeddings space has the property that two pieces of text that are semantically related will be placed close to each other, allowing us to calculate a semantic distance between any two given pieces of text. Embeddings can be used to implement powerful search features that go beyond keyword matching, find related documents in a large database, or detect redundant information even when it's written in different ways.

The Embeddings Space Component provides straightforward interfaces to create and operate with embeddings, find the semantically closest documents to a given piece of text and many other operations. See the Embeddings Semantic Search Component documentation page for more details.

eLLMental Principles

These are the design principles behind eLLMental:

  1. Simplicity in Complexity: We aim to make the AI development process as simple and intuitive as any other library, hiding implementation details and glue code so the developer can focus on creating value.

  2. Readiness for Production: From development to deployment, all features of eLLMental are crafted with a production-ready mindset.

  3. Continuous Improvement: eLLMental continuously evolves for the better. With the support of our active community and dedicated team, we regularly add improvements and introduce new features.

Join the movement!

We'll need your help to build something that becomes really useful for everyone. There are many things you can do to contribute:

  1. Join the conversation in our Discord server.
  2. Send us suggestions, questions, or feature requests by creating a New Issue.
  3. Look at the Open Issues, choose one that interests you, and start contributing.
  4. Spread the word about eLLMental!
+ \ No newline at end of file