- Pin cohere version #327
- Add support for OctoAI LLM and embeddings #301 (Thanks @ptorru!)
- Add Qdrant as a supported knowledge base #244 (Thanks @Anush008!)
Full Changelog: https://github.com/pinecone-io/canopy/compare/v0.8.1...v0.9.0
- Fix cohere tokenizer test #307
- Fix upsert on gcp starter indexes #308
- Stop calling with unused
preamble_override
param to cohere chat [#315]#315)
- Example for specifying and encoder to KB in README #302 (Thanks @coreation!)
- Remove JINA API key from mandatory env vars #303 (Thanks @aulorbe!)
Full Changelog: https://github.com/pinecone-io/canopy/compare/v0.8.0...v0.8.1
- Added support for Pydantic v2 #288
Full Changelog: https://github.com/pinecone-io/canopy/compare/v0.7.0...v0.8.0
- Move config directory to be part of the canopy package #278
- Fix building images on release #252
- Exporting the correct module CohereRecordEncoder #264 (Thanks @tomaarsen!)
- Fixed GRPC support #270
- Change the minimum version of FastAPI to 0.93.0 #279
- Reduce the docker image size #277
- Generalize chunk creation #258
- Add SentenceTransformersRecordEncoder #263 (Thanks @tomaarsen!)
- Add HybridRecordEncoder #265
- Make transformers optional & allow pinecone-text with dense optional #266
- Add cohere reranker #269
- Add dimension support for OpenAI embeddings #273
- Include config template files inside the package and add a CLI command to dump them #287
- Add contributing guide #254
- Update README #267 (Thanks @aulorbe!)
- Fixed typo in dense.py docstring #280 (Thanks @ptorru!)
Full Changelog: https://github.com/pinecone-io/canopy/compare/v0.6.0...v0.7.0
- Pinecone serverless support #246
- Loosen fastapi and uvicorn requirements #229
- Cleanup indexes in case of failure #232
- Add timeout to checking server health #236
- Add instruction query generator #226
- Separate LLM API params #231
- Add dockerfile #234, #237, #242
- Add support for namespaces #243
- Azure OpenAI LLM implementation #188 (Thanks @MichaelAnckaert, @aulorbe!)
- Add deployment guide (GCP) #239
Full Changelog: https://github.com/pinecone-io/canopy/compare/V0.5.0...v0.6.0
- Bump pytest-html version #213
- Improve dataloader error handling #182
- Slightly improve error handling for external errors #222
- Cohere Embedding model support #203 Thanks @jamescalam!
- Add Anyscale Embedding model support #198
- change max prompt tokens for Anyacle config #222
Full Changelog: https://github.com/pinecone-io/canopy/compare/V0.3.0...v0.5.0
- Fix some typos, add dev container, faux streaming #200 (Thanks @eburnette!)
- CLI requires OpenAI API key, even if OpenAI is not being used by#208
- CLI: read config file from env location#190 (Thanks @MichaelAnckaert!)
- Add document field explanations and python version badges #187
- Update README.md #192 (Thanks @tomer-w!)
- Tweaks to CLI help texts #193 (Thanks @jseldess!)
- Update README.md and change href #202
- Add Anyscale Endpoint support and Llama Tokenizer #173 (Thanks @kylehh!)
- Add last message query generator #210
Full Changelog: https://github.com/pinecone-io/canopy/compare/V0.2.0...V0.3.0
- Bug fix in E2E test that prevented running
pytest tests/
#175
- Added versioning to Canopy server's API #169
Full Changelog: https://github.com/pinecone-io/canopy/compare/V0.1.4...V0.2.0
- Fixed error when trying to run
canopy chat
on Windows #166 - Fixed
canopy stop
on Windows #166 - Update incorrect pinecone quick start path #168 (Thanks @abpai!)
- Edit description on pyproject.toml.
- Added the ability to load individual text files from a directory
- Bumped the
pinecone-text
dependency to fix a numpy dependency issue
- Readme fixes
- Initial release