Skip to content

ReliaChat, is a frontend app designed to have the easisest UI elements and massive app language support to chat with a locally hosted LLM.

License

Notifications You must be signed in to change notification settings

LTS-VVE/ReliaChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

99 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Please star our project ๐Ÿ™Œโœจโค๐Ÿ‘€!

Note

We have fixed the backend issue, and you can now officially chat with an LLM of your choice, at any time. (As long as the model is installed).

Important

CHANGELOG:
The new build for v.1.3.5 has fixed color issues that are not supported in flet 0.26.0 and 0.27.0.
Added new setup screen to finalize your experience, into a more friendly and customizable approach.
Planning to add model downloads directly into the setup or choosen to be skipped for Termux/iSH integration, as we are fine-tuning a model with the right data for the Albanian and English Language. (Note, that other mainstream models that are easy on mobile hardware like gemma, phi or other model's of you're choice by pasting the url of the model).

ReliaChat AI Innovations App (Source-Code)

Important

This project is under the GNU AFFERO GENERAL PUBLIC LICENSE v3.

Important

Please report issues on the github/gitlab issues tab or directly email us at: [email protected].


Version 1.3.4-BETA.

Note

NEW RELEASE V 1.3.4-BETA.

Tip

What's in the new release?
1. Word filtering has been improved. Words that include profanity, discrimination and illegal behaviors has been improved.

2. UI Changes. (Chatbubble changes, filter chatbubble, performance updates, user chatbubble changes, padding.)

3. Description update.

Important

We will add filtering for AI in the future releases.

Caution

ReliaChat AI Innovations, IS NOT RESPONSIBLE FOR ANY ACTIONS YOU MAKE USING THE AI APP. THE APP IS BUILD AS FRONT-END, AND HAS RESPONSIBILITY TO SEND REQUESTS LOCALLY ON ALREADY TRAINED MODELS (Such as Ollama Models, Hugging Face or external sources for customizability).
THESE MODELS ARE PRE-TRAINED BY OPEN-SOURCE COMMUNITIES. PLEASE BE SURE TO USE THE SOFTWARE PROVIDED BY ReliaChat AI Innovations, ETHICALLY.

Note

This filtering feature can be turned off via the setting menu.

Note

That, you can build it for iOS in your own machine, however you cannot install a pre-compiled IPA for iOS, therefore if no recources are avaible to you, we recommend you use cloud VPS(es) or wait for the official build for iOS devices.

Important

Please make sure you report issues that occurr in the app via github, gitlab or via our email at [email protected]. This will help us move forward with the project.

Tip

Please make sure you are within the EU to install (sideload) apps on iOS.

Builds

Android Build
iOS Build (Comming soon)

Note

Please remember that the iOS build is likely to come out on the first stable release.

What about ReliaChat?




Flet plus ollama



ReliaChat is an open-source app, made for a high school project, build using the Flet Flutter Python Framework to connect locally on a set port on your device via requests and the Termux/iSH (For android and iOS repectively.) app. It has a modern UI, Easy to use frontend, and easy command pasting for non-advanced users, while maintaining massive language support, easy of use, and being privacy respecting.



Warning

We recommned you follow the commands to install ollama on your mobile device (Android/iOS) platforms in order to chat with the desired or preset AI model.

Tip

Commands to install it for android:

pkg update -y && pkg upgrade -y
pkg install -y python curl wget openssh git golang cmake clang
python -m ensurepip --upgrade
pip install flask requests
git clone https://github.com/ollama/ollama.git
cd ollama
go generate ./...
go build .
./ollama pull gemma:2b
wget https://raw.githubusercontent.com/LTS-VVE/ReliaChat/main/backend/backend_server_for_mobile.py
echo -e "\033[32mโ €โ €โ €โ €โ €โ €โ €โ €โ €โ €โ €โข€โฃคโฃคโก€โ €โ €โ €โ €โ €โ €โ €โ €โ €โ €โ €
โ €โ €โ €โ €โ €โ €โฃฐโฃถโฃฆโฃคโฃคโฃพโฃฟโฃฟโฃฟโฃคโฃคโฃดโฃถโฃฆโ €โ €โ €โ €โ €โ €
โ €โ €โ €โ €โ €โข€โฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโก€โ €โ €โ €โ €โ €
โ €โ €โฃดโฃพโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃทโฃฆโ €โ €
โ €โ €โ ธโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโกฟโ ปโฃฟโฃฟโฃฟโฃฟโฃฟโกโ €โ €
โ €โข€โฃดโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโกฟโ ‹โ €โข€โฃผโฃฟโฃฟโฃฟโฃฟโฃฆโก€โ €
โ ฐโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโกฟโ ‹โ ปโฃฟโ Ÿโ ƒโ €โข€โฃดโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโ †
โ €โ ˆโ ปโฃฟโฃฟโฃฟโฃฟโฃฟโฃทโฃ„โ €โ €โ €โข€โฃดโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโ Ÿโ โ €
โ €โ €โฃฐโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃทโฃคโฃดโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃ†โ €โ €
โ €โ €โ ปโ ฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโ ฟโ Ÿโ €โ €
โ €โ €โ €โ €โ €โ ˆโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโฃฟโ โ €โ €โ €โ €โ €
โ €โ €โ €โ €โ €โ €โ นโ ฟโ ฟโ ›โ ›โขฟโฃฟโฃฟโฃฟโ ›โ ›โ ปโ ฟโ โ €โ €โ €โ €โ €โ €
โ €โ €โ €โ €โ €โ €โ €โ €โ €โ €โ €โ €โ ™โ ™โ โ €โ €โ €โ €โ €โ €โ €โ €โ €โ €โ € \033[0m"
echo -e "\033[32mInstallation and setup complete. Ollama and ReliaChat backend are ready to launch. Launching Server... Thank you for using ReliaChat! When attempting to relaunch the api server simply run python3 ~/ollama/backend_server_for_mobile.py or by going into the directory and running python3 backend_server_for_mobile.py. \033[0m"
./ollama serve &
python3 backend_server_for_mobile.py

Note

The UI has been costumized to feel modern and easy to use, however this is the very first version, and some UI featured may not work as expected.

How the app behaives when ran:

Note

UI On Sidebar.

oie_png(3) oie_png

Important

UI, may not work well, as it is in early development. Please report you're issues in the github/gitlab issues section or send us an email on [email protected].

Note

We are currently aware of the stack bug on mobile when tried to change the settings (such as not formated text entery container boxes).
We will fix this in the future release.

Note

Greeting based on system time.

image(7)

Note

Erase all chats function (Erasing All Chat History), and the Settings tab.

oie_png(2)

Note

How the AI responds visually.

oie_png(4)

Note

How the AI responds visually, when prompted a filtered/censored word or a string of characters containg the phrases.

oie_png(5)

How the App Works

New filtering System:

Warning

The code below is a demonstration of how it works, the word list is located on the frontend folder of this github repo.

# This script defines a function called `strict_content` which is used for content moderation.
# The function checks if any blocked words are present in the given content and returns True if any are found.
# This is useful for filtering out inappropriate or harmful content in user-generated inputs. ReliaChat AI Innovations, IS NOT RESPONSIBLE FOR ANY WARRNTY
# OR MISUSE ON THE SOFTWARE. THE SOFTWARE IS PROVIDED AS IS. FOR MORE INFO GO TO: https://github.com/LTS-VVE/ReliaChat/blob/main/LICENSE, TO VIEW THE LICENSE.

def strict_content(content):
    # Blocked Words.
    # NOTE THAT THIS SECTION OR FILE OF CODE IS SIMPLY FOR MODERATION, AND ReliaChat AI Innovations IS NOT PROMOTING, ADVERTISING, OR HAVING RELIABILITY
    # FOR ANY OF THE WORDS, PROVIDED FOR MODERATION.
    blocked_words = [
        "badword1", "badword2",
    ]
    
    # Currently there only is 2 langauges for profanity check. Please view our READ.ME on github or gitlab for more info.


    # Check if any blocked word is in the content
    for word in blocked_words:
        if word in content.lower():
            return True
    
    return False

In the list we can see an example that if the user inputs the word that is containted within the strict_content file in this case bardword1, badword2, the frontend will block all attemps being made to the ollama server for extra-protection. This is done that if the AI Model is expiriencing inaccurate, morally wrong responses or hallucinations, the filter will act as a safe-guard to protect the user from unethical use of AI. This feature can be toggled off, however it is recommended to be kept on, as AI LLM's that are small like gemma:2b, or others, do not have enough tokens to enforce moderate responses. Therefore we imported a simple system so the user prompts are regulated to enforce safety.

Warning

Please note, that we are currently adding AI moderation to the filter as well, currently only user requests are filtered which could prove to be unethical, again ReliaChat AI Innovations is not responsible for any ACTION, taken on YOU'RE DEVICE. THE SOFTWARE IS PROVIDED AS IS, FOR MORE INFO GO TO License.

Diagram of how the app computes.

%%{init: {
  'theme': 'dark',
  'themeVariables': {
    'fontFamily': 'arial',
    'fontSize': '16px',
    'lineColor': '#808080',
    'primaryColor': '#1f2937',
    'primaryTextColor': '#ffffff',
    'primaryBorderColor': '#404040',
    'secondaryColor': '#2d3748',
    'tertiaryColor': '#374151'
  }
}}%%
graph TD
    subgraph Termux["Termux/iSH Environment"]
        direction TB
        TermuxNode[("Mobile Linux/Unix Environment")]
        style TermuxNode fill:#1f2937,stroke:#404040
    end
    subgraph Client["Client Layer"]
        direction TB
        FrontendNode["Flet App<br/>Python UI Framework"]
        style FrontendNode fill:#2d3748,stroke:#404040
    end
    subgraph Moderation["Moderation Layer"]
        direction TB
        ModerationNode["Input Validation"]
        style ModerationNode fill:#1f2937,stroke:#404040
    end
    subgraph Server["Server Layer"]
        direction TB
        BackendNode["Flask Backend<br/>Port: 8000"]
        style BackendNode fill:#374151,stroke:#404040
    end
    subgraph AI["AI Service"]
        direction TB
        OllamaNode["Ollama LLM<br/>Port: 11434"]
        style OllamaNode fill:#1f2937,stroke:#404040
    end
    
    FrontendNode -->|"User Input"| Moderation
    Moderation -->|"Validated Input"| BackendNode
    BackendNode -->|"LLM Query"| OllamaNode
    OllamaNode -->|"AI Response"| BackendNode
    BackendNode -->|"JSON Result"| FrontendNode
    
    TermuxNode -.->|"Hosts"| BackendNode
    TermuxNode -.->|"Hosts"| OllamaNode
    
    classDef default fill:#2d3748,stroke:#404040,color:#fff
    classDef container fill:#1f2937,stroke:#404040,color:#fff
    
    class Termux,Client,Moderation,Server,AI container
Loading



Ollama Server Requests

We use the Requests via urllib since, building the app for iOS and Android with the Requests module fails.

import urllib.request

In contrast with defining to get the ollama (def get_ollama_response) response:

def get_ollama_response(message, ip, port, temperature, custom_endpoint):
    url = f"http://{ip}:{port}{custom_endpoint}"
    headers = {
        "Content-Type": "application/json",
    }
    data = {
        "prompt": message,
        "temperature": temperature
    }
    
    try:
        data_bytes = json.dumps(data).encode('utf-8')
        req = urllib.request.Request(
            url,
            data=data_bytes,
            headers=headers,
            method='POST'
        )
        
        response_data = ""
        with urllib.request.urlopen(req) as response:
            for line in response:
                line = line.decode('utf-8')
                if line.startswith("data:"):
                    response_part = json.loads(line[5:])['response']
                    response_data = response_part  # Update the response_data to the latest part
                    print(response_part)  # Print each part of the response for debugging
        return response_data
    except Exception as e:
        return f"Failed to connect to server at {url}. Please check the IP and Port. Error: {str(e)}"

In the code above the ip, port and endpoint are taken from settings.json to connect to the desired open IP Adress and Port, with the desried endpoint.
The function to save the ollama response via a confirmation dialog:

def save_settings_action(e):
            save_settings(ip_field.value, port_field.value, username_field.value, "light" if theme_toggle.value else "dark", float(temperature_input.value), custom_endpoint_field.value)
            dlg.open = False
            page.update()

If the dialog is open (dlg.open = False), then the dialog will not be shown, if value is set to True, it will show the settings there for showing the endpoint, port, and ip on the frontend. When the settings are saved it will grab them and place them in the {port/custom_endpoint/ip}. This will make a connection to the backend server (Flask Py Server) to route the ip to the default set IP, PORT and ENDPOINT.

Backend.

In the backend, we will route the Ollama server (ollama serve &), to the custom endpoint.

@app.route('/api/v1/query', methods=['POST'])
def query_model():
    data = request.get_json()
    prompt = data.get("prompt", "")

After the process is completed, we will gain 2 streaming responses (repeated) from the Flask server, therefore the code was modified to remove doubled streaming responses.

Note

Make sure to view your custom code carefully as the function could break easily, removing the casting response from the server.

Language Support Customization. (App layout).


As for language support you simply go to the translations.py file, and translate the functions to your preferred langauge.
Example:

"en": {
        "settings": "Settings",
        "help": "Help",
        "privacy_policy": "Privacy Policy",
        "about": "About",
        "terms_of_use": "Terms of Use",
        "erase_all_chats": "Erase All Chats",
    },

In the code, simply replace the "en" with your preferred language and provide translations inside the parantheses after the colon's :. (ex. "settings" : "Einstellungen"). And change the parantheses of "en" to any language (ex. "de")

Note

Other Featured like a time based greeting, animatined typewritter and blinking cursor are implemented, along with a copy button, chat history and transparent blurry chat bubbles. (Basic app amenities.)

Supported Languages.

Note

More languages will be added soon. (Any language can be added manually by editing the code, however on the main repo, the following are supported by default).

Note

NOTE, THAT THESE ARE APP LANGAUGES, AND NOT AI LANGUAGES. CHOOSE YOUR AI MODEL, IF YOU WANT MULTI-LANGUAL SUPPORT FOR THE RESPONSES, PLEASE USE A MULTI-LANGUAL MODEL. As mentioned earlier, moderation on applies for Albanian (Shqip), and English. Please add you're own moderation if wanted.

Language

Flag

English

๐Ÿ‡บ๐Ÿ‡ธ

Albanian

๐Ÿ‡ฆ๐Ÿ‡ฑ

German

๐Ÿ‡ฉ๐Ÿ‡ช

Danish

๐Ÿ‡ฉ๐Ÿ‡ฐ

Hungarian

๐Ÿ‡ญ๐Ÿ‡บ

Irish

๐Ÿ‡ฎ๐Ÿ‡ช

Italian

๐Ÿ‡ฎ๐Ÿ‡น

Norwegian

๐Ÿ‡ณ๐Ÿ‡ด

Ukrainian

๐Ÿ‡บ๐Ÿ‡ฆ

Romanian

๐Ÿ‡ท๐Ÿ‡ด

Russian

๐Ÿ‡ท๐Ÿ‡บ

Spanish

๐Ÿ‡ช๐Ÿ‡ธ

French

๐Ÿ‡ซ๐Ÿ‡ท

Swedish

๐Ÿ‡ธ๐Ÿ‡ช

Simplified Chinese

๐Ÿ‡จ๐Ÿ‡ณ

Cantonese (Traditional Chinese)

๐Ÿ‡ญ๐Ÿ‡ฐ

Japanese

๐Ÿ‡ฏ๐Ÿ‡ต

Korean

๐Ÿ‡ฐ๐Ÿ‡ท

Hindi

๐Ÿ‡ฎ๐Ÿ‡ณ

Tamil

๐Ÿ‡ฑ๐Ÿ‡ฐ

Hebrew

๐Ÿ‡ฎ๐Ÿ‡ฑ

Arabic

๐Ÿ‡ฆ๐Ÿ‡ช

Amharic

๐Ÿ‡ช๐Ÿ‡น

Swahili

๐Ÿ‡น๐Ÿ‡ฟ

Persian (Farsi)

๐Ÿ‡ฎ๐Ÿ‡ท

Nepali

๐Ÿ‡ณ๐Ÿ‡ต

Filipino

๐Ÿ‡ต๐Ÿ‡ญ

Bulgarian

๐Ÿ‡ง๐Ÿ‡ฌ

Thai

๐Ÿ‡น๐Ÿ‡ญ

Portuguese

๐Ÿ‡ต๐Ÿ‡น

Portuguese (Brazil)

๐Ÿ‡ง๐Ÿ‡ท

Indonesian

๐Ÿ‡ฎ๐Ÿ‡ฉ

Greek

๐Ÿ‡ฌ๐Ÿ‡ท

Croatian

๐Ÿ‡ญ๐Ÿ‡ท

Serbian

๐Ÿ‡ท๐Ÿ‡ธ

Finnish

๐Ÿ‡ซ๐Ÿ‡ฎ

Macedonian

๐Ÿ‡ฒ๐Ÿ‡ฐ

Polish

๐Ÿ‡ต๐Ÿ‡ฑ

Turkish

๐Ÿ‡น๐Ÿ‡ท

Georgian

๐Ÿ‡ฌ๐Ÿ‡ช

Kazakh

๐Ÿ‡ฐ๐Ÿ‡ฟ

Malay

๐Ÿ‡ฒ๐Ÿ‡พ

Vietnamese

๐Ÿ‡ป๐Ÿ‡ณ

Czech

๐Ÿ‡จ๐Ÿ‡ฟ

Latin

๐Ÿ‡บ๐Ÿ‡ณ

Tip

Hey! It looks like you actually read the documentation! We appriciate that!

About

ReliaChat, is a frontend app designed to have the easisest UI elements and massive app language support to chat with a locally hosted LLM.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages