Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add notice on usage of local models #12298

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions src/main/java/module-info.java
Original file line number Diff line number Diff line change
Expand Up @@ -193,5 +193,6 @@
requires mslinks;
requires org.antlr.antlr4.runtime;
requires org.libreoffice.uno;
requires org.checkerframework.checker.qual;
// endregion
}
Original file line number Diff line number Diff line change
Expand Up @@ -35,9 +35,11 @@
<ChatPromptComponent fx:id="chatPrompt" HBox.hgrow="ALWAYS" />
</HBox>
<HBox alignment="CENTER" spacing="50">
<Label fx:id="noticeText"
text="%Current AI model: %0. The AI may generate inaccurate or inappropriate responses. Please verify any information provided."
BorderPane.alignment="CENTER"/>
<VBox fx:id="noticeVbox" spacing="5" alignment="CENTER">
<Label fx:id="noticeText"
text="%Current AI model: %0. This content is generated by an AI system. Please verify its accuracy and relevance before use."
Copy link
Member

@subhramit subhramit Dec 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the second sentence is redundant after the first is read. Also, the third sentence ends with "before use" - but accuracy can be verified only after the content is generated.
Based on chatgpt:

image

Three possible suggestions:

Suggested change
text="%Current AI model: %0. This content is generated by an AI system. Please verify its accuracy and relevance before use."
text="%Current AI model: %0. AI generated content may not always be accurate. Verify any important information."

Or:

Suggested change
text="%Current AI model: %0. This content is generated by an AI system. Please verify its accuracy and relevance before use."
text="%Current AI model: %0. Always verify the correctness of any important information."

Or:

Suggested change
text="%Current AI model: %0. This content is generated by an AI system. Please verify its accuracy and relevance before use."
text="%Current AI model: %0. Please always verify the correctness of AI-generated content."

and other permutations...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ha! I haven't looked at "Current model:...". But you are right. I rewrote the message a bit now, what do you think about it now?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes! This one reads good.

BorderPane.alignment="CENTER"/>
</VBox>
<Button alignment="CENTER"
onAction="#onClearChatHistory"
styleClass="icon-button,narrow"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ public class AiChatComponent extends VBox {
@FXML private Button notificationsButton;
@FXML private ChatPromptComponent chatPrompt;
@FXML private Label noticeText;
@FXML private VBox noticeVbox;

public AiChatComponent(AiService aiService,
StringProperty name,
Expand Down Expand Up @@ -109,6 +110,10 @@ private void initializeNotice() {
.replaceAll("%0", aiPreferences.getAiProvider().getLabel() + " " + aiPreferences.getSelectedChatModel());

noticeText.setText(newNotice);

if (aiPreferences.isUnsafeModelSelected()) {
noticeVbox.getChildren().add(new Label(Localization.lang("A custom or local AI model is used, JabRef is not responsible for the content generated by the model.")));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about the other models? Do we take responsibility? I hope not :).

Proposal: "A model not covered by the EU AI act is selected. Use with extra care."

}
}

private void initializeChatPrompt() {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ public PrivacyNoticeComponent(AiPreferences aiPreferences, Runnable onIAgreeButt

@FXML
private void initialize() {
addPrivacyHyperlink(aiPolicies, AiProvider.OPEN_AI);
addPrivacyHyperlink(aiPolicies, AiProvider.OPEN_AI_COMPATIBLE);
addPrivacyHyperlink(aiPolicies, AiProvider.MISTRAL_AI);
addPrivacyHyperlink(aiPolicies, AiProvider.GEMINI);
addPrivacyHyperlink(aiPolicies, AiProvider.HUGGING_FACE);
Expand Down
14 changes: 7 additions & 7 deletions src/main/java/org/jabref/gui/preferences/ai/AiTabViewModel.java
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ public AiTabViewModel(CliPreferences preferences) {

if (oldValue != null) {
switch (oldValue) {
case OPEN_AI -> {
case OPEN_AI_COMPATIBLE -> {
openAiChatModel.set(oldChatModel);
openAiApiKey.set(currentApiKey.get());
openAiApiBaseUrl.set(currentApiBaseUrl.get());
Expand Down Expand Up @@ -170,7 +170,7 @@ public AiTabViewModel(CliPreferences preferences) {
}

switch (newValue) {
case OPEN_AI -> {
case OPEN_AI_COMPATIBLE -> {
currentChatModel.set(openAiChatModel.get());
currentApiKey.set(openAiApiKey.get());
currentApiBaseUrl.set(openAiApiBaseUrl.get());
Expand Down Expand Up @@ -204,7 +204,7 @@ public AiTabViewModel(CliPreferences preferences) {
}

switch (selectedAiProvider.get()) {
case OPEN_AI -> openAiChatModel.set(newValue);
case OPEN_AI_COMPATIBLE -> openAiChatModel.set(newValue);
case MISTRAL_AI -> mistralAiChatModel.set(newValue);
case GEMINI -> geminiChatModel.set(newValue);
case HUGGING_FACE -> huggingFaceChatModel.set(newValue);
Expand All @@ -216,7 +216,7 @@ public AiTabViewModel(CliPreferences preferences) {

this.currentApiKey.addListener((observable, oldValue, newValue) -> {
switch (selectedAiProvider.get()) {
case OPEN_AI -> openAiApiKey.set(newValue);
case OPEN_AI_COMPATIBLE -> openAiApiKey.set(newValue);
case MISTRAL_AI -> mistralAiApiKey.set(newValue);
case GEMINI -> geminiAiApiKey.set(newValue);
case HUGGING_FACE -> huggingFaceApiKey.set(newValue);
Expand All @@ -226,7 +226,7 @@ public AiTabViewModel(CliPreferences preferences) {

this.currentApiBaseUrl.addListener((observable, oldValue, newValue) -> {
switch (selectedAiProvider.get()) {
case OPEN_AI -> openAiApiBaseUrl.set(newValue);
case OPEN_AI_COMPATIBLE -> openAiApiBaseUrl.set(newValue);
case MISTRAL_AI -> mistralAiApiBaseUrl.set(newValue);
case GEMINI -> geminiApiBaseUrl.set(newValue);
case HUGGING_FACE -> huggingFaceApiBaseUrl.set(newValue);
Expand Down Expand Up @@ -298,7 +298,7 @@ public AiTabViewModel(CliPreferences preferences) {

@Override
public void setValues() {
openAiApiKey.setValue(aiPreferences.getApiKeyForAiProvider(AiProvider.OPEN_AI));
openAiApiKey.setValue(aiPreferences.getApiKeyForAiProvider(AiProvider.OPEN_AI_COMPATIBLE));
mistralAiApiKey.setValue(aiPreferences.getApiKeyForAiProvider(AiProvider.MISTRAL_AI));
geminiAiApiKey.setValue(aiPreferences.getApiKeyForAiProvider(AiProvider.GEMINI));
huggingFaceApiKey.setValue(aiPreferences.getApiKeyForAiProvider(AiProvider.HUGGING_FACE));
Expand Down Expand Up @@ -351,7 +351,7 @@ public void storeSettings() {
aiPreferences.setHuggingFaceChatModel(huggingFaceChatModel.get() == null ? "" : huggingFaceChatModel.get());
aiPreferences.setGpt4AllChatModel(gpt4AllChatModel.get() == null ? "" : gpt4AllChatModel.get());

aiPreferences.storeAiApiKeyInKeyring(AiProvider.OPEN_AI, openAiApiKey.get() == null ? "" : openAiApiKey.get());
aiPreferences.storeAiApiKeyInKeyring(AiProvider.OPEN_AI_COMPATIBLE, openAiApiKey.get() == null ? "" : openAiApiKey.get());
aiPreferences.storeAiApiKeyInKeyring(AiProvider.MISTRAL_AI, mistralAiApiKey.get() == null ? "" : mistralAiApiKey.get());
aiPreferences.storeAiApiKeyInKeyring(AiProvider.GEMINI, geminiAiApiKey.get() == null ? "" : geminiAiApiKey.get());
aiPreferences.storeAiApiKeyInKeyring(AiProvider.HUGGING_FACE, huggingFaceApiKey.get() == null ? "" : huggingFaceApiKey.get());
Expand Down
14 changes: 7 additions & 7 deletions src/main/java/org/jabref/logic/ai/AiDefaultPreferences.java
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,11 @@

public class AiDefaultPreferences {
public enum PredefinedChatModel {
GPT_4O_MINI(AiProvider.OPEN_AI, "gpt-4o-mini", 128000),
GPT_4O(AiProvider.OPEN_AI, "gpt-4o", 128000),
GPT_4(AiProvider.OPEN_AI, "gpt-4", 8192),
GPT_4_TURBO(AiProvider.OPEN_AI, "gpt-4-turbo", 128000),
GPT_3_5_TURBO(AiProvider.OPEN_AI, "gpt-3.5-turbo", 16385),
GPT_4O_MINI(AiProvider.OPEN_AI_COMPATIBLE, "gpt-4o-mini", 128000),
GPT_4O(AiProvider.OPEN_AI_COMPATIBLE, "gpt-4o", 128000),
GPT_4(AiProvider.OPEN_AI_COMPATIBLE, "gpt-4", 8192),
GPT_4_TURBO(AiProvider.OPEN_AI_COMPATIBLE, "gpt-4-turbo", 128000),
GPT_3_5_TURBO(AiProvider.OPEN_AI_COMPATIBLE, "gpt-3.5-turbo", 16385),
OPEN_MISTRAL_NEMO(AiProvider.MISTRAL_AI, "open-mistral-nemo", 128000),
OPEN_MISTRAL_7B(AiProvider.MISTRAL_AI, "open-mistral-7b", 32000),
// "mixtral" is not a typo.
Expand Down Expand Up @@ -59,10 +59,10 @@ public String toString() {
public static final boolean AUTO_GENERATE_EMBEDDINGS = false;
public static final boolean AUTO_GENERATE_SUMMARIES = false;

public static final AiProvider PROVIDER = AiProvider.OPEN_AI;
public static final AiProvider PROVIDER = AiProvider.OPEN_AI_COMPATIBLE;

public static final Map<AiProvider, PredefinedChatModel> CHAT_MODELS = Map.of(
AiProvider.OPEN_AI, PredefinedChatModel.GPT_4O_MINI,
AiProvider.OPEN_AI_COMPATIBLE, PredefinedChatModel.GPT_4O_MINI,
AiProvider.MISTRAL_AI, PredefinedChatModel.OPEN_MIXTRAL_8X22B,
AiProvider.GEMINI, PredefinedChatModel.GEMINI_1_5_FLASH,
AiProvider.HUGGING_FACE, PredefinedChatModel.BLANK_HUGGING_FACE,
Expand Down
27 changes: 24 additions & 3 deletions src/main/java/org/jabref/logic/ai/AiPreferences.java
Original file line number Diff line number Diff line change
Expand Up @@ -392,7 +392,7 @@ public int getContextWindowSize() {
return contextWindowSize.get();
} else {
return switch (aiProvider.get()) {
case OPEN_AI -> AiDefaultPreferences.getContextWindowSize(AiProvider.OPEN_AI, openAiChatModel.get());
case OPEN_AI_COMPATIBLE -> AiDefaultPreferences.getContextWindowSize(AiProvider.OPEN_AI_COMPATIBLE, openAiChatModel.get());
case MISTRAL_AI -> AiDefaultPreferences.getContextWindowSize(AiProvider.MISTRAL_AI, mistralAiChatModel.get());
case HUGGING_FACE -> AiDefaultPreferences.getContextWindowSize(AiProvider.HUGGING_FACE, huggingFaceChatModel.get());
case GEMINI -> AiDefaultPreferences.getContextWindowSize(AiProvider.GEMINI, geminiChatModel.get());
Expand Down Expand Up @@ -516,7 +516,7 @@ public void addListenerToApiBaseUrls(Runnable runnable) {

public String getSelectedChatModel() {
return switch (aiProvider.get()) {
case OPEN_AI ->
case OPEN_AI_COMPATIBLE ->
openAiChatModel.get();
case MISTRAL_AI ->
mistralAiChatModel.get();
Expand All @@ -532,7 +532,7 @@ public String getSelectedChatModel() {
public String getSelectedApiBaseUrl() {
if (customizeExpertSettings.get()) {
return switch (aiProvider.get()) {
case OPEN_AI ->
case OPEN_AI_COMPATIBLE ->
openAiApiBaseUrl.get();
case MISTRAL_AI ->
mistralAiApiBaseUrl.get();
Expand Down Expand Up @@ -570,4 +570,25 @@ public String getTemplate(AiTemplate aiTemplate) {
public StringProperty templateProperty(AiTemplate aiTemplate) {
return templates.get(aiTemplate);
}

/**
* Returns whether the selected model is "not safe" to use. By "safe" it is meant that the model is remote model from
* reputable companies.
* <p>
* This function is made for <a href="https://eur-lex.europa.eu/eli/reg/2024/1689/oj">EU act on AI</a>. LLMs are high-risk systems and may generate harmful content.
* If user connects to a reputable remote model (OpenAI, Gemini, Mistral AI, etc.), then they are safe to use, as they
* are subject to the EU AI act. However, when user selects a local model or models from Hugging Face, then any model
* can be used in JabRef, including uncensored and harmful ones.
*/
public boolean isUnsafeModelSelected() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename to isSubjectToEuAiAct?

In all cases do NOT use negations in the method name. Use isSafeModelSelected.

// Any model can be chosen in GPT4All.
boolean localProvider = getAiProvider() == AiProvider.GPT4ALL;
// Any model can be chosen on HuggingFace.
boolean huggingFace = getAiProvider() == AiProvider.HUGGING_FACE;
// If user changed API base URL from default one, then probably user has connected to a local model provider,
// like `ollama` or `llama.cpp`.
boolean customApiBaseUrl = !getSelectedApiBaseUrl().equals(getAiProvider().getApiUrl());

return localProvider || huggingFace || customApiBaseUrl;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ private void rebuild() {
}

switch (aiPreferences.getAiProvider()) {
case OPEN_AI -> {
case OPEN_AI_COMPATIBLE -> {
langchainChatModel = Optional.of(new JvmOpenAiChatLanguageModel(aiPreferences, httpClient));
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -645,14 +645,14 @@ protected JabRefCliPreferences() {
defaults.put(AI_AUTO_GENERATE_EMBEDDINGS, AiDefaultPreferences.AUTO_GENERATE_EMBEDDINGS);
defaults.put(AI_AUTO_GENERATE_SUMMARIES, AiDefaultPreferences.AUTO_GENERATE_SUMMARIES);
defaults.put(AI_PROVIDER, AiDefaultPreferences.PROVIDER.name());
defaults.put(AI_OPEN_AI_CHAT_MODEL, AiDefaultPreferences.CHAT_MODELS.get(AiProvider.OPEN_AI).getName());
defaults.put(AI_OPEN_AI_CHAT_MODEL, AiDefaultPreferences.CHAT_MODELS.get(AiProvider.OPEN_AI_COMPATIBLE).getName());
defaults.put(AI_MISTRAL_AI_CHAT_MODEL, AiDefaultPreferences.CHAT_MODELS.get(AiProvider.MISTRAL_AI).getName());
defaults.put(AI_GEMINI_CHAT_MODEL, AiDefaultPreferences.CHAT_MODELS.get(AiProvider.GEMINI).getName());
defaults.put(AI_HUGGING_FACE_CHAT_MODEL, AiDefaultPreferences.CHAT_MODELS.get(AiProvider.HUGGING_FACE).getName());
defaults.put(AI_GPT_4_ALL_MODEL, AiDefaultPreferences.CHAT_MODELS.get(AiProvider.GPT4ALL).getName());
defaults.put(AI_CUSTOMIZE_SETTINGS, AiDefaultPreferences.CUSTOMIZE_SETTINGS);
defaults.put(AI_EMBEDDING_MODEL, AiDefaultPreferences.EMBEDDING_MODEL.name());
defaults.put(AI_OPEN_AI_API_BASE_URL, AiProvider.OPEN_AI.getApiUrl());
defaults.put(AI_OPEN_AI_API_BASE_URL, AiProvider.OPEN_AI_COMPATIBLE.getApiUrl());
defaults.put(AI_MISTRAL_AI_API_BASE_URL, AiProvider.MISTRAL_AI.getApiUrl());
defaults.put(AI_GEMINI_API_BASE_URL, AiProvider.GEMINI.getApiUrl());
defaults.put(AI_HUGGING_FACE_API_BASE_URL, AiProvider.HUGGING_FACE.getApiUrl());
Expand Down
5 changes: 3 additions & 2 deletions src/main/java/org/jabref/model/ai/AiProvider.java
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,10 @@

import java.io.Serializable;

import org.jabref.logic.l10n.Localization;

public enum AiProvider implements Serializable {
OPEN_AI("OpenAI", "https://api.openai.com/v1", "https://openai.com/policies/privacy-policy/"),
OPEN_AI_COMPATIBLE(Localization.lang("OpenAI (or API compatible)"), "https://api.openai.com/v1", "https://openai.com/policies/privacy-policy/"),
MISTRAL_AI("Mistral AI", "https://api.mistral.ai/v1", "https://mistral.ai/terms/#privacy-policy"),
GEMINI("Gemini", "https://generativelanguage.googleapis.com/v1beta/", "https://ai.google.dev/gemini-api/terms"),
HUGGING_FACE("Hugging Face", "https://huggingface.co/api", "https://huggingface.co/privacy"),
Expand Down Expand Up @@ -35,4 +37,3 @@ public String toString() {
return label;
}
}

4 changes: 3 additions & 1 deletion src/main/resources/l10n/JabRef_en.properties
Original file line number Diff line number Diff line change
Expand Up @@ -2610,7 +2610,9 @@ The\ attached\ file(s)\ are\ currently\ being\ processed\ by\ %0.\ Once\ complet
Waiting\ summary\ for\ %0...=Waiting summary for %0...
Additionally,\ we\ use\ Deep\ Java\ Library\ (DJL)\ embedding\ models\ for\ both\ chatting\ and\ summarization.\ The\ embedding\ model\ will\ be\ downloaded\ in\ background\ (size\ %0)\ from\ Deep\ Java\ Library\ servers\ anonymously.=Additionally, we use Deep Java Library (DJL) embedding models for both chatting and summarization. The embedding model will be downloaded in background (size %0) from Deep Java Library servers anonymously.
An\ API\ key\ has\ to\ be\ provided=An API key has to be provided
Current\ AI\ model\:\ %0.\ The\ AI\ may\ generate\ inaccurate\ or\ inappropriate\ responses.\ Please\ verify\ any\ information\ provided.=Current AI model: %0. The AI may generate inaccurate or inappropriate responses. Please verify any information provided.
A\ custom\ or\ local\ AI\ model\ is\ used,\ JabRef\ is\ not\ responsible\ for\ the\ content\ generated\ by\ the\ model.=A custom or local AI model is used, JabRef is not responsible for the content generated by the model.
Current\ AI\ model\:\ %0.\ This\ content\ is\ generated\ by\ an\ AI\ system.\ Please\ verify\ its\ accuracy\ and\ relevance\ before\ use.=Current AI model: %0. This content is generated by an AI system. Please verify its accuracy and relevance before use.
OpenAI\ (or\ API\ compatible)=OpenAI (or API compatible)
Delete\ message\ from\ chat\ history=Delete message from chat history
Generated\ at\ %0\ by\ %1=Generated at %0 by %1
Retry=Retry
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,14 +41,14 @@ void tearDown() {

@Test
void set() {
summariesStorage.set(bibPath, "citationKey", new Summary(LocalDateTime.now(), AiProvider.OPEN_AI, "model", "contents"));
summariesStorage.set(bibPath, "citationKey", new Summary(LocalDateTime.now(), AiProvider.OPEN_AI_COMPATIBLE, "model", "contents"));
reopen();
assertEquals(Optional.of("contents"), summariesStorage.get(bibPath, "citationKey").map(Summary::content));
}

@Test
void clear() {
summariesStorage.set(bibPath, "citationKey", new Summary(LocalDateTime.now(), AiProvider.OPEN_AI, "model", "contents"));
summariesStorage.set(bibPath, "citationKey", new Summary(LocalDateTime.now(), AiProvider.OPEN_AI_COMPATIBLE, "model", "contents"));
reopen();
summariesStorage.clear(bibPath, "citationKey");
reopen();
Expand Down
Loading