Skip to content

Commit

Permalink
[chore] format code and fix lint issues (#437)
Browse files Browse the repository at this point in the history
This PR formats the entire repo and fixes or suppresses all lint issues
by running the following commands:

- `npm run format`
- `npm run lint`
  • Loading branch information
Neet-Nestor committed May 30, 2024
1 parent 0ba1548 commit 8fe80bd
Show file tree
Hide file tree
Showing 61 changed files with 1,845 additions and 1,369 deletions.
8 changes: 5 additions & 3 deletions .eslintrc.cjs
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,15 @@ module.exports = {
root: true,
rules: {
"@typescript-eslint/no-explicit-any": "off",
"@typescript-eslint/no-empty-function": "off"
"@typescript-eslint/no-empty-function": "off",
"@typescript-eslint/no-non-null-assertion": "off",
},
overrides: [
{
"files": ["examples/**/*.js"],
"files": ["examples/**/*.js", "examples/**/*.ts"],
"rules": {
"no-undef": "off"
"no-undef": "off",
"@typescript-eslint/no-unused-vars": "off"
}
}
]
Expand Down
13 changes: 10 additions & 3 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,39 +8,46 @@ Please send a pull request if you find things that belongs to here.
Note that all examples below run in-browser and use WebGPU as a backend.

#### Project List

- [get-started](get-started): minimum get started example with chat completion.

[![Open on JSFiddle](https://img.shields.io/badge/open-JSFiddle-blue?logo=jsfiddle&logoColor=white)](https://jsfiddle.net/neetnestor/yac9gbwf/)
[![Open on Codepen](https://img.shields.io/badge/open-codepen-gainsboro?logo=codepen)](https://codepen.io/neetnestor/pen/NWVdgey)

- [simple-chat-js](simple-chat-js): a mininum and complete chat bot app in vanilla JavaScript.

[![Open on JSFiddle](https://img.shields.io/badge/open-JSFiddle-blue?logo=jsfiddle&logoColor=white)](https://jsfiddle.net/neetnestor/4nmgvsa2/)
[![Open on Codepen](https://img.shields.io/badge/open-codepen-gainsboro?logo=codepen)](https://codepen.io/neetnestor/pen/vYwgZaG)

- [simple-chat-ts](simple-chat-ts): a mininum and complete chat bot app in TypeScript.
- [get-started-web-worker](get-started-web-worker): same as get-started, but using web worker.
- [next-simple-chat](next-simple-chat): a mininum and complete chat bot app with [Next.js](https://nextjs.org/).
- [multi-round-chat](multi-round-chat): while APIs are functional, we internally optimize so that multi round chat usage can reuse KV cache

#### Advanced OpenAI API Capabilities

These examples demonstrate various capabilities via WebLLM's OpenAI-like API.

- [streaming](streaming): return output as chunks in real-time in the form of an AsyncGenerator
- [json-mode](json-mode): efficiently ensure output is in json format, see [OpenAI Reference](https://platform.openai.com/docs/guides/text-generation/chat-completions-api) for more.
- [json-schema](json-schema): besides guaranteeing output to be in JSON, ensure output to adhere to a specific JSON schema specified the user
- [function-calling](function-calling) (WIP): function calling with fields `tools` and `tool_choice`.
- [seed-to-reproduce](seed-to-reproduce): use seeding to ensure reproducible output with fields `seed`.

#### Chrome Extension

- [chrome-extension](chrome-extension): chrome extension that does not have a persistent background
- [chrome-extension-webgpu-service-worker](chrome-extension-webgpu-service-worker): chrome extension using service worker, hence having a persistent background

#### Others

- [logit-processor](logit-processor): while `logit_bias` is supported, we additionally support stateful logit processing where users can specify their own rules. We also expose low-level API `forwardTokensAndSample()`.
- [cache-usage](cache-usage): demonstrates how WebLLM supports both the [Cache API](https://developer.mozilla.org/en-US/docs/Web/API/Cache) and [IndexedDB cache](https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API), and
users can pick with `appConfig.useIndexedDBCache`. Also demonstrates various cache utils such as checking
whether a model is cached, deleting a model's weights from cache, deleting a model library wasm from cache, etc.
users can pick with `appConfig.useIndexedDBCache`. Also demonstrates various cache utils such as checking
whether a model is cached, deleting a model's weights from cache, deleting a model library wasm from cache, etc.
- [simple-chat-upload](simple-chat-upload): demonstrates how to upload local models to WebLLM instead of downloading via a URL link

## Demo Spaces

- [web-llm-embed](https://huggingface.co/spaces/matthoffner/web-llm-embed): document chat prototype using react-llm with transformers.js embeddings
- [web-llm-embed](https://huggingface.co/spaces/matthoffner/web-llm-embed): document chat prototype using react-llm with transformers.js embeddings
- [DeVinci](https://x6occ-biaaa-aaaai-acqzq-cai.icp0.io/): AI chat app based on WebLLM and hosted on decentralized cloud platform
3 changes: 1 addition & 2 deletions examples/cache-usage/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,9 @@ demonstrate the utility cache functions such as deleting models, checking if mod

For more information about the two caches, see: https://developer.mozilla.org/en-US/docs/Web/API/Storage_API/Storage_quotas_and_eviction_criteria#what_technologies_store_data_in_the_browser.

To inspect the downloaded artifacts in your browser, open up developer console, go to application,
To inspect the downloaded artifacts in your browser, open up developer console, go to application,
and you will find the artifacts under either `IndexedDB` or `Cache storage`.


To run the exapmle, you can do the following steps under this folder

```bash
Expand Down
38 changes: 19 additions & 19 deletions examples/cache-usage/src/cache_usage.html
Original file line number Diff line number Diff line change
@@ -1,24 +1,24 @@
<!DOCTYPE html>
<!doctype html>
<html>
<script>
webLLMGlobal = {}
</script>
<script>
webLLMGlobal = {};
</script>

<body>
<h2>WebLLM Test Page</h2>
Open console to see output
</br>
</br>
<label id="init-label"> </label>
<body>
<h2>WebLLM Test Page</h2>
Open console to see output
<br />
<br />
<label id="init-label"> </label>

<h3>Prompt</h3>
<label id="prompt-label"> </label>
<h3>Prompt</h3>
<label id="prompt-label"> </label>

<h3>Response</h3>
<label id="generate-label"> </label>
</br>
<label id="stats-label"> </label>
<h3>Response</h3>
<label id="generate-label"> </label>
<br />
<label id="stats-label"> </label>

<script type="module" src="./cache_usage.ts"></script>

</html>
<script type="module" src="./cache_usage.ts"></script>
</body>
</html>
14 changes: 7 additions & 7 deletions examples/chrome-extension-webgpu-service-worker/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,28 +2,28 @@

![Chrome Extension](https://github.com/mlc-ai/mlc-llm/assets/11940172/0d94cc73-eff1-4128-a6e4-70dc879f04e0)


> [!WARNING]
> Service worker support in WebGPU is enabled by default in [Chrome 124](https://chromiumdash.appspot.com/commit/8d78510e4aca5ac3cd8ee4a33e96b404eaa43246).
> If you are using Chrome 123, go to `chrome://flags/#enable-experimental-web-platform-features`, enable the `#enable-experimental-web-platform-features` flag, and **relaunch the browser**.
This example shows how we can create a Chrome extension using WebGPU and service worker.

- The project structure is as follows:
- `manifest.json`: A required file that lists important information about the structure and behavior of that extension. Here we are using manifest V3.
- `popup.ts`: Script of the extension pop-up window.
- `background.ts`: Script of the service worker. An extension service worker is loaded when it is needed, and unloaded when it goes dormant.
- `content.js`: Content script that interacts with DOM.
- `manifest.json`: A required file that lists important information about the structure and behavior of that extension. Here we are using manifest V3.
- `popup.ts`: Script of the extension pop-up window.
- `background.ts`: Script of the service worker. An extension service worker is loaded when it is needed, and unloaded when it goes dormant.
- `content.js`: Content script that interacts with DOM.
- Run

```bash
npm install
npm run build
```

This will create a new directory at `./dist/`. To load the extension into Chrome, go to Extensions > Manage Extensions and select Load Unpacked. Add the `./dist/` directory. You can now pin the extension to your toolbar and use it to chat with your favorite model!

**Note**: This example disables chatting using the contents of the active tab by default.
**Note**: This example disables chatting using the contents of the active tab by default.
To enable it, set `useContext` in `popup.ts` to `true`. More info about this feature can be found
[here](https://github.com/mlc-ai/web-llm/pull/190).
[here](https://github.com/mlc-ai/web-llm/pull/190).
However, if the web content is too large, it might run into issues. We recommend using `example.html` to
test this feature.
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
import { ExtensionServiceWorkerMLCEngineHandler, MLCEngine } from "@mlc-ai/web-llm";
import {
ExtensionServiceWorkerMLCEngineHandler,
MLCEngine,
} from "@mlc-ai/web-llm";

// Hookup an engine to a service worker handler
const engine = new MLCEngine();
Expand Down
10 changes: 5 additions & 5 deletions examples/chrome-extension-webgpu-service-worker/src/content.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
// Only the content script is able to access the DOM
chrome.runtime.onConnect.addListener(function(port) {
port.onMessage.addListener(function(msg) {
port.postMessage({contents: document.body.innerHTML});
});
});
chrome.runtime.onConnect.addListener(function (port) {
port.onMessage.addListener(function (msg) {
port.postMessage({ contents: document.body.innerHTML });
});
});
31 changes: 20 additions & 11 deletions examples/chrome-extension-webgpu-service-worker/src/example.html
Original file line number Diff line number Diff line change
@@ -1,11 +1,20 @@
In the year 2154, humanity had colonized several planets in the distant reaches of the galaxy. The planet of Xylophia-IV was one of the most remote and inhospitable, with temperatures often dropping to -200 degrees Celsius. Despite these harsh conditions, a team of scientists had established a research station on the planet to study the unique geological formations and exotic flora and fauna.

One day, while conducting a routine survey of the planet's surface, the team discovered an strange object buried deep in the ice. As they examined it closer, they realized it was a small, metallic capsule with a glowing blue symbol etched onto its surface.

The team's leader, a brilliant scientist named Dr. Maria Rodriguez, was immediately intrigued by the capsule's mysterious origins. She ordered her team to bring it back to the research station for further analysis.

After weeks of studying the capsule, the team finally cracked the code to the symbol etched onto its surface. It was a message from an alien race, warning Earth of an impending attack from an unknown threat.

The team was shocked and dismayed by the news, but they knew they had to act quickly to warn the rest of humanity. They transmitted the message to the nearest space station, which relayed it to Earth's government.

As the threat of attack loomed near, the team remained on high alert, ready to face whatever dangers lay ahead. They had uncovered a secrets of the universe, and now they were determined to protect their planet and its inhabitants at all costs.
In the year 2154, humanity had colonized several planets in the distant reaches
of the galaxy. The planet of Xylophia-IV was one of the most remote and
inhospitable, with temperatures often dropping to -200 degrees Celsius. Despite
these harsh conditions, a team of scientists had established a research station
on the planet to study the unique geological formations and exotic flora and
fauna. One day, while conducting a routine survey of the planet's surface, the
team discovered an strange object buried deep in the ice. As they examined it
closer, they realized it was a small, metallic capsule with a glowing blue
symbol etched onto its surface. The team's leader, a brilliant scientist named
Dr. Maria Rodriguez, was immediately intrigued by the capsule's mysterious
origins. She ordered her team to bring it back to the research station for
further analysis. After weeks of studying the capsule, the team finally cracked
the code to the symbol etched onto its surface. It was a message from an alien
race, warning Earth of an impending attack from an unknown threat. The team was
shocked and dismayed by the news, but they knew they had to act quickly to warn
the rest of humanity. They transmitted the message to the nearest space station,
which relayed it to Earth's government. As the threat of attack loomed near, the
team remained on high alert, ready to face whatever dangers lay ahead. They had
uncovered a secrets of the universe, and now they were determined to protect
their planet and its inhabitants at all costs.
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,5 @@
"service_worker": "background.ts",
"type": "module"
},
"permissions": [
"storage",
"tabs",
"webNavigation"
]
}
"permissions": ["storage", "tabs", "webNavigation"]
}
Loading

0 comments on commit 8fe80bd

Please sign in to comment.