Skip to content

Commit

Permalink
Merge pull request #138 from TitanStar73/patch-1
Browse files Browse the repository at this point in the history
Update README.md
  • Loading branch information
mxyng authored May 8, 2024
2 parents 2b66ade + e5c4799 commit a8cb34e
Showing 1 changed file with 14 additions and 14 deletions.
28 changes: 14 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ pip install ollama

```python
import ollama
response = ollama.chat(model='llama2', messages=[
response = ollama.chat(model='llama3', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
Expand All @@ -47,7 +47,7 @@ Response streaming can be enabled by setting `stream=True`, modifying function c
import ollama

stream = ollama.chat(
model='llama2',
model='llama3',
messages=[{'role': 'user', 'content': 'Why is the sky blue?'}],
stream=True,
)
Expand All @@ -63,13 +63,13 @@ The Ollama Python library's API is designed around the [Ollama REST API](https:/
### Chat

```python
ollama.chat(model='llama2', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])
ollama.chat(model='llama3', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])
```

### Generate

```python
ollama.generate(model='llama2', prompt='Why is the sky blue?')
ollama.generate(model='llama3', prompt='Why is the sky blue?')
```

### List
Expand All @@ -81,14 +81,14 @@ ollama.list()
### Show

```python
ollama.show('llama2')
ollama.show('llama3')
```

### Create

```python
modelfile='''
FROM llama2
FROM llama3
SYSTEM You are mario from super mario bros.
'''

Expand All @@ -98,31 +98,31 @@ ollama.create(model='example', modelfile=modelfile)
### Copy

```python
ollama.copy('llama2', 'user/llama2')
ollama.copy('llama3', 'user/llama3')
```

### Delete

```python
ollama.delete('llama2')
ollama.delete('llama3')
```

### Pull

```python
ollama.pull('llama2')
ollama.pull('llama3')
```

### Push

```python
ollama.push('user/llama2')
ollama.push('user/llama3')
```

### Embeddings

```python
ollama.embeddings(model='llama2', prompt='The sky is blue because of rayleigh scattering')
ollama.embeddings(model='llama3', prompt='The sky is blue because of rayleigh scattering')
```

## Custom client
Expand All @@ -135,7 +135,7 @@ A custom client can be created with the following fields:
```python
from ollama import Client
client = Client(host='http://localhost:11434')
response = client.chat(model='llama2', messages=[
response = client.chat(model='llama3', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
Expand All @@ -151,7 +151,7 @@ from ollama import AsyncClient

async def chat():
message = {'role': 'user', 'content': 'Why is the sky blue?'}
response = await AsyncClient().chat(model='llama2', messages=[message])
response = await AsyncClient().chat(model='llama3', messages=[message])

asyncio.run(chat())
```
Expand All @@ -164,7 +164,7 @@ from ollama import AsyncClient

async def chat():
message = {'role': 'user', 'content': 'Why is the sky blue?'}
async for part in await AsyncClient().chat(model='llama2', messages=[message], stream=True):
async for part in await AsyncClient().chat(model='llama3', messages=[message], stream=True):
print(part['message']['content'], end='', flush=True)

asyncio.run(chat())
Expand Down

0 comments on commit a8cb34e

Please sign in to comment.