Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM-based Honeypots issues #1729

Closed
MikeHorn-git opened this issue Jan 7, 2025 · 5 comments
Closed

LLM-based Honeypots issues #1729

MikeHorn-git opened this issue Jan 7, 2025 · 5 comments
Labels
cannot reproduce no basic support info Please follow the guidelines so we can help

Comments

@MikeHorn-git
Copy link

MikeHorn-git commented Jan 7, 2025

Introduction

Hello, I encounter issues with Beelzebub and Galah.
I don't know yet if they are mistakes on my side, Tpot bug or both.

⚠️ Basic support information (commands are expected to run as root)

We happily take the time to improve T-Pot and take care of things, but we need you to take the time to create an issue that provides us with all the information we need.

  • What OS are you T-Pot running on?
    Ubuntu Server
  • What is the version of the OS lsb_release -a and uname -a?
    Linux host 6.8.0-51-generic 52-Ubuntu SMP PREEMPT_DYNAMIC Thu Dec 5 13:09:44 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
  • What T-Pot version are you currently using (only T-Pot 24.04.x is currently supported)?
    24.04.1
  • What architecture are you running on (i.e. hardware, cloud, VM, etc.)?
    Virtualbox
  • Did you modify any scripts or configs? If yes, please attach the changes.
    Yes, let's start with the setup :

Env

BEELZEBUB_LLM_MODEL: "ollama"
BEELZEBUB_LLM_HOST: "http://localhost:11434/api/chat"
BEELZEBUB_OLLAMA_MODEL: "lamma3.2"

GALAH_LLM_PROVIDER: "ollama"
GALAH_LLM_SERVER_URL: "http://localhost:11434"
GALAH_LLM_MODEL: "llama3.2"

Ollama

Running with llama3.2 model at the moment. Here is the classic docker-compose I use :

---
services:
  ollama:
    image: ollama/ollama
    container_name: ollama
    restart: unless-stopped
    ports:
      - 11434:11434
    volumes:
      - ollama:/root/.ollama
volumes:
  ollama: null

Tpot Docker-compose

Except the comments on read-only for debug purposes. I have the default config :

Galah

galah:
    container_name: galah
    restart: always
    depends_on:
      tpotinit:
        condition: service_healthy
    networks:
    - galah_local
    ports:
    - 80:80
    - 443:443
    - 8443:8443
    - 8080:8080
    image: ${TPOT_REPO}/galah:${TPOT_VERSION}
    pull_policy: ${TPOT_PULL_POLICY}
    environment:
      LLM_PROVIDER: ${GALAH_LLM_PROVIDER}
      LLM_SERVER_URL: ${GALAH_LLM_SERVER_URL}
      LLM_MODEL: ${GALAH_LLM_MODEL}
    # read_only: true
    volumes:
    - ${TPOT_DATA_PATH}/galah/cache:/opt/galah/config/cache
    - ${TPOT_DATA_PATH}/galah/cert:/opt/galah/config/cert
    - ${TPOT_DATA_PATH}/galah/log:/opt/galah/log

Beelzebub

beelzebub:
    container_name: beelzebub
    restart: always
    depends_on:
      tpotinit:
        condition: service_healthy
    networks:
    - beelzebub_local
    ports:
    - '22:22'
    image: ${TPOT_REPO}/beelzebub:${TPOT_VERSION}
    pull_policy: ${TPOT_PULL_POLICY}
    environment:
      LLM_MODEL: ${BEELZEBUB_LLM_MODEL}
      LLM_HOST: ${BEELZEBUB_LLM_HOST}
      OLLAMA_MODEL: ${BEELZEBUB_OLLAMA_MODEL}
    read_only: true
    volumes:
    - ${TPOT_DATA_PATH}/beelzebub/key:/opt/beelzebub/configurations/key
    - ${TPOT_DATA_PATH}/beelzebub/log:/opt/beelzebub/configurations/log

Issues

Beelzebub

Steps to reproduce

sudo docker logs beelzebub 

██████  ███████ ███████ ██      ███████ ███████ ██████  ██    ██ ██████  
██   ██ ██      ██      ██         ███  ██      ██   ██ ██    ██ ██   ██ 
██████  █████   █████   ██        ███   █████   ██████  ██    ██ ██████  
██   ██ ██      ██      ██       ███    ██      ██   ██ ██    ██ ██   ██ 
██████  ███████ ███████ ███████ ███████ ███████ ██████   ██████  ██████  
Honeypot Framework, happy hacking!
{"commands":2,"level":"info","msg":"Init service: Wordpress 6.0 LLM","port":":80","timestamp":"2025-01-07T08:43:34Z"}
{"commands":1,"level":"info","msg":"Init service: Apache 401","port":":8080","timestamp":"2025-01-07T08:43:34Z"}
{"commands":1,"level":"info","msg":"GetInstance service ssh","port":":22","timestamp":"2025-01-07T08:43:34Z"}
{"commands":8,"level":"info","msg":"GetInstance service ssh","port":":2222","timestamp":"2025-01-07T08:43:34Z"}
{"banner":"8.0.29","level":"info","msg":"Init service tcp","port":":3306","timestamp":"2025-01-07T08:43:34Z"}
vagrant@host:~$ ssh root@localhost
root@localhost's password: 
root@ubuntu:~$ ls
command not found
root@ubuntu:~$ whoami
command not found
root@ubuntu:~$ exit
Connection to localhost closed.
vagrant@host:~$ sudo docker logs beelzebub 

██████  ███████ ███████ ██      ███████ ███████ ██████  ██    ██ ██████  
██   ██ ██      ██      ██         ███  ██      ██   ██ ██    ██ ██   ██ 
██████  █████   █████   ██        ███   █████   ██████  ██    ██ ██████  
██   ██ ██      ██      ██       ███    ██      ██   ██ ██    ██ ██   ██ 
██████  ███████ ███████ ███████ ███████ ███████ ██████   ██████  ██████  
Honeypot Framework, happy hacking!
{"commands":2,"level":"info","msg":"Init service: Wordpress 6.0 LLM","port":":80","timestamp":"2025-01-07T08:43:34Z"}
{"commands":1,"level":"info","msg":"Init service: Apache 401","port":":8080","timestamp":"2025-01-07T08:43:34Z"}
{"commands":1,"level":"info","msg":"GetInstance service ssh","port":":22","timestamp":"2025-01-07T08:43:34Z"}
{"commands":8,"level":"info","msg":"GetInstance service ssh","port":":2222","timestamp":"2025-01-07T08:43:34Z"}
{"banner":"8.0.29","level":"info","msg":"Init service tcp","port":":3306","timestamp":"2025-01-07T08:43:34Z"}
{"client":"SSH-2.0-OpenSSH_9.6p1 Ubuntu-3ubuntu13.5","dest_port":"22","level":"info","message":"New SSH attempt","msg":"New SSH attempt","password":"","protocol":"SSH","service":"SSH interactive LLM","session":"d8ed1e75-f177-4ea0-8a00-60161ff3f2be","src_ip":"172.25.0.1","src_port":"33776","status":"Stateless","timestamp":"2025-01-07T09:10:10Z","username":"root"}
{"client_version":"SSH-2.0-OpenSSH_9.6p1 Ubuntu-3ubuntu13.5","dest_port":"22","environ":"LANG=en_US.UTF-8","input":"","level":"info","message":"New SSH Session","msg":"New SSH Session","protocol":"SSH","service":"SSH interactive LLM","session":"72de046e-8fd6-4f85-bd5c-e616eb13e342","src_ip":"172.25.0.1","src_port":"33776","status":"Start","timestamp":"2025-01-07T09:10:10Z","username":"root"}
{"level":"error","msg":"Error ExecuteModel: ls, Post \"http://localhost:11434/api/chat\": dial tcp [::1]:11434: connect: connection refused","timestamp":"2025-01-07T09:10:11Z"}
{"dest_port":"22","input":"ls","input_duration":"1.16s","level":"info","message":"New SSH Terminal Session","msg":"New SSH Terminal Session","output":"command not found","protocol":"SSH","service":"SSH interactive LLM","session":"72de046e-8fd6-4f85-bd5c-e616eb13e342","src_ip":"172.25.0.1","src_port":"33776","status":"Interaction","timestamp":"2025-01-07T09:10:11Z"}
{"level":"error","msg":"Error ExecuteModel: whoami, Post \"http://localhost:11434/api/chat\": dial tcp [::1]:11434: connect: connection refused","timestamp":"2025-01-07T09:10:14Z"}
{"dest_port":"22","input":"whoami","input_duration":"3.09s","level":"info","message":"New SSH Terminal Session","msg":"New SSH Terminal Session","output":"command not found","protocol":"SSH","service":"SSH interactive LLM","session":"72de046e-8fd6-4f85-bd5c-e616eb13e342","src_ip":"172.25.0.1","src_port":"33776","status":"Interaction","timestamp":"2025-01-07T09:10:14Z"}
{"dest_port":"22","level":"info","message":"End SSH Session","msg":"End SSH Session","protocol":"SSH","session":"72de046e-8fd6-4f85-bd5c-e616eb13e342","session_duration":"5.53s","src_ip":"172.25.0.1","src_port":"33776","status":"End","timestamp":"2025-01-07T09:10:15Z"}
vagrant@host:~$ curl -X POST http://localhost:11434/api/chat
{"error":"missing request body"}vagrant@host:~$

Ollama Docker logs

vagrant@host:~$ sudo docker logs ollama
[...]
[GIN-debug] POST   /api/pull                 --> github.com/ollama/ollama/server.(*Server).PullHandler-fm (5 handlers)
[GIN-debug] POST   /api/generate             --> github.com/ollama/ollama/server.(*Server).GenerateHandler-fm (5 handlers)
[GIN-debug] POST   /api/chat                 --> github.com/ollama/ollama/server.(*Server).ChatHandler-fm (5 handlers)
[GIN-debug] POST   /api/embed                --> github.com/ollama/ollama/server.(*Server).EmbedHandler-fm (5 handlers)
[GIN-debug] POST   /api/embeddings           --> github.com/ollama/ollama/server.(*Server).EmbeddingsHandler-fm (5 handlers)
[GIN-debug] POST   /api/create               --> github.com/ollama/ollama/server.(*Server).CreateHandler-fm (5 handlers)
[GIN-debug] POST   /api/push                 --> github.com/ollama/ollama/server.(*Server).PushHandler-fm (5 handlers)
[GIN-debug] POST   /api/copy                 --> github.com/ollama/ollama/server.(*Server).CopyHandler-fm (5 handlers)
[GIN-debug] DELETE /api/delete               --> github.com/ollama/ollama/server.(*Server).DeleteHandler-fm (5 handlers)
[GIN-debug] POST   /api/show                 --> github.com/ollama/ollama/server.(*Server).ShowHandler-fm (5 handlers)
[GIN-debug] POST   /api/blobs/:digest        --> github.com/ollama/ollama/server.(*Server).CreateBlobHandler-fm (5 handlers)
[GIN-debug] HEAD   /api/blobs/:digest        --> github.com/ollama/ollama/server.(*Server).HeadBlobHandler-fm (5 handlers)
[GIN-debug] GET    /api/ps                   --> github.com/ollama/ollama/server.(*Server).PsHandler-fm (5 handlers)
[GIN-debug] POST   /v1/chat/completions      --> github.com/ollama/ollama/server.(*Server).ChatHandler-fm (6 handlers)
[GIN-debug] POST   /v1/completions           --> github.com/ollama/ollama/server.(*Server).GenerateHandler-fm (6 handlers)
[GIN-debug] POST   /v1/embeddings            --> github.com/ollama/ollama/server.(*Server).EmbedHandler-fm (6 handlers)
[GIN-debug] GET    /v1/models                --> github.com/ollama/ollama/server.(*Server).ListHandler-fm (6 handlers)
[GIN-debug] GET    /v1/models/:model         --> github.com/ollama/ollama/server.(*Server).ShowHandler-fm (6 handlers)
[GIN-debug] GET    /                         --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func1 (5 handlers)
[GIN-debug] GET    /api/tags                 --> github.com/ollama/ollama/server.(*Server).ListHandler-fm (5 handlers)
[GIN-debug] GET    /api/version              --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
[GIN-debug] HEAD   /                         --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func1 (5 handlers)
[GIN-debug] HEAD   /api/tags                 --> github.com/ollama/ollama/server.(*Server).ListHandler-fm (5 handlers)
[GIN-debug] HEAD   /api/version              --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
time=2025-01-07T08:41:14.570Z level=INFO source=routes.go:1339 msg="Dynamic LLM libraries" runners="[cuda_v12_avx cpu cpu_avx cpu_avx2 cuda_v11_avx]"
time=2025-01-07T08:41:14.571Z level=INFO source=gpu.go:226 msg="looking for compatible GPUs"
time=2025-01-07T08:41:14.597Z level=INFO source=gpu.go:392 msg="no compatible GPUs were discovered"
time=2025-01-07T08:41:14.597Z level=INFO source=types.go:131 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="11.7 GiB" available="10.3 GiB"
[GIN] 2025/01/07 - 09:16:02 | 400 |    2.435106ms |   192.168.144.1 | POST     "/api/chat"
[GIN] 2025/01/07 - 09:38:05 | 400 |      59.913µs |   192.168.144.1 | POST     "/api/chat"

Questions

I don't understand why this log line : {"level":"error","msg":"Error ExecuteModel: ls, Post \"http://localhost:11434/api/chat\": dial tcp [::1]:11434: connect: connection refused","timestamp":"2025-01-07T09:10:11Z"}
I can reach the ollama endpoint like show in the steps to reproduce part. And I can ssh / interact with beelzebub.

Galah

Steps to reproduce

sudo docker logs galah

 ██████   █████  ██       █████  ██   ██ 
██       ██   ██ ██      ██   ██ ██   ██ 
██   ███ ███████ ██      ███████ ███████ 
██    ██ ██   ██ ██      ██   ██ ██   ██ 
 ██████  ██   ██ ███████ ██   ██ ██   ██ 
  llm-based web honeypot // version 1.0
  	author: Adel "0x4D31" Karimi

time="2025-01-07T09:50:06Z" level=info msg="starting HTTPS server on port 8443 with TLS profile: tls_profile1"
time="2025-01-07T09:50:06Z" level=info msg="starting HTTP server on port 80"
time="2025-01-07T09:50:06Z" level=info msg="starting HTTP server on port 8080"
time="2025-01-07T09:50:06Z" level=info msg="starting HTTPS server on port 443 with TLS profile: tls_profile1"
vagrant@host:~$ curl http://127.0.0.1:80
401 Unauthorizedvagrant@host:~$ curl https://127.0.0.1:443
curl: (60) SSL certificate problem: self-signed certificate
More details here: https://curl.se/docs/sslcerts.html

curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
vagrant@host:~$ sudo docker logs galah

 ██████   █████  ██       █████  ██   ██ 
██       ██   ██ ██      ██   ██ ██   ██ 
██   ███ ███████ ██      ███████ ███████ 
██    ██ ██   ██ ██      ██   ██ ██   ██ 
 ██████  ██   ██ ███████ ██   ██ ██   ██ 
  llm-based web honeypot // version 1.0
  	author: Adel "0x4D31" Karimi

time="2025-01-07T09:50:06Z" level=info msg="starting HTTPS server on port 8443 with TLS profile: tls_profile1"
time="2025-01-07T09:50:06Z" level=info msg="starting HTTP server on port 80"
time="2025-01-07T09:50:06Z" level=info msg="starting HTTP server on port 8080"
time="2025-01-07T09:50:06Z" level=info msg="starting HTTPS server on port 443 with TLS profile: tls_profile1"
time="2025-01-07T09:51:26Z" level=info msg="port 80 received a request for \"/\", from source 192.168.80.1:47278"
time="2025-01-07T09:51:26Z" level=info msg="sent the response to 192.168.80.1:47278 (source: static)"
time="2025-01-07T09:51:26Z" level=error msg="error getting enrichment info for \"192.168.80.1\": lookup 1.80.168.192.in-addr.arpa. on 127.0.0.11:53: no such host"
2025/01/07 09:51:31 http: TLS handshake error from 192.168.80.1:53132: local error: tls: bad record MAC

Galah Docker debug

vagrant@host:~$ sudo docker exec -it galah sh
/opt/galah $ cat /etc/resolv.conf 
# Generated by Docker Engine.
# This file can be edited; Docker Engine will not make further changes once it
# has been modified.

nameserver 127.0.0.11
search .
options ndots:0

# Based on host file: '/etc/resolv.conf' (internal resolver)
# ExtServers: [10.0.2.3 195.36.145.100 195.36.228.100]
# Overrides: []
# Option ndots from: internal
/opt/galah $ nslookup google.com
Server:		127.0.0.11
Address:	127.0.0.11:53

Non-authoritative answer:
Name:	google.com
Address: 2a00:1450:4007:819::200e

Non-authoritative answer:
Name:	google.com
Address: 172.217.20.206

/opt/galah $ nslookup 8.8.8.8
Server:		127.0.0.11
Address:	127.0.0.11:53

Non-authoritative answer:
8.8.8.8.in-addr.arpa	name = dns.google

/opt/galah $ nslookup 192.168.80.1
Server:		127.0.0.11
Address:	127.0.0.11:53

** server can't find 1.80.168.192.in-addr.arpa: NXDOMAIN

/opt/galah $ exit
vagrant@host:~$ ip a | grep -A 5 "192.168.80.1"
    inet 192.168.80.1/20 brd 192.168.95.255 scope global br-85fb2a77e994
       valid_lft forever preferred_lft forever
    inet6 fe80::42:92ff:fec7:947/64 scope link 
       valid_lft forever preferred_lft forever

Galah Docker network

sudo docker network inspect tpotce_galah_local
[
    {
        "Name": "tpotce_galah_local",
        "Id": "85fb2a77e994eb2f74f24a008a9049463e8957ea3bc14de9d04df7e64dc516ec",
        "Created": "2025-01-07T10:49:57.918101258+01:00",
        "Scope": "local",
        "Driver": "bridge",
        "EnableIPv6": false,
        "IPAM": {
            "Driver": "default",
            "Options": null,
            "Config": [
                {
                    "Subnet": "192.168.80.0/20",
                    "Gateway": "192.168.80.1"
                }
            ]
        },
        "Internal": false,
        "Attachable": false,
        "Ingress": false,
        "ConfigFrom": {
            "Network": ""
        },
        "ConfigOnly": false,
        "Containers": {
            "99ebb680b1a948e879fc7705e3fb24994ccdd48cc932661d2992f2e9b0fcb380": {
                "Name": "galah",
                "EndpointID": "78d969a4ace0723cac83faf8d8408c4c31d0020a334ba11ef94d8a9278137391",
                "MacAddress": "02:42:c0:a8:50:02",
                "IPv4Address": "192.168.80.2/20",
                "IPv6Address": ""
            }
        },
        "Options": {},
        "Labels": {
            "com.docker.compose.config-hash": "7565fc9455f97f6dafe514a1bfd7b85b2854e077656f4d920473c416a3d86af7",
            "com.docker.compose.network": "galah_local",
            "com.docker.compose.project": "tpotce",
            "com.docker.compose.version": "2.32.1"
        }
    }
]

Questions

I don't understand the reverse DNS lookup fail with Docker internal DNS. Here : time="2025-01-07T09:51:26Z" level=error msg="error getting enrichment info for \"192.168.80.1\": lookup 1.80.168.192.in-addr.arpa. on 127.0.0.11:53: no such host"

By the way, we can spot a TLS handshake error : 2025/01/07 09:51:31 http: TLS handshake error from 192.168.80.1:53132: local error: tls: bad record MAC Since is recent from the last c45cda4f7091b84170319feb6d54a787dcd9a182 commit, I don't dig on it yet.

Conclusion

Thanks for your work.
Feel free to ask if you need more informations.
Cheers.

@github-actions github-actions bot added the no basic support info Please follow the guidelines so we can help label Jan 7, 2025
@t3chn0m4g3
Copy link
Member

t3chn0m4g3 commented Jan 7, 2025

Please check your Ollama setup / settings, i.e.:
BEELZEBUB_OLLAMA_MODEL: "lamma3.2" => BEELZEBUB_OLLAMA_MODEL: "llama3.2"

It seems there is a communication issue between the containers and your Ollama installation, otherwise our installations work perfectly fine.

Also I recommend using the default models from the .env.

@MikeHorn-git
Copy link
Author

Thanks for this Beelzebub settings typo.

What is your Ollama setup for your tests ?

Ok for testing default .env before.

@t3chn0m4g3
Copy link
Member

On the testing environment we are using the Ollama docker image with GPU support, running on a separate instance with a FQDN and on port tcp/11434. That's it, basically running everything on defaults with the FQDN in the .env adjusted of course.

@t3chn0m4g3
Copy link
Member

t3chn0m4g3 commented Jan 7, 2025

Just noticed, you cannot use localhost, as the container will try to connect locally, not the host. Either use a FQDN or a routable IP. That should do it.

@MikeHorn-git
Copy link
Author

Just noticed, you cannot use localhost, as the container will try to connect locally, not the host. Either use a FQDN or a routable IP. That should do it.

Thanks that work. I'm going to try ollama on another instance instead of tpot itself etc. Thanks for your quick responses.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cannot reproduce no basic support info Please follow the guidelines so we can help
Projects
None yet
Development

No branches or pull requests

2 participants