Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Communication problem between frontend and backend in K8S Clúster #1085

Closed
ivanfr90 opened this issue Sep 23, 2024 · 6 comments
Closed

Communication problem between frontend and backend in K8S Clúster #1085

ivanfr90 opened this issue Sep 23, 2024 · 6 comments
Assignees

Comments

@ivanfr90
Copy link

Hi.

We are experiencing a issue in a deployment on a K8S clúster following the guide SC4NSMP-GUI

The pods are running fine without problems, but after access to the application through the browser, after a few seconds, a communication issue is displayed in the screen:

image

Also it's impossible to save any configuration, no action appears to have effect through the GUI.

The SC4SNMP GUI it's deployed in a K8S Clúster:

image

Environments:

  • Test on differents K8S Clústers with same problem
  • SC4SNMP Version: 1.11.0
  • UI Config:
UI:
  enable: true
  frontEnd:
    NodePort: 30001
    pullPolicy: "Always"
  backEnd:
    NodePort: 30002
    pullPolicy: "Always"
  valuesFileDirectory: "/opt/sc4snmp/gui"
  valuesFileName: ""
  keepSectionFiles: true

There're not detected issues:

Logs ui-backend-deployment:

Defaulted container "ui-backend" out of: ui-backend, patch-log-dirs (init)
[2024-09-16 07:42:24 +0000] [7] [INFO] Starting gunicorn 22.0.0
[2024-09-16 07:42:24 +0000] [7] [INFO] Listening at: http://0.0.0.0:5000 (7)
[2024-09-16 07:42:24 +0000] [7] [INFO] Using worker: sync
[2024-09-16 07:42:24 +0000] [8] [INFO] Booting worker with pid: 8

Logs ui-backend-worker-deployment:

 -------------- celery@ui-backend-worker-deployment-58cb744585-8hx94 v5.2.7 (dawn-chorus)
--- ***** -----
-- ******* ---- Linux-4.18.0-553.16.1.el8_10.x86_64-x86_64-with-glibc2.36 2024-09-16 07:42:16
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         SC4SNMP_UI_backend:0x7f19d47a05e0
- ** ---------- .> transport:   redis://snmp-redis-headless:6379/2
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> apply_changes    exchange=apply_changes(direct) key=apply_changes


[tasks]
  . SC4SNMP_UI_backend.apply_changes.tasks.run_job

[2024-09-16 07:42:17,556: INFO/MainProcess] Connected to redis://snmp-redis-headless:6379/2
[2024-09-16 07:42:17,720: INFO/MainProcess] mingle: searching for neighbors
[2024-09-16 07:42:18,821: INFO/MainProcess] mingle: sync with 1 nodes
[2024-09-16 07:42:18,821: INFO/MainProcess] mingle: sync complete
[2024-09-16 07:42:18,852: INFO/MainProcess] celery@ui-backend-worker-deployment-58cb744585-8hx94 ready.
[2024-09-16 07:42:48,844: INFO/MainProcess] missed heartbeat from celery@ui-backend-worker-deployment-5cc48f5c54-mm4jp

Logs ui-frontend-deployment:

yarn run v1.22.19
$ webpack-dev-server --config demo/webpack.standalone.config.js --port ${DEMO_PORT-8080} --host 0.0.0.0
<i> [webpack-dev-server] Project is running at:
<i> [webpack-dev-server] Loopback: http://localhost:30001/
<i> [webpack-dev-server] On Your Network (IPv4): http://10.4.4.4:30001/
<i> [webpack-dev-server] Content not from webpack is served from '/frontend/packages/manager/public' directory
(node:28) [DEP_WEBPACK_COMPILATION_ASSETS] DeprecationWarning: Compilation.assets will be frozen in future, all modifications are deprecated.
BREAKING CHANGE: No more changes should happen to Compilation.assets after sealing the Compilation.
        Do changes to assets earlier, e. g. in Compilation.hooks.processAssets.
        Make sure to select an appropriate stage from Compilation.PROCESS_ASSETS_STAGE_*.
(Use `node --trace-deprecation ...` to show where the warning was created)
assets by path *.otf 370 KiB
  asset 62d4d7d369292a9bf23762465ec6d704.otf 94.4 KiB [emitted] (auxiliary name: main)
  asset b4f9eb8ce027016ab9b9860817451d07.otf 93.7 KiB [emitted] (auxiliary name: main)
  asset 410504d49238e955ba7dc23a7f963021.otf 92.4 KiB [emitted] (auxiliary name: main)
  asset 6a386899746222073dd64c5f74d1a69d.otf 89.8 KiB [emitted] (auxiliary name: main)
asset main.js?7eac4df873559deeb723 12.4 MiB [emitted] [immutable] (name: main)
asset index.html 427 bytes [emitted]
runtime modules 27.5 KiB 14 modules
orphan modules 681 bytes [orphan] 1 module
modules by path ../../node_modules/ 4.07 MiB 550 modules
modules by path ./ 277 KiB
  modules by path ./src/ 249 KiB 43 modules
  modules by path ./node_modules/qs/lib/*.js 26.2 KiB
    ./node_modules/qs/lib/index.js 211 bytes [built] [code generated]
    ./node_modules/qs/lib/stringify.js 9.72 KiB [built] [code generated]
    ./node_modules/qs/lib/formats.js 476 bytes [built] [code generated]
    ./node_modules/qs/lib/parse.js 9.18 KiB [built] [code generated]
    ./node_modules/qs/lib/utils.js 6.66 KiB [built] [code generated]
  ./demo/demo.jsx 1.14 KiB [built] [code generated]
  ./util.inspect (ignored) 15 bytes [built] [code generated]
webpack 5.87.0 compiled successfully in 10136 ms
assets by status 370 KiB [cached] 4 assets
assets by status 12.4 MiB [emitted]
  assets by path *.js 12.4 MiB
    asset main.js?c25e383efe5ec4ddbffa 12.4 MiB [emitted] [immutable] (name: main)
    asset main.6c418420d48e2d666e73.hot-update.js 851 bytes [emitted] [immutable] [hmr] (name: main)
  asset index.html 427 bytes [emitted]
  asset main.6c418420d48e2d666e73.hot-update.json 28 bytes [emitted] [immutable] [hmr]
Entrypoint main 12.4 MiB (370 KiB) = main.js?c25e383efe5ec4ddbffa 12.4 MiB main.6c418420d48e2d666e73.hot-update.js 851 bytes 4 auxiliary assets
cached modules 4.34 MiB [cached] 601 modules
runtime modules 27.5 KiB 14 modules
webpack 5.87.0 compiled successfully in 279 ms

Thanks.

@ikheifets-splunk ikheifets-splunk self-assigned this Sep 30, 2024
@ikheifets-splunk
Copy link
Contributor

ikheifets-splunk commented Oct 1, 2024

Hello, @ivanfr90 !
Can we make a call regarding this issue? I have some questions. I will available this Friday 14:00-20:00 CET. Please send me invite

@ikheifets-splunk
Copy link
Contributor

ikheifets-splunk commented Oct 4, 2024

On call we discussed thing that we need to do:

  1. On browser go to inspector -> network and show what HTTP request is failling and attach response of this request
  2. Enable OTel to see logs in Splunk, because we haven't direct access on servers with SC4SNMP.
  3. Attach files that has been created in /opt/sc4snmp/gui by SC4SNMP-UI:
UI:
 enable: true
 frontEnd:
   NodePort: 30001
   pullPolicy: "Always"
 backEnd:
   NodePort: 30002
   pullPolicy: "Always"
 valuesFileDirectory: "/opt/sc4snmp/gui"
 valuesFileName: ""
 keepSectionFiles: true

@ikheifets-splunk
Copy link
Contributor

@ivanfr90 after that you will collect this info, please schedule the call on Thu/Fri

@ikheifets-splunk
Copy link
Contributor

@ivanfr90 please let me know, can you provide this details We need it to reproduce problem, after that we can schedule meet to troubleshoot it :)

@ivanfr90
Copy link
Author

Hi @ikheifets-splunk.

After follow your recomendations we could determinate that the problem was a network configuration issue in the K8S environment. After apply the needed changes, now it's working fine. I'm very grateful for your help and support.

You can mark this has resolved, thanks!!

@ikheifets-splunk
Copy link
Contributor

ikheifets-splunk commented Oct 21, 2024

@ivanfr90 thanks, closing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants