-
Notifications
You must be signed in to change notification settings - Fork 3
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Crawler 10.5.0 fails to deploy #414
Comments
@ivanov-petar can you please contact Sovity because we need this ASAP? |
I'm not an expert on Flyway database migrations, but after internal consultation the log suggests the following:
The error may result from a corrupted database state:
This type of issue can arise if the database state was unintentionally corrupted, potentially due to external factors or incomplete migrations beforehand. Could you please try with a fresh new setup of databases? |
Hi Tim, |
Hi! Are you sure the Catalog Crawler is pointing to the correct database? Are you trying to start the Crawler without having the actual portal backend running? |
I am deploying the crawler on a local instance to test if it starts up before we deploy on our AP, so I only have the container of the crawler and an empty PostgreSQL database. So there are not tables existing. So I think my answer to all your questions is yes. Shall I deploy additional components to test it properly? (the full AP?) I am kind of afraid on deploying it directly on our working environment. |
Ah that explains the problem. To mitigate this, start up the Authority Portal Backend connected to the same database (local is fine, as long as it's the same DB) to let it create all necessary tables and go through the migrations and then start up the Crawler. It should then validate the DB schema successfully. |
Thanks @kamilczaja , so just to be sure, only the backend is enough, or all of its other dependencies, other than postgres, should be there as well (KC, DAPS-KC)? Would it work if I import a dump of our database on my local empty database? Perhaps this way would be easier to see if it actually works, my expectation would be to see some crawling in action. Please excuse me for any "silly" questions. |
Hey, Backend + Postgres (potentially also IAM Keycloak) should be enough. The backend is needed to populate the DB for the Crawler. Keep in mind though, if you don't spin up a DAPS, the Crawler will start but you won't see it actually working since it needs to talk to other connectors (which means it also needs to talk to the DAPS). Let me know if it worked out for you |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Description - What happened? *
Hi, we are running AP 4.1.4 and we are now trying to deploy the catalog-crawler. According to the release notes, this should be on version 10.5.0.
I am deploying the crawler through docker, but when the container starts, I am getting some error in regards of the database migrations, please see logs.
I am not familiar with flyway so I am not sure how to fix this. Shouldn't those migrations happen automatically?
Please let me know how to proceed with this.
Expected Behavior *
The crawler deploys and it is able to crawl the connectors registered on the specified environment.
Observed Behavior *
The crawler fails to deploy
Steps to Reproduce
No response
Context Information
No response
Relevant log output
Screenshots
No response
The text was updated successfully, but these errors were encountered: