Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deploying #4

Open
Shadoukun opened this issue Apr 23, 2017 · 6 comments
Open

deploying #4

Shadoukun opened this issue Apr 23, 2017 · 6 comments

Comments

@Shadoukun
Copy link

Sorry to bother you with newb stuff.

Is there any way I could get instructions for deploying this? iqdb/iqdbs and archives seem less straight-forward than danbooru is.

@evazion
Copy link

evazion commented Apr 23, 2017

I made some (very brief) notes when I set this up a while back. I'll just paste them below. The basic outline is to create the danbooru2_archive database, configure both archives and danbooru to use it, then sign up for amazon and make an SQS queue, then configure both archives and danbooru to use that too.

The final step is running RUN=1 bundle exec ruby services/sqs_processor.rb --pidfile=tmp/sqs.pid --logfile=stdout. This is the daemon that listens for post update messages from SQS and saves them to the database.

DB:
* git clone http://github.com/r888888888/archives
* bundle install
* cp .env-SAMPLE .env
* .env: configure POSTGRES_DB / POSTGRES_USER.
* configure same db for danbooru in danbooru/.env.local or danbooru/config/database.yml.
* bundle exec rake db:setup
* psql danbooru2_archive    # test that database works

SQS:
* sign up for aws
* create new sqs queue
* create iam group and assign sqs privs
* create iam user and save access key / secret access key
* .env: configure access key / secret access key.
* foreman start

test SQS:
* sudo dnf install awscli  # for fedora
* aws configure            # input access key / secret key
* aws sqs list-queues
* aws sqs send-message --queue-url "$(aws sqs get-queue-url --queue-name devbooru)" --message-body "test message"
* aws sqs receive-message --queue-url "$(aws sqs get-queue-url --queue-name devbooru)"
* configure danbooru/config/danbooru_local_config.rb and danbooru/config/database.yml
# archives/.env config file:

AMAZON_SQS_REGION=us-east-1
AMAZON_KEY=redacted
AMAZON_SECRET=redacted
SQS_ARCHIVES_URL=https://sqs.us-east-1.amazonaws.com/redacted/devbooru
POSTGRES_DB=danbooru2_archive
POSTGRES_USER=danbooru
RAILS_ENV=development
# danbooru/.env.local config file:

# These settings take precedence over config/unicorn/unicorn.rb.
export UNICORN_ROOT=/home/danbooru/src/danbooru
export UNICORN_TIMEOUT=60
export UNICORN_LOG=/dev/stdout

export SECRET_TOKEN=redacted
export SESSION_SECRET_KEY=redacted

# These settings take precedence over config/danbooru_local_config.rb.
export DANBOORU_APP_NAME="Devbooru"
export DANBOORU_HOSTNAME="devbooru.evazion.ml"
export DANBOORU_SOURCE_CODE_URL="https://github.com/evazion/danbooru"

export DANBOORU_IQDBS_AUTH_KEY="redacted"
export DANBOORU_IQDBS_SERVER="http://127.0.0.1:4567"
export DANBOORU_AWS_SQS_IQDB_URL="https://sqs.us-east-1.amazonaws.com/redacted/iqdb"

export DANBOORU_AWS_SQS_ARCHIVES_URL="https://sqs.us-east-1.amazonaws.com/redacted/devbooru"

export DANBOORU_AWS_ACCESS_KEY_ID="redacted"
export DANBOORU_AWS_SECRET_ACCESS_KEY="redacted"
export DANBOORU_AWS_SQS_REGION="us-east-1"

export GOOGLE_API_JSON_KEY_PATH="$UNICORN_ROOT/.google-key.json"
# danbooru/.env.development config file:

export UNICORN_LISTEN=0.0.0.0:3000
export UNICORN_PROCESSES=1
export DATABASE_URL="postgresql://localhost/danbooru2?pool=5&timeout=5000"
export RO_DATABASE_URL="postgresql://localhost/danbooru2"
export ARCHIVE_DATABASE_URL="postgresql://localhost/danbooru2_archive"

The "test SQS" step is optional, that just shows you how to confirm that SQS is set up properly.

@Shadoukun
Copy link
Author

Thanks, I think I got it! Amazing instructions.

@Shadoukun
Copy link
Author

Hey, sorry again lol, but I figured I'd ask here instead of making a second issue elsewhere.

You wouldn't happen to have similar instructions for iqdbs would you?

@evazion
Copy link

evazion commented May 15, 2017

Sorry, but it's been awhile since I installed iqdbs and I forgot to write down the procedure once I got it working. The basic idea is similar though: run bundle install, configure .env, create and configure an SQS queue, then run the commands in Procfile.

Also you'll have to compile and install iqdb itself; refer to the README for that. Danbooru's iqdb fork is a little outdated and I had to patch a few things to get it to compile cleanly. There's a newer release from 2016 at https://iqdb.org/code/ that might work better (I haven't try it yet).

@Shadoukun
Copy link
Author

Shadoukun commented May 15, 2017

Yeah I think I got it. Thanks a lot. the only issue I seem to have left is importing to it?

The script doesn't seem to work for me. Poking around leads me to believe that /script/fixes/029_iqdb_import.rb was prior to the creation of iqdbs? Correct me if I'm wrong

@evazion
Copy link

evazion commented May 15, 2017

It was. You'd have to script it yourself. I haven't done it myself but something like this might work:

#!/bin/sh

rails runner 'Post.all.pluck(:id, :md5).each { |id, md5| puts "#{id}:#{md5}.jpg" }' > files.txt
iqdb add iqdb.db < files.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants