Skip to content
This repository has been archived by the owner on Mar 27, 2022. It is now read-only.

ERROR: (gcloud.compute.ssh) could not fetch resource: The resource 'projects/recommendationshashiproject/zones/us-central1-a/instances/hadoop-m' was not found #105

Open
shashi4414 opened this issue Oct 26, 2017 · 5 comments

Comments

@shashi4414
Copy link

shashi4414 commented Oct 26, 2017

Hi All,
I am creating project recommendation on google cloud, in that I am setting up
Download the bdutil repository through below link
git clone https://github.com/GoogleCloudPlatform/bdutil.git
Set the variable in bdutil_env.sh file
Project ID, configbucket and ZONE.
CONFIGBUCKET='unilogrootbucket'(I have created a empty bucket ihttps://console.cloud.google.com/storage)
PROJECT='recommendationshashiproject'(https://console.cloud.google.com/home/)
GCE_ZONE='us-central1-a'

Below is the command used to run bdutil
. /bdutil deploy -e extensions/spark/spark_env.sh

. /bdutil shell

But I got error like
ERROR: (gcloud.compute.ssh) could not fetch resource:
The resource 'projects/recommendationshashiproject/zones/us-central1-a/instances/hadoop-m' was not found
complete error log
shashi_kumarmirle@recommendationshashiproject:~/bdutil-master$ ./bdutil deploy -e extensions/spark/spark_env.sh

Thu Oct 26 17:41:33 IST 2017: Using local tmp dir for staging files: /tmp/bdutil-20171026-174133-7kt
Thu Oct 26 17:41:33 IST 2017: Using custom environment-variable file(s): bdutil_env.sh extensions/spark/spark_env.sh
Thu Oct 26 17:41:33 IST 2017: Reading environment-variable file: ./bdutil_env.sh
Thu Oct 26 17:41:33 IST 2017: Reading environment-variable file: extensions/spark/spark_env.sh
Thu Oct 26 17:41:33 IST 2017: No explicit GCE_MASTER_MACHINE_TYPE provided; defaulting to value of GCE_MACHINE_TYPE: n1-standard-4
Deploy cluster with following settings?
CONFIGBUCKET='unilogrootbucket'
PROJECT='recommendationshashiproject'
GCE_IMAGE=''
GCE_IMAGE_PROJECT='debian-cloud'
GCE_IMAGE_FAMILY='debian-8'
GCE_ZONE='us-central1-a'
GCE_NETWORK='default'
GCE_TAGS='bdutil'
PREEMPTIBLE_FRACTION=0.0
PREFIX='hadoop'
NUM_WORKERS=2
MASTER_HOSTNAME='hadoop-m'
WORKERS='hadoop-w-0 hadoop-w-1'
BDUTIL_GCS_STAGING_DIR='gs://unilogrootbucket/bdutil-staging/hadoop-m'
(y/n) y
Thu Oct 26 17:41:36 IST 2017: Checking for existence of gs://unilogrootbucket...
gs://unilogrootbucket/
Thu Oct 26 17:41:38 IST 2017: Checking for existence of gs://hadoop-dist/hadoop-1.2.1-bin.tar.gz...
Thu Oct 26 17:41:41 IST 2017: Checking upload files...
Thu Oct 26 17:41:41 IST 2017: Verified './conf/hadoop1/bq-mapred-template.xml'
Thu Oct 26 17:41:41 IST 2017: Verified './conf/hadoop1/core-template.xml'
Thu Oct 26 17:41:41 IST 2017: Verified './conf/hadoop1/gcs-core-template.xml'
Thu Oct 26 17:41:41 IST 2017: Verified './conf/hadoop1/hdfs-template.xml'
Thu Oct 26 17:41:41 IST 2017: Verified './conf/hadoop1/mapred-health-check.sh'
Thu Oct 26 17:41:41 IST 2017: Verified './conf/hadoop1/mapred-template.xml'
Thu Oct 26 17:41:41 IST 2017: Verified './libexec/hadoop_helpers.sh'
Thu Oct 26 17:41:41 IST 2017: Generating 10 command groups...
Thu Oct 26 17:41:41 IST 2017: Done generating remote shell scripts.
Thu Oct 26 17:41:41 IST 2017: Creating worker instances: hadoop-w-0 hadoop-w-1
..Thu Oct 26 17:41:41 IST 2017: Creating master instance: hadoop-m
.Thu Oct 26 17:41:42 IST 2017: Waiting on async 'instances create' jobs to finish. Might take a while...
Thu Oct 26 17:42:06 IST 2017: Exited 1 : gcloud --project=recommendationshashiproject --quiet --verbosity=info compute instances create hadoop-w-1 --machine-type=n1-standard-4 --image-family=debian-8 --image-project=debian-cloud --network=default --tags=bdutil --scopes storage-full --boot-disk-type=pd-standard --zone=us-central1-a
Thu Oct 26 17:42:06 IST 2017: Exited 1 : gcloud --project=recommendationshashiproject --quiet --verbosity=info compute instances create hadoop-w-0 --machine-type=n1-standard-4 --image-family=debian-8 --image-project=debian-cloud --network=default --tags=bdutil --scopes storage-full --boot-disk-type=pd-standard --zone=us-central1-a
Thu Oct 26 17:42:07 IST 2017: Exited 1 : gcloud --project=recommendationshashiproject --quiet --verbosity=info compute instances create hadoop-m --machine-type=n1-standard-4 --image-family=debian-8 --image-project=debian-cloud --network=default --tags=bdutil --scopes storage-full --boot-disk-type=pd-standard --zone=us-central1-a
Thu Oct 26 17:42:07 IST 2017: Command failed: wait ${SUBPROC} on line 326.
Thu Oct 26 17:42:07 IST 2017: Exit code of failed command: 1
Thu Oct 26 17:42:07 IST 2017: Detailed debug info available in file: /tmp/bdutil-20171026-174133-7kt/debuginfo.txt
Thu Oct 26 17:42:07 IST 2017: Check console output for error messages and/or retry your command.
shashi_kumarmirle@recommendationshashiproject:~/bdutil-master$ ./bdutil shell
Thu Oct 26 17:42:20 IST 2017: Using local tmp dir for staging files: /tmp/bdutil-20171026-174220-tyx
Thu Oct 26 17:42:20 IST 2017: Using custom environment-variable file(s): bdutil_env.sh
Thu Oct 26 17:42:20 IST 2017: Reading environment-variable file: ./bdutil_env.sh
Thu Oct 26 17:42:20 IST 2017: No explicit GCE_MASTER_MACHINE_TYPE provided; defaulting to value of GCE_MACHINE_TYPE: n1-standard-4
Thu Oct 26 17:42:20 IST 2017: Running gcloud --project=recommendationshashiproject --quiet --verbosity=info compute ssh hadoop-m --command= --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-central1-a
ERROR: (gcloud.compute.ssh) Could not fetch resource:

  • The resource 'projects/recommendationshashiproject/zones/us-central1-a/instances/hadoop-m' was not found

Thu Oct 26 17:42:22 IST 2017: Exited 1 : gcloud --project=recommendationshashiproject --quiet --verbosity=info compute ssh hadoop-m --command= --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-central1-a
Thu Oct 26 17:42:22 IST 2017: Command failed: return ${exitcode} on line 453.
Thu Oct 26 17:42:22 IST 2017: Exit code of failed command: 1
Thu Oct 26 17:42:22 IST 2017: Detailed debug info available in file: /tmp/bduti
Thu Oct 26 17:42:22 IST 2017: Check console output for error messages and/or retry your command.
shashi_kumarmirle@recommendationshashiproject:~/bdutil-master$ ./bdutil shell
Thu Oct 26 18:05:09 IST 2017: Using local tmp dir for staging files: /tmp/bdutil-20171026-180509-mSH
Thu Oct 26 18:05:09 IST 2017: Using custom environment-variable file(s): bdutil_env.sh
Thu Oct 26 18:05:09 IST 2017: Reading environment-variable file: ./bdutil_env.sh
Thu Oct 26 18:05:09 IST 2017: No explicit GCE_MASTER_MACHINE_TYPE provided; defaulting to value of GCE_MACHINE_TYPE: n1-standard-4
Thu Oct 26 18:05:09 IST 2017: Running gcloud --project=recommendationshashiproject --quiet --verbosity=info compute ssh hadoop-m --command= --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-central1-a
ERROR: (gcloud.compute.ssh) Could not fetch resource:

  • The resource 'projects/recommendationshashiproject/zones/us-central1-a/instances/hadoop-m' was not found

Thu Oct 26 18:05:11 IST 2017: Exited 1 : gcloud --project=recommendationshashiproject --quiet --verbosity=info compute ssh hadoop-m --command= --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-central1-a
Thu Oct 26 18:05:11 IST 2017: Command failed: return ${exitcode} on line 453.
Thu Oct 26 18:05:11 IST 2017: Exit code of failed command: 1
Thu Oct 26 18:05:11 IST 2017: Detailed debug info available in file: /tmp/bdutil-20171026-180509-mSH/debuginfo.txt
Thu Oct 26 18:05:11 IST 2017: Check console output for error messages and/or retry your command.

@betarelease
Copy link

I am facing something similar with flink installation. flink to gcp using bdutil consistently fails indicating one of the other installation issues. Log attached.


******************* gcloud compute stdout *******************
NAME ZONE MACHINE_TYPE PREEMPTIBLE INTERNAL_IP EXTERNAL_IP STATUS
flink-w-0 us-west1-a n1-standard-2 10.138.0.2 35.197.110.114 RUNNING
NAME ZONE MACHINE_TYPE PREEMPTIBLE INTERNAL_IP EXTERNAL_IP STATUS
flink-w-1 us-west1-a n1-standard-2 10.138.0.3 35.199.155.165 RUNNING
NAME ZONE MACHINE_TYPE PREEMPTIBLE INTERNAL_IP EXTERNAL_IP STATUS
flink-m us-west1-a n1-standard-2 10.138.0.4 35.197.30.229 RUNNING

******************* gcloud compute stderr *******************
Created [https://www.googleapis.com/compute/v1/projects/dogwood-baton-185323/zones/us-west1-a/instances/flink-w-0].
INFO: Display format "
table(
name,
zone.basename(),
machineType.machine_type().basename(),
scheduling.preemptible.yesno(yes=true, no=''),
networkInterfaces[].networkIP.notnull().list():label=INTERNAL_IP,
networkInterfaces[].accessConfigs[0].natIP.notnull().list() :label=EXTERNAL_IP,
status
)
".
Created [https://www.googleapis.com/compute/v1/projects/dogwood-baton-185323/zones/us-west1-a/instances/flink-w-1].
INFO: Display format "
table(
name,
zone.basename(),
machineType.machine_type().basename(),
scheduling.preemptible.yesno(yes=true, no=''),
networkInterfaces[].networkIP.notnull().list():label=INTERNAL_IP,
networkInterfaces[].accessConfigs[0].natIP.notnull().list() :label=EXTERNAL_IP,
status
)
".
Created [https://www.googleapis.com/compute/v1/projects/dogwood-baton-185323/zones/us-west1-a/instances/flink-m].
INFO: Display format "
table(
name,
zone.basename(),
machineType.machine_type().basename(),
scheduling.preemptible.yesno(yes=true, no=''),
networkInterfaces[].networkIP.notnull().list():label=INTERNAL_IP,
networkInterfaces[].accessConfigs[0].natIP.notnull().list() :label=EXTERNAL_IP,
status
)
".
Warning: Permanently added 'compute.131374822653404283' (ECDSA) to the list of known hosts.
Warning: Permanently added 'compute.1507412667037361275' (ECDSA) to the list of known hosts.
INFO: Display format "default".
INFO: Display format "default".
ssh: connect to host 35.197.30.229 port 22: Operation timed out
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].
Warning: Permanently added 'compute.617740097528243323' (ECDSA) to the list of known hosts.
INFO: Display format "default".
Connection to 35.199.155.165 closed.
INFO: Display format "default".
Connection to 35.197.110.114 closed.
INFO: Display format "default".
Connection to 35.197.30.229 closed.
INFO: Display format "default".
Connection to 35.197.30.229 closed.
INFO: Display format "default".
Connection to 35.197.30.229 closed.
INFO: Display format "default".
Connection to 35.197.110.114 closed.
INFO: Display format "default".
Connection to 35.199.155.165 closed.
INFO: Display format "default".
Connection to 35.197.110.114 closed.
Connection to 35.199.155.165 closed.
INFO: Display format "default".
INFO: Display format "default".
Connection to 35.197.30.229 closed.
INFO: Display format "default".
Connection to 35.197.110.114 closed.
INFO: Display format "default".
Connection to 35.199.155.165 closed.
INFO: Display format "default".
Connection to 35.197.30.229 closed.
INFO: Display format "default".
Connection to 35.197.30.229 closed.
INFO: Display format "default".
Connection to 35.197.110.114 closed.
Connection to 35.199.155.165 closed.
Connection to 35.197.30.229 closed.

************ ERROR logs from gcloud compute stderr ************
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].

******************* Exit codes and VM logs *******************
Tue Nov 7 12:42:45 PST 2017: Exited 255 : gcloud --project=dogwood-baton-185323 --quiet --verbosity=info compute ssh flink-m --command=exit 0 --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-west1-a
Tue Nov 7 12:48:12 PST 2017: Exited 2 : gcloud --project=dogwood-baton-185323 --quiet --verbosity=info compute ssh flink-w-1 --command=sudo su -l -c "cd ${PWD} && ./install_flink.sh" 2>>install_flink_deploy.stderr 1>>install_flink_deploy.stdout --ssh-flag=-tt --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-west1-a
Tue Nov 7 12:48:12 PST 2017: Exited 2 : gcloud --project=dogwood-baton-185323 --quiet --verbosity=info compute ssh flink-w-0 --command=sudo su -l -c "cd ${PWD} && ./install_flink.sh" 2>>install_flink_deploy.stderr 1>>install_flink_deploy.stdout --ssh-flag=-tt --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-west1-a
Tue Nov 7 12:48:12 PST 2017: Exited 2 : gcloud --project=dogwood-baton-185323 --quiet --verbosity=info compute ssh flink-m --command=sudo su -l -c "cd ${PWD} && ./install_flink.sh" 2>>install_flink_deploy.stderr 1>>install_flink_deploy.stdout --ssh-flag=-tt --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-west1-a
flink-w-1:tTue Nov 7 12:48:12 PST 2017: Running gcloud --project=dogwood-baton-185323 --quiet --verbosity=info compute ssh flink-w-1 --command=tail -vn 30 *.stderr --ssh-flag=-n --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-west1-a
flink-w-1:t==> bootstrap.stderr <==
flink-w-1:tERROR: (gcloud.components.update)
flink-w-1:tYou cannot perform this action because the Cloud SDK component manager
flink-w-1:tis disabled for this installation. You can run the following command
flink-w-1:tto achieve the same result for this installation:
flink-w-1:t
flink-w-1:tsudo apt-get install google-cloud-sdk
flink-w-1:t
flink-w-1:t
flink-w-1:tCopying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/bq-mapred-template.xml...
flink-w-1:t/ [0 files][ 0.0 B/ 769.0 B]
Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/core-template.xml...
flink-w-1:t/ [0 files][ 0.0 B/ 1.0 KiB]
Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/gcs-core-template.xml...
flink-w-1:t/ [0 files][ 0.0 B/ 3.5 KiB]
/ [1 files][ 769.0 B/ 3.5 KiB]
Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hdfs-template.xml...
flink-w-1:t/ [1 files][ 769.0 B/ 5.7 KiB]
/ [2 files][ 1.0 KiB/ 5.7 KiB]
Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/mapred-health-check.sh...
flink-w-1:t/ [2 files][ 1.0 KiB/ 7.6 KiB]
Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/mapred-template.xml...
flink-w-1:t/ [3 files][ 3.5 KiB/ 7.6 KiB]
/ [3 files][ 3.5 KiB/ 10.7 KiB]
/ [4 files][ 5.7 KiB/ 10.7 KiB]
Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hadoop_helpers.sh...
flink-w-1:t/ [4 files][ 5.7 KiB/ 16.7 KiB]
/ [5 files][ 7.6 KiB/ 16.7 KiB]
Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hadoop-env-setup.sh...
flink-w-1:t/ [5 files][ 7.6 KiB/ 51.3 KiB]
/ [6 files][ 10.7 KiB/ 51.3 KiB]
Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-ssh-master-setup.sh...
flink-w-1:t-

  • [7 files][ 16.7 KiB/ 51.3 KiB]
  • [7 files][ 16.7 KiB/ 52.8 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-core-setup.sh...
    flink-w-1:t- [7 files][ 16.7 KiB/ 81.9 KiB]
  • [8 files][ 51.3 KiB/ 81.9 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-master-nfs-setup.sh...
    flink-w-1:t- [8 files][ 51.3 KiB/ 86.0 KiB]
  • [9 files][ 52.8 KiB/ 86.0 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-client-nfs-setup.sh...
    flink-w-1:t- [9 files][ 52.8 KiB/ 87.6 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-ssh-worker-setup.sh...
    flink-w-1:t- [9 files][ 52.8 KiB/ 88.9 KiB]
  • [10 files][ 81.9 KiB/ 88.9 KiB]
  • [11 files][ 86.0 KiB/ 88.9 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-start.sh...
    flink-w-1:t- [11 files][ 86.0 KiB/ 90.2 KiB]
  • [12 files][ 87.6 KiB/ 90.2 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/install_connectors.sh...
    flink-w-1:t- [12 files][ 87.6 KiB/ 96.4 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/install_flink.sh...
    flink-w-1:tCopying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/start_flink.sh...
    flink-w-1:t- [13 files][ 88.9 KiB/ 96.4 KiB]
  • [13/17 files][ 88.9 KiB/100.7 KiB] 88% Done
  • [13/17 files][ 88.9 KiB/100.7 KiB] 88% Done
  • [14/17 files][ 90.2 KiB/100.7 KiB] 89% Done

    \ [15/17 files][ 96.4 KiB/100.7 KiB] 95% Done
    \ [16/17 files][100.0 KiB/100.7 KiB] 99% Done
    \ [17/17 files][100.7 KiB/100.7 KiB] 100% Done
    flink-w-1:tOperation completed over 17 objects/100.7 KiB.
    flink-w-1:t
    flink-w-1:t==> deploy-client-nfs-setup_deploy.stderr <==
    flink-w-1:tFailed to open /dev/tty: No such device or address
    flink-w-1:t
    flink-w-1:t==> deploy-core-setup_deploy.stderr <==
    flink-w-1:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-1:t Dload Upload Total Spent Left Speed
    flink-w-1:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 17 100 17 0 0 7609 0 --:--:-- --:--:-- --:--:-- 8500
    flink-w-1:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-1:t Dload Upload Total Spent Left Speed
    flink-w-1:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 1 100 1 0 0 353 0 --:--:-- --:--:-- --:--:-- 500
    flink-w-1:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-1:t Dload Upload Total Spent Left Speed
    flink-w-1:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 10 100 10 0 0 4009 0 --:--:-- --:--:-- --:--:-- 5000
    flink-w-1:tdpkg-query: package 'libsnappy1' is not installed and no information is available
    flink-w-1:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-w-1:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-w-1:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-w-1:tdpkg-query: package 'libsnappy-dev' is not installed and no information is available
    flink-w-1:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-w-1:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-w-1:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-w-1:tCopying gs://hadoop-dist/hadoop-1.2.1-bin.tar.gz...
    flink-w-1:t/ [0 files][ 0.0 B/ 36.3 MiB]
    / [1 files][ 36.3 MiB/ 36.3 MiB]
    flink-w-1:tOperation completed over 1 objects/36.3 MiB.
    flink-w-1:tNo URLs matched: gs://hadoop-native-dist/Hadoop_1.2.1-Linux-amd64-64.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-1:t Dload Upload Total Spent Left Speed
    flink-w-1:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 166k 100 166k 0 0 12.8M 0 --:--:-- --:--:-- --:--:-- 13.5M
    flink-w-1:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-1:t Dload Upload Total Spent Left Speed
    flink-w-1:t
    0 0 0 0 0 0 flink-w-0:tTue Nov 7 12:48:12 PST 2017: Running gcloud --project=dogwood-baton-185323 --quiet --verbosity=info compute ssh flink-w-0 --command=tail -vn 30 *.stderr --ssh-flag=-n --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-west1-a
    flink-w-0:t==> bootstrap.stderr <==
    flink-w-0:tERROR: (gcloud.components.update)
    flink-w-0:tYou cannot perform this action because the Cloud SDK component manager
    flink-w-0:tis disabled for this installation. You can run the following command
    flink-w-0:tto achieve the same result for this installation:
    flink-w-0:t
    flink-w-0:tsudo apt-get install google-cloud-sdk
    flink-w-0:t
    flink-w-0:t
    flink-w-0:tCopying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/bq-mapred-template.xml...
    flink-w-0:t/ [0 files][ 0.0 B/ 769.0 B]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/core-template.xml...
    flink-w-0:t/ [0 files][ 0.0 B/ 1.0 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/gcs-core-template.xml...
    flink-w-0:t/ [0 files][ 0.0 B/ 3.5 KiB]
    / [1 files][ 769.0 B/ 3.5 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hdfs-template.xml...
    flink-w-0:t/ [1 files][ 769.0 B/ 5.7 KiB]
    / [2 files][ 1.0 KiB/ 5.7 KiB]
    / [3 files][ 3.5 KiB/ 5.7 KiB]
    / [4 files][ 5.7 KiB/ 5.7 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/mapred-health-check.sh...
    flink-w-0:t/ [4 files][ 5.7 KiB/ 7.6 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/mapred-template.xml...
    flink-w-0:t/ [4 files][ 5.7 KiB/ 10.7 KiB]
    / [5 files][ 7.6 KiB/ 10.7 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hadoop_helpers.sh...
    flink-w-0:t/ [5 files][ 7.6 KiB/ 16.7 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hadoop-env-setup.sh...
    flink-w-0:t/ [6 files][ 10.7 KiB/ 16.7 KiB]
    / [6 files][ 10.7 KiB/ 51.3 KiB]
  • [7 files][ 16.7 KiB/ 51.3 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-ssh-master-setup.sh...
    flink-w-0:t- [7 files][ 16.7 KiB/ 52.8 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-core-setup.sh...
    flink-w-0:t- [8 files][ 51.3 KiB/ 52.8 KiB]
  • [8 files][ 51.3 KiB/ 81.9 KiB]
  • [9 files][ 52.8 KiB/ 81.9 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-master-nfs-setup.sh...
    flink-w-0:t- [9 files][ 52.8 KiB/ 86.0 KiB]
  • [10 files][ 81.9 KiB/ 86.0 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-client-nfs-setup.sh...
    flink-w-0:t- [10 files][ 81.9 KiB/ 87.6 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-ssh-worker-setup.sh...
    flink-w-0:t- [10 files][ 81.9 KiB/ 88.9 KiB]
  • [11 files][ 86.0 KiB/ 88.9 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-start.sh...
    flink-w-0:t- [12 files][ 87.6 KiB/ 88.9 KiB]
  • [12 files][ 87.6 KiB/ 90.2 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/install_connectors.sh...
    flink-w-0:t- [13 files][ 88.9 KiB/ 90.2 KiB]
  • [13 files][ 88.9 KiB/ 96.4 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/install_flink.sh...
    flink-w-0:tCopying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/start_flink.sh...
    flink-w-0:t- [14 files][ 90.2 KiB/ 96.4 KiB]
  • [14 files][ 90.2 KiB/100.0 KiB]
  • [14 files][ 90.2 KiB/100.7 KiB]
  • [15/17 files][ 96.4 KiB/100.7 KiB] 95% Done

    \ [16/17 files][100.0 KiB/100.7 KiB] 99% Done
    \ [17/17 files][100.7 KiB/100.7 KiB] 100% Done
    flink-w-0:tOperation completed over 17 objects/100.7 KiB.
    flink-w-0:t
    flink-w-0:t==> deploy-client-nfs-setup_deploy.stderr <==
    flink-w-0:tFailed to open /dev/tty: No such device or address
    flink-w-0:t
    flink-w-0:t==> deploy-core-setup_deploy.stderr <==
    flink-w-0:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-0:t Dload Upload Total Spent Left Speed
    flink-w-0:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 17 100 17 0 0 6348 0 --:--:-- --:--:-- --:--:-- 8500
    flink-w-0:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-0:t Dload Upload Total Spent Left Speed
    flink-w-0:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 1 100 1 0 0 399 0 --:--:-- --:--:-- --:--:-- 500
    flink-w-0:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-0:t Dload Upload Total Spent Left Speed
    flink-w-0:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 10 100 10 0 0 4353 0 --:--:-- --:--:-- --:--:-- 5000
    flink-w-0:tdpkg-query: package 'libsnappy1' is not installed and no information is available
    flink-w-0:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-w-0:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-w-0:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-w-0:tdpkg-query: package 'libsnappy-dev' is not installed and no information is available
    flink-w-0:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-w-0:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-w-0:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-w-0:tCopying gs://hadoop-dist/hadoop-1.2.1-bin.tar.gz...
    flink-w-0:t/ [0 files][ 0.0 B/ 36.3 MiB]
    / [1 files][ 36.3 MiB/ 36.3 MiB]
    flink-w-0:tOperation completed over 1 objects/36.3 MiB.
    flink-w-0:tNo URLs matched: gs://hadoop-native-dist/Hadoop_1.2.1-Linux-amd64-64.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-0:t Dload Upload Total Spent Left Speed
    flink-w-0:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 166k 100 166k 0 0 8350k 0 --:--:-- --:--:-- --:--:-- 8769k
    flink-w-0:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-w-0:t Dload Upload Total Spent Left Speed
    flink-w-0:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 2464k 100 2464k 0 0 30.0M 0 --:--:-- --:--:-- --:--:-- 30.4M
    flink-w-1:tdpkg-query: package 'autofs' is not installed and no information is available
    flink-w-1:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-w-1:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-w-1:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-w-1:t
    flink-w-1:t==> deploy-ssh-worker-setup_deploy.stderr <==
    flink-w-1:tCopying gs://flink_trial/bdutil-staging/flink-m/hadoop_master_id_rsa.pub...
    flink-w-1:t/ [0 files][ 0.0 B/ 394.0 B]
    / [1 files][ 394.0 B/ 394.0 B]
    flink-w-1:tOperation completed over 1 objects/394.0 B.
    flink-w-1:t
    flink-w-1:t==> install_flink_deploy.stderr <==
    flink-w-1:tAccessDeniedException: 403 [email protected] does not have storage.objects.list access to flink-dist.
    flink-w-1:t
    flink-w-1:tgzip: stdin: unexpected end of file
    flink-w-1:ttar: Child returned status 1
    flink-w-1:ttar: Error is not recoverable: exiting now
    flink-w-1:t.
    0 0 --:--:-- --:--:-- --:--:-- 0
    100 2464k 100 2464k 0 0 76.5M 0 --:--:-- --:--:-- --:--:-- 77.6M
    flink-w-0:tdpkg-query: package 'autofs' is not installed and no information is available
    flink-w-0:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-w-0:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-w-0:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-w-0:t
    flink-w-0:t==> deploy-ssh-worker-setup_deploy.stderr <==
    flink-w-0:tCopying gs://flink_trial/bdutil-staging/flink-m/hadoop_master_id_rsa.pub...
    flink-w-0:t/ [0 files][ 0.0 B/ 394.0 B]
    / [1 files][ 394.0 B/ 394.0 B]
    flink-w-0:tOperation completed over 1 objects/394.0 B.
    flink-w-0:t
    flink-w-0:t==> install_flink_deploy.stderr <==
    flink-w-0:tAccessDeniedException: 403 [email protected] does not have storage.objects.list access to flink-dist.
    flink-w-0:t
    flink-w-0:tgzip: stdin: unexpected end of file
    flink-w-0:ttar: Child returned status 1
    flink-w-0:ttar: Error is not recoverable: exiting now
    flink-w-0:t.
    flink-m:tTue Nov 7 12:48:12 PST 2017: Running gcloud --project=dogwood-baton-185323 --quiet --verbosity=info compute ssh flink-m --command=tail -vn 30 *.stderr --ssh-flag=-n --ssh-flag=-oServerAliveInterval=60 --ssh-flag=-oServerAliveCountMax=3 --ssh-flag=-oConnectTimeout=30 --zone=us-west1-a
    flink-m:t==> bootstrap.stderr <==
    flink-m:tERROR: (gcloud.components.update)
    flink-m:tYou cannot perform this action because the Cloud SDK component manager
    flink-m:tis disabled for this installation. You can run the following command
    flink-m:tto achieve the same result for this installation:
    flink-m:t
    flink-m:tsudo apt-get install google-cloud-sdk
    flink-m:t
    flink-m:t
    flink-m:tCopying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/bq-mapred-template.xml...
    flink-m:t/ [0 files][ 0.0 B/ 769.0 B]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/core-template.xml...
    flink-m:t/ [0 files][ 0.0 B/ 1.0 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/gcs-core-template.xml...
    flink-m:t/ [1 files][ 769.0 B/ 1.0 KiB]
    / [1 files][ 769.0 B/ 3.5 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hdfs-template.xml...
    flink-m:t/ [1 files][ 769.0 B/ 5.7 KiB]
    / [2 files][ 1.0 KiB/ 5.7 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/mapred-health-check.sh...
    flink-m:t/ [3 files][ 3.5 KiB/ 5.7 KiB]
    / [3 files][ 3.5 KiB/ 7.6 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/mapred-template.xml...
    flink-m:t/ [3 files][ 3.5 KiB/ 10.7 KiB]
    / [4 files][ 5.7 KiB/ 10.7 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hadoop_helpers.sh...
    flink-m:t/ [4 files][ 5.7 KiB/ 16.7 KiB]
    / [5 files][ 7.6 KiB/ 16.7 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/hadoop-env-setup.sh...
    flink-m:t/ [5 files][ 7.6 KiB/ 51.3 KiB]
    / [6 files][ 10.7 KiB/ 51.3 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-ssh-master-setup.sh...
    flink-m:t/ [6 files][ 10.7 KiB/ 52.8 KiB]
    / [7 files][ 16.7 KiB/ 52.8 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-core-setup.sh...
    flink-m:t/ [8 files][ 51.3 KiB/ 52.8 KiB]
    / [8 files][ 51.3 KiB/ 81.9 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-master-nfs-setup.sh...
    flink-m:t-
  • [9 files][ 52.8 KiB/ 81.9 KiB]
  • [9 files][ 52.8 KiB/ 86.0 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-client-nfs-setup.sh...
    flink-m:t- [9 files][ 52.8 KiB/ 87.6 KiB]
  • [10 files][ 81.9 KiB/ 87.6 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-ssh-worker-setup.sh...
    flink-m:t- [10 files][ 81.9 KiB/ 88.9 KiB]
  • [11 files][ 86.0 KiB/ 88.9 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/deploy-start.sh...
    flink-m:t- [12 files][ 87.6 KiB/ 88.9 KiB]
  • [12 files][ 87.6 KiB/ 90.2 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/install_connectors.sh...
    flink-m:t- [12 files][ 87.6 KiB/ 96.4 KiB]
  • [13 files][ 88.9 KiB/ 96.4 KiB]
    Copying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/install_flink.sh...
    flink-m:tCopying gs://flink_trial/bdutil-staging/flink-m/20171107-124151-D7T/start_flink.sh...
    flink-m:t- [13/17 files][ 88.9 KiB/100.7 KiB] 88% Done
  • [13/17 files][ 88.9 KiB/100.7 KiB] 88% Done
  • [14/17 files][ 90.2 KiB/100.7 KiB] 89% Done
  • [15/17 files][ 96.4 KiB/100.7 KiB] 95% Done
  • [16/17 files][ 97.1 KiB/100.7 KiB] 96% Done
  • [17/17 files][100.7 KiB/100.7 KiB] 100% Done
    flink-m:tOperation completed over 17 objects/100.7 KiB.
    flink-m:t
    flink-m:t==> deploy-client-nfs-setup_deploy.stderr <==
    flink-m:tFailed to open /dev/tty: No such device or address
    flink-m:t
    flink-m:t==> deploy-core-setup_deploy.stderr <==
    flink-m:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-m:t Dload Upload Total Spent Left Speed
    flink-m:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 17 100 17 0 0 9057 0 --:--:-- --:--:-- --:--:-- 17000
    flink-m:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-m:t Dload Upload Total Spent Left Speed
    flink-m:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 1 100 1 0 0 475 0 --:--:-- --:--:-- --:--:-- 500
    flink-m:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-m:t Dload Upload Total Spent Left Speed
    flink-m:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 10 100 10 0 0 5382 0 --:--:-- --:--:-- --:--:-- 10000
    flink-m:tdpkg-query: package 'libsnappy1' is not installed and no information is available
    flink-m:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-m:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-m:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-m:tdpkg-query: package 'libsnappy-dev' is not installed and no information is available
    flink-m:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-m:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-m:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-m:tCopying gs://hadoop-dist/hadoop-1.2.1-bin.tar.gz...
    flink-m:t/ [0 files][ 0.0 B/ 36.3 MiB]
    / [1 files][ 36.3 MiB/ 36.3 MiB]
    flink-m:tOperation completed over 1 objects/36.3 MiB.
    flink-m:tNo URLs matched: gs://hadoop-native-dist/Hadoop_1.2.1-Linux-amd64-64.tar.gz % Total % Received % Xferd Average Speed Time Time Time Current
    flink-m:t Dload Upload Total Spent Left Speed
    flink-m:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 166k 100 166k 0 0 1929k 0 --:--:-- --:--:-- --:--:-- 1937k
    flink-m:t % Total % Received % Xferd Average Speed Time Time Time Current
    flink-m:t Dload Upload Total Spent Left Speed
    flink-m:t
    0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
    100 2464k 100 2464k 0 0 30.7M 0 --:--:-- --:--:-- --:--:-- 30.8M
    flink-m:tdpkg-query: package 'autofs' is not installed and no information is available
    flink-m:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-m:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-m:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-m:t
    flink-m:t==> deploy-master-nfs-setup_deploy.stderr <==
    flink-m:tdpkg-query: package 'nfs-kernel-server' is not installed and no information is available
    flink-m:tUse dpkg --info (= dpkg-deb --info) to examine archive files,
    flink-m:tand dpkg --contents (= dpkg-deb --contents) to list their contents.
    flink-m:tdpkg-preconfigure: unable to re-open stdin: No such file or directory
    flink-m:tFailed to open /dev/tty: No such device or address
    flink-m:tFailed to open /dev/tty: No such device or address
    flink-m:tFailed to open /dev/tty: No such device or address
    flink-m:t
    flink-m:t==> deploy-ssh-master-setup_deploy.stderr <==
    flink-m:tCopying file:///home/hadoop/.ssh/hadoop_master_id_rsa.pub [Content-Type=application/octet-stream]...
    flink-m:t/ [0 files][ 0.0 B/ 394.0 B]
    / [1 files][ 394.0 B/ 394.0 B]
    flink-m:tOperation completed over 1 objects/394.0 B.
    flink-m:t
    flink-m:t==> deploy-start_deploy.stderr <==
    flink-m:t17/11/07 20:44:29 INFO util.GSet: VM type = 64-bit
    flink-m:t17/11/07 20:44:29 INFO util.GSet: 2.0% max memory = 1398276096
    flink-m:t17/11/07 20:44:29 INFO util.GSet: capacity = 2^22 = 4194304 entries
    flink-m:t17/11/07 20:44:29 INFO util.GSet: recommended=4194304, actual=4194304
    flink-m:t17/11/07 20:44:30 INFO namenode.FSNamesystem: fsOwner=hadoop
    flink-m:t17/11/07 20:44:30 INFO namenode.FSNamesystem: supergroup=supergroup
    flink-m:t17/11/07 20:44:30 INFO namenode.FSNamesystem: isPermissionEnabled=false
    flink-m:t17/11/07 20:44:30 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
    flink-m:t17/11/07 20:44:30 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
    flink-m:t17/11/07 20:44:30 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = 0
    flink-m:t17/11/07 20:44:30 INFO namenode.NameNode: Caching file names occuring more than 10 times
    flink-m:t17/11/07 20:44:30 INFO common.Storage: Image file /hadoop/dfs/name/current/fsimage of size 112 bytes saved in 0 seconds.
    flink-m:t17/11/07 20:44:30 INFO namenode.FSEditLog: closing edit log: position=4, editlog=/hadoop/dfs/name/current/edits
    flink-m:t17/11/07 20:44:30 INFO namenode.FSEditLog: close success: truncate to 4, editlog=/hadoop/dfs/name/current/edits
    flink-m:t17/11/07 20:44:30 INFO common.Storage: Storage directory /hadoop/dfs/name has been successfully formatted.
    flink-m:t17/11/07 20:44:30 INFO namenode.NameNode: SHUTDOWN_MSG:
    flink-m:t/************************************************************
    flink-m:tSHUTDOWN_MSG: Shutting down NameNode at flink-m/10.138.0.4
    flink-m:t************************************************************/
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:tJobtracker not yet ready(1); sleeping 10.
    flink-m:t17/11/07 20:46:12 INFO gcs.GoogleHadoopFileSystemBase: GHFS version: 1.5.3-hadoop1
    flink-m:t
    flink-m:t==> install_flink_deploy.stderr <==
    flink-m:tAccessDeniedException: 403 [email protected] does not have storage.objects.list access to flink-dist.
    flink-m:t
    flink-m:tgzip: stdin: unexpected end of file
    flink-m:ttar: Child returned status 1
    flink-m:ttar: Error is not recoverable: exiting now
    flink-m:t.

@aeneasr
Copy link

aeneasr commented Nov 12, 2017

I am encountering the same error which is most likely due to missing read rights on the bucket flink-dist. Not sure who is responsible for that bucket!

@aeneasr
Copy link

aeneasr commented Nov 12, 2017

Also keep in mind that this is running an ancient version of flink (0.10.0)

@betarelease
Copy link

betarelease commented Nov 14, 2017

thanks for those pointers @arekkas. Was of great help. Seems like flink-dist is indeed broken.
I was able to fix it as follows in flink_env.sh

-FLINK_HADOOP1_TARBALL_URI='gs://flink-dist/flink-0.10.1-bin-hadoop1-scala_2.10.tgz'
+
+FLINK_HADOOP1_TARBALL_URI='https://archive.apache.org/dist/flink/flink-1.3.2/flink-1.3.2-bin-hadoop2-scala_2.10.tgz'
-FLINK_HADOOP2_TARBALL_URI='gs://flink-dist/flink-0.10.1-bin-hadoop27-scala_2.10.tgz'
+FLINK_HADOOP2_TARBALL_URI='https://archive.apache.org/dist/flink/flink-1.3.2/flink-1.3.2-bin-hadoop2-scala_2.10.tgz'

I used the urls on the main flink website .. but they come from some dyn/ link and that did not work so had to find the archive to get these to download.

@meghbhalerao
Copy link

Try using gcloud auth login - and this will direct you to your gmail login. Worked for me. Let me know if this works for you.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants