Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Walkthrough jupyter notebook doesn't work #133

Open
iMurfyD opened this issue Mar 9, 2023 · 4 comments
Open

Walkthrough jupyter notebook doesn't work #133

iMurfyD opened this issue Mar 9, 2023 · 4 comments

Comments

@iMurfyD
Copy link

iMurfyD commented Mar 9, 2023

First time user who gets runtime error on trying the Walkthrough jupyter notebook:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
Cell In[4], line 11
      9 # We use RINEX3 PRNs to identify satellites
     10 sat_prn = 'G07'
---> 11 sat_pos, sat_vel, sat_clock_err, sat_clock_drift, ephemeris = dog.get_sat_info(sat_prn, time)
     12 print("Satellite's position in ECEF (m) : \n", sat_pos, '\n')
     13 print("Satellite's velocity in ECEF (m/s) : \n", sat_vel, '\n')

File ~/opt/anaconda3/envs/missile-tid/lib/python3.10/site-packages/laika/astro_dog.py:265, in AstroDog.get_sat_info(self, prn, time)
    263 eph = None
    264 if self.pull_orbit:
--> 265   eph = self.get_orbit(prn, time)
    266 if not eph and self.pull_nav:
    267   eph = self.get_nav(prn, time)

File ~/opt/anaconda3/envs/missile-tid/lib/python3.10/site-packages/laika/astro_dog.py:106, in AstroDog.get_orbit(self, prn, time)
    104 def get_orbit(self, prn: str, time: GPSTime):
    105   skip_download = time in self.orbit_fetched_times
--> 106   orbit = self._get_latest_valid_data(self.orbits[prn], self.cached_orbit[prn], self.get_orbit_data, time, skip_download)
    107   if orbit is not None:
    108     self.cached_orbit[prn] = orbit

File ~/opt/anaconda3/envs/missile-tid/lib/python3.10/site-packages/laika/astro_dog.py:363, in AstroDog._get_latest_valid_data(self, data, latest_data, download_data_func, time, skip_download, recv_pos)
    361   download_data_func(time, recv_pos)
    362 else:
--> 363   download_data_func(time)
    364 latest_data = get_closest(time, data, recv_pos=recv_pos)
    365 if is_valid(latest_data):

File ~/opt/anaconda3/envs/missile-tid/lib/python3.10/site-packages/laika/astro_dog.py:211, in AstroDog.get_orbit_data(self, time, only_predictions)
    209   ephems_sp3 = self.download_parse_orbit(time)
    210 if sum([len(v) for v in ephems_sp3.values()]) < 5:
--> 211   raise RuntimeError(f'No orbit data found. For Time {time.as_datetime()} constellations {self.valid_const} valid ephem types {self.valid_ephem_types}')
    213 self.add_orbits(ephems_sp3)

RuntimeError: No orbit data found. For Time 2018-01-07 00:00:00 constellations ['GPS', 'GLONASS'] valid ephem types (<EphemerisType.FINAL_ORBIT: 1>, <EphemerisType.RAPID_ORBIT: 2>, <EphemerisType.ULTRA_RAPID_ORBIT: 3>)

Output of this code block:

# For example if we want the position and speed of satellite 7 (a GPS sat)
# at the start of January 7th 2018. Laika's custom GPSTime object is used throughout
# and can be initialized from python's datetime.

from datetime import datetime
from laika.gps_time import GPSTime
time = GPSTime.from_datetime(datetime(2018, 1, 7))

# We use RINEX3 PRNs to identify satellites
sat_prn = 'G07'
sat_pos, sat_vel, sat_clock_err, sat_clock_drift, ephemeris = dog.get_sat_info(sat_prn, time)
print("Satellite's position in ECEF (m) : \n", sat_pos, '\n')
print("Satellite's velocity in ECEF (m/s) : \n", sat_vel, '\n')
print("Satellite's clock error (s) : \n", sat_clock_err, '\n\n')

# we can also get the pseudorange delay (tropo delay + iono delay + DCB correction)
# in the San Francisco area
receiver_position = [-2702584.60036925, -4325039.45362552, 3817393.16034817]
delay = dog.get_delay(sat_prn, time, receiver_position)
print("Satellite's delay correction (m) in San Fransisco \n", delay)

Which prints out:

Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18006/final/Sta19826.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18008/final/Sta19831.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18007/final/Sta19830.sp3
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igs19826.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igs19830.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igs19831.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igr19826.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igr19831.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igr19830.sp3.Z
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18007/rapid/Sta19830.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18008/rapid/Sta19831.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18006/rapid/Sta19826.sp3
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19831_18.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igu19826_18.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19830_18.sp3.Z
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18006/ultra/Sta19826.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18007/ultra/Sta19830.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18008/ultra/Sta19831.sp3
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19831_12.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igu19826_12.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19830_12.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19831_06.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19831_00.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igu19826_06.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19830_06.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19830_00.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igu19826_00.sp3.Z

I tried signing up for an Earthdata account and installing laika from source with a .netrc file in the root folder with python setup.py install. Also tried wiping the cache, which is at /tmp/gnss. Cache looks like this:

 ls /tmp/gnss
cddis_products   russian_products

ls /tmp/gnss/cddis_products
1982 1983

ls /tmp/gnss/cddis_products/1982
igr19826.sp3.attempt_time    igs19826.sp3.attempt_time    igu19826_00.sp3.attempt_time igu19826_06.sp3.attempt_time igu19826_12.sp3.attempt_time igu19826_18.sp3.attempt_time

Running on a Mac with python 3.10.

@iMurfyD
Copy link
Author

iMurfyD commented Mar 10, 2023

Pytest also fails for what looks like similar reasons.

============================= test session starts ==============================
platform darwin -- Python 3.10.9, pytest-7.2.2, pluggy-1.0.0
rootdir: /Users/idesjard/software/laika
plugins: anyio-3.5.0
collected 38 items

tests/test_dop.py .....                                                  [ 13%]
tests/test_downloader.py FFFF.                                           [ 26%]
tests/test_ephemerides.py .F.                                            [ 34%]
tests/test_fail_caching.py F                                             [ 36%]
tests/test_fetch_sat_info.py .FF                                         [ 44%]
tests/test_positioning.py sF                                             [ 50%]
tests/test_prediction_orbits.py FF                                       [ 55%]
tests/test_prns.py ......                                                [ 71%]
tests/test_time.py .....                                                 [ 84%]
tests/test_time_range_holder.py ......                                   [100%]

=================================== FAILURES ===================================
__________________ TestDownloader.test_all_download_protocols __________________

self = <tests.test_downloader.TestDownloader testMethod=test_all_download_protocols>

    def test_all_download_protocols(self):
      for url_base in self.url_base:
>       dat = download_file(url_base, self.folder_path, self.filename_zipped)

tests/test_downloader.py:32: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
laika/downloader.py:38: in wrapped
    return f(url_bases, *args, **kwargs)
laika/downloader.py:234: in download_file
    return https_download_file(url)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/2103/igu21034_18.sp3.Z'

    def https_download_file(url):
      crl = pycurl.Curl()
      crl.setopt(crl.CAINFO, certifi.where())
      crl.setopt(crl.URL, url)
      crl.setopt(crl.FOLLOWLOCATION, True)
      crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT@SECLEVEL=1')
      crl.setopt(crl.COOKIEJAR, '/tmp/cddis_cookies')
      crl.setopt(pycurl.CONNECTTIMEOUT, 10)
    
      buf = BytesIO()
      crl.setopt(crl.WRITEDATA, buf)
>     crl.perform()
E     pycurl.error: (59, 'failed setting cipher list: DEFAULT@SECLEVEL=1')

laika/downloader.py:190: error
----------------------------- Captured stdout call -----------------------------
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/2103/igu21034_18.sp3.Z
_________________________ TestDownloader.test_download _________________________

url_base = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
folder_path = '2103/', cache_dir = '/tmp/gnss/cddis_products'
filename = 'igu21034_18.sp3', compression = '.Z', overwrite = False

    def download_and_cache_file(url_base, folder_path: str, cache_dir: str, filename: str, compression='', overwrite=False):
      filename_zipped = filename + compression
      folder_path_abs = os.path.join(cache_dir, folder_path)
      filepath = str(hatanaka.get_decompressed_path(os.path.join(folder_path_abs, filename)))
    
      filepath_attempt = filepath + '.attempt_time'
    
      if os.path.exists(filepath_attempt):
        with open(filepath_attempt, 'r') as rf:
          last_attempt_time = float(rf.read())
        if time.time() - last_attempt_time < SECS_IN_HR:
          raise DownloadFailed(f"Too soon to try downloading {folder_path + filename_zipped} from {url_base} again since last attempt")
      if not os.path.isfile(filepath) or overwrite:
        try:
>         data_zipped = download_file(url_base, folder_path, filename_zipped)

laika/downloader.py:269: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_bases = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
args = ('2103/', 'igu21034_18.sp3.Z'), kwargs = {}
url_base = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/'

    def wrapped(url_bases, *args, **kwargs):
      if isinstance(url_bases, str):
        # only one url passed, don't do the retry thing
        return f(url_bases, *args, **kwargs)
    
      # not a string, must be a list of url_bases
      for url_base in url_bases:
        try:
>         return f(url_base, *args, **kwargs)

laika/downloader.py:43: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_base = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/'
folder_path = '2103/', filename_zipped = 'igu21034_18.sp3.Z'

    @retryable
    def download_file(url_base, folder_path, filename_zipped):
      url = url_base + folder_path + filename_zipped
      print('Downloading ' + url)
      if url.startswith('https://'):
>       return https_download_file(url)

laika/downloader.py:234: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/2103/igu21034_18.sp3.Z'

    def https_download_file(url):
      crl = pycurl.Curl()
      crl.setopt(crl.CAINFO, certifi.where())
      crl.setopt(crl.URL, url)
      crl.setopt(crl.FOLLOWLOCATION, True)
      crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT@SECLEVEL=1')
      crl.setopt(crl.COOKIEJAR, '/tmp/cddis_cookies')
      crl.setopt(pycurl.CONNECTTIMEOUT, 10)
    
      buf = BytesIO()
      crl.setopt(crl.WRITEDATA, buf)
>     crl.perform()
E     pycurl.error: (59, 'failed setting cipher list: DEFAULT@SECLEVEL=1')

laika/downloader.py:190: error

During handling of the above exception, another exception occurred:

self = <tests.test_downloader.TestDownloader testMethod=test_download>

    def test_download(self):
>     file = download_and_cache_file(self.url_base, self.folder_path, cache_dir=self.cache_dir, filename=self.filename, compression='.Z')

tests/test_downloader.py:36: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_base = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
folder_path = '2103/', cache_dir = '/tmp/gnss/cddis_products'
filename = 'igu21034_18.sp3', compression = '.Z', overwrite = False

    def download_and_cache_file(url_base, folder_path: str, cache_dir: str, filename: str, compression='', overwrite=False):
      filename_zipped = filename + compression
      folder_path_abs = os.path.join(cache_dir, folder_path)
      filepath = str(hatanaka.get_decompressed_path(os.path.join(folder_path_abs, filename)))
    
      filepath_attempt = filepath + '.attempt_time'
    
      if os.path.exists(filepath_attempt):
        with open(filepath_attempt, 'r') as rf:
          last_attempt_time = float(rf.read())
        if time.time() - last_attempt_time < SECS_IN_HR:
          raise DownloadFailed(f"Too soon to try downloading {folder_path + filename_zipped} from {url_base} again since last attempt")
      if not os.path.isfile(filepath) or overwrite:
        try:
          data_zipped = download_file(url_base, folder_path, filename_zipped)
        except (DownloadFailed, pycurl.error, socket.timeout):
          unix_time = time.time()
          os.makedirs(folder_path_abs, exist_ok=True)
          with atomic_write(filepath_attempt, mode='w', overwrite=True) as wf:
            wf.write(str(unix_time))
>         raise DownloadFailed(f"Could not download {folder_path + filename_zipped} from {url_base} ")
E         laika.downloader.DownloadFailed: Could not download 2103/igu21034_18.sp3.Z from ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')

laika/downloader.py:275: DownloadFailed
----------------------------- Captured stdout call -----------------------------
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/2103/igu21034_18.sp3.Z
____________________ TestDownloader.test_download_overwrite ____________________

url_base = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
folder_path = '2103/', cache_dir = '/tmp/gnss/cddis_products'
filename = 'igu21034_18.sp3', compression = '.Z', overwrite = False

    def download_and_cache_file(url_base, folder_path: str, cache_dir: str, filename: str, compression='', overwrite=False):
      filename_zipped = filename + compression
      folder_path_abs = os.path.join(cache_dir, folder_path)
      filepath = str(hatanaka.get_decompressed_path(os.path.join(folder_path_abs, filename)))
    
      filepath_attempt = filepath + '.attempt_time'
    
      if os.path.exists(filepath_attempt):
        with open(filepath_attempt, 'r') as rf:
          last_attempt_time = float(rf.read())
        if time.time() - last_attempt_time < SECS_IN_HR:
          raise DownloadFailed(f"Too soon to try downloading {folder_path + filename_zipped} from {url_base} again since last attempt")
      if not os.path.isfile(filepath) or overwrite:
        try:
>         data_zipped = download_file(url_base, folder_path, filename_zipped)

laika/downloader.py:269: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_bases = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
args = ('2103/', 'igu21034_18.sp3.Z'), kwargs = {}
url_base = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/'

    def wrapped(url_bases, *args, **kwargs):
      if isinstance(url_bases, str):
        # only one url passed, don't do the retry thing
        return f(url_bases, *args, **kwargs)
    
      # not a string, must be a list of url_bases
      for url_base in url_bases:
        try:
>         return f(url_base, *args, **kwargs)

laika/downloader.py:43: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_base = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/'
folder_path = '2103/', filename_zipped = 'igu21034_18.sp3.Z'

    @retryable
    def download_file(url_base, folder_path, filename_zipped):
      url = url_base + folder_path + filename_zipped
      print('Downloading ' + url)
      if url.startswith('https://'):
>       return https_download_file(url)

laika/downloader.py:234: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/2103/igu21034_18.sp3.Z'

    def https_download_file(url):
      crl = pycurl.Curl()
      crl.setopt(crl.CAINFO, certifi.where())
      crl.setopt(crl.URL, url)
      crl.setopt(crl.FOLLOWLOCATION, True)
      crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT@SECLEVEL=1')
      crl.setopt(crl.COOKIEJAR, '/tmp/cddis_cookies')
      crl.setopt(pycurl.CONNECTTIMEOUT, 10)
    
      buf = BytesIO()
      crl.setopt(crl.WRITEDATA, buf)
>     crl.perform()
E     pycurl.error: (59, 'failed setting cipher list: DEFAULT@SECLEVEL=1')

laika/downloader.py:190: error

During handling of the above exception, another exception occurred:

self = <tests.test_downloader.TestDownloader testMethod=test_download_overwrite>

    def test_download_overwrite(self):
>     file = download_and_cache_file(self.url_base, self.folder_path, cache_dir=self.cache_dir, filename=self.filename, compression='.Z')

tests/test_downloader.py:40: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_base = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
folder_path = '2103/', cache_dir = '/tmp/gnss/cddis_products'
filename = 'igu21034_18.sp3', compression = '.Z', overwrite = False

    def download_and_cache_file(url_base, folder_path: str, cache_dir: str, filename: str, compression='', overwrite=False):
      filename_zipped = filename + compression
      folder_path_abs = os.path.join(cache_dir, folder_path)
      filepath = str(hatanaka.get_decompressed_path(os.path.join(folder_path_abs, filename)))
    
      filepath_attempt = filepath + '.attempt_time'
    
      if os.path.exists(filepath_attempt):
        with open(filepath_attempt, 'r') as rf:
          last_attempt_time = float(rf.read())
        if time.time() - last_attempt_time < SECS_IN_HR:
          raise DownloadFailed(f"Too soon to try downloading {folder_path + filename_zipped} from {url_base} again since last attempt")
      if not os.path.isfile(filepath) or overwrite:
        try:
          data_zipped = download_file(url_base, folder_path, filename_zipped)
        except (DownloadFailed, pycurl.error, socket.timeout):
          unix_time = time.time()
          os.makedirs(folder_path_abs, exist_ok=True)
          with atomic_write(filepath_attempt, mode='w', overwrite=True) as wf:
            wf.write(str(unix_time))
>         raise DownloadFailed(f"Could not download {folder_path + filename_zipped} from {url_base} ")
E         laika.downloader.DownloadFailed: Could not download 2103/igu21034_18.sp3.Z from ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')

laika/downloader.py:275: DownloadFailed
----------------------------- Captured stdout call -----------------------------
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/2103/igu21034_18.sp3.Z
____________________ TestDownloader.test_wait_after_failure ____________________

url_base = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
folder_path = '2103/', cache_dir = '/tmp/gnss/cddis_products'
filename = 'igu21034_18.sp3', compression = '.Z', overwrite = False

    def download_and_cache_file(url_base, folder_path: str, cache_dir: str, filename: str, compression='', overwrite=False):
      filename_zipped = filename + compression
      folder_path_abs = os.path.join(cache_dir, folder_path)
      filepath = str(hatanaka.get_decompressed_path(os.path.join(folder_path_abs, filename)))
    
      filepath_attempt = filepath + '.attempt_time'
    
      if os.path.exists(filepath_attempt):
        with open(filepath_attempt, 'r') as rf:
          last_attempt_time = float(rf.read())
        if time.time() - last_attempt_time < SECS_IN_HR:
          raise DownloadFailed(f"Too soon to try downloading {folder_path + filename_zipped} from {url_base} again since last attempt")
      if not os.path.isfile(filepath) or overwrite:
        try:
>         data_zipped = download_file(url_base, folder_path, filename_zipped)

laika/downloader.py:269: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_bases = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
args = ('2103/', 'igu21034_18.sp3.Z'), kwargs = {}
url_base = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/'

    def wrapped(url_bases, *args, **kwargs):
      if isinstance(url_bases, str):
        # only one url passed, don't do the retry thing
        return f(url_bases, *args, **kwargs)
    
      # not a string, must be a list of url_bases
      for url_base in url_bases:
        try:
>         return f(url_base, *args, **kwargs)

laika/downloader.py:43: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_base = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/'
folder_path = '2103/', filename_zipped = 'igu21034_18.sp3.Z'

    @retryable
    def download_file(url_base, folder_path, filename_zipped):
      url = url_base + folder_path + filename_zipped
      print('Downloading ' + url)
      if url.startswith('https://'):
>       return https_download_file(url)

laika/downloader.py:234: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url = 'https://github.com/commaai/gnss-data/raw/master/gnss/products/2103/igu21034_18.sp3.Z'

    def https_download_file(url):
      crl = pycurl.Curl()
      crl.setopt(crl.CAINFO, certifi.where())
      crl.setopt(crl.URL, url)
      crl.setopt(crl.FOLLOWLOCATION, True)
      crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT@SECLEVEL=1')
      crl.setopt(crl.COOKIEJAR, '/tmp/cddis_cookies')
      crl.setopt(pycurl.CONNECTTIMEOUT, 10)
    
      buf = BytesIO()
      crl.setopt(crl.WRITEDATA, buf)
>     crl.perform()
E     pycurl.error: (59, 'failed setting cipher list: DEFAULT@SECLEVEL=1')

laika/downloader.py:190: error

During handling of the above exception, another exception occurred:

self = <tests.test_downloader.TestDownloader testMethod=test_wait_after_failure>

    def test_wait_after_failure(self):
      # Verify no failure first.
>     download_and_cache_file(self.url_base, self.folder_path, cache_dir=self.cache_dir, filename=self.filename, compression='.Z')

tests/test_downloader.py:61: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_base = ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')
folder_path = '2103/', cache_dir = '/tmp/gnss/cddis_products'
filename = 'igu21034_18.sp3', compression = '.Z', overwrite = False

    def download_and_cache_file(url_base, folder_path: str, cache_dir: str, filename: str, compression='', overwrite=False):
      filename_zipped = filename + compression
      folder_path_abs = os.path.join(cache_dir, folder_path)
      filepath = str(hatanaka.get_decompressed_path(os.path.join(folder_path_abs, filename)))
    
      filepath_attempt = filepath + '.attempt_time'
    
      if os.path.exists(filepath_attempt):
        with open(filepath_attempt, 'r') as rf:
          last_attempt_time = float(rf.read())
        if time.time() - last_attempt_time < SECS_IN_HR:
          raise DownloadFailed(f"Too soon to try downloading {folder_path + filename_zipped} from {url_base} again since last attempt")
      if not os.path.isfile(filepath) or overwrite:
        try:
          data_zipped = download_file(url_base, folder_path, filename_zipped)
        except (DownloadFailed, pycurl.error, socket.timeout):
          unix_time = time.time()
          os.makedirs(folder_path_abs, exist_ok=True)
          with atomic_write(filepath_attempt, mode='w', overwrite=True) as wf:
            wf.write(str(unix_time))
>         raise DownloadFailed(f"Could not download {folder_path + filename_zipped} from {url_base} ")
E         laika.downloader.DownloadFailed: Could not download 2103/igu21034_18.sp3.Z from ('https://github.com/commaai/gnss-data/raw/master/gnss/products/', 'sftp://gdc.cddis.eosdis.nasa.gov/gnss/products/', 'ftp://igs.ign.fr/pub/igs/products/')

laika/downloader.py:275: DownloadFailed
----------------------------- Captured stdout call -----------------------------
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/2103/igu21034_18.sp3.Z
_____________________ TestAstroDog.test_nav_vs_orbit__old ______________________

self = <tests.test_ephemerides.TestAstroDog testMethod=test_nav_vs_orbit__old>

    def test_nav_vs_orbit__old(self):
      dog_orbit = AstroDog(valid_ephem_types=EphemerisType.all_orbits())
      dog_nav = AstroDog(valid_ephem_types=EphemerisType.NAV)
      for gps_time in gps_times:
        for svId in svIds:
          sat_info_nav = dog_nav.get_sat_info(svId, gps_time)
>         assert sat_info_nav is not None
E         assert None is not None

tests/test_ephemerides.py:39: AssertionError
_____________________ TestFailCache.test_no_infinite_pulls _____________________

self = <tests.test_fail_caching.TestFailCache testMethod=test_no_infinite_pulls>

    def test_no_infinite_pulls(self):
      dog = AstroDog(valid_ephem_types=EphemerisType.all_orbits())
      for gps_time in gps_times:
        for svId in svIds:
>         dog.get_sat_info(svId, gps_time)

tests/test_fail_caching.py:19: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
laika/astro_dog.py:265: in get_sat_info
    eph = self.get_orbit(prn, time)
laika/astro_dog.py:106: in get_orbit
    orbit = self._get_latest_valid_data(self.orbits[prn], self.cached_orbit[prn], self.get_orbit_data, time, skip_download)
laika/astro_dog.py:362: in _get_latest_valid_data
    download_data_func(time)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <laika.astro_dog.AstroDog object at 0x7fa672b363b0>
time = GPSTime(week=1950, tow=415621.0), only_predictions = False

    def get_orbit_data(self, time: GPSTime, only_predictions=False):
      if only_predictions:
        ephems_sp3 = self.download_parse_prediction_orbit(time)
      else:
        ephems_sp3 = self.download_parse_orbit(time)
      if sum([len(v) for v in ephems_sp3.values()]) < 5:
>       raise RuntimeError(f'No orbit data found. For Time {time.as_datetime()} constellations {self.valid_const} valid ephem types {self.valid_ephem_types}')
E       RuntimeError: No orbit data found. For Time 2017-05-25 19:27:01 constellations ('GPS', 'GLONASS') valid ephem types (<EphemerisType.FINAL_ORBIT: 1>, <EphemerisType.RAPID_ORBIT: 2>, <EphemerisType.ULTRA_RAPID_ORBIT: 3>)

laika/astro_dog.py:211: RuntimeError
__________________ TestFetchSatInfo.test_get_all_sat_info_gps __________________

self = <tests.test_fetch_sat_info.TestFetchSatInfo testMethod=test_get_all_sat_info_gps>

    def test_get_all_sat_info_gps(self):
      time = GPSTime.from_datetime(datetime(2020, 5, 1, 12, 0, 0))
      all_ephem_types = (EphemerisType.FINAL_ORBIT, EphemerisType.RAPID_ORBIT, EphemerisType.ULTRA_RAPID_ORBIT, EphemerisType.NAV)
      kwargs_list = [
        *[{"valid_const": ["GPS"], "valid_ephem_types": ephem_type} for ephem_type in all_ephem_types],
        *[{"valid_const": ["GLONASS"], "valid_ephem_types": ephem_type} for ephem_type in all_ephem_types],
        *[{"valid_const": ["BEIDOU"], "valid_ephem_types": ephem_type} for ephem_type in EphemerisType.all_orbits()],
        *[{"valid_const": ["GALILEO"], "valid_ephem_types": ephem_type} for ephem_type in EphemerisType.all_orbits()],
        *[{"valid_const": ["QZNSS"], "valid_ephem_types": ephem_type} for ephem_type in EphemerisType.all_orbits()],
      ]
    
      for kwargs in kwargs_list:
        dog = AstroDog(**kwargs)
>       infos = dog.get_all_sat_info(time)

tests/test_fetch_sat_info.py:44: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
laika/astro_dog.py:275: in get_all_sat_info
    ephs = self.get_orbits(time)
laika/astro_dog.py:113: in get_orbits
    self.get_orbit_data(time)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <laika.astro_dog.AstroDog object at 0x7fa672b1d2a0>
time = GPSTime(week=2103, tow=475200.0), only_predictions = False

    def get_orbit_data(self, time: GPSTime, only_predictions=False):
      if only_predictions:
        ephems_sp3 = self.download_parse_prediction_orbit(time)
      else:
        ephems_sp3 = self.download_parse_orbit(time)
      if sum([len(v) for v in ephems_sp3.values()]) < 5:
>       raise RuntimeError(f'No orbit data found. For Time {time.as_datetime()} constellations {self.valid_const} valid ephem types {self.valid_ephem_types}')
E       RuntimeError: No orbit data found. For Time 2020-05-01 12:00:00 constellations ['GPS'] valid ephem types [<EphemerisType.FINAL_ORBIT: 1>]

laika/astro_dog.py:211: RuntimeError
_ TestFetchSatInfo.test_no_block_satellite_when_get_info_from_not_available_period _

self = <tests.test_fetch_sat_info.TestFetchSatInfo testMethod=test_no_block_satellite_when_get_info_from_not_available_period>

    def test_no_block_satellite_when_get_info_from_not_available_period(self):
      '''If you first fetch satellite info from period when navigation data
      isn't available and next from period when navigation data are available
      then you should get correct result'''
    
      prn = "C03"
      constellations = ["GPS", "BEIDOU"]
      available_date = GPSTime.from_datetime(datetime(2020, 5, 1, 12, 0))
      not_available_date = GPSTime.from_datetime(datetime(2000, 1, 1))
    
      dog = AstroDog(valid_const=constellations)
>     sat_info = dog.get_sat_info(prn, not_available_date)

tests/test_fetch_sat_info.py:26: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
laika/astro_dog.py:265: in get_sat_info
    eph = self.get_orbit(prn, time)
laika/astro_dog.py:106: in get_orbit
    orbit = self._get_latest_valid_data(self.orbits[prn], self.cached_orbit[prn], self.get_orbit_data, time, skip_download)
laika/astro_dog.py:362: in _get_latest_valid_data
    download_data_func(time)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <laika.astro_dog.AstroDog object at 0x7fa672ae1d50>
time = GPSTime(week=1042, tow=518400.0), only_predictions = False

    def get_orbit_data(self, time: GPSTime, only_predictions=False):
      if only_predictions:
        ephems_sp3 = self.download_parse_prediction_orbit(time)
      else:
        ephems_sp3 = self.download_parse_orbit(time)
      if sum([len(v) for v in ephems_sp3.values()]) < 5:
>       raise RuntimeError(f'No orbit data found. For Time {time.as_datetime()} constellations {self.valid_const} valid ephem types {self.valid_ephem_types}')
E       RuntimeError: No orbit data found. For Time 2000-01-01 00:00:00 constellations ['GPS', 'BEIDOU'] valid ephem types (<EphemerisType.FINAL_ORBIT: 1>, <EphemerisType.RAPID_ORBIT: 2>, <EphemerisType.ULTRA_RAPID_ORBIT: 3>)

laika/astro_dog.py:211: RuntimeError
_________________ TestPositioning.test_station_position_short __________________

self = <laika.rinex_file.RINEXFile object at 0x7fa672b053f0>, filename = None
rate = None

    def __init__(self, filename, rate=None):
      self.rate = rate
      try:
>       with open(filename) as f:
E       TypeError: expected str, bytes or os.PathLike object, not NoneType

laika/rinex_file.py:48: TypeError

During handling of the above exception, another exception occurred:

self = <tests.test_positioning.TestPositioning testMethod=test_station_position_short>

    def test_station_position_short(self):
>     self.run_station_position(10)

tests/test_positioning.py:23: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/test_positioning.py:38: in run_station_position
    obs_data = RINEXFile(slac_rinex_obs_file)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <laika.rinex_file.RINEXFile object at 0x7fa672b053f0>, filename = None
rate = None

    def __init__(self, filename, rate=None):
      self.rate = rate
      try:
        with open(filename) as f:
          self._read_header(f)
          self._read_data(f)
      except TypeError:
        print("TypeError, file likely not downloaded.")
>       raise DownloadError("file download failure")
E       laika.rinex_file.DownloadError: file download failure

laika/rinex_file.py:53: DownloadError
----------------------------- Captured stdout call -----------------------------
File not downloaded, check availability on server.
TypeError, file likely not downloaded.
________________________ TestPredictionOrbits.test_gps _________________________

self = <tests.test_prediction_orbits.TestPredictionOrbits testMethod=test_gps>

    def test_gps(self):
      available_date = GPSTime.from_datetime(datetime(2020, 5, 1, 12))
      dog = AstroDog(valid_const=["GPS"], valid_ephem_types=EphemerisType.ULTRA_RAPID_ORBIT)
>     dog.get_orbit_data(available_date, only_predictions=True)

tests/test_prediction_orbits.py:15: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <laika.astro_dog.AstroDog object at 0x7fa672b1faf0>
time = GPSTime(week=2103, tow=475200.0), only_predictions = True

    def get_orbit_data(self, time: GPSTime, only_predictions=False):
      if only_predictions:
        ephems_sp3 = self.download_parse_prediction_orbit(time)
      else:
        ephems_sp3 = self.download_parse_orbit(time)
      if sum([len(v) for v in ephems_sp3.values()]) < 5:
>       raise RuntimeError(f'No orbit data found. For Time {time.as_datetime()} constellations {self.valid_const} valid ephem types {self.valid_ephem_types}')
E       RuntimeError: No orbit data found. For Time 2020-05-01 12:00:00 constellations ['GPS'] valid ephem types [<EphemerisType.ULTRA_RAPID_ORBIT: 3>]

laika/astro_dog.py:211: RuntimeError
________________ TestPredictionOrbits.test_gps_and_glonass_2022 ________________

self = <tests.test_prediction_orbits.TestPredictionOrbits testMethod=test_gps_and_glonass_2022>

    def test_gps_and_glonass_2022(self):
      # Test GPS and GLONASS separately from the first date that GLONASS Ultra-Rapid prediction orbits were available
      available_date = GPSTime.from_datetime(datetime(2022, 1, 29, 11, 31))
      for t in range(0, 24, 3):
        check_date = available_date + t * SECS_IN_HR
        for const in ["GPS", "GLONASS"]:
          dog = AstroDog(valid_const=const, valid_ephem_types=EphemerisType.ULTRA_RAPID_ORBIT)
>         dog.get_orbit_data(check_date, only_predictions=True)

tests/test_prediction_orbits.py:26: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
laika/astro_dog.py:207: in get_orbit_data
    ephems_sp3 = self.download_parse_prediction_orbit(time)
laika/astro_dog.py:195: in download_parse_prediction_orbit
    result = download_prediction_orbits_russia_src(gps_time, self.cache_dir)
laika/downloader.py:402: in download_prediction_orbits_russia_src
    return download_and_cache_file_return_first_success(url_bases, folder_and_file_names, cache_dir+'russian_products/', raise_error=True)
laika/downloader.py:252: in download_and_cache_file_return_first_success
    raise last_error
laika/downloader.py:246: in download_and_cache_file_return_first_success
    file = download_and_cache_file(url_bases, folder_path, cache_dir, filename, compression, overwrite)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url_base = 'https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/'
folder_path = '22028/ultra/', cache_dir = '/tmp/gnss/russian_products/'
filename = 'Stark_1D_22012818.sp3', compression = '', overwrite = False

    def download_and_cache_file(url_base, folder_path: str, cache_dir: str, filename: str, compression='', overwrite=False):
      filename_zipped = filename + compression
      folder_path_abs = os.path.join(cache_dir, folder_path)
      filepath = str(hatanaka.get_decompressed_path(os.path.join(folder_path_abs, filename)))
    
      filepath_attempt = filepath + '.attempt_time'
    
      if os.path.exists(filepath_attempt):
        with open(filepath_attempt, 'r') as rf:
          last_attempt_time = float(rf.read())
        if time.time() - last_attempt_time < SECS_IN_HR:
>         raise DownloadFailed(f"Too soon to try downloading {folder_path + filename_zipped} from {url_base} again since last attempt")
E         laika.downloader.DownloadFailed: Too soon to try downloading 22028/ultra/Stark_1D_22012818.sp3 from https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/ again since last attempt

laika/downloader.py:266: DownloadFailed
=========================== short test summary info ============================
FAILED tests/test_downloader.py::TestDownloader::test_all_download_protocols
FAILED tests/test_downloader.py::TestDownloader::test_download - laika.downlo...
FAILED tests/test_downloader.py::TestDownloader::test_download_overwrite - la...
FAILED tests/test_downloader.py::TestDownloader::test_wait_after_failure - la...
FAILED tests/test_ephemerides.py::TestAstroDog::test_nav_vs_orbit__old - asse...
FAILED tests/test_fail_caching.py::TestFailCache::test_no_infinite_pulls - Ru...
FAILED tests/test_fetch_sat_info.py::TestFetchSatInfo::test_get_all_sat_info_gps
FAILED tests/test_fetch_sat_info.py::TestFetchSatInfo::test_no_block_satellite_when_get_info_from_not_available_period
FAILED tests/test_positioning.py::TestPositioning::test_station_position_short
FAILED tests/test_prediction_orbits.py::TestPredictionOrbits::test_gps - Runt...
FAILED tests/test_prediction_orbits.py::TestPredictionOrbits::test_gps_and_glonass_2022
=================== 11 failed, 26 passed, 1 skipped in 2.52s ===================

@iMurfyD
Copy link
Author

iMurfyD commented Mar 10, 2023

Seems to be fixed with the following diff:

diff --git a/laika/downloader.py b/laika/downloader.py
index bd456af..65da306 100644
--- a/laika/downloader.py
+++ b/laika/downloader.py
@@ -181,10 +181,11 @@ def https_download_file(url):
   crl.setopt(crl.CAINFO, certifi.where())
   crl.setopt(crl.URL, url)
   crl.setopt(crl.FOLLOWLOCATION, True)
-  crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT@SECLEVEL=1')
+  crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT')
+  # crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT@SECLEVEL=1')
   crl.setopt(crl.COOKIEJAR, '/tmp/cddis_cookies')
   crl.setopt(pycurl.CONNECTTIMEOUT, 10)
-  
+
   buf = BytesIO()
   crl.setopt(crl.WRITEDATA, buf)
   crl.perform()
@@ -230,6 +231,7 @@ def download_files(url_base, folder_path, cacheDir, filenames):
 def download_file(url_base, folder_path, filename_zipped):
   url = url_base + folder_path + filename_zipped
   print('Downloading ' + url)
+  time.sleep(1.0)
   if url.startswith('https://'):
     return https_download_file(url)
   elif url.startswith('ftp://'):

I can open a pull request if it's helpful. I have a poor understanding of SSL certificates, so I'm not sure what the SECLEVEL setting actually does.

@haraschax
Copy link
Collaborator

I seem to vaguely remember this issue, If I remember it has something to do with the SSL version? Not sure anymore.

@iMurfyD
Copy link
Author

iMurfyD commented Mar 27, 2023

My version is OpenSSL 1.1.1t 7 Feb 2023.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants