Code | Set at | Label | Status | Prequisities | Comment |
---|---|---|---|---|---|
13012101 | 13/01/21 | SQL Server | fundamental | ||
13012102 | 13/01/21 | Apache Server | fundamental | ||
13012103 | 13/01/21 | Processing Server(s) | fundamental | ||
13012104 | 13/01/21 | Redundant Processing Server | fundamental | ||
13012105 | 13/01/21 | DSO Website | 13012101 | fundamental | |
13012106 | 13/01/21 | Upgrade Bernese AP | 13012103 | euref DWG, euref VWG | |
13012107 | 13/01/21 | Test Bernese AP | 13012106 | euref DWG | |
13012108 | 13/01/21 | Process old data | 13012107 | euref DWG | |
13012111 | 13/01/21 | Automate SINEX distribution | 13012107 | euref DWG | |
13012109 | 13/01/21 | Time-Series SW | euref VWG | ||
13012110 | 13/01/21 | EUREF Velocities WG | 13012109 | euref VWG | |
13012112 | 13/01/21 | Get RTCM standards | real time |
Continue DSOs contribution to EUREF Densification WG
Start DSOs contribution to EUREF Velocities WG
Automatic processing via Bernese in near-real-time
Work towards processing in real-time
Work towards processing loca-cost-receiver datasets
Fundamental groundwork
Setup an SQLServer for automatic processing
Setup an Apache Server to host DSOs activities
Update/Upgrade Processing Servers | Setup two (clones) processing servers to host automatic processing activities
Setup one processing server to act a redundant (backup) processing host
Setup DSO processing website (probably Django)
Update/Upgrade all automatic processing scripts and programs. All updated routines to be placed in autobern
Test that the automatic processing works for final and ultra rapid solutions and for all networks
Process data for years 2019 and 2020 of network "greece"
Design and develop a (python?) time-series software tool
Contact the EUREF Velocities Working Group and try to contribute. We need to have velocity estimates for that (as dense as possible). They (Dr. Brockmann) are very interested in StrainTool
Automate the publishing of SINEX files to EUREF Densification WG. E.g. once every two weeks